I hate how this sounds...but this reads to me "we lack the confidence in our code security so we're closing the source code to conceal vulnerabilities which may exist."
I would argue that the request was invalid in the first place.
If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.
The data belongs to the government, and you can't get around that right by going to business that holds the data and asking them to delete it.
> If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.
Sounds reasonable to me. If the police want to put up a camera, then the police should put up a camera.
Offloading their legal responsibilities to a third party company is shitty.
Honestly private prisons are a farce anyways, so yeah this seems valid to me. The government doesn't get to get out of its obligations to citizens by outsourcing to third parties, and third parties don't get to wield government-level authority without government-level accountability.
So police departments should have to develop and host all their administrative software also? I think we can all see why that would be a terrible idea. Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.
> So police departments should have to develop and host all their administrative software also?
Yes. We're in an high technology and information age. Police should be well-versed and capable of understanding the technologies and informations that people use.
> I think we can all see why that would be a terrible idea.
I don't.
> Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.
Why shouldn't police (or some law enforcement agency) be capable of operating and maintaining law enforcement technologies?
Develop, no. Host, yes. They should buy, own, and operate any technology like this on-prem. The only involvement that 3rd-party tech should have is sales, tech support, and maybe blind, encrypted backups accessible only by the municipality.
In other countries, police contract companies to develop software and run and manage the software themselves. Putting up a continental drag net to sell to government agencies is something I've only heard of from the US.
Nobody is saying cops should be writing software, but Flock shouldn't have access to the data and analysis tools it has right now. If American police can afford to be armed similarly to a small army, surely they can pay to run a couple of servers in a basement somewhere.
I'm surprised the USA is letting this happen given the culture of individual freedom that seems to have traditionally driven American laws.
Except the data does NOT belong to the government, that's the whole point of Flock operating the way it does. It's not governmental data collection it's data collection by a private company that is then made available to the government upon request. And yeah: it is literally allowed to delete data, because again: it's not a government agency, it's just private data, collected by a private company, with the exact same status as you recording an public intersection with a camera from your window.
The rule of any documentation is that it is out of date as soon as the ink is dry. By the time a regulation is enacted, workarounds/loopholes have already been found (if not intentionally worked into it).
Is it just me or does this describe most of Microsoft software at the moment? I tried to sign into my personal microsoft account to setup an oauth flow and I was greated by an infinity repeating error dialog about some internal service that had failed.
At work, I use outlook. The number of times I've gotten caught in an auth loop where I enter again and again my creds + tfa only for the screen to flicker and start all over again.
Different users. Many people care about privacy and aren’t using Meta products. And many businesses care about it too and have information policies to protect their IP.
> Different users. Many people care about privacy and aren’t using Meta products.
Yeah but if they can rake in 100x as much by making products for people who don't care about privacy, then why spend time developing stuff for people who care?
There is still a small market left, of course, but that market will not have the billions of R&D behind it.
It's largely out of Meta's hands now anyway. The risk here not so much to privacy (it's Apple) but they'll walled garden the model space somehow for sure.
70% of the world’s population use at least one Meta property at least once per day. How many of the other 30% are too poor/young/computer illiterate to be part of an addressable market?
Every company has dozens of SaaS products that store their business critical information. Amazon installs Office on each computer, Slack (they were moving away from Chime when I left), and the sales department uses SalesForce - SA’s and Professional Services (former employee).
The addressable market of even companies that care about privacy is not a large addressable market. How long will it be before computers become cheap enough that can run even GPT 4 level LLMs that companies will give it to all of their developers?
The banking industry absolutely does care about privacy of their business data btw.
We do use tools like Confluence but they're all hosted in our own data centers.
These are all great statistics, but how do you explain ClawdBot explosion. Even in lower income countries like China. So much demand that Apple can’t keep up production of Mac Minis. Why aren’t these folks going towards cloud solutions? Is it cost or is there some consideration for having more control over their data?
ClawBot doesn't generally run the model locally, it just talks to remote APIs. No different than any other agentic harness. You could run a local model on the same Mac Mini as your agent, but it wouldn't be very smart and many agentic tasks around computer GUI/browser use, etc. would be out of reach.
> Why aren’t these folks going towards cloud solutions?
They are. The majority aren't doing inference on a Mac Mini, but instead using it as a local host for cloud-based inference. You could have the same general experience on a $200 Chromebook or $300 Windows box.
They are running cloud models in almost all cases. Like saying it isn’t cloud when you use the Facebook app on your phone (it is ON your phone and running there).
This just got me. Datadog decided that they only support the current and last major versions of Go. So, 1.26 and 1.25. But in my cause we're still on 1.24.13 which was released by the Go team less than two months ago.
So upgrade to 1.25? What reason could you possibly have to be so far behind?
I can understand staying one version behind latest, to not be exposed to brand new bugs, which do happen, but staying two versions behind is pointless.
Using a release less than two months is hardly “so far behind”. The 1.24 series had considerable regressions that have taken a number of patch releases to fix, it stands to reason that the same would be true of newer releases. Given there's still miscompilations getting fixed as late as 1.25.8, and 1.25 brought in large changesets for the new experimental GC, sticking it out while 1.24 is still getting patches a mere handful of weeks ago is not unreasonable.
reply