They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data". A company that manufactures video cameras is not the one to talk to when someone records you, talk to the person who recorded you.
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.
But the data collected is property of the government and flock is not allowed to use that data for additional business gain (according to their statements)...
So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.
If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"
The AWS analogy breaks down because AWS doesn't encourage customers to pool their S3 buckets into a nationwide searchable index.
Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?
Someone once dropped some fireworks not too far from me at 3am a few years back. They were loud and, yeah, cops were called. A few minutes later about five cars drive past me about 30mph over the limit. Not sure how they didn't see me or try to see me. But I know they didn't catch the BRIGHT orange and lifted care.
Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.
Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.
He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.
I may or may not know a business owner who got criminals off their business' street by saying he thinks he saw a gun any time criminals showed up to do things, everything from prostitution to selling drugs. Cops showed up immediately. They stopped coming by altogether, probably the safest street in quite a rough part of town.
It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.
I don’t care. I don’t care who owns the data. If I can’t easily get private information like my movements removed from a database like this, the legislation does not sufficiently protect me.
It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.
The problem with this is where do you draw the line? If I film you with my iPhone (e.g. you walk past in the background of my video), Apple should delete my video from my phone and iCloud account based only on your instructions?
Apple hold the data in iCloud, Apple (or a phone network) may be leasing me the phone. That sounds pretty similar to the Flock situation.
I guess the difference is that flock might be sharing the data from a customers camera with other customers. Then they are definitely controlling it.
I think the bigger problem with Flock is the fact that their cyber security is so laughably bad that non-customers can easily access the data.
Not pronouncing about what path is the most distopic, just for the fun of the exercise of what if we push in the direction:
Given the rule, I would expect (IANAL), Apple should not deal with data stored on phones they sold.
People are responsible for what they store on their device. When I take a photo in the street, if someone come to me asking to erase a photo with them or their kids as they were in the background, I'll tell I don't publish any photo online, which is generally what people are thinking of as a concern and that stop there, but if they insist I will remove it from my phone. Because I'm too lazy to actually live edit the photo and remove them from the picture, even if that is certainly doable with a simple prompt by now.
Now if Apple store automatically photo in some remote server they own, they are the ones who should be responsible to comply with making sure they won't store something illegally. Microsoft, Google, and Apple use PhotoDNA to detect known CSAM if I'm not mistaken. Though legally they only should remove once they get a notice about it. Same way, they could proactively blur visages of people not detected as the people that were whitelisted for the uploading account. And, by that logic, they should certainly remove the information regarding a person if they get a notice, just as well as they wouldn't keep CSAM data once notified, would they?
Anyway the underlying issue is not who store what, but what societies lose at letting mass surveillance infrastructures being deployed, no matter how the ownership/responsibility dilution game is played on top of it.
The legal term is 'distinction without a difference'. Flock/others can't create a weaselly scenario to pretend it's something else. Otherwise people could bypass all kinds of laws/rules just by giving some weaselly description to everything.
This also falls under the 2026 rule 'everyone Is 12 Now'. Flock is literally acting like a 12 year old to get out of following the rules. My 12 year old tried to use this dumb parsing of things to avoid rules/consequences.
A reasonably nuanced defense could likely claim that to be able to do what you want, would have much worse side effects on privacy.
For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?
For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.
Now, it's not the same thing of course - but hopefully you understand what I'm referring to?
Except that the analogy is that they already have, or can easily create, that list. If they couldn’t, their value proposition would be lame. “We know you’re looking for a specific license plate, here’s a million hours of footage from all over the city, have at looking through it all.”
Only for paying customers, which you aren't of course. If those customers paid public storage to inventory their stuff, then that inventory is their property. Surely it would be inappropriate to use their inventory data to find your naked photos. A violation of privacy even. (/s, kinda)
I was enumerating the likely defense, not that it's valid.
This is also true according to their contracts (we were one of the first munis in the country to ostentatiously cancel our Flock contract, and the lead up to that was a bunch of progressive legal experts poring over that contract looking for holes.)
>a bunch of progressive legal experts poring over that contract looking for holes
all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.
if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.
As a practical matter, this may be good advice. But it also places a demand on someone with a legitimate concern that they go find an ideological "beard" to make themselves more palatable and sympathetic.
It's not hard to see how this enables an institution to gate itself from criticism.
Except that Flock very clearly benefits financially from having direct access to this data: owning (and in their own documentation, they very clearly do own it) a network of 80,000 surveillance devices across the country, and owning every single transit point for the data they collect, is what gets them to a $7.5 billion valuation from investors.
The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.
(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)
Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?
That makes them a data broker in my reading, and at least in California, Data Broker legislation should apply. CA Data Broker registry gives me access denied, but that could be because I am outside US.
Because Flock isn't a data broker. Flock's customers own their data, not Flock, and they use Flock's platform voluntarily to share data with other customers.
Flock charges to access the data which is voluntarily shared by other customers. I am struggling to note a difference in this practice from any other data brokerage service in existence.
Does Flock do some kind of P2P dance to avoid the data transiting their systems?
They are not, because they are not operating a business that acquires and resells your data. You own your document, and Google isn't selling it to third parties. Flock doesn't own municipal data, and Flock is also not "selling it to third parties"; it's facilitating a sharing system that law enforcement agencies avidly desire.
Presumably the California data brokerage statutes were written specifically to prevent the kind of nerd-lawyering happening on this thread.
I was referring to the claim that "Flock's cameras collect more data than is provided to police agencies" — that suggests that there is data not "owned" by the customers, which implies it's Flock's data, thus it might make them liable under Data Broker legislation.
I encourage you to present that analogy to an actual court and see how far it gets you. It's very easy to find the statutory definition of a "data broker" under California law.
This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?
Technically, most stocks are registered in the name of a securities holding company, with you named as beneficial owner. That makes it frictionless for you to buy and sell. You enjoy all the rights of ownership, unless the broker lends your shares out to someone else.
So… Flock uses their own platform and top to bottom tech stack to do everything technically? Your local PD doesn’t use random cameras (like Reolink), doesn’t run a custom software stack (like Frigate in a container on some random VM hosted with AWS), doesn’t store the data wherever (like Backblaze)? The customers just have to install the Flock cameras and “order” the subsequent data from Flock? But you say they’re not at all responsible or accountable for any it because despite doing everything at every step, they’re “just a broker”?
If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.
I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.
Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.
> If Flock's customers, using Flock's infrastructure or tooling, can share data with each other, that would be bad.
Ex-employee of Flock here, that's ABSOLUTELY what's happening.
And what's more Flock lets them do so even when they know the agencies are legally not permitted to do so. They turn a blind eye, say it's not their problem to enforce ("oh, doing so in state X is illegal? Well, even if your agency is in state X, we didn't disable that feature"), then happily provide training to do enable those agencies to do so (and it's a nudge nudge wink wink part of the sales process.)
> But the data collected is property of the government
I thought this was the get-out clause from the constitutional problems with Flock? That because Flock is a non-government organisation it isn't restricted by the constitution (i.e. the constitution only restricts what the government can do).
They can't have it both ways - if Flock are collecting the data then they are subject to the privacy laws. If it's the government collecting the data via Flock as just a service, then they are subject to constitutional restrictions.
That’s a pretty compelling argument, but what if I went round to AWS’ house, peeked into their kitchen, and saw a crate of photos on their table with me in them?
I’d absolutely say:
“Hey, that’s me! Give me those right now!”
I’d also be pretty angry if they told me:
“Sorry we’re storing those for Corp Inc. Go ask them.”
To refute my own point though, this only sounds annoying because the data processor is being irritating by manually referring me to the data controller. In practice, it would be trivial for them to automatically forward communications between me and the controller.
That’s what feels is amiss with the top level article.
If I lease out a property to a tenant (apartment, retail, industrial use, whatever) and that tenant is committing an illegal activity on the property. Would the landlord be liable for knowing it? Or not?
"Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."
Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.
This is worth validating independently, but to be clear:
Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?
If so, that makes the panopticon slightly less powerful.
I assume they are building "meta-data" profiles of people based on the data they say they can't use directly. That seems like an easy work-around that satisfies the lip-service they've given to the issue.
Yeah but their argument is that if someone takes a photo of you with thier iphone and its uploaded to icloud, you cant ask apple to delete the photo, you need to ask the person who took it
Your example is apples and oranges. Flock maintains private infrastructure that stores data.
If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.
As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.
Flock has knowledge/use of the data. Their system processes can relate the photos “owned” by two different entities. They’re interacting with it and selling their access to it as a feature. That’s obviously distinct from S3.
I know quite a bit about Flock, having been intimately involved in the process of evicting it from our municipality, and I don't think the distinction you're trying to draw here is meaningful. Flock will say they provide a service, one avidly sought by the actual owners of the data, to generate analysis based on that data.
They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.
Flock's facilitation of data-sharing is a huge part of their value proposition over other cameras, and why their customers buy from them over their competitors.
As such, even if they can contract it such that they are not legally responsible for such use, they are very much knowingly facilitating it. If this was physical goods, rather than data, they would probably been as responsible as their customers.
I've read our contract. I know what it says. This isn't an abstraction. They can do lots of things. What they actually do is not data brokerage under California Law, at least not that I can tell.
What Flock names the relationship in their contract does not make it one, as the courts do very much duck type.
Flock knowingly collects PII of people they have no direct relationship with, and transfers it to third parties. If that transfer, which Flock seem to gain from, is legally a sale is something to be argued at a great expense in front of the court.
But regardless of that definition, I so think that any reasonable person (= not a corporate lawyer) would consider there is a sale of data here.
Except their customer's data isn't actually theirs: OP requested their private data to be deleted from the system. So OP expressed a clear intent for their data not to be used by Flock's customer. We could say that the data thus becomes abusively retained on these systems. As a result, IF Flock has the technical means of performing the requested data deletion, it should be compelled to perform it.
This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.
Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.
Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.
If I as a photographer take a photograph of someone, the photo does not belong to that person—the photographer retains the IP and ownership rights.
You have rights too, such as privacy/likeness rights, which allow you to restrict what the IP owner is allowed to do with the image that they own, but you do not own the data, and your rights give you a claim against the data owner.
Flock probably have legal obligations or contractual commitments not to delete or destroy their customers' data, and changing that is not necessarily a good thing.
I don't live in a state with a law like California's so your "gotcha" isn't relevant.
Californians would have standing under the law but need expensive lawyers to litigate.
AWS has employed expensive lawyers to argue semantics; they host OS VMs and databases. This provides them legal cover for what AWS customers store.
Amazon the retailer stores customer data. A non-customer would have standing under California law to litigate removal of PII should they decide to hire lawyers.
Your reductionism is to law what a Linux beige box on a routable IP, no firewall, hosting a production health database with creds set to admin/pwd1234 is to software engineering.
Coincidentally 1234 happens to be the code to my luggage.
If AWS maintained private infrastructure that stored and indexed data associated with people's license plates and vehicles and then charged customers to do searches against that data then yes, you could write them to ask them to purge data pertaining to you.
If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.
The argument you're making implicates way more than just Flock, and is in a practical sense novel. If you can cite jurisprudence (or even legal experts) backing it up, I'm interested in reading it. Otherwise, I'm happy to accept that we just have premises about the law that are too far apart for an argument to be productive.
My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.
Realistically speaking you're never going to get pro Flock people in any numbers on this site writing comments at all. The anti surveillance position's popularity when it comes to up votes, down votes, and flags on this site is such that pros will continue posting about what they want the law to be and antis will stay out. That's just how crowd voting dynamics shape out.
Devils advocate here. There is currently an article on the front page about a US bill to compel operating systems to collect age verification / id data. If something like that was actually in place and every packet on the internet was stamped with your digital id then you could feasibly demand that aws purge/filter your data out of their systems.
You can't even request AWS delete your actual PII from S3. If you've been to a doctor in the last 2 years, you have HIPAA PII somewhere on S3, and AWS won't do a thing about it for you. I don't know why people have this idea that service providers will scrub their customers data for you.
Does AWS actively and by design parse and keep track of personally identifiable information of the data that AWS customers store on their S3 buckets? If that were the case they would absolutely be subject to CCPA (and GDPR) requests for deletion.
However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.
I think if it were only offline storage it would not be as big of an issue. A more accurate analogy would be renting a DSLR that automatically transmits every picture to Rent-A-Center servers.
If Rent-A-Center installed the camera in a bathroom, I'd contend that it does.
Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.")
> They were saying "don't write to us, talk to the people who own the cameras and ask them to delete the data".
The response to this should just be, "Yes, very well, please divulge a complete list of your customers, their contact information, and information about camera locations so I will be able to pursue this per instructions".
When that obviously doesn't work either then we can all agree the law as written is completely useless, and feel great about rewriting it in a way that's calculated for maximum damage to both the vendor and their customers, and collateral damage to the whole panopticon. Or, just spitballing here, we can just skip to the punchline here and do all that anyway
It easily goes both ways. But we do sue American gun makers for deaths caused by lunatics. We sue drug makers for drugs prescribed by a doctor. We sue cloud providers for not reporting illegal photos. Printers are forced to id every printed page to combat counterfeiting. Banks are forced to do close accounts even though it's not their dirty money
Most of those examples have to do with the manufacturers knowing that their products were dangerous, addictive or illegal and advertising them aggressively as safe. They're mostly litigated on product liability claims. Banks are regulated by an entire architecture of laws, and we could enact laws that would regulate Flock too, as one of the other commenters pointed out they're doing in Oregon.
the way this has been addressed in complex product liability in the past in the USA is that the public-facing Brand Owner has certain legal liability for the product, despite contractors or supply chains. In this case, it appears that the Flock company is the brand owner and is public-facing.
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.