> Flock Safety’s customers own the data and make all decisions around how such data is used and shared.
which seems to directly oppose the CCPA. It's my data, not their customers'.
Again, I didn't really expect this to work. And yet, I'm still disappointed with the path by which it didn't work.
Did YC house style change a while back to drop the "(YC xxx)" annotation since so many popular firms particpate / or because it's well known?
Or vote for/against them, that might work too.
Flock seems to leave the data in ownership of the government. They are just providing the service of being custodians for storing and accessing that data.
You probably would get a similar response by submitting your request to Amazon web services or Google cloud or whoever has Flocks data: "sorry, we're just holding the data on behalf of Flock"
In either my example case or your stated case, you would have a very hard time convincing the host business to destroy their customers data without a court order or court case that shows their policy is invalid and they must comply.
Not a lawyer, just noting the parallel.
I do appreciate that Flock's response says that they cannot use the data they've collected for other purposes.. which further reinforces my cloud storage analogy -- the cloud vendor can't look at your data you upload to storage to e.g. build profiles on you/your business.
> In accordance with its Terms and Conditions, Flock Safety may access, use, preserve and/or disclose the LPR data to law enforcement authorities, government officials, and/or third parties, if legally required to do so or if Flock has a good faith belief that such access, use, preservation or disclosure is reasonably necessary to comply with a legal process, enforce the agreement between Flock and the customer, or detect, prevent or otherwise address security, privacy, fraud or technical issues. Additionally, Flock uses a fraction of LPR images (less than one percent), which are stripped of all metadata and identifying information, solely for the purpose of improving Flock Services through machine learning.
In this document, to which they linked in their reply, it says clearly "address ... privacy ... issues."
Does your case not constitute a privacy issue? I would say so.
Continuing down below, their claim on "Trust Us" about how they employ machine learning would need some proper transparency into how can that be guaranteed.
The best source of this information is https://deflock.org/ . FWIW, this is run by a neighbor in Boulder, CO which has been wrestling with the use of these cameras.
For example, if Flock receives a legitimate request to delete some data, then Flock must forward that request to all their Data Processors (e.g. including AWS/GCP/Cloudflare) and they must delete it as well.
Does this hold water? I'm reading the CCPA rules now but if anyone knows, it would save me some tedious research.
If you write the police and ask them to delete all their data about you, that isn't a thing that they do. It shouldn't matter if the police store their data on AWS or their own servers.
Flock is a tool used by the police so it should work the same way.
> (2) (A) “Personal information” does not include publicly available information [...]
> (B) (i) For purposes of this paragraph, “publicly available” means any of the following:
> (I) Information that is lawfully made available from federal, state, or local government records.
> (II) Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
2x4
rebar
spraypaint
spray foam
battery powered metal cutter
And bash those pieces of shit to chunks or completely ruin the lens and solar.Republican community? They love corporate surveillance. Democrat community? They too love corporate surveillance.
There is no "Peoples' Party" that rejects this garbage.
To the extent that Flock is only storing the data on behalf of their customers, I'd understand they wouldn't be required to delete it. But to the extent that they are indexing it, deriving from it, aggregating it across customers, and sharing it via their platform, it seems they should be required to remove that data from those services.
But then again, I am not a lawyer!
In short, Flock is a "service provider" and not the entity doing the recording.
Perhaps you can make a case that they are a "data broker" instead (https://oag.ca.gov/privacy/ccpa#collapse1i), but that is a separate law, and what you are really looking at is a combination of license plate, time and location being collected as data being collected and sold without your consent.
Obviously, I am not a lawyer (and not even US-based), but I like when privacy is respected :)
But that's not what Flock is claiming. They're claiming that they don't even have to consider the request because they don't own the data.
[0] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...
[1] https://www.clarip.com/data-privacy/ccpa-erasure-exemptions/
It would be a pity if someone made dense point clouds of these devices.
Would you ask your local ISP to delete data they provided to Tinder like your IP address? That doesn't make sense to me.
But a reasonable person would say -- the data is stored on Flock servers, not with the camera owners. And Flock would say, just because we sell data storage functionality to camera owners doesn't mean we own the data, anymore than a storage service you rent a space from owns what you put in that space.
But then an even more reasonable person would say: the infrastructure is designed in such a way as to create inadvertent sharing, and the system has vulnerabilities that compromise the data, so Flock has responsibility for setting up the system in such a way that it's basically designed to violate privacy.
And that is the main criticism of Flock. You need to have a more nuanced criticism. It would be really interesting to see this litigated.
Now, with you likely not keeping that Ring tied to a business account, how that applies to non-businesses holding PII is a different matter.
But yes, data that can be used to track my movements in my vehicle is certainly a type of personally identifiable information. I'd argue there should be some exemptions for individuals operating on a small scale, which I believe the CCPA has (and if we actually got a US GDPR, that it should have). But also that kind of exception shouldn't apply to a camera jointly operated by and backhauling to Ring.
https://theonion.com/american-people-hire-high-powered-lobby...
Would our main check on this be whistleblowers?
Wait til you see their "Transparency Portal" which, if my County and neighboring can be used as a sample size, doesn't even name at least 30% of agencies using Flock.
By analogy, Google Docs isn't marketed for healthcare use. If you wanted, you could put a bunch of PHI in a Google doc and it wouldn't be their responsibility. They certainly didn't tell you to do that. However, if they marketed Google Docs as a great place to store PHI, yeah, then suddenly they're on the hook for complying with the relevant laws like HIPAA.
(Although in this case Google will sign a HIPAA business associate agreement with you and voluntarily agree to comply. They still don't market it that way, or at least don't predominantly do so.)
[0] https://www.flocksafety.com/blog/flock-safety-does-my-neighb...
* The right to know about the personal information a business collects about them and how it is used and shared;
* The right to delete personal information collected from them (with some exceptions);
* The right to opt-out of the sale or sharing of their personal information including via the GPC;
This isn't someone incidentally taking pictures of license plates in an otherwise noncommercial setting. It's a company literally created to collect and sell PII. Laws are different for them than for us.
If I see a flash on a speed camera operated by a business on behalf of a police department, your argument states I should be able to use CCPA to force the business to delete my picture and the record of me speeding If I can get the request to them before the police can file with the court and request that data as evidence.
The data belongs to the government, and you can't get around that right by going to business that holds the data and asking them to delete it.
Sounds reasonable to me. If the police want to put up a camera, then the police should put up a camera.
Offloading their legal responsibilities to a third party company is shitty.
We're talking about Flock. A company offering surveillance as a service. Per their website:
>Trusted by over 12,000 public safety customers including cities, towns, counties, and business partners.
If Flock's argument holds then most of the CCPA be circumvented this same way. All it takes is a few entities and clever contract language.
So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.
If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"
You're going to get a lot of cheerleading and support about this in venues like HN and Reddit, because you're narrowcasting to an audience already primed to be hyperconcerned about surveillance technology (I am too). I think you're going to find those attitudes do not in fact generalize to the public at large, and especially not to the legal system.
Best of luck either way. It'll be an interesting experience to write up, and I'm happy to read about the outcome, even if I do think it's highly predictable.
And what do you know? I got not reply, but the content disappeared in ~48hrs.
https://i.ibb.co/WWWYznHX/flock-future.png
See also a poster from IBM’s German subsidiary, circa 1934. The approximate translation: “See everything with Hollerith punch cards.”
https://www.clevelandjewishnews.com/opinion/op-eds/new-detai...
fyi, flock owns the cameras.
"We operate using a lease model. What does that mean? Since we own the hardware, we own the problems that occur."
> (v) (1) “Personal information” means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following [...]:
> (E) Biometric information.
> (H) Audio, electronic, visual, thermal, olfactory, or similar information.
https://leginfo.legislature.ca.gov/faces/codes_displaySectio...
To your point, the intent would presumably still matter for exceptions to when deletion requests must be honored (say for journalism), but a photo of someone walking down a public street would still logically be considered the subject's personal information, by the above definition.
If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.
As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.
Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.")
The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.
(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)
Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?
[0]: https://www.scotusblog.com/cases/case-files/chatrie-v-united...
Obviously, the idea is to not disallow having someone take a photo of you as a background, passing figure as they take a front-and-center photo of their family, but not allow you to be the main subject unknowingly and especially when you object explicitly.
On the other hand, a photographer still owns the copyright to a photo, so a subject (including in a portrait) cannot claim it or distribute it without permission even if they can potentially stop the photographer from distributing that photo.
IANAL, but you are not by default allowed to use anyone's "likeness" for your individual profit.
As a suggestion, I saw you have RSS:
I didn't see it mentioned in the main page or About or Archive. Maybe add it to a more visible place?
"Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."
Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.
The response to this should just be, "Yes, very well, please divulge a complete list of your customers, their contact information, and information about camera locations so I will be able to pursue this per instructions".
When that obviously doesn't work either then we can all agree the law as written is completely useless, and feel great about rewriting it in a way that's calculated for maximum damage to both the vendor and their customers, and collateral damage to the whole panopticon. Or, just spitballing here, we can just skip to the punchline here and do all that anyway
Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?
all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.
if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.
This is not the case in the United States. There is no presumption of privacy in public. In fact, there is a whole genre known as "street photography" that involves taking pictures in public without explicit consent of the subjects.
Edit: from https://oag.ca.gov/privacy/ccpa
> If a service provider has said that it does not or cannot act on your request because it is a service provider, you may follow up to ask who the business is. However, sometimes the service provider will not be able to provide that information. You may be able to determine who the business is based on the services that the service provider provides, although sometimes this may be difficult or impossible.
However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.
Even your full legal name and birth date cannot be guaranteed to refer only to you specifically (as there could be someone else with an identical name and birth date), but it's obviously still PII because it helps narrow the field immensely if you can combine it with other information - for example, your IP address.
So yeah, "anyone could have been driving my car", but if you also know that the car drove from your home to your work then that narrows down the list of likely individuals immensely.
Conversely, if your license plate was spotted parked near an anti-ICE rally, then they can be pretty confident that you or someone you know was near an anti-ICE rally, which means they can harass you about it, follow you around, shoot you in the street, etc.
If https://legalclarity.org/can-you-post-someones-picture-witho... is to be trusted though, at least you get protection from your likeness being used for commercial purposes, though that seems a bit more limited than I'd expect.
I'm not convinced this is the case. It might be equipment made by them, but does that necessarily mean they were ever even in possession of the data in question?
Would you ask the manufacturer of your oven what you ate for dinner last week? No, you're just using an appliance that they made.
In the case of Flock I don't think we have any evidence of whether Flock themselves ever hold or store any data produced by their devices when operated by a customer.
No we don't, there's a federal law explicitly protecting gun manufacturers from liability for gun crime. https://en.wikipedia.org/wiki/Protection_of_Lawful_Commerce_...
And that's a good point. I'll look at that when I get home.
Yes. We're in an high technology and information age. Police should be well-versed and capable of understanding the technologies and informations that people use.
> I think we can all see why that would be a terrible idea.
I don't.
> Police are like any other government agency or business in that they contract with the private sector for a variety of services that are not in their area of expertise.
Why shouldn't police (or some law enforcement agency) be capable of operating and maintaining law enforcement technologies?
I might have to do that.
It's not hard to see how this enables an institution to gate itself from criticism.
But you knew that.
If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.
That doesn't seem correct, even leaving aside the obvious moral issues with that.
I don't like either of those activities, but I think one of them is much worse.
Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?
If so, that makes the panopticon slightly less powerful.
It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.
[0]: https://www.eff.org/deeplinks/2025/10/flock-safety-and-texas...
They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.
My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.
> “Personal information” does not include [...] Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
But I'm with you both suck.
Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.
Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.
He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.
I think what's happening here is that people are trying to colloquially define "selling access to data" to fit the camera data sharing that Flock enables, and then saying that because you have to pay to be a Flock customer to get access to that data, they're effectively selling it. I don' think that's how data brokerage laws work. Flock doesn't own the data they're providing access to, and they're providing that sharing access with the (avid!) consent of their customers.
For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?
For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.
Now, it's not the same thing of course - but hopefully you understand what I'm referring to?
Does Flock do some kind of P2P dance to avoid the data transiting their systems?
I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.
Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.
This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.
Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.
Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.
This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?
I was enumerating the likely defense, not that it's valid.
I wrote to Flock’s privacy contact to opt out of their domestic spying program:
I am a resident of California. As such, and because you are subject to the CCPA, delete all information about me, my vehicle, and other household members from all of your databases. I do not give you permission to collect or store data about me, my vehicles, or my relatives, in any future situation.
[Me] [My address]
They replied today:
Dear [misspelled name, i.e. not copied and pasted],
Your request cannot be completed at this time.
Dear [misspelling again],
Thank you for submitting your privacy request. At this time, we are unable to process this request for the reasons detailed below.
Flock Safety provides its services to our customers, and our customers are owners and controllers of the data Flock Safety processes on their behalf. Flock Safety processes data as a service provider and processor for our customers and as a result, we are unable to directly fulfill your request. We recommend contacting the organization that engaged Flock Safety’s services to submit your request, as they are responsible for assessing and responding to it.
Here are a few additional points about Flock Safety’s data collection and privacy practices:
- Customer Contracts: Flock Safety’s processing activity as a service provider and processor is governed by the contract we have with our customers, which captures their instructions and the limitations on how Flock Safety may process their data. Flock Safety’s customers own the data and make all decisions around how such data is used and shared.
- No Sale of Data: Because Flock Safety’s customers own the data, Flock Safety may only process the data in accordance with our customer’s instructions, as outlined in our contracts with customers. Flock Safety is not permitted to sell, publish, or exchange such data for our own commercial purposes.
- Information Collected: Where Flock Safety’s customers leverage License Plate Reader (LPR) technology, the LPRs do not process sensitive information like names or addresses. Instead, LPRs only capture images of publicly available and visible vehicle characteristics that are taken in the public view.
- Purpose: Flock Safety customers use data for security purposes, including managing public safety or responding to safety concerns and reports. Additionally, such data may be used to help solve crimes and provide objective evidence.
- Retention: By default, Flock Safety’s systems only retain data for 30 days, which means that any data collected on behalf of customers is permanently hard deleted on a rolling 30 day basis. Flock Safety customers are able to adjust this retention period based on their local laws or policies.
For more information about how Flock Safety processes data, please refer to our Privacy Policy and LPR Policy.
Thank you,
Flock Safety Privacy Team
I think that’s legally inaccurate. They’re the entity collecting and processing my personally identifiable information, and my non-lawyer reading of the California Consumer Privacy Act (CCPA) would seem to obligate them to comply with my request. I haven’t decided to engage a lawyer yet, but neither have I ruled it out.