I don't see how this can work at a fundamental level.
You're telling me that to remove my info I have to essentially give (through your service) all these companies & data brokers my info so they can opt me out, and actually trust them that they'll do so? If anything, opting out is a signal that you may actually be of more interest to them than not doing anything. If these companies can also infer that the request is coming from your service (and they will, unless you use random proxies and browser automation), the flag becomes "this person has enough disposable money to pay for such a service" which suddenly increases the value of your profile by orders of magnitude.
How are you going to make money to justify the VC funding? VCs rarely fund boring, sustainable businesses that sell a service and make a slim profit; for them it's all about hypergrowth, but I don't see the potential here - unless of course you start doing the very thing you're currently protecting against. The simple fact that it's VC-funded tells me to steer well clear.
The only proper way to deal with this is with GDPR-like regulation and actually enforcing it - the latter has been lacking in Europe, but thankfully seems like it's somewhat picked up recently.
Anyone can submit opt outs, with, or without, a service like ours. The vast majority of data brokers do remove the information after an opt out request is submitted. Unfortunately, over time, many data brokers start adding it back. The CCPA (California's Privacy Law) permits a data broker to stop honoring an opt out after 12 months.
To answer your question though, yes, in order to get these companies to remove your info, you have to submit an opt out that identifies who you are. There is a catch-22, otherwise, they would not know who to opt out.
There are a multitude of reasons why people submit opt outs beyond whether or not the person can pay, e.g. victims of domestic violence, police officers, public figures, government officials, members of the military, etc. The data brokers are aware of this and generally have processes to accommodate the requests.
Millions of people use some form of identity protection to protect themselves from identity theft, email spam, phishing, scams, hacking, etc. It is a multi-billion dollar market across the consumer, business, and government levels.
I do agree that we need stronger privacy laws in the U.S. ASAP!
The CCPA (California's Privacy Law) permits a data broker to stop honoring an opt out after 12 months.
Does this mean users have to keep opting out every 12 months? If yes, that sounds...dumb, isn't it? Maybe I am not understanding this correctly, is there a reason behind this 12 month period? Even if they wanted to let data brokers add the data back, 12 months seems too short a time. What am I missing?
Why is every bad thing opt out instead of opt in? This is so backwards...
Here's the exact text from Section 1798.135(a)(5) of the CCPA: "For a consumer who has opted-out of the sale of the consumer’s personal information, respect the consumer’s decision to opt-out for at least 12 months before requesting that the consumer authorize the sale of the consumer’s personal information."
Data brokers can often have a very liberal interpretation of what it means for the consumer to "authorize" the sale of their personal information. I would guess 99% of data sales were never actually “authorized” by the consumer to begin with, and are usually done through some backdoor implicit authorization that the consumer has no knowledge of whatsoever.
> Does this mean users have to keep opting out every 12 months? If yes, that sounds...dumb, isn't it? Maybe I am not understanding this correctly, is there a reason behind this 12 month period? Even if they wanted to let data brokers add the data back, 12 months seems too short a time. What am I missing?
Although it often causes harm and is often considered evil, there is a lot of economic value that results from the free flow of data. U.S. lawmakers recognize this and its partly why they’re so reluctant to pass strict privacy laws like they have in Europe. The other reason is that lobbyists for deep pocketed tech companies water down privacy laws significantly before they can pass. Even with the 12 month expiration, the CCPA is the strictest privacy law in the U.S., and most U.S. citizens have basically no data privacy rights whatsoever.
How long a data broker honors an opt out is highly variable by the data broker. But yes, in general, you have to continue monitoring these companies and re-submitting opt out requests over time, that's what Optery does with its ongoing scanning and removal technology. We have a little more info on this topic here:
Lobbying. That's the reason. There is no rational reason why a user's explicitly expressed preference would become irrelevant after a year.
The organ donor database won't ask you every month to verify that you still want to do it - there is nil legitimate reason why this would be 'needed' in this case.
Not in California, but spent some time several years ago working on the tech for a sales arm.
My impression there was that people where pretty conscious of opt-outs and wanting to manage them carefully, if only because the consequences of not doing so could be costly. We spent a lot of time talking about opt-outs and trying to honor them across different tools they used.
Many prospects came from purchased lists. Some were contracts where some other firm was specifically handling a marketing campaign for us, but others were clearly just purchased from a data broker who, presumably, filtered their data for people who indicated an interest in us.
When we asked how to handle these, the decision was that if you showed up on one of these lists, you had opted in. After all, that is what the broker told us.
This presents the first problem: If you buy data from a broker who you have not specifically engaged to campaign on your behalf, its a little gray what that data actually represents. This may be resolved by probing into to broker's collection and segmentation more, but as far as I can tell there's not a lot of incentive for the decision makers to care.
Anyways, I was not particularly thrilled with this approach, especially after working with some lists we purchased several times a year that always had a significant number of duplicates time after time. So I wrote that particular import to check for duplicates and filter them out.
Well, this worked great until the prospects, having no recent activity, grew stale and were cleaned from the system. Then when they showed up in the list again wouldn't match and get a fresh new subscription.
Which presents the second problem: If you don't hold on to their identifiable information, you cannot determine if they have previously requested to be removed from your offerings. So it may be dumb to have to opt out again, bit the alternative is to trust them not to use your information in the mean time.
> Which presents the second problem: If you don't hold on to their identifiable information, you cannot determine if they have previously requested to be removed from your offerings. So it may be dumb to have to opt out again, bit the alternative is to trust them not to use your information in the mean time.
This is just false. You can opt to keep e.g. a hash to check against future additions. Else you also would not be able to comply with the 12 month requirement.
I'm not familiar with the rules for data brokers or if Optery is able to compel anyone to actually delete things, especially if they claim not to not have a way to prevent reentry. The platforms I worked with all handled opt-outs with a "Has Opted Out" indicator that could be filtered, in particular on distribution, and deleting profiles was a separate concern with the only crossover that they must be kept long enough to honor opt-out requirements.
Personal profiles are messy. People have different names, addresses, email, phone numbers, and different people sharing any of the aforementioned details. This is complicated enough with the raw data and unless you have a very simple profiles or opt-out rules, such as treating the entire household as one, you're going to need more than a simple hash to figure out if there is existing opt-out history.
Profiles may also need to be kept for transactional purposes. Presumably, data brokers and people who have only heard of you through brokers wouldn't have any transactional activity. But everyone else likely needs to deal with opt-outs on the distribution side, making opt-outs a solved problem that doesn't need to happen on intake.
So...sure, you can.
But I wouldn't hold your breath that this will be seen as a reasonable expectation for most developers or businesses to implement.
I think it can be a great service but have the same question as above.
Why did you opt-out for VC? This seems to be a perfect candidate for a profitable and sustainable business because you don't really need a complicated infra or another 2 years to build an enterprise grade product. (unless I am missing something).
> in order to get these companies to remove your info, you have to submit an opt out that identifies who you are.
I wonder if you could build an opt-out service or protocol where you share a hash of your personal information instead of the info itself. With the hash you can identify matching records but you cannot create a record from the hash.
If Optery bought full datasets from databrokers, it could use a hash to identify matching records and submit those back to the brokers for opt-out (this wouldn't work for querying their public APIs...)
Probably not a feasible solution, but it's a fun possibility to think about!
> With the hash you can identify matching records but you cannot create a record from the hash.
I always chuckle when I see someone saying this. A buddy of mine (email marketer) tried to convince me that his 50 million large email database is "well protected" because they are using "industry standard md5 encryption". Of course its not encryption but rather hashing. He was so sure of his, erm, encryption, that he send me the whole database and said "here, crack it".
I found a large hacked Facebook email database online, run few python scripts to weed us most combinations of usernames (name+numbers, numbers+name, numbers+some random chars, etc) and some 1000 generic email domains names (like gmail.com, yahoo.com, etc). It took my regular i9 five days to go thru the whole 50M of md5s and compare each combination of usernames + domains names. Oh boy his shock when I returned some 70% "unencrypted" plain text emails back to him :)
Bottom line is, if there is some "industry standard" of hashing data, then there are ways to unhash it. Yes in many cases it may be in millions of years to circle thru all possibilities, but if your standard is first name + last name + email address (and all caps), then you can easily plug database of names and download millions of email records online and narrow down your hashing search greatly.
What's the point? The company a you are asking to remove data will anyway know which dataset you asked to delete.
What would work would be a mandatory self block list (e.g. most EU countries have a 'do not call' opt-in list against telemarketers) and then the brokers have to check any new days against the hash of data points in this block list. But that requires government intervention, not startups that bandaid the problem.
> You're telling me that to remove my info I have to essentially give (through your service) all these companies & data brokers my info so they can opt me out, and actually trust them that they'll do so?
Even if one ignores intentional misuse, simple incompetence by a data broker seems like enough to cause a problem. It only takes one data broker to commingle fields from opt-out requests with existing data (and then share/sell/trade that existing data) for the opt-out fields to spread.
That's certainly one way to look at it. Another way to look at it is that if you do nothing, you information will continue to persist, multiply, and propagate unchecked. Those that take the time (or money) to remove the profiles, have dramatically reduced online footprints, which is why these types of services are becoming more and more popular. Many companies are starting to mandate their employees use services like ours to strengthen their security posture to reduce exposure to phishing, hacking, email spam, etc.
> you[sic] information will continue to persist, multiply, and propagate unchecked
Based on how the underlying problem is presented in your own business model, it seems like this will happen regardless of whether a customers uses this service or not. Like conscripting a recruit into a losing battle (and arguably an unwinnable war) at cost to oneself with little specificity as to what qualifies as success.
I’m all for spending the time or money to remove my profile. I’d rather also spend a little extra time to see where I’m listed and evaluate that organization, though.
> I have to essentially give (through your service) all these companies & data brokers my info
Happy user of Optery here: you don't have to give Optery a whole lot of info. No SSNs or anything, just things they would search by to help you remove them. Are there fields for past addresses or people you lived with or past names you may have had? Sure. Do you have to fill them out? No. Are they provided in bulk to the data brokers? No.
So they will enter all this data on the websites of the brokers which sell such data....this reminds me of the website that promises it contains all/most bitcoin private keys - just enter yours to find out if its there!
In the vast majority of cases Optery only submits the opt out (a) if we’ve already located your profile at the data broker, meaning they already have your information to begin with and (b) with the minimum information necessary to complete the out, generally First Name, Last Name, Age, Current City, and Current State, no more information than what’s already publicly available online.
> You're telling me that to remove my info I have to essentially give (through your service) all these companies & data brokers my info
How can you index into a hashmap, an array, or a DB table without a key? Answer: you cannot.
There's no way for data-broker opt-outs to work without uniquely identifying the individual who wishes to be removed.
Sure, I agree that "GDPR-like regulation and actually enforcing it" is the proper solution - but how long will that take? Five years? Ten?
What if I want my personal data removed from these brokers now?
Edit: now that I think about it - you could build some scheme where you give the data-brokers a cryptographic hash of some personally-identifying information, so if they don't already have you in their database, then they can't get your information. But, in order to do that, you'd need regulation equivalent to the GDPR (otherwise they'd never do it), in which case the above argument still applied.
You're telling me that to remove my info I have to essentially give (through your service) all these companies & data brokers my info so they can opt me out, and actually trust them that they'll do so? If anything, opting out is a signal that you may actually be of more interest to them than not doing anything. If these companies can also infer that the request is coming from your service (and they will, unless you use random proxies and browser automation), the flag becomes "this person has enough disposable money to pay for such a service" which suddenly increases the value of your profile by orders of magnitude.
How are you going to make money to justify the VC funding? VCs rarely fund boring, sustainable businesses that sell a service and make a slim profit; for them it's all about hypergrowth, but I don't see the potential here - unless of course you start doing the very thing you're currently protecting against. The simple fact that it's VC-funded tells me to steer well clear.
The only proper way to deal with this is with GDPR-like regulation and actually enforcing it - the latter has been lacking in Europe, but thankfully seems like it's somewhat picked up recently.