I read this and immediately thought "oh shit, yet another regulation for a small bootstrapped software business where we try to be honest while the big guys will still find a way to circumvent it". Thankfully, I looked into the fine print and was wrong. This bill is only for Corporations that do over $50,000,000 in revenues or higher OR (EDITED from AND) have info on at least 1,000,000 or more customers. Of course, I don't mean to imply that smaller tech. companies are all saints but this bill clearly seems to be drafted for the bigger fishes who are powerful enough to blatantly ignore privacy laws.
EDIT: I clarified one thing user downandout pointed that it could apply to App developers who may have a million users as well but overall, it should not affect bootstrappers or smaller tech. businesses that don't have that high amount of consumer data so my overall point still remains.
So then big corps do all their customer information risking behavior in spun out small wholly owned subsidiaries or even arms length non owned ones and if successful acquire them for some fixed amount but if they screw up with this law the company just folds and the parent is free from financial damage.
I am not an expert bill reader but seems like this loophole is covered as there is another point that says :
"(ii) is not substantially owned, oper
ated, or controlled by a person, partner
ship, or corporation that does not meet the
6 requirements under clause"
Tbh thats still a better outcome than the status quo isn’t it: a single hack is now limited in damage, and multiple are required to do the equivalent of todays scenarios
What if everyone contracts out the data collection to a single party? Only giving the data a larger exposure.
I'm really glad this is coming to light and I hope it passes. It will be very interesting to see how companies try to avoid it, but at first glance, it seems well thought out.
Software engineers tend to assume that (a) they are the smartest in the room, (b) legislation is exactly like code, (c) caselaw doesn't exist and (d) nobody has ever had to write complex rules before software was invented. All of which, we feel, qualifies us to poke holes in any legislation we happen to stumble across.
But, of course, it doesn't. Not even close.
By way of analogy, if someone looked at a link to a github repo and said "yeah well I can't see any GOTO 10 lines, I bet this will crash when the IP trace becomes Apache'd" I hope someone would be patient and polite in explaining the several levels of wrongness involved.
There are jurisdictions were caselaw doesn't exist (which a lot of software engineers also conveniently forget a lot of the time) or law is written differently from the anglo-american legislations (which software engineers also conveniently forget a lot)
That the doctrinal basis is different does not change that law requires interpretation by experts using something more sophisticated than "it's just code".
Civil jurisdictions don't have caselaw as an independent source of legal rules, but they still have cases and they still have interpretation. It is not uncommon for an ancient Roman jurist's opinion to be cited in argument in Scots law, just as it is not uncommon to cite very old English cases in Australian law. And it is also common to have civil codes in common law jurisdictions -- the criminal law I was taught was based on a statutory instrument which asserted itself to be the whole of the criminal law and which was frequently amended whenever courts began to accrete rulings around it.
That ultimately doesn't matter. The bill has zero shot at passing and becoming law. Wyden doesn't have anywhere near the votes he would need, so it's essentially his fantasy idea of a bill.
There's no privacy law that is going to get passed by the US Government, focused on large corporations, that involves sending violators to prison for up to 20 years.
You read that wrong. In order to NOT be a "covered entity," you must meet ALL of the following criteria:
1) Revenue of less than $50 million; AND 2) Must not have info on 1 million or more people; AND 3) cannot be a data broker
That means an independent app developer who gets more than 1 million installs, or a website with more than 1 million users, IS a covered entity, regardless of revenue. Also, ANY "data broker," regardless of size, is covered.
This info is on page 4 and 5.
Edit: How is a factual comment getting downvotes? OP read it wrong, and I told him so. There's nothing in this comment to disagree with. There are only facts.
No, it would kill all American software/web startups by limiting them to 999,999 users, unless they have millions of dollars in VC funding that they can use to comply with this law. It would strangle the startup community, as most startups (even those with 1M+ users) can never hope to have the resources to comply. You have to remember that the reason that startups get any funding is because investors hope that they will be highly successful. If there is a guarantee of a huge compliance bill once the company reaches 1M users, far fewer companies will be funded.
Getting 999K users, while a milestone, isn't necessarily enough to raise the money necessary to comply with this law.
I'm not GP, but it looks like the more burdensome things are on pages 26-33, and they are too lengthy to post here. I can see compliance costing significant sums that would be out of reach to a typical startup.
The question is not "is there anything that would clearly be burdensome?", but "am I confident enough that I am complying with these items, as retroactively interpreted by regulators?"
You need to pay a lawyer to evaluate that for you, that's the cost, not whoever the bills sponsor says this is intended to target.
Can you cite an example of one of these requirements that you wouldn't be confident in being able to comply with? Also: how much do you think a legal consult costs? For any one item, I think we're talking a couple hundred bucks.
Almost all of the language in the section we're referring to applies to just one requirement, which is to make data tech companies retain about consumers available upon request to those consumers. That's something responsible companies already do, many because they're already by regulation required to do so.
Just popping in to say that I've never had an invoice less than $800 from any firm I've ever hired, and those were very small things. Data privacy-related stuff tends to be way more complicated, and if my company didn't have me, basic data privacy stuff would cost them tens of thousands a year just in legal.
Most competent lawyers cost $400+/hr. For them to review your internal compliance policies and procedures (including your opt-in/out procedures. etc), privacy policy, etc. you could easily be looking at a few hundred hours. That doesn't include the external auditors that the bill wants you to have.
As you said in one of your comments, fortunately this bill as written will never come into law, both due to its implications, and the fact that its author is a single member of a minority party. This is one instance in which I am happy with our system of government.
This proposal doesn't require companies to do formal internal compliance reviews. It's not SOX or GLBA. For most startups, the legal overhead here would probably amount to a few phone calls with a lawyer.
My read is that it's less onerous than the California privacy statute that already covers a huge fraction of tech startups.
We do both security and privacy engineering work for our clients, most of whom are encumbered in one way or another by regs, and it is not the norm for legal to do line-item review of policies and procedures. SOC2 Type 1 audits are much closer to a mainstream practice, would almost certainly satisfy the "data protection" requirements in any rule the FTC would come up with, and certainly do not involve "a few hundred hours" of legal.
That's just not accurate. You should read pages 26-33 in detail. It wants external auditors to come in, and while consultation with a lawyer isn't required, companies would offensively have to use them to review everything they do, lest they be found non-compliant. That could easily range into hundreds of hours of legal work.
I believe I'm one of the "auditors or independent technical experts" this bill refers to (trust me, we don't need Wyden's help getting work), and for the most part the only time we talk to client legal is when we're negotiating our contract. Note also the "if reasonably possible" attached to getting external assessment.
You're referring to that specific provision, but again you aren't considering the fact that any business interested in complying will have to have an attorney review the law, and then review all aspects of their business, software implementation, and policies/procedures in order to ensure they are compliant. That's not a requirement of the law, but how else can they ensure that they are compliant?
It starts with the claim that this law could put Flappy Bird on the hook for decades of prison time. I rebut, and you say (paraphrased) "no, read the law, anyone with 1MM users could be sent to prison for failure to comply". This is obviously not true.
Then the claim becomes that pp26-33 of the statute has so many burdensome requirements that it would be impracticable for many startups to comply. I ask for specifics; none emerge. Instead, a new claim appears: every startup would be on the hook for "a couple hundred hours" of legal to verify their compliance.
But the proposal as stated doesn't require formal compliance reviews, making it hard to support an argument that this proposal would somehow cost more than many other regulations that do have that requirement, and for which my firm has done significant engineering and compliance work without spending a hundred hours talking to legal.
But, no, it turns out that's not the argument. The real argument is that the proposal requires auditors, for which legal will have to be deployed prophylactically. Now, the proposal does not in fact have an auditor requirement, but also, the clause that discusses auditors goes out of its way to make it clear that the types of third parties they're referring to are technical experts, which startups already use.
So the argument changes again. Now the argument is that regardless of the specific construction in the proposal (again, these specifics were all brought to the discussion by you!), it would be prohibitively expensive for startups because a lawyer would have to take time to verify the meaning of the law for the startup.
I point out that this is an argument that applies equally to pretty much any privacy or security law, and you respond that this is one is a special case because of the prison time and fines (the "breathtaking" fines are part of the same clauses as the prison liability) --- thus resurrecting the original false claim.
This doesn't read to me like a good-faith argument.
It's of course fine to make the argument that any new regulation would impede startups and would therefore not be worth the trouble (there are other arguments against this proposal you could just as easily make; for instance, that the field isn't mature enough for us to have the FTC use rulemaking authority to establish cybersecurity requirements for startups).
But if those are the kinds of arguments, you're making, make them. Don't move the goalposts.
It starts with the claim that this law could put Flappy Bird on the hook for decades of prison time. I rebut, and you say (paraphrased) "no, read the law, anyone with 1MM users could be sent to prison for failure to comply". This is obviously not true.
Actually, with specific regard to Flappy Bird, it is true because it had more than 100 million installs, far surpassing the 50 million requirement to expose him to criminal as well as civil penalties. So, in contrast to your statement, it actually is true.
Now, the proposal does not in fact have an auditor requirement, but also, the clause that discusses auditors goes out of its way to make it clear that the types of third parties they're referring to are technical experts, which startups already use.
I'm not sure what you mean here. There is an auditor requirement "where reasonable," and presumably "reasonable" would be entirely up to a court's discretion. Also, "technical experts" in the context of this law, wouldn't necessarily be the developer of the site, but rather technical experts who are trained in complying with this law. Likely, that means someone brought in by a law firm or professional auditing outfit, at enormous expense.
No, you're still not correct, because the problem with your claim isn't simply that you have to be a larger company to face prison time, but that there's only one offense in the bill that includes that thread: knowingly certifying fraudulent data protection reports. I'm like the 4th person on this (broader) thread to point that out, and this is at least the 3rd time I've pointed it out to you.
By the way, did Flappy Bird even collect NPI? Or is this an even sillier example?
there's only one offense in the bill that includes that thread: knowingly certifying fraudulent data protection reports.
That's what it says, but one would have to believe that failing to file such reports would also be a criminal violation in any final draft of the bill. Otherwise what would be the point of the bill? Does it make sense to you that they would have a bill like this, and provide a simple way to avoid it: just don't file? That appears to be an oversight by the author, but one would undoubtedly be fixed.
By the way, did Flappy Bird even collect NPI?
Since this bill uses a vague and legally untested definition of "personal information," simply maintaining weblogs containing IP addresses could trigger this.
"It costs money to comply" is a tired argument that gets trotted out every time a new business regulation gets proposed, no matter what the regulation is. All regulations cost money to comply with. Are you arguing against all business regulations?
If your problem is actually with the number of users, please propose a specific number of users that would make a better limit.
EDIT: This is also the exact same kind of "sky is falling" rhetoric that came with HIPPA. Lo and behold, we still have large and small doctors' offices, and have taken the first steps towards actually protecting customer health records.
There are (I have reason to believe) plenty of non-VC funded social media apps. Easy to make, and people think they’re going to be as big as Facebook. I think that type of app will still get made, as people who can’t sort out funding are people I don’t expect to know the law.
Only as much as the parent's argument was a non-sequitur argument.
Parent mentioned startups with lim -> 1 million users unable to comply. If they can't find VC money at the million user stage, then perhaps they didn't have a viable model in the first place.
(Plus, where all these I mentioned "VC funded" from the first user? If not, the argument holds for them too).
> gets more than 1 million installs, or a website with more than 1 million users,
Incorrect, that would only be true if they collected and stored personal info on their users. Hopefully we see more apps and websites stop collecting this info, or a minimum started purging the data (i.e if you visited a site 1 time 5 years go they should not still have your data but many do)
> (12) PERSONAL INFORMATION.
—The term 6 ‘‘personal information’’ means any information,
re-gardless of how the information is collected, in-ferred,
or obtained that is reasonably linkable to a
specific consumer or consumer device.
So if you use an email address for your users to log in, or even a username, you are collecting personal information based on the vague nomenclature of this law. Device IDs might be the case too (though I think Apple at least now gives a per-app id, which means it can't be linked to external sources much now iirc).
That's analogous to a construction already in place in California privacy law, which is more prescriptive and onerous than this law is, so it's hard to make the argument that this federal act would wreck the startup ecosystem.
You are posting like this is a bad thing, I think it is great, companies need to be held accountable for gobbling up personal data, and should be discouraged from collecting anything including email addresses, I get enough spam thank you.
A lot of people can be tracked down by their online aliases unless the person has gone through a lot of work to make sure they keep their aliases separate (which isn't as easy as it sounds). That means that by definition most aliases and usernames can be googled to find a specific person and thus would fall under personal information according to the law.
You think email addresses are bad but the masses hate having to remember usernames. Emails aren't used for logins because of marketing (all those systems already had email verification in addition to usernames, thus email was required). Furthermore, without an email it becomes impossible to recover your account if you forget your password. If the app has no personal information and email and you forget your password (which most general users do) then it's impossible to recover your account.
So this law isn't going to discourage companies from collecting your email address, it just has the potential to add burdens to companies that end up with 1 million signups (even for a free website).
You have said alot of words none of which changes my opinion.
Using email as either Verification or Username has always been a lazy and insecure and should stop.
If normies can not recover their candy crush account and need to sign up for a new one in order to protect privacy i am find with that if companies like King stop collecting data
I am security and privacy first, convenience and "free" are about 1000000000000000 on my list of importance. If a lot of free sites die that is price we pay for better data security and privacy. i am fine with that.
This has nothing to do with me... if more people would like me Facebook, Twitter, etc would not exist at all, and Google would have a massively different business model more like when they started then what they have become.
I do however want ownership over my data, and the right to demand these companies tell me what they collect on me (often with out my permission see Facebooks Shadow Profiles on people that do not have accounts) and right to demand they delete said data.
If you don't put the information out there, then they won't have it. It's a simple cause and effect relationship. Not giving up this information will likely cause you to not be able to use some services, however you have no inherent right to access services run by private parties. They are offered to you under certain conditions, and if you choose not to comply with those conditions, you are free to not use the service. With regard to "shadow profiles," simply use incognito mode if you are this worried about it.
Let's be clear that this law won't pass, certainly not as it is written. In the US, it's perfectly legal for websites to track your behavior. Should you object to this, you have a simple remedy: use incognito mode.
So the app developer has to be able to demonstrate they followed some form of best practice with regard to user data.
I think you're downplaying the requirements of this law. You should read it, it's pretty onerous and carries decades in prison with it - even GDPR didn't go that far.
One interesting caveat, however, is that at least as written, I can't find anything imposing penalties for simply not filing the reports this law claims to require after all of the expensive audits etc it wants. It only imposes penalties for lying on the reports. I'm not sure if that was an oversight on the author's part or if that's intentional though. Any final version would likely "fix" that issue.
Again, I believe this is simply false. The provision carrying "decades in prison" applies only to companies making over a billion dollars in revenue, and only in the very limited case where a particular officer of the company knowingly mis-certifies a report to the FTC.
The plain language of the law says that your interpretation is not correct. The criminal provisions apply to companies with over $1 billion in revenue or those those that have 1M or more users. That would expose a much larger range of independent developers to decades in prison.
A closer reading indicates seem to be correct that the criminal provisions only apply to those larger entities. However, ALL of the provisions in pages 26-33, which are significantly burdensome, still apply to All covered entities, which you can hit by just having 1 million user accounts.
I corrected my And to an OR so yes, good point but overall, it still doesn't impact "entire startup community". There are plenty of tech. businesses that don't hit 50 Million in revenue AND don't have a million users. I am talking about those.
True, but the issue is that getting 1M+ installs isn't under the control of the developer. Sometimes things go viral - look at Flappy Bird. Under this law, that guy (if he were in the US) could be looking at decades in prison unless he took enough investment money to comply.
This law also uses a very broad definition of "personal information" that could possibly include IP addresses. So it does have an effect on the entire community, in the sense that every startup hopes to surpass 1M users, and this law will punish them for it.
I don't think it will pass as is, but if it does, this is truly a "sky is falling" moment for US startups. Because it effectively limits non-VC backed startups to less than 1M users, it also makes sure that there will never be a competitor to Facebook or Google. Like GDPR, it locks in entrenched competitors.
Flappy Bird might have a million customers but he could opt to not collect information on those users. If Apple/Google had the information that is their issue, not that of the app developer.
No, he couldn't. I think you need to read the draft more carefully. Flappy Bird, in the scenario you describe, is explicitly exempt from imprisonment under this proposal.
That seems to be an entirely incorrect interpretation. Any app with more than 1 million users would fall under this law. You're simply reading it wrong, as the OP of this thread initially did. Any entity with personal information - as that term is (very broadly) defined in this document - on more than 1 million or more users is fully exposed to its civil and criminal penalties. This includes developers that just get lucky and get 1 million or more installs, and who have no way to pay for compliance.
No, I think you're confused. Having 1MM users makes you a "Covered Entity" in this draft. But "Covered Entities" aren't required to file data protection reports to the FTC until they make $1B in revenue or have 50MM users.† And, again: the "decades of imprisonment" 'downandout is talking about refers only to the crime of deliberately misreporting those data protection reports. It is not the case that any failure to comply with this law has prison time attached.
Happy to be wrong about this; if I am, please offer a cite.
You are both right and wrong. Flappy Bird indeed had over 50MM users (in fact it had over 100MM users), and therefore its owner would have criminal liability under this law regardless of revenue. However you are right that the lower limit excludes people from criminal penalties. Having just 1MM user accounts still exposes them to the full brunt of the civil penalties available under this law that could easily bankrupt them. So if you have between 1MM and 50MM users, you won't go to prison, you'll just be broke.
I hope it passes. If there are significant problems, updates and changes can always be made. As for your comparison with the GDPR, I think it's way too early to start drawing conclusions.
The way GDPR hurt smaller companies wasn't so much that the regulation directly applied to smaller companies but that larger companies, due to the liability the regulation presented, no longer felt comfortable working with smaller vendors who they felt couldn't provide the assurance of GDPR compliance. External vendors became riskier that everyone reduced the number of vendors and partners they wanted to work with (either directly or indirectly), which had the cascading impact of smaller companies being cut out of the ad ecosystem and Google becoming much more dominant.
Now I'm not yet familiar with this draft nor with how it's likely to evolve from this point on but you have to do one or the other. You make the regulation overarching enough that partners' lack of compliance affects your own compliance efforts, in which case smaller players are going to have a hard time competing and will be cut out of the ecosystem by legit big companies who have more to lose than you do. If you don't do that and it becomes relatively easy to outsource data-related liability to your partners, there will be a loophole where shady data collection and processing activities will tend to aggregate and move towards entities that are either exempt from regulations or at least willing to take on liability for their customers for short-term gains. The latter is distinctly worse than the status quo because you're making the problem worse. The former is arguably bad for the startup ecosystem.
My understanding is that smaller players are, in the aggregate, much worse about pretty much everything and the effectiveness of GDPR-style regulations depends on the impact they have on smaller players who are much further away from compliance than the big tech companies who were for the most part never too far away from compliance and whose bottom line never depended on any of the alleged shady practices. Most people aren't aware of smaller ad-tech players or data vendors so often their (and their customers' or partners') wrong-doings are blamed on the big tech companies that are much more visible.
To a large extent, what happened in the ad ecosystem is that smaller shadier ad tech companies forced everyone else to become shadier - if they are out there promising marketers more data, better tracking and more accurate measurements and they are accomplishing that through questionable practices, that still raises the bar for what marketers expect and they are able to force their way into integration with other platforms or at least force everyone to do similar things. Ultimately, there's a trade-off between transparency and measurements for marketers and privacy and a world dominated by lots of small players who don't trust one another is one where marketers are forced to require proof that their money is being well-spent, which means more privacy-defeating tracking and measurements.
EDIT: I clarified one thing user downandout pointed that it could apply to App developers who may have a million users as well but overall, it should not affect bootstrappers or smaller tech. businesses that don't have that high amount of consumer data so my overall point still remains.
Below is the link to the details on this bill:
https://www.wyden.senate.gov/news/press-releases/wyden-relea...