I'm not GP, but it looks like the more burdensome things are on pages 26-33, and they are too lengthy to post here. I can see compliance costing significant sums that would be out of reach to a typical startup.
The question is not "is there anything that would clearly be burdensome?", but "am I confident enough that I am complying with these items, as retroactively interpreted by regulators?"
You need to pay a lawyer to evaluate that for you, that's the cost, not whoever the bills sponsor says this is intended to target.
Can you cite an example of one of these requirements that you wouldn't be confident in being able to comply with? Also: how much do you think a legal consult costs? For any one item, I think we're talking a couple hundred bucks.
Almost all of the language in the section we're referring to applies to just one requirement, which is to make data tech companies retain about consumers available upon request to those consumers. That's something responsible companies already do, many because they're already by regulation required to do so.
Just popping in to say that I've never had an invoice less than $800 from any firm I've ever hired, and those were very small things. Data privacy-related stuff tends to be way more complicated, and if my company didn't have me, basic data privacy stuff would cost them tens of thousands a year just in legal.
Most competent lawyers cost $400+/hr. For them to review your internal compliance policies and procedures (including your opt-in/out procedures. etc), privacy policy, etc. you could easily be looking at a few hundred hours. That doesn't include the external auditors that the bill wants you to have.
As you said in one of your comments, fortunately this bill as written will never come into law, both due to its implications, and the fact that its author is a single member of a minority party. This is one instance in which I am happy with our system of government.
This proposal doesn't require companies to do formal internal compliance reviews. It's not SOX or GLBA. For most startups, the legal overhead here would probably amount to a few phone calls with a lawyer.
My read is that it's less onerous than the California privacy statute that already covers a huge fraction of tech startups.
We do both security and privacy engineering work for our clients, most of whom are encumbered in one way or another by regs, and it is not the norm for legal to do line-item review of policies and procedures. SOC2 Type 1 audits are much closer to a mainstream practice, would almost certainly satisfy the "data protection" requirements in any rule the FTC would come up with, and certainly do not involve "a few hundred hours" of legal.
That's just not accurate. You should read pages 26-33 in detail. It wants external auditors to come in, and while consultation with a lawyer isn't required, companies would offensively have to use them to review everything they do, lest they be found non-compliant. That could easily range into hundreds of hours of legal work.
I believe I'm one of the "auditors or independent technical experts" this bill refers to (trust me, we don't need Wyden's help getting work), and for the most part the only time we talk to client legal is when we're negotiating our contract. Note also the "if reasonably possible" attached to getting external assessment.
You're referring to that specific provision, but again you aren't considering the fact that any business interested in complying will have to have an attorney review the law, and then review all aspects of their business, software implementation, and policies/procedures in order to ensure they are compliant. That's not a requirement of the law, but how else can they ensure that they are compliant?
It starts with the claim that this law could put Flappy Bird on the hook for decades of prison time. I rebut, and you say (paraphrased) "no, read the law, anyone with 1MM users could be sent to prison for failure to comply". This is obviously not true.
Then the claim becomes that pp26-33 of the statute has so many burdensome requirements that it would be impracticable for many startups to comply. I ask for specifics; none emerge. Instead, a new claim appears: every startup would be on the hook for "a couple hundred hours" of legal to verify their compliance.
But the proposal as stated doesn't require formal compliance reviews, making it hard to support an argument that this proposal would somehow cost more than many other regulations that do have that requirement, and for which my firm has done significant engineering and compliance work without spending a hundred hours talking to legal.
But, no, it turns out that's not the argument. The real argument is that the proposal requires auditors, for which legal will have to be deployed prophylactically. Now, the proposal does not in fact have an auditor requirement, but also, the clause that discusses auditors goes out of its way to make it clear that the types of third parties they're referring to are technical experts, which startups already use.
So the argument changes again. Now the argument is that regardless of the specific construction in the proposal (again, these specifics were all brought to the discussion by you!), it would be prohibitively expensive for startups because a lawyer would have to take time to verify the meaning of the law for the startup.
I point out that this is an argument that applies equally to pretty much any privacy or security law, and you respond that this is one is a special case because of the prison time and fines (the "breathtaking" fines are part of the same clauses as the prison liability) --- thus resurrecting the original false claim.
This doesn't read to me like a good-faith argument.
It's of course fine to make the argument that any new regulation would impede startups and would therefore not be worth the trouble (there are other arguments against this proposal you could just as easily make; for instance, that the field isn't mature enough for us to have the FTC use rulemaking authority to establish cybersecurity requirements for startups).
But if those are the kinds of arguments, you're making, make them. Don't move the goalposts.
It starts with the claim that this law could put Flappy Bird on the hook for decades of prison time. I rebut, and you say (paraphrased) "no, read the law, anyone with 1MM users could be sent to prison for failure to comply". This is obviously not true.
Actually, with specific regard to Flappy Bird, it is true because it had more than 100 million installs, far surpassing the 50 million requirement to expose him to criminal as well as civil penalties. So, in contrast to your statement, it actually is true.
Now, the proposal does not in fact have an auditor requirement, but also, the clause that discusses auditors goes out of its way to make it clear that the types of third parties they're referring to are technical experts, which startups already use.
I'm not sure what you mean here. There is an auditor requirement "where reasonable," and presumably "reasonable" would be entirely up to a court's discretion. Also, "technical experts" in the context of this law, wouldn't necessarily be the developer of the site, but rather technical experts who are trained in complying with this law. Likely, that means someone brought in by a law firm or professional auditing outfit, at enormous expense.
No, you're still not correct, because the problem with your claim isn't simply that you have to be a larger company to face prison time, but that there's only one offense in the bill that includes that thread: knowingly certifying fraudulent data protection reports. I'm like the 4th person on this (broader) thread to point that out, and this is at least the 3rd time I've pointed it out to you.
By the way, did Flappy Bird even collect NPI? Or is this an even sillier example?
there's only one offense in the bill that includes that thread: knowingly certifying fraudulent data protection reports.
That's what it says, but one would have to believe that failing to file such reports would also be a criminal violation in any final draft of the bill. Otherwise what would be the point of the bill? Does it make sense to you that they would have a bill like this, and provide a simple way to avoid it: just don't file? That appears to be an oversight by the author, but one would undoubtedly be fixed.
By the way, did Flappy Bird even collect NPI?
Since this bill uses a vague and legally untested definition of "personal information," simply maintaining weblogs containing IP addresses could trigger this.
It is often the case under US law that failing to file paperwork is treated as a much less serious act than filing fraudulent paperwork. If you fail to file a tax return, you're nearly always assessed a penalty (it's a misdemeanor). If you file a fraudulent tax return, you can easily go to prison for a long time (it's a felony).
The original discussion was about the bill draft as it stands - not what it might be in the future - so why would you say it isn't "moving the goalposts" to make an argument out of speculation on the future?