Hacker News new | past | comments | ask | show | jobs | submit login
California passes nation’s first IoT security bill – too little too late? (diginomica.com)
44 points by m_eiman on Sept 25, 2018 | hide | past | favorite | 40 comments



This is incorrect. The bill has not been signed into law. It can still get rejected by Gov. Brown, who has a habit of rejecting bills that come under criticism, and this one has received some criticism for its vague lingo.


Quick thing on “vague lingo”.

Vague lingo is currently accepted, among white-collar crime academics, as absolutely the best path forward for reining in corporate behavior.

When you have very specific lingo, it’s extremely easy to circumvent the law. You want to keep the law vague and open so you have lots of maneuver room to prosecute. This assumes you trust the government, which when compared with companies I 97% do. Bruce Schneier’s latest book, “Click Here to Kill”, makes the same point [1].

[1] https://www.amazon.com/Click-Here-Kill-Everybody-Hyper-conne...


Counter argument:

Vague language is the best way to enable selective enforcement. Large companies who can afford the best lawyers will be able to find interpretations that are in their favor. Smaller companies would be unable to do this and thus more likely to lose.

And this only gets worse if large companies spend more effort on lobbying — they can get vague wording that makes the public feel good (like something is happening!) but which requires no effort on the company to comply.


>When you have very specific lingo, it’s extremely easy to circumvent the law. You want to keep the law vague and open so you have lots of maneuver room to prosecute. This assumes you trust the government, which when compared with companies I 97% do. Bruce Schneier’s latest book, “Click Here to Kill”, makes the same point [1].

Companies don't have a monopoly on force that the government does. This greatly changes how you calculate who the law should favor. I much prefer specific laws instead of laws that are selectively applied to whomever the government decides to target, because there is a long history of showing a very evil nature in how they pick targets.


Having recently encountered vague law - specifically GDPR - I cannot give the experience a positive recommendation. The position of being unable to determine what is actually required of your company, what the consequences might be, and thus what the risks actually are makes for an extremely uncomfortable environment.

Core question: what does "reasonable" mean? Do I get to define it, or some information security peer? Is it going to be defined as a matter of convenience by some functionary looking to make quota?

It's my professional responsibility to provide informative and actionable risk assessments and guidance. Crap like this makes that impossible.


The law isn't there to make the company's job easier.


You're absolutely right! Completely, utterly, and unreservedly so.

Laws like GDPR are there to protect people from companies. Not make things easier for them. It's perhaps possible that people might be better protected by laws that are clear than by laws that gesture vaguely in the direction of security. Similarly, people are well-protected by clear automotive safety standards, and poorly protected by the regulation around dietary supplements.

Might that be worth considering?


What makes surveillance capitalism companies scary is not that they make money through ad targeting, it's that the government can force them to give up their data to do much worse things.


I’m not sure how good this law is. For example “best practice” is often interpreted as “includes a virus scanner”, yet as Travis Ormandy has shown over and over again virus scanners add huge attack vectors. Added to that the very nature of what virus scanners want to do means that the entire OS must allow exactly the type of code injection that malicious software wants.

Yet there are places that will only purchase things that can run antivirus (because there are rules or regulations that require them to run antivirus).


You want to fix this?

Get rid of/make void the fact that software and hardware companies disclaim any and all liability. If they make bad things, they should be liable for them.

I would provide an exception to this, and that's if they are open sourced, able to upload firmwares, and able to revert to a safe baseband.


>If they make bad things, they should be liable for them.

Define 'bad'.

What one person sees as "insecure and bad" another person will view as "configurable and open".

Ultimately the consumer shares a good percentage of blame.


Really now?

Say, I have a Samsung tv with internet. I hook it up to the 'net. I use the built in features intended. It updates when it needs to.

Except, for a year and a half, those TVs were vulnerable to a whole range of bad things. They were used as local spying platforms. They were used as jumping off points for local networks. And they were used in attacks on other networks.

We can quantify the damages that this malfeasant software/hardware did. They certainly weren't "intended features", nor can customers just roll their own signed firmware.

> What one person sees as "insecure and bad" another person will view as "configurable and open".

Depends. Does the default behavior leave you open to a range of unintended effects? Are there known and unpatched errors/bugs?

Yes, configurations can be made that open you to vulnerabilities. Making an open relay over SMTP is one such. But the default Postfix is sane and secure.

> Ultimately the consumer shares a good percentage of blame.

That's what I'm arguing. When the customer is not much more than in a rental agreement faked as a purchase, 100% of the blame needs to fall on the renter operator.


>Say, I have a Samsung tv with internet. I hook it up to the 'net. I use the built in features intended. It updates when it needs to.

In your proposed universe, you won't have a "smart tv" because you will choose not to pay the $3500 price tag to cover the security audits and increased litigation insurance that will accompany the manufacture of such devices.

Most linux distros will also be sued out of existence. Such laws are a dream for authoritarian regimes. Ultimately there will be a only a small handful of monopoly powers who have enough resources to navigate the regulatory landscape.


Did you accurately calculate the extra expenses needed to create a secure TV? Or did you pull $3500 out of your ass to bolster your argument?


And you're going to the "OMG regulatory hell" argument. Its only been the last 10 years since we've had idiotic "smart features".

I'm proposing excising these "smart" features out of devices that don't need it. A TV/Monitor is and will be useful for the foreseeable future. A TV/MOnitor with bit-rotted 'smart features' will turn the device into a disposable device when the maintainer/owner finally wants to get rid of it.


>A TV/Monitor is and will be useful for the foreseeable future.

Then just buy a TV / Monitor?


Have you tried this recently, for a high-end device in a category for which a smart alternative exists? E.g. TVs

Without liability, there is no disincentivizing including smart features for manufacturers.

I agree that "Use a user-patchable / open source / replaceable stack -> shielded from some liability" or "Lock everything down -> take on legal liability for your own security" is a sane system to put in place.

There's no reason 50%+ of IoT devices (especially Chinese clones) aren't built top-to-bottom open source, aside from manufacturer ignorance / laziness.


And I'm also rejecting this argument that if we don't let software companies shovel whatever shit they want out the door without consequence, then nothing will happen. There is no basis for that whatsoever.


that would just drive up the costs significantly on tvs... I do like that Microsoft appears to be working on its smart tv's with pluggable compute modules so people could replace the compute platform on their tv with one of their preference.


I'd disagree.

The problem with most companies that put some intelligence in dumb devices is that they:

1. Rely on their own cloud platform

2. Use lockin to their specific plan

3. Fail to quickly respond to security incidences

4. Abandon when their next incremental version comes out

And most of these types of devices are the ones that end up being a member in a botnet or other bad networks.

The best defense is to make "dumb" devices. No, our microwave, refrigerator, oven, dish washer, and others do NOT need network connectivity. Hard to hack a machine with no connectivity.


I don't see what your argument is, the more i read it the more i see it supports my argument that imposing such contstraints on smart tvs (just an easy example) would shoot their costs up.

mind you, I'm NOT Advocating for insecure devices, I'm just saying be prepared for monthly service fees or short product lifecycles if this is what is expected. All of which will drive up costs and potentially reduce utility.


> that imposing such contstraints on smart tvs (just an easy example) would shoot their costs up.

Or, the public being price sensitive, will with appropriate legislation, either start killing 'smart' devices, or use secured and local standards.


That's like saying that inspecting your meat would drive up the price. It is an absolute nonsense argument brought up by people who would prefer to have no responsibility for anything they do, ever.


nah, inspecting meat is a one-off deal. What is being suggested is that the meat inspector has to certify your fridge/freezer is holding right temperature and is liable if the system fails and spoils the meat. With tv's you're looking at a one-off purchase with a long-tail support contract for years to certify and patch against any/all liabilities. The only way that would be profitable was if every smart tv required a service contract to support such demands, otherwise the model is flawed.


> The only way that would be profitable was if every smart tv required a service contract to support such demands, otherwise the model is flawed.

That's the point I'm trying to make.

Support your shit properly, or keep the devices dumb.

I prefer dumb, so I can add my own smarts without being tied to some shitty proprietary IoT.


thats not the consumer pattern though, and to be honest, you can get dumb tvs :)


If the dumb tv is poorly engineered and sets my house on fire, then I would be able to sue them for liability. I see no reason whatsoever why I should not be able to do the same thing if their software causes issues.


I'm sorry, but I am rejecting that argument. "Configurable" and "Secure" are not opposite ends of the same axis. They are different axis altogether, that have very little to do with each other.


> Get rid of/make void the fact that software and hardware companies disclaim any and all liability

It already is void or at least significantly limited by law in most jurisdictions. It might be better if we waived statutes of limitations and enhanced damages (e.g., 3× multiple) for damages where the basis of liability would be reasonably viewed as within the scope of a disclaimer which purported to disclaim liability that could not legally be disclaimed, so that there was a significant disincentive for the currently-routine overly-broad disclaimers.


Consumers don't want that.

There is already a kind of hardware and software where companies are liable: in medical equipment. Now look at what that does to the prices.

Open source and consumer serviceable makes sense, but that's already being fought for with the "right to repair" movement.


The medical field has all sorts of other market failures going on, that it's not really fair to compare their prices in that way.


Look at the prices for hardware certified for permanent installation into an aircraft?


You practically have to have every bolt be traceable back to the ore the iron for the steel was smelted from (this is only a slight exaggeration). That puts a huge additional cost on every unit shipped. Software is not subject to that. You can certify it once and amortize the cost over a million copies sold. The cost of testing is also much cheaper.


It isn't just the cost of testing. The barrier to entry, no matter how modest, is not what increases the price so much. It's the pricing power acquired from the lack of competition, as the barrier prevents new entrants.

If three brands of GPS mapping device dominate the consumer market, but only one of them bothers to certify for aircraft, the exact same hardware in an aircraft dash-mount will cost many times the amount for a consumer handheld. Costs only determine prices for commodity suppliers. Everyone else charges what the market will bear.


Medical equipment is expensive for reasons other than just liability. Much of it is because of some combination of being quite specialized, protected by medical patent (or other IP law), produced from expensive materials or manufacturing process, and subject to all sorts of regulations involved in prolonging/saving lives.


You can get most medical equipment from China/India at up to 100x lower prices, but nobody would even dare to use them on any patient (in first world countries) because of the liabilities... and that's why customers only get products made by certified companies, who accept the liability and consequently tighten their manufacturing, and pay for whatever IP.

The same would happen with any other kind of software/hardware where liabilities are regulated. Suddenly companies would have to tighten their development processes, triple check for IP, formally verify every component, get certified and insured, and prices to the consumer would skyrocket.


Correlation is not enough to establish causation. Do you have a basis for claiming a causal link?


Well, that’s why I feel pretty ok getting into a MRI.

I think consumers will want regulation when they see what it fixes.


How would this affect FLOSS where pretty much all licenses reject liability? I personally believe an engineer has an implicit liability on everything they built, no matter what the license says, but that's the status quo.


good time to be in law school ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: