Hacker News new | past | comments | ask | show | jobs | submit login

Trying to fight a local attacker with root (which is necessary to add a certificate to the trust stores on most platforms) isn't worth the effort. It's easy for the admin to bypass and would cause even more warning fatigue.

That's not to say I disagree with the sentiment that this is something employers (and other organizations providing access to devices) should be obliged to disclose, but that is perhaps more of a legal and educational issue.




> Trying to fight a local attacker with root (which is necessary to add a certificate to the trust stores on most platforms) isn't worth the effort.

Hah. That's precisely the argument I have made when arguing that there should be an opt-out for addon signature verification (needing admin permissions to toggle it if they insist) because you already utterly lost the security game if someone had admin on the machine.

But no, they argue that they must defend against malware with admin permissions injecting addons into the browser. Because that's a fight worth fighting and the perception of the browser's security is somehow more important than user freedom.


I agree. But the reason they felt forced to do this is because even "reputable" software companies were auto-injecting unnecessary extensions as a side-effect of installing their popular software. Companies like Adobe and Microsoft, and "industry leading" "computer security" companies.

My first instinct is to say "it's important to not install crap software, you need to reasonably trust the software you install". But I immediately recognize that it's un-intuitive that Adobe and Microsoft and Symantec and McAfee are not on the "trusted" list. (Office and .Net have silently installed problematic Firefox extensions in the past.)

I don't really have a conclusion here, just, it sucks.


The problem in the field was application installers quietly "side loading" browser plugins.


Either those are malicious, in which case you lost the game and cannot defend against that because they have the security high ground, or they legitimately act on the user's behalf.

The "quietly" adjective suggests they are malicious. Which means they should be reported to AV vendors (including microsoft) instead of being used as a boogeyman when arguing against user freedoms.


Yes, especially since true malware creators have been able to inject code into browsers and intercept and modify pages for ages. They don't need an add-on, they'll just inject a shared library or something similar.


Don't assume the there's a local attacker with root.

Employees are often required to install local certs (or applications/scripts that do that) - that doesn't mean the host is entirely compromised.


> Employees are often required to install local certs (or applications/scripts that do that) - that doesn't mean the host is entirely compromised.

If they are forced to install those certs, then the computers they use belong to their employers, and those computers are obeying their proper owners. I fail to see the problem.

Don't use your work computer for things you don't want your work to be able to detect, intercept & modify.


  Don't use your work computer for things you don't want your work to be able to detect, intercept & modify.
with that logic, don't get mail sent to the office because your employer has every right to open and reseal the envelope.

You know who also has that right? Prisons.


> with that logic, don't get mail sent to the office because your employer has every right to open and reseal the envelope.

I don't have things sent to the office unless they comply with my employer's policies (e.g. I'd never have a weapon mailed here), and unless I'm happy with my employer having information about my packages.

I'm curious what line of reasoning would justify me doing such a thing and expecting privacy.

> You know who also has that right? Prisons.

You know who also has that right? You, on any network and hardware you own. My employer owns my laptop and the network it's connected to: of course it has every right to inspect its own property.


> * My employer owns my laptop and the network it's connected to: of course it has every right to inspect its own property.*

Careful there: there's a difference between capability and right. Many legislations forbid employers to put surveillance cameras in the bathroom for instance. There is a level of privacy employees can legally expect.

E-mail is similar: if your employer intercepts your e-mail just because you read it at work, it is likely a breach of correspondence secrecy (in France it would be). This likely applies even if you're at fault for using the company's resources on personal matters.

While "don't trust your employer's network" is elementary operational security everyone should be taught at school, 90% of the population don't know what computers can do, let alone how they work. Because of that, their expectations are social, not technical.


Common mistake on HN: the law is not logical.

The law says that if employers own the computers they provide to employees, then employers have broad latitude to monitor how those computers are used.

A different law says that only the recipient of a US Postal letter may open it. If you receive a personal letter at your office via US Postal Service, your employer cannot legally open it.

I'm not up to date on what the law says about FedEx or UPS.


That's employment for you. Employees are not treated as autonomous agents. More like untrustworthy children.

For my last gig, I worked at some company on behalf of another. This workplace was quite explicit about intercepting and monitoring everything. I pondered for a second whether I should use this place's computers to log to my employer's webmail. I gave up and did it, because I wasn't going to read or write work emails outside of office hours.

Personal stuff however I didn't dare.


> That's employment for you.

This might be true in the US but illegal in various other countries!


That's… murky. I live in France, where your correspondence is supposed to stay secret. It obviously applies to personal snail mail, and arguably personal e-mail. It should apply to every personal IP packet (they're structurally closer to snail mail than an e-mail is), but I don't know if it does.

Work stuff… When you use the work computer to do stuff on behalf of your employer, it's not really private, and could arguably be monitored. (There are work regulations that limit how you can use that data however. Benchmarking for instance is forbidden in France.)

The problem is, since proxy servers cannot automatically distinguish work related activities from personal errands, they tend to cast a wide net. The implied distrust kinda disgusts me, but I reckon this puts big companies in a delicate situation: one does not simply trust thousands of people —too many single points of failure.


Yes and that's well within their right and within most companys' policies.. Why would you send personal mail to your workplace?


That's fine, but that doesn't justify doing it silently - if you're going to claim the right to detect, intercept and modify you should have no problems with being completely transparent about doing so.


> It's easy for the admin to bypass

If it involves patching and recompiling the browser it wouldn't be that trivial for your average sysadmin. Besides I don't see why the admin would be hostile to the users being aware that they're being monitored. As you point out companies generally disclose that anyway.


> Besides I don't see why the admin would be hostile to the users being aware that they're being monitored.

Agreed. We could argue all day about companies who think they need to intercept traffic, but why would anyone who believed they had a legitimate reason to do so want to do so silently without any notification?

A persistent infobar near the address bar, for instance, would work nicely. And anyone working in a hostile environment with such monitoring imposed on them (a bank, for instance) would then have a much clearer warning that they shouldn't use their work device for anything they want to keep private.


No, sysadmins don't patch browsers. Endpoint security products do. Patching browsers to implement TLS interception is table stakes for security products. Local pin enforcement would in fact result in million more surreptitious browser patches.


Maybe on windows. Do you know of any Android security products which do this? Why does Chrome for Android not implement this?


The warning fatigue problem is still there - and we're talking about a warning for 4-10% of all connections according to the study. Plus, as you pointed out, this would only help against the average sysadmin; if we assume an advanced sysadmin convinced to not disclose their snooping or an actual attacker, replacing the browser binary would not be a huge obstacle. In that sense, it might even add a false sense of security. It's a bit like the state of certificate revocation - it works most of the time, just not when you actually need it.


This figure of 4-10% of connections is meaningless here, either you're intercepted or you're not. The warning would only matter for websites that bother to implement certificate pinning.

I don't really know how widespread key pinning is but if it's reserved to the more sensitive websites (banking, e-commerce etc...) it might make sense to at least issue a warning.


> This figure of 4-10% of connections is meaningless here, either you're intercepted or you're not. The warning would only matter for websites that bother to implement certificate pinning.

Most Google properties use key pinning in some form (though AFAIK through static pins rather than HTTP headers). I would suspect that most users in that group would see such a warning at least daily.

> I don't really know how widespread key pinning is [...]

"Visitors may be presented with a warning if they're behind a middlebox and you deploy HPKP" would probably be a good way to slow down HPKP deployment even further.


Well, sysadmins have a way simpler solution: just tell their users to use a different browser. This would have to be coordinated between browsers to have any effect at all.


Adding certs to root stores does not require recompiling the browser.




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: