Hacker News new | past | comments | ask | show | jobs | submit login

I believe it is essential for Firefox to allow installing extensions in a private way that does not force us to upload our source code to Mozilla, especially considering their renewed push for user privacy.

It could be a feature that can only be enabled from the browser UI, with appropriate warnings, but the choice must be offered for those who need it.

Until that happens, those with private extensions must resort to compiling their own browser, use unbranded or developer builds, or patch Firefox at runtime to disable signature checks.

This is how you allow unsigned extensions in Firefox on Arch Linux, the same files can be edited on Windows and macOS, restart the browser after changes:

  sudo tee /usr/lib/firefox/defaults/pref/config-prefs.js &>/dev/null <<EOF
  pref("general.config.obscure_value", 0);
  pref("general.config.filename", "config.js");
  pref("general.config.sandbox_enabled", false);
  EOF

  sudo tee /usr/lib/firefox/config.js &>/dev/null <<EOF
  // keep this comment
  try {
    Components.utils
      .import('resource://gre/modules/addons/XPIDatabase.jsm', {})
      .XPIDatabase['SIGNED_TYPES'].clear();
  } catch (ex) {
    Components.utils.reportError(ex.message);
  }
  EOF



> It could be a feature that can only be enabled from the browser UI, with appropriate warnings

That's not the issue. The problem is: where do you record the fact that the user has seen and acknowledged these warnings? Wherever it is, malware can go there and write "yes, the user has already seen the warnings, no need to ask them, you can run the extension".


You request administrative rights to store the configuration change. With root access malware can just replace Firefox.

Even with signature checks enabled malware can trick users by pointing Firefox shortcuts to a patched browser, changing shortcuts doesn't require root access.


I don't understand this logic at all, it baffles me.

Malware can do a lot of things, but I never heard of common software being designed under the assumption that malware has taken over the PC. The software I use is designed under the assumption that the user is intelligent enough to keep the computer malware free. Only the OS or explicit security software is designed to keep me from malware, but not my text editor, office software or browser.

It is a terrible excuse for ignoring user intent.

What's next? Disallowing to change the startpage from google.com because malware can change it? Disallowing downloads because users could download malware? Going that route would essentially take away the entire software in the end. Maybe at one point Firefox is only allowed to operate from within the cloud, where employees make sure it is 100% safe?

Also others have already said that one could add admin privilege to certain settings.


I mean we're not necessarily talking about malware that has taken over the computer. More like software companies whose installers 'helpfully' install their browser extension behind the users back.

The same is true for link handlers, file extension associations, context menu entries, browser plugins. None of these should be able to be changed by applications, only by the user themselves.

We're entering an age where applications running in a user session are no longer user agents but users of your computer unto themselves that need have their own permissions boundary. Applications running as the user != the user anymore.


> More like software companies whose installers 'helpfully' install their browser extension behind the users back.

When there's a will, there is a way...


It would be important for the Firefox team to allow a discussion with third-party developers about the exact threat model and reasoning behind forcing signature checks, and disallowing an escape hatch.

There are indications [1] that not even Mozilla employees have a good understanding of the threat model and the current solution, and that leaves one wondering how could alternative solutions be possibly explored if the topic isn't even properly understood.

[1] https://news.ycombinator.com/item?id=20423747


Maybe no one really knows anymore why the feature was implemented that way in the first place.

I think the post-mortem has shown that the biggest problem of mozilla is fragmentation of decision-making, and the existence of probably >50 small teams that do stuff without communicating.

It is highly likely that the certificate problem isn't the only negative consequence, and that we'll see more evidence of mismanagement in the future.

I have the feeling that for some reason mozilla has established a culture where information does not flow efficiently from top to bottom and vica versa, and it even looks like the management doesn't really exist.

When a small team is formed around a task without central oversight, reporting back to someone, it will tend to justify it's existence, even if it means doing unnecessary work.

I heard the last CEO who wanted to streamline the mozilla hierarchy back to efficiency was Brendan Eich and many people got uncomfortable when he started to demand that people actually work productively.

I think a honest post-mortem would have come to a painful conclusion: That the Mozilla of today is in no way able to compete anymore, and many employees have stopped doing real work. The company only lives on because it lives off it's massive market-share of the past.

I am convinced that, within 2 years, mozilla will be confronted with massive lay-offs, threatening the gecko engine. This incidence shows me that they haven't done anything to address their structural problems, probably because most people in the company are content with the place they have, and comfortable living off the massive google revenue.


Once there's malware on the system you've completely lost, there's nothing Firefox can do ever, so there's no point in trying to provide any protection in that case.


That depends. If the malware (and to be clear, "malware" here includes various anti-virus vendors and other gray-hat software that is nominally providing the user a service but _also_ attempting to insert itself into Firefox) is only running with the privileges of the user, it can't modify the Firefox install itself, if that install requires administrator access (sudo on Linux, etc) to modify.

That gives you a trusted place to bootstrap from. The addon database is stored in the user's profile, so _can_ be edited by such malware, but that's where the the trusted code could try to verify integrity in some way. Which is how we got where we are now.

Disclaimer: I work on Gecko, not extensions and the "things injecting into the browser" situation, but have overheard a bunch of discussions about it.


>[...] is only running with the privileges of the user, it can't modify the Firefox install itself

It can however, attach to Firefox as a debugger and modify it's code as needed (ptrace/writeprocessmemory).


Attaching as a debugger may require elevated privileges too (e.g. on Mac).


Couldn't the malware replace your Firefox icon with a version that points to a malicious patched Firefox? Or whenever it sees Firefox start, it kills it it and starts the malicious Firefox instead.


Mozilla's reasoning is that these programs like to think of themselves as not malware. Configuring the user's computer is much easier to claim as "honest behavior" than patching or replacing executables.


But here's the thing: It's my browser, not Mozilla's.


Why not in the OS's keyring?


Malware can patch or replace your firefox.exe, that's not a very strong argument, if you ask me.


Okay, you’re not running as Administrator on your corporate Windows desktop. How does that malware replace Firefox.exe — not to mention setting a valid signature?

The answer is why this matters: we’re not in the Windows 95 era any more and a large fraction of users are running without admin privileges, often on a system which does code signing checks. Being able to get a trusted binary to run an add-on is common because the easier direct techniques aren’t as they were a couple decades ago.


First of all: user separation is not a robust mechanism. There are countless privilege escalation exploits. It's just one layer of protection.

If you insist that exploit can't escalate its privileges, then set this addon running privilege from Administrator. That would be a good thing actually.


> It's just one layer of protection.

Exactly, and it’s important in cases where users are being convinced to install dodgy extensions because each additional step increases the chances of someone realizing that it’s a scam.


I for one am OK with Firefox's mandatory code signing. In fact I prefer it over Chrome's. At least with Firefox you can then publish the extension using your own servers, whereas with Chrome you must use their store, even though it will be hidden from the general public.

Thus, I don't really understand your statement:

> those with private extensions must resort to compiling their own browser

I certainly would never recommend such a thing, myself. If I were paranoid about 3rd party access to my extension to the point I felt I needed to compile by own browser, then one thing I'd certainly do is also disable addons and build mine right into the core product...

I'd certainly trust Mozilla far more than I can trust every addon out there that might coexist with my private addon, which is where most of the attack surface would come from (if such were my threat model).


Chrome allows installing unsigned extensions by enabling developer mode in chrome://extensions. On browser startup you'll see a warning [1] that encourages you to disable unsigned extensions, but users are offered a choice.

I do trust Mozilla, though I'd prefer if they'd also trust me to be capable of making an informed decision and give me a choice while using their product.

[1] https://i.imgur.com/iGEYMwv.png


Do you not consider "run the Developer edition" of Firefox to be a choice?


It's not an optimal choice, because Firefox Developer Edition is not based on the latest stable release of Firefox, and because users shouldn't be forced to switch to browsers with an alternative focus just to enjoy Firefox with a local extension.

Permanently installing unpacked or unsigned extensions is a feature that all major browsers offer for their users, except Firefox.


My distribution packages regular Firefox, but not the Developer Edition (or ESR). This is a showstopper for me.


> I certainly would never recommend such a thing, myself. If I were paranoid about 3rd party access to my extension to the point I felt I needed to compile by own browser

That's exactly the point, you shouldn't need to compile your own browser to use a private extension!


What's the main difference for you between using their store vs. your own server?


At minimum, Unbranded should be able to automatically update itself, so that it can be a legitimate choice for users.


You may want to keep an eye on https://bugzilla.mozilla.org/show_bug.cgi?id=1514451 -- mozilla wants to remove general.config.sandbox_enabled (again) in the future.


> use unbranded or developer builds

Is that such a problematic thing?


Unbranded builds do not auto-update, and Firefox Developer Edition is based on Firefox Beta, so it can be less stable than the release build.


Auto-update would be convenient indeed. That said, Developer Edition is really very stable, so while I see why you would not want that, it's not that bad a stop-gap measure until unbranded builds auto-update.


It's perplexing that Mozilla insists on forcing signature checks at any cost in release builds, when there is no consensus on the fundamental reason for doing so.

> Most fundamentally, the full Firefox team does not have a common understanding of the role, function, and operation of cryptographic signatures for Firefox add-ons. For instance, although there are several good reasons for signing add-ons (monitoring add-ons not hosted on AMO, blocklisting malicious add-ons, providing cryptographic assurance by chaining add-ons to the Mozilla root), there is no shared consensus on the fundamental rationale for doing so.

https://wiki.mozilla.org/Add-ons/Expired-Certificate-Technic...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: