Hacker News new | past | comments | ask | show | jobs | submit login

I'm very disappointed that although using Studies (Normandy) is briefly mentioned, it is treated as just an unusual option rather than a breach of policy. I was really hoping for an explanation of why they felt they could break their promise to users that Studies would only ever be used for, well, studies.

* All Shield Studies must be designed to answer a specific question - Shield is a tool for evaluating ideas and features in the product. If you are not trying to evaluate a proposed feature or idea there are other means of shipping your code. Shield studies will always respect user privacy in accordance with our data collection policies

* All Shield studies will adhere to the scientific method for answering complex questions - Generate a hypothesis, test, collect and analyze data, validate or refute hypothesis, refine, repeat..

* All Shield studies require a PHD (Product Hypothesis Doc) - A PHD or Product Hypothesis Doc outlines the overarching research question the study is trying to answer. It requires the study author to think critically about the problem and the outcomes long before the study ships.

These guidelines (https://wiki.mozilla.org/Firefox/Shield/Shield_Studies#Guidi...) were created after the beach of trust represented by the Mr Robot cross-promotion as a means to restore confidence. Yet now they have been ignored, and again "Studies" has been used as a convenient backdoor.




Imagine you run a very popular website, and you have a policy that says "we will never modify any site data by accessing the database directly", to try to ensure that everything goes through site functionality and can be properly logged/etc.

One day, some weird data gets into the database somehow, the site's completely unable to handle it, and it crashes and goes totally down. You're faced with two options:

1. Rewrite a major section of the site code to be able to handle the bad data. This is expected to take at least two days to complete.

2. Break the policy, fix the data in the database, and have the site back up in 5 minutes.

Do you truly believe that Option #2 needs to be treated as completely out of the question?


Not nearly analogous. First, the policy was not about accountability, it was about privacy and specifically restoring a trust already broken.

Second, it would only have taken a worst case of 24 hours vs the 6 hours of using Studies to follow that policy. I think the integrity of a company that trades on that very thing is worth 18 hours.


> it was about privacy

Yes, and adblockers, umatrix/noscript, Tor, and other add-ons were universally broken.

That has an impact on privacy too.


So how many of Mozilla's policies need to be appended with "unless we think it's important enough"? All of them? Just the ones violated this time?


The implied distinction is that adherence to principle is what distinguishes Mozilla from Google. If Mozilla is willing to sell out its principles on this instance, then when else will it be willing to sell out its principles?


It's not like people didn't tell Mozilla about the potential for problems like this when signing was added.


> I was really hoping for an explanation of why they felt they could break their promise to users that Studies would only ever be used for, well, studies.

It's literally there:

> Because we had a lot of users who were affected, getting them fixed was a very high priority, which made Studies the best technical choice.

Which makes perfect sense to me. When you have a worldwide critical degradation of service, you fix that as soon as humanly possible, ideological pearl clutching be damned.


"Honoring the promise you depended upon to get users to install the feature in the first place" isn't ideological, or pearl-clutching.

Imagine your point in other contexts:

"If your company is about to go bankrupt unless it gets some serious revenue, then sure, you start selling the user data you promised not to, ideological pearl-clutching be damned."

"If a priest receives the confession of the killer in an unsolved mystery, he should report it to the police, religious pearl-clutching be damned."

And "critical degradation of service" is overstated; users always had the option to use another browser (like Mozilla seems to want these days). When they shut off everyone's unsigned, whitelisted addons in 2016, that wasn't regarded as a critical degradation.


The context is that people's adblockers, noscript/umatrix, and Tor add-ons were not working.


> "If a priest receives the confession of the killer in an unsolved mystery, he should report it to the police, religious pearl-clutching be damned."

Confidentiality in such areas (also med/law) is usually restricted to past actions. If you confess you're going to kill someone, that does indeed have to be reported to the police. I doubt your local laws are different here.

Thus to construct a better analogy we need ongoing harm that was initiated in the past but can only be shortened by taking action. Maybe a confession like "Just had an argument with my kid and imprisoned it in the basement without food and water. Will release it after two days". Now should this have to be reported to CPS? I'm not sure, but think I would favor yes.


I was making a moral (and somewhat practical) argument, not a legal one about a particular jurisdiction.


I mean, really? Mozilla's whole thing is integrity and principals. The Mozilla Manifesto is literally an ideology they probably don't want to be damned.

I'd have very much preferred them wait just 18 more hours and not have violated their privacy policies.

What exactly is Firefox's selling point if their principals can be traded for speed and convenience? That's just another tech company.


> What exactly is Firefox's selling point if their principals can be traded for speed and convenience? That's just another tech company.

If everyone cared as much as you do about this issue, then they'd have likely waited. The fact that they were jumping through hoops tells me that the users don't care, they want their crap fixed _right now_.


> The fact that they were jumping through hoops tells me that the users don't care, they want their crap fixed _right now_.

Surely what that tells you is that Mozilla thinks that users don't care, and want their crap fixed; in which case the policy might as well say "These policies will be applied unless their abrogation is necessary to fix your crap", or some appropriately dignified re-wording.


They didn't remotely enable telemetry/studies though, right? If you were fine with waiting, you could have just not enabled it. I don't see how anything they did violated their privacy policies.


The policies that using Studies in this way violated are listed in my initial post. Those policies were created to prevent Studies from being used as a convenience.

You might think violating them was justified, but that they were in fact violated is inarguable.


> The policies that using Studies in this way violated are listed in my initial post.

You just claimed they had violated their privacy policy, not the shield studies guidelines. Your initial post doesn't say anything about their privacy policy. And as I said, your privacy wasn't affected at all if you didn't enable the studies yourself, so your point about rather waiting doesn't make sense either.

> You might think violating them was justified, but that they were in fact violated is inarguable.

You could argue that testing the fix meets the first two conditions you listed. I don't know if they actually provided that "Product Hypothesis Doc", but at that point you're really just looking for something to complain about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: