The Facebook blog post includes this screenshot [0]. That's definitely an opt-in, although I could imagine a non-technical person clicking through it without reading the grey text. Better than nothing.
I'd like to formally welcome everyone to the world of a/b test-driven development!
Nice easy-to-read high-contrast giant-font headings that seem legit: "Text anyone in your phone!" - Yay! that's exactly what I want!!!
Tricky hard-to-read low-contrast micro-font fine print that looks like way to much trouble to try to read, detailing what ACTUALLY happens when you're tricked into clicking the:
One brightly-colored-only-thing-that-looks-remotely-like-I-should-click-it call to action.
Add in the cute emoji looking user who obviously LOVES giving up his privacy, and you've got a winner!
Wow project manager, would you look at these great test results!!! everyone wants to give us their private info!!! what idiots!
yeah, that's how I recently clicked it - had to quickly install messenger (needed to make a video call, which the lite client I normally use doesn't support), clicked right through this. And normally I'm careful with such stuff.
It is not "better than nothing," it is the minimum that is technically and legally required.
The UI is intentionally deceptive.
The text is deceptive. The big text says "Text anyone in your phone," which sounds great, but doesn't actually say anything about data sharing. The small low contrast text, describes what they're doing. They know that by putting it small and grey, most people wont read it.
The buttons are deceptive. It looks like there's only one button.
Even by putting all the permission stuff in a row on first startup is push ploy. Everyone knows, people just keep clicking OK until they get to actual app. People don't read the pages. People don't read the user agreements. They just want to get through the roadblocks as fast as possible so they can get to what they want.
None of this is by accident. All of these elements are chosen to push people towards the desired outcome, to upload your contacts, SMS, and call history as quickly as possible.
I know this might come across as a nitpick, but A/B testing and similar techniques aren't at all about what users choose. They are about constructing a statistically correct test between two alternatives measured in some predetermined way (eg. transactions/users but equally well consenting users/total users) with the least amount of bias.
The bit about minimizing bias means that you couldn't possibly give users the A/B variants to choose from but instead must assert the best you can that no user is exposed to different variants of the same experiment. You would typically do this by giving the client a cookie with a random seed which you can then hash with a secret and the id of the experiment to get a randomly distributed but sticky/consistent variant assignment that doesn't scale client storage with the number of experiments. Where things get more interesting is how to change behavior when a user logs into a new device. Do you switch to the users set of variant assignments or keep the experience on this device consistent?
This being said, yeah, I'm pretty sure this is the result of optimization on "what fraction of users give consent" while keeping an eye on uninstall rate as a health metric.
Source: Redesigned/rewrote A/B testing (and general optimization) framework for large e-commerce company many years ago.
Well then it isn’t really much of choice is it? It’s a false choice. The product manager will apply as much pressure as needed to get the desired effect. To quote Captain Ramsey in Crimson Tide, “[Y]ou can get a horse to deal cards. It’s just a matter of voltage.”
The ToS and other conditions attached to various services and relationships are literally far too long and complex for anyone to read, understand, and recall in full detail.
This point has been the subject of numerour articles and documentaries.
Hoback states that if one was to read everything in these user/service agreements, it would take one full month, that’s 180 hours every year” and that according to The Wall Street Journal “consumers lose over $250 million dollars due to what’s hidden in these agreements.”
But first of all, the prompt shown in the article's screenshot was not hidden in a long ToS.
Second of all, I agree that we should work towards solving the problem that overly-long ToSs represent. What I do not agree with is the assertion that, in the presence of an overlong-ToS, we allow people to click-through, then get outraged at the company for doing something they were allowed to do by the ToS. If it really is far too long and complex for you to read, don't use the product.
Tools which become ubiquitous or prerequisites for other services are not discretionary.
Facebook is the Internet for many people, is required for authenticating by numerous services, and is how numerous groups and communities organise.
As such, its terms and services as those of numerous other services, should be defined in a standard set of obligations, rights, and responsibilities, by law.
> Tools which become ubiquitous or prerequisites for other services are not discretionary...As such, its terms and services as those of numerous other services, should be defined in a standard set of obligations, rights, and responsibilities, by law.
I see this sentiment in some form a lot. Is there a name for this principle? Where does it come from? I don't in general agree with it - speaking generally, I think enforcing this principle is a way to solve some problems, but not the best way - and would be interested in discussions about the principle itself.
Pardon the delay, but I've been trying to think of a good brief description. I'm not sure I've got one that's particularly clear.
"Common weal", that is, the common wealth or common good, a/k/a social benefit, is probably the best general description. The notion being that there are positive externalities not addressed by the market (or there are offsetting negative externalities not imposed on the producer).
In either case, a useful functioning requires some entity with the interest, capacity, and power, to act in the public interest. The notion is an old one. It comprises Book V of Adam Smith's Wealth of Nations ("The Expenses of the Sovereign"), or in contemporary economics, the area of public sector economics (or welfare economics).
And is the domain of government, as my initial response indicated.
What services require Facebook for authentication? The only one I can think of is Tinder (not sure if that's actually the case, I've never used it) but I don't think a few non-essential services requiring a Facebook account makes it not discretionary.
If you needed a Facebook account to file a tax return or something, then I would agree that it's not discretionary.
The best description I recently heard of these types push questions is "exploiting a bug in human behavior. And when the bug is in the brain, there is no patch."
So how do we solve this? What is the general framework we use to agree to transactions of this kind where users allow access for some data in return for something?
Effectively, what GDPR requires. You must inform users what the data will be used for when asking for opt-in permission to use it, as well as offering opt-out and right-to-deletion.
dark pattern like this one needs to be hammered down by law. We should consider the most prominent button as "default" and such a form as an "opt-out".
Is this really a dark pattern? The text is right there. It's not convoluted legalese, it's extremely straightforward. If people aren't willing to read that, how exactly are you supposed to get anyone's consent for anything?
You are supposed to get consent by showing both options equally. Note also the wording of "not now". That is the kind of option that feels like the app is going to bother you about it again. Moreover, it communicates inevitability. You can opt out for now.
All of this is clearly made to get people to opt-in. Heck, it is not even clear that the "not now" option is clickable.
Consent can be tricky when you create a path of higher and lower resistance. This is especially so for people who try to avoid conflict (teenage girls here especially). They may give consent simply because it's what is expected of them, and rather than what they actually want. UI design for such consent forms has to be carefully considered so as to make it 'socially acceptable' to decline.
That specific popup is a clear example of how to maximize acceptance in a big portion of the population. I can almost guarantee you that popup design was chosen based off A-B testing, and had the highest rate of acceptance among designs.
Yes - Call and Text history are not required to make this feature work - only your contact list (which is all iOS sends). They bundled this extra invasive data collection into the consent request. And we are used to apps requesting access to contacts to find people you know on the platform, so syncing your contact list is expected for this feature. What's not so expected is also sharing your call and text history, which is conveniently mentioned in a low contrast text blurb. That's the dark pattern.
I hate to play devil's advocate against something i agree with here, but there's a lot of subjective ideas here, most obviously determining which button is "most prominent."
In a world where we have already been preconditioned to certain design patterns, the blue button is clearly the more prominent and will subconsciously be considered the "default" option.
Worse than that - I expect most technology-illiterate users won't realise the "Not now" text is a button at all.
(Its grey-on-white designed to look identical to the copy text above the button. The only difference is placement, and a small difference in font size.)
This is exactly why I do not have any call/sms info in my FB data downloads. I read such stuff and deny the app these things.
Btw. something I am required by law here in Germany. I would need written permission of every contact in my phonebook for this upload, before uploading. Same goes for WhatApp.
I am not sure about the wording of the law. But imho no, this would not be illegal to offer. It would be illegal to use, unless you have permission from everybody.
[0]: https://fbnewsroomus.files.wordpress.com/2018/03/opt-in_scre...