Hacker News new | past | comments | ask | show | jobs | submit login
Facebook iOS App Scrapes Your Clipboard? (un-excogitate.org)
237 points by chillaxtian on Oct 19, 2015 | hide | past | favorite | 124 comments



Apple actually improved the security of this in iOS 9 by only allowing foreground apps and extensions to read the contents of the clipboard.

Previously, Facebook could have scraped it from the background.

Now you implicitly grant permission to read your clipboard whenever you open an app or extension.

My app Workflow[1] has a great workflow in its gallery that you can run from your widget to clear your clipboard if you regularly copy sensitive information.

[1] https://workflow.is/


> My app Workflow[1] has a great workflow in its gallery that you can run from your widget to clear your clipboard if you regularly copy sensitive information.

Shameless plug, but the app looks awesome. Would be cool (if not already done) if you could plug into IFTTT to fill in features that you can't/don't want to.


Definitely! I made sure to prefix it with "my" so it was clear :)

IFTTT integration is on the roadmap!


Ah, so that's why my favourite dictionary app broke! :( You'd keep it running in the background, copy a word in another app, and the dictionary would post the definition & pronunciation as a local notification. So much better than share sheets and multitasking. RIP, hack!


> Apple actually improved the security of this in iOS 9 by only allowing foreground apps and extensions to read the contents of the clipboard.

very cool! was not aware of this change.


Thanks for the tip. I have Workflow on my home screen, combined with the 3D Touch menu this is very easy to use.


What are some of your favorite workflows?


For those using iOS in the enterprise and worried about data leakage, this sort of thing should give you pause.

This sort of thing is difficult to detect in the AppStore Review process so apps that may appear benign could quite easily use this method to try to steal sensitive information with little risk of detection.

We've been lobbying Apple for a while now to improve the controls around clipboard data; specifically to allow us a policy that prevents prevents personal / unmanaged apps from seeing data that was copied from enterprise / managed apps. It might not solve the personal privacy issues but would significantly reduce this as a risk for corporate data.


...and this is one of the many reasons that any organization that does care about security "still" uses BlackBerry


Is the disturbing part the fact that this is Facebook doing it, or that apps have access to the clipboard? Because the latter has been the norm for all apps on desktop OSs AFAIK, and enables extremely useful sharing of information between them (I have a download manager that automatically catches URLs, for example.)


All desktop apps have access to everything. Your download manager can also unlock your browser's saved passwords, read your emails, copy your SSH private key, intercept your IMs, spoof your bank's website, etc. Some of these attacks are more difficult and leave more of a chance of detection than others, but they're all possible.

This is widely considered a mistake, but it's also an implication of steady backwards-compatibility in app design -- and in many cases, code -- since the late 1980s. No one expected Lotus 1-2-3 to steal your WordStar documents, partially because neither of these were networked. So the OSes told Lotus 1-2-3, WordStar, and anyone else who came along that they had access to the entire drive and they could be designed as if they had that access, even though they didn't need it. Apps could ship their own Open dialogs; the OS trusted that they would only open files they were told to. A quarter-century later, the Windows and Mac App Stores are trying to reverse this, but the app developers are understandably grumpy because they have a quarter-century of code that never had to deal with such a design principle.

And the Windows and Mac App Stores are attempting to reverse this by reference to mobile app stores. Mobile platforms, being an entirely new mental model for both developers and users, could abandon lots of the design principles that make sandboxing hard. There is absolutely a reason why no mobile app has a desktop-style Open dialog: the OS doesn't trust it with that power. There is also a reason that the mobile AV industry is a tiny fraction of the desktop AV industry, despite mobile use having outpaced desktop use.


> This is widely considered a mistake

Is it? I've heard voices, but I didn't realize it's a very popular opinion. Personally, I think it's a feature and what makes desktop useful as a tool, as opposed to mobile devices being mostly toys.


I think the mistake (at least in today's world) is that un-sandboxed is the default. Even as a power user, I would like if most of the GUI programs on my computer (web browser, word processor, etc.) were sandboxed, so I can focus most of my scrutiny on my IDE, toolchains, etc.

The other aspect is that programs are traditionally really bad at being conscientious about your system, even when not being malicious. Windows programs have a tendency to put files wherever they feel like, although UAC has helped this to some degree. Plus, every program and its brother has its own update system that absolutely must run in the background constantly, even when I'm not using it.


> I think the mistake (at least in today's world) is that un-sandboxed is the default. Even as a power user, I would like if most of the GUI programs on my computer (web browser, word processor, etc.) were sandboxed, so I can focus most of my scrutiny on my IDE, toolchains, etc.

As a desktop user I need little to none scrtuiny. My web browser, IDE, toolchains, IM, video player and computer games all work together well and while technically any one could own any other, this simply does not happen. It takes some (little) skill to avoid obvious malware, and then generally you're fine. On the other hand, I get to open my video file in my IDE, edit IM's data files in a word processor, and move things around seamlessly with OS; everything is expected to work and no program tries to lock me in.

> The other aspect is that programs are traditionally really bad at being conscientious about your system, even when not being malicious.

I don't think much has changed here. Phone apps are also not conscientious; they fight for your attention with other apps and with the entire operating system. The changelog of both Android and iOS is full of things done against app developers, because many popular app developers are willing to do anything imaginable to get your attention (and thus ad money).

My point is - the problem isn't with sandboxing itself; it's with the web and mobile ecosystem that makes developers work against the interest of users, instead of for it. The sandboxing is just a symptom - something that is being done to mitigate the damage of greedy "entrepreneurs". Desktop environments don't need that much sandboxing because there isn't that much crap going on.

I think the fact that desktops originally weren't connected to the Internet is a reason we've started with an open system, but it may be worth preserving this flexibility even in the Internet era.


> As a desktop user I need little to none scrtuiny. My web browser, IDE, toolchains, IM, video player and computer games all work together well and while technically any one could own any other, this simply does not happen. It takes some (little) skill to avoid obvious malware, and then generally you're fine.

It's not just downloading malware that is an issue; you also have to worry about security flaws in the apps you trust. Web browsers are a perfect example; they are constantly downloading untrusted code and executing it. Even when the javascript engine doesn't have a flaw, you have exploits like Rowhammer.js[1] which sidestep them entirely.

> It takes some (little) skill to avoid obvious malware, and then generally you're fine.

I think you're underestimating the knowledge needed to secure your computer properly, and overestimating the savy of the majority of users. The current state of computer security (huge botnets, etc.) is a testament to the fact that the current model isn't cutting it.

> On the other hand, I get to open my video file in my IDE, edit IM's data files in a word processor, and move things around seamlessly with OS; everything is expected to work and no program tries to lock me in.

If the lock in is an open file dialog, or passing a filename by command line, all of the examples you list would work fine.

> Desktop environments don't need that much sandboxing because there isn't that much crap going on.

You are missing an important difference; most users are far more liberal in downloading apps for mobile than for desktop, and if they were just as liberal on their desktops they would be pretty much guaranteed to get malware. Because of this, many apps which are only used in a browser on a computer are used as apps on mobile (which is needed at the very least for performance reasons). Sandboxes aren't perfect, but they move the ability to infect a device via installation from "totally trivial" to "requiring significant vulnerability".

[1]: https://github.com/IAIK/rowhammerjs


  > Desktop environments don't need that much sandboxing because there isn't that much crap going on
http://www.howtogeek.com/198622/heres-what-happens-when-you-...

You're safe if you install only paid-for software from big companies and open-source software that hasn't been bundled with installer malware. And never install Flash or Adobe Reader. And keep your browser, AV, and anti-malware up to date.

The plural of unsandboxed desktop is "botnet".

Yes, this is tragic and a real threat to open software development.


It also means opening a video file (or browsing a website, or editing a photo, etc.) can trigger code to read your documents and email it to someone else.


Yes, it does. But that rarely happens. Why is that?

There's something with web/mobile that makes people afraid to do anything without sandboxing the living hell out of every app, even though we've managed to live with unsandboxed desktops without many such incidents. I can't exactly put my finger on what is going wrong (it has definitely something to do with installing random crap and the new business model of companies trying to monetize user's eyeballs and data), but the push towards sandboxing is a symptom of that problem.


> even though we've managed to live with unsandboxed desktops without many such incidents

Antivirus is mandatory on desktops


Sandboxing and App Stores do not prevent powerful software tools being developed. You can have first-class video editors, painting and photo manipulation, and coding environments.

You might not be able to do absolutely everything you could with unfettered access, but that hardly leads to "mostly toys." The nature of mobile apps stems far more from the use case of the devices (user attention measured in seconds, rather than minutes) than limitations of the software development platforms.


The sandboxing of mobile apps - at least on iOS - was always sold as a feature that respected this handheld computer was also a phone, and needed the voice chat reliability of a pre-smartphone.

If you have any suspicion that installing an app will cause your phone call with your grandmother to be interrupted, you will really scrutinize what you install.

Early Android devices tended to have more problems than iOS devices when it came to phone calls getting interrupted, so whether this was marketing making an excuse for something Apple wanted on s technical level, it seems sandboxing was with the use case of "keep the phone working" firmly in mind.


Sandboxing and App Stores do not prevent powerful software tools being developed. You can have first-class video editors, painting and photo manipulation, and coding environments

I don't doubt that - but the problem is that everything you do with one app must almost always stay within that one. It's only when you want to exchange data between apps that all the power of the traditional full-access model really shines, and the reason why people have resorted to some horribly inefficient workarounds (e.g. upload to some server on the Internet from one app, then download the data in another) to do this.


That's not an insurmountable problem though, and sandboxed environments are getting better in this regard. E.g., it is now possible in iOS to launch one application from another which edits the source document directly — not a copy.

That said, the notion of a file system isn't inherently incompatible with sandboxing. If you allow the user to grant permission on a per-app basis whether those apps can read/write to a common data store, you essentially have the benefits of a local file system in a way that does not break sandboxing.

You only want to sandbox apps to keep them from touching the OS and other private app data. If it is user-permitted for them to touch a common data store then that means data does not always have to stay within the app in which it was created.


Fair, but I also want to grant sandbox apps to touch the OS and other private app data. The way how on desktop I can use Emacs to edit /etc/hosts, or MS Paint to change some other program's splash screen.


There are methods for solving that, like powerboxes:

http://plash.beasts.org/powerbox.html

I believe that Apple's desktop sandbox does use powerboxes, in that the Open dialog is privileged code and hands the app an open file handle. But it means that you can't tell Emacs to edit an entire source code directory. BBEdit had exactly this complaint about the sandbox.


Well that defeats the purpose of a sandbox. A sandbox is specifically to disallow apps from touching OS or private data.

But there's no reason such apps can't live side-by-side. I use Emacs to edit my hosts file, it's pretty convenient. A sandboxed Emacs would not be.

But I don't mind if the painting app I download is sandboxed because I'm going to be dragging-and-dropping my artwork into that program and I don't want it having access to my disk or network outside its sandbox.


I tend to think of systems which don't sandbox each process effectively as 'toys' since I can't meaningfully trust them to run code I haven't audited.


You can't trust sandboxed systems either, and honestly, who ever audits the code they run? "Auditing code" is a smart-sounding phrase that is meaningless to 99% of users. Also, have you audited your sandbox's code yet?

I consider mobile apps toys because they're designed to contain and own your data, and that impares both the ergonomy of work and limits your capability for doing things to whatever is available in the MVP some startup released. Most apps don't evolve past the point of MVP.


In other words.

Real scissors are toys but plastic scissors are not toys since you can't hurt yourself with them.


A chainsaw is also dangerous. You have to know how to use it and you should use protective gear.

Definitely not a "toy".


Sandboxing doesn't necessarily mean that you can't share files/data between programs. I think it's easy to mix these up because ios hides the file system for you. I would very much like both visual studio and notepad++ to be able to edit the same .cs file but i don't want notepad++ to have internet or GPS access because it has absolutely no reason to.


Sandbox all you want but give me root level access then. Notepad++ may not need the GPS or Internet access[0], but I need it to be able to edit Visual Studio's config files when I ask it to, without having to ask VS for permission. Don't let programs have magic databases that are inaccessible by users.

[0] - most desktop software doesn't need Internet access; even most mobile software doesn't need it in principle, it's only because of lazy business models that they have to have it.


Sandbox all you want but give me root level access then. Notepad++ may not need the GPS or Internet access[0], but I need it to be able to edit Visual Studio's config files when I ask it to, without having to ask VS for permission. Don't let programs have magic databases that are inaccessible by users.

OS X solves this quite elegantly. If an application is sandboxed, it cannot access files outside the sandbox directly. However, when an Open (or Save) dialog is requested, it is handled by a privileged daemon outside the process (it used to be pboxd). This daemon then makes the file or directory available in the sandbox after selection.

You still get app isolation, but it does not impede the user experience. (Of course, it does not apply for every scenario, but sandboxing is not a requirement outside the App store.)


There are several very useful OS X applications and tools that are not available on MAS BECAUSE Sandboxing doesn't let them do their job.

E.g. Finder replacements, tools like Alfred, BootChamp, etc.


> Don't let programs have magic databases that are inaccessible by users

Maybe you misunderstood my post but that's exactly what I meant should not be misunderstood as sandboxing. Sandboxing api calls is not the same as hiding the file system.


The problem/difference is that mobile apps are like websites. You might want to "visit" any one of them, even if you're not sure you trust it. You want to see what it has to offer. You expect the app to represent some company's interests, not your own. It might have ads!

You shouldn't download and run random executables on your Desktop OS though. Ideally, you only install open-source software, from a fairly secure repository, using a package manager which knows exactly which package is responsible for each and every file outside your home dir ... That's a not-quite-attainable ideal, but some people get quite close to it.


The problem/difference is that mobile apps are like websites.

From what I've heard, a lot of them really are websites, as all they do is wrap the OS's browser control and open it to the specified website. In other words, they're nothing more than enormously bloated browser bookmarks. That is a problem, but I think it's more attributable to the lack of support for adding browser bookmarks to the home screen than anything else since the only reason they exist is so people can go to some company's website more easily.

That said, a sandboxing feature for the apps you don't necessarily trust is certainly a good idea; nevertheless, I personally don't see much value in keeping around those which I only somewhat trust, but don't trust with full access. It's either apps I trust completely, or ones I find alternatives to which I can.


The reason this isn't a mistake is that most desktop apps (unlike mobile apps) are open source and community maintained.


Yeah but I don't have nosey apps like facebook on a desktop. I only have "utility" apps like visual studio, photoshop, vuescan, office, etc. Which is why the change of stance of Microsoft re privacy with Win10 is a big deal.


Do you have Spotify installed on your desktop? It's known to phone home meta data about your local collection of mp3s. Is playing music considered "utility" or "nosey" to you? I don't want to go into details about this particular example, the root problem is that you don't know, and that anything, even a utility program, could be nosey.


Correct. I don't have such app but I do have skype which I am looking at with increasing defiance.

But social network or advertising apps are unwelcome on a desktop, and in fact are usually referred to as malware when outside of a browser. I don't have much software on my desktop which isn't either a license I paid for or open source software.


...but the app developers are understandably grumpy because they have a quarter-century of code that never had to deal with such a design principle.

The thing we're grumpy about is the lack of respect for the central dogma behind the personal computing revolution: one machine, one owner, one user.

Recently, for example, Apple has literally granted themselves higher access privileges than the admin/owner of a machine running the OS X file system. There are files on the hard drive that they can write to and delete that you, the owner, can't, without extraordinary (and nonstandard) efforts. Naturally, it will only be a matter of time until Microsoft notices this innovative feature and copies it, more or less verbatim, into Windows. At that point, it's no longer "My Computer" -- it's theirs.

And all of this is accompanied by the sound of thunderous applause. Because, you see, it's For Our Security.


I assume you are talking about System Integrity Protection. You can turn this off, so do you consider the fact that Ubuntu doesn't have a root user by default (and requires you to run everything through sudo) an imposition on your freedom as well? Or is it only barriers which require a restart to disable?

In addition, no Intel Macs have any means of operating system lock-in. You can run Windows, Linux, or anything else that can handle standard x86 hardware. So you seem to have full control over the hardware at least (except maybe the SMC, which is not easily modifiable on standard PC hardware either). And given the fact that the Darwin kernel is open source[1], you are free to patch out SIP entirely and run that instead.

[1]: https://github.com/opensource-apple/xnu


Or is it only barriers which require a restart to disable?

Yes, that's what I meant by "nonstandard." Apple has invented a brand-new access control mechanism to keep people from tinkering with files that reside on their own computers. For now you can work around this feature. Once they close that particular gap, I'm guessing we'll still be able to use a debugger to get the job done... and I'm guessing you'll move the goalpost accordingly.


This is actually a battle that will soon merge with the primary frontline of the War Against General-Purpose Computing[0]. Windows 8 already has some things that you can't do without restarting and going through scary recovery screen with additional layers of security - in particular, installing unsigned device drivers. I predict the gap will close from both software and hardware side as Trusted Computing starts getting popular, finally sealing your fate as a leaser of computing device, not the owner.

[0] - It's not really "coming" anymore.


> installing unsigned device drivers

So, probably one of the most dangerous things you can do (opening the door to rootkits and an infected BIOS) doesn't take just a click and an undistinguished warning dialog? Also, this has been around since at least Windows 7, its not evidence of us sliding down a slope.


I didn't have to reboot Windows 7 machine and go through recovery interface to find obscure option that needs to be toggled and authorized in order to do so.

Yes, rebooting Windows one time to do it isn't that bad, but it's a start. I won't be surprised if the next version of Windows (next after Win10) removed that feature completely.

Forcing users to run only signed drivers is a problem because only Trusted Vendors can get proper certification (after paying proper amount of $$ to Trusted Certification Providers). Such restriction will basically kill hardware tinkering, which is one of the goals of the War on General-Purpose Computing.


Agreed, it's not really fair of me to single out Apple for criticism. I was just reaching for a recent example of something that can't possibly be justified on any grounds other than the usual "For the children" and "For the OS vendor."


I didn't read it as singling out Apple, just wanted to share another example that you may not have heard of and that I remember vividly, after losing some time and patience trying to program a TI Stellaris LM4F120 using a Windows 8 laptop.


> Yes, that's what I meant by "nonstandard."

It's slightly more difficult than turning of UAC on Windows. In addition, they didn't even try to hide the ability to turn it off; it's available in the documentation [1].

The only way they could seal the more difficult workaround I detailed above would be to either:

1. Rewrite their kernel so they don't have to open source it. In this case, I have no doubt the Hackinitosh community will have a patched version within a week. Goalposts moved I guess. Note you can still edit those files by restarting in another OS, which shouldn't be a problem for anyone who knows enough that they should be messing with such things.

2. Modify the hardware so it can only boot signed OS X. At this point, SIP is pretty irrelevant to the issue, and there doesn't seem to be any strategic advantage in even having it.

Also, I'm not sure what the motive is here. I can still install and run whatever binaries I want, signed or otherwise. I have El Capitan and I'm even using a custom shell. If Apple planned to require signatures for even command line programs to further their evil anti-Turing plan, SIP would again become irrelevant. So what slippery slope are we sliding down?

[1]: https://developer.apple.com/library/watchos/documentation/Se...


Once they close that particular gap, I'm guessing we'll still be able to use a debugger to get the job done...

Not if the Trusted Computing Group has their way; it's already happening with things like Secure Boot. Hopefully things will change for the better, before we get to outlawing debuggers[1]...

[1] http://www.gnu.org/philosophy/right-to-read.en.html


Desktop apps aren't sandboxed.


Thank God for that. That makes them actually useful.

Personally, I find sandboxing not as something desirable, but a symptom of a problem with mobile/web ecosystem.


How is sandboxing directly relevant to the usefulness of an application?

It can be relevant to some applications: generally applications that are made more useful by being able to infer things about the global state of your OS. E.g., an unarchiver that wants to default to the current directory instead of asking you each time.

But certainly not all. And not even most.


The majority of desktop apps that I use on OS X are Sandboxed (from a quick glance: Chrome helpers, Wunderlist, Microsoft Office, Photos, Airmail). I haven't experienced any functionality downgrade since the pre-sandboxing times.

As long as iTerm runs without sandboxing, I am happy ;).


Desktop browsers on the other hand has always had this feature blocked by default. Apps on a phone came after the internet and after web browsers and should hence have learned by all the mistakes we did when writing desktop OSs and programs. There is no reason why every program should have automatic access to the clipboard, if you want your download manager to work then give it explicit permission.


The app doesn't send anything to the server:

https://twitter.com/alexstamos/status/655459585642225664


So says a random tweet.

According to Facebook's Privacy Policy [1], "We collect information from or about the computers, phones, or other devices where you install or access our Services, depending on the permissions you’ve granted." Since obviously you've granted it permission to access the clipboard of your device and you've agreed to this Privacy Policy while installing the app, Facebook can legally collect, store and sell the contents of your clipboard.

[1] https://m.facebook.com/about/privacy


Says a random tweet from Alex Stamos, who is very well respected in the security community...Oh...and Facebook CSO.

Looks like the FB engineers just pop the URL and check for "https?://"[0] at 0 in the string.

[0] - http://i.imgur.com/AhIpWDe.png


It's great to hear from Facebook that we can trust Facebook. /s


I guess it's good to be cynical at times, but do you really think the tweeter and indeed Facebook would put their reputations on the line to gather such a small, insignificant information?


I doubt he's lying, but you have to trust that what he's saying is accurate and stays accurate. Some helpful dev that doesn't get the privacy issues could switch it to process the clipboard on the server, or accidentally add some logging code which happens to log that value remotely.

I'm more concerned that iOS gives this ability without any warning or notification to the user, because even if Facebook is virtuous (now) about this, other future apps may be less so.


It may seem small and insignificant to us, but how much do you think advertisers would pay to see the link that I just pasted into an iMessage, or the contents of the email I just copied?


They would pay significantly less than the billions of dollars Facebook stands to lose if they were found out to do this.


How could this cost them billions? They do lots of things that are pretty gross. Does this cross some sort of line they haven't crossed before?


Wasn't he only just recently CSO (or similar) of Yahoo? What's the story there?

EDIT: found this: http://www.siliconbeat.com/2015/06/24/yahoo-loses-security-c...

Where it is written:

> Stamos also won plaudits this spring at Austin’s South by Southwest festival when he revealed the company will roll out end-to-end encryption for Yahoo Mail by the end of the year.

Except, I don't believe Yahoo ever did "roll out" end-to-end encryption for email?


A Google search or two leads me here: https://github.com/yahoo/end-to-end


Thanks, that's a nice repository. I see that like its parent (Google's end-to-end), it is collecting dust for 99.99% of users.


While I wouldn't consider this very active, it's certainly not collecting dust.

https://github.com/yahoo/end-to-end/commit/8bf5dca239bb1df3f...


It is collecting dust for 99.99% of users. You've linked me to a commit of a merged pull request. That is not an indication that any of Yahoo's users are using this code (unless you count the developers). Maybe Yahoo will actually make their end of year deadline for "all users" [1], we'll see. They have less than two months left to do so. My knowledge of Google's end-to-end attempt leaves me skeptical, and not because this is any sort of a technical challenge, but because it's a political challenge.

[1] http://yahoo.tumblr.com/post/113708033335/user-focused-secur...


Can't make a coherent response, so you hit the downvote button out of frustration. lol.


> random tweet

From the Chief Security Officer of Facebook....


And the CEO of a car company recently said "This was a couple of software engineers who put this in for whatever reasons."


Cool. If FB feels like making a public commitment to financial liability in the event it's discovered that they violate this, it might almost be worth believing.


...whose twitter profile says "All tweets are my own blah blah blah [sic]"


But there's no way to verify that. Perhaps the OS should show something in the status bar when an app accesses the clipboard.


I think it's weakness in iOS security architecture. App shouldn't be able to access clipboard by default. iOS should provide API to call context menu with Paste menu item and that menu should be protected from app intervention, so user can paste clipboard anywhere but app can't force him to do that.

And if App provides some extended functionality for which it need to constantly monitor clipboard (I never used such app), it needs special request from user and special flag in plist file, so moderator can review it.


UPS does the same thing, scanning for tracking numbers. I don't find it particularly creepy, but it probably should be a privacy setting just like location, camera, etc.


Yes on a new privacy setting. Bonus points for having policy settings that depend on both source and destination, e.g. a custom policy for a string that originates in a password manager.


And then let's default it to deny access, and suddenly we've just destroyed the clipboard.

I don't like the direction where this is going. We've created an adversarial situation, where software developers go against users and against each other and then instead of fixing that, we keep destroying features of cooperation and interoperability in the name of "security" and "privacy".


The clipboard is not for apps to help themselves in what the user does, it is for the user to use directly. If apps do want to programmatically access the clipboard I don't see why not forcing them to ask first.


I don't see how that would destroy the clipboard any more than having a permission dialog when an app wants to access my camera? Every social app out there wants to access my contact list in the name of "cooperation and interoperability." I like having the choice of whether or not they get to access that data.

I wouldn't mind if I opened an app and got a one-time dialog "This app is trying to access your clipboard. Do you want to allow it?"


the iOS / OS X sandbox model of file interaction is a good compromise for a similar situation imo.

sandboxed apps may only access a file outside of the sandbox via an explicit user interaction with a system UI element, i.e. drag-and-drop a file via OS X drag-and-drop API, select a file in iOS / OS X owned file dialog, share sheets, etc.


The suggestion is to prevent apps reading the clipboard, not to prevent the user pasting their clipboard contents into the app.


How is this not creepy? Apps shouldn't be sniffing around in my data unless I tell them to.


Instapaper also does it (with url, suggesting adding them)


People who use password managers tend to use the clipboard to copy the passwords around. I assume those passwords are also available in these apps?


That is kinda worrying, though I don't use that functionality myself. Is there another way that passwords could be passed between apps without using the clipboard?


Have the password management app emulate a custom keyboard?


Keepass2Android does exactly that, and additionally auto-fills password and confirmation fields when you focus them with the custom keyboard active. Very handy.

(https://play.google.com/store/apps/details?id=keepass2androi...)


iOS disables custom keyboards for password fields, which is a great idea in most scenarios. This is what prevents Agilebits building a 1Password keyboard extension.


Maybe there could be a way to negotiate a secure channel between a browser and password manager. Say, when setting up, a browser gives you a key to retype into password manager, and then it's used to encrypt the password before putting it in the clipboard.

Adds one more step for setting things up but saves the clipboard as a whole. People using password managers are already people willing to jump through some hoops to set things up, adding one more hoop shouldn't make things worse.


At least 1Password on iOS can fill logins directly into Safari. It is (somewhat confusingly) listed as an action when you click the "share" icon.


1Password has a TTL setting to clear passwords copied to clipboards. They've thought this through. Kudos.


Probably doesn't work anymore, as conradev notes[0], applications used to have clipboard access when running from the background (so after TTL expires 1password could go in, check if clipboard data was still the password and remove it even if not active) but that's not the case in ios9, only the currently active application can access the clipboard. So unless apple added TTL to clipboard items (which UIPasteboard's documentation doesn't indicate) it won't work anymore.

[0] https://news.ycombinator.com/item?id=10411166


With Protect My Privacy (http://apt.thebigboss.org/onepackage.php?bundleid=org.protec...) you can control which application can access your clipboard (and many other things) on jailbroken ios.


Pretty common to do this for apps where people might want to copy/paste into the app.


Why? If I want to paste, I tap and say paste.


It's a better experience to show a pop up asking to save a clipboard URL to Pocket rather the user has to tap on a few buttons to do it. The key is to let them switch the feature off.


The ideal user experience would be for the app to forward all my emails, texts, and other personal data to an employee so that the employee could read them and make relevant suggestions.

But I don't want that either.


Some do. It's usually called having a secretary.

We must be careful to pick up sane defaults; just shutting down everything because "privacy" is not a way to build useful technology.


Nobody expects or wants the Facebook to act like a secretary.


If your going to Pocket, it's likely that pasting from your clipboard is what you want to do. Not as true for Facebook- would you say you post a link from your clipboard as often as you launch Facebook to read your timeline or post a picture?


The Pocket app does this on Android to streamline the article adding process.


Google Chrome does the same when you are trying to enter a URL.


As does the Pinner app (for Pinboard) to make it convenient to add bookmarks via copying the current URL.


As does Pinterest.


> what can we do? How about stop using this fucking terrible app?


Same on android, my japanese dictionary checks the clipboard for automatic searches. I was surprised that it didn't require any authorizations.

https://play.google.com/store/apps/details?id=com.cal.pas.ul...


Pocket on Android can also do this, I never thought it as a problem, but a functionality that facilitated share of urls from apps that can not share urls to other apps on my phone, but can copy the url.


Pocket for iOS does this too.


Facebook is simply an evil company that you should loathe just like any other malicious operator on the internet. What the heck is wrong with people that everyone would be so smitten with the psychopaths at Facebook. How would you feel if any number of malicious actors did any number of the horrible things Facebook does on a regular basis?

I can tell you, you would be pissed off. But instead, you simply feed ever more information into the Facebook global surveillance system. Good job.


My Japanese dictionary has auto look up, so if I copy a word from a website and open the dictionary app, it will look it up automatically.

Or as the website claims: it scrapes my clipboard.


Instapaper has been doing this for a while.


The Plex app accesses your clipboard every time you open the app to check if what's in there is a link. If it finds a link it asks you if you want to add that link to your Plex Queue.

This functionality is used to add videos from the web to your Queue for later viewing.


Does that happen even when the facebook app is not active? Does the content gets uploaded?


I like this feature and don't think it is creepy. All they are probably doing is regex'ing a URL and if one matches showing it there. Sure they could see some other random thing I have copied, but there is no incentive for them to do anything with that information.


And Gmail scans your emails in a functional way as well...


Every third week there is a problem about privacy and security.

How do we solve these issues once and for all? Is it on the OS level, or consumer education, or must the government intervene?


[deleted]


How about any url or other data that I happened to copy and didn't intend to share with facebook? It's not at all difficult to think of such a situation.


other stuff I'm likely to copy and paste is content I can't remember, things like passwords


I removed the facebook app from my phone a long time ago. I do not intend to put it back.


Thanks for telling us! We all wanted to know.


Flipboard used to scrape it before Facebook. So this is not new




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: