Quick note: The report makes it sound like /bin/ls is being given special privileges. That would be reminiscent of many past macOS security issues: processes are treated differently based on their code signature and entitlements, and sometimes that has unexpected consequences.
But that's not the case here. /bin/ls has no entitlements. And if I modify the sample project to just call stat() directly rather than invoking ls, it still works.
So it's really a kernel issue where for some reason filesystem metadata is not being protected as much as the actual data.
> That would be reminiscent of many past macOS security issues: processes are treated differently based on their code signature and entitlements, and sometimes that has unexpected consequences.
Hmm, I wonder if this is the root cause of something my friend group found in high school. We had macs that were locked down and I think it was something the system did vs third-party software but I could be mistaken. Pretty much you could only launch certain applications and you couldn't edit and preferences/settings for the system. Being kids we wanted to play Starcraft so I brought in a bunch of copies to play during our free period. Unfortunately you couldn't launch the Starcraft app. I still don't remember how we even figured it out but you could go into Safari and change the default browser to Terminal, then you could open up a word doc, type a link, and click it (or anything that would cause the system to open a link when you weren't already in a browser). That would launch Terminal but with a level of permissions that you couldn't get by launching it directly (or maybe it was that we couldn't even open Terminal directly, it's been a while). Once you had this "system-permissions-Terminal" launched you could type "open /path/to/Starcraft/launcher" and boom, Starcraft would launch and it was off the races.
Good times. Our teacher questioned how we could play games and if it was allowed but was placated with the explanation "The computers are all locked down so the ability to play this game means it must be ok to play". A bit of circular-logic and misdirection but this was still in period where all the teachers were woefully behind on technology and how it worked.
Oh man, this reminds me of my own experience with early computers in the classroom. Back in these days there were two computers in each classroom. One for the teacher and one for the students to share. Generally, students never used the single computer because what is the point, there's only one. My AP Calc teacher's "student" computer was broken. The school IT department couldn't be fussed to fix it, so we asked if we could and we did. As a reward for fixing the computer, we were allowed to use it. We played GTA (original) every day. One person was back there at a time during lecture. Your goal was to find a flame thrower and a chain of joggers, we called them the school children. If you killed all the joggers in the chain you got a big bonus. You can see where this is headed. Once you got the flame thrower and the school children on the screen you announced it to the class, lecture would stop, you'd flame thrower the school children while everyone watched, and then you traded off to the next person and the lecture continued. All but one person passed the AP exam that year.
Ahh yes, I helped fix computers throughout my time in primary education to the point I'd get pulled from classes to help sometimes. I had cemented my "status" back in elementary school when I was a "computer genius" because I knew how to mount the network drive then open and save files to it (which led to me being tasked with helping everyone else of course lol). I used to take every single chance to put my hands on a computer and it paid off in present day when I get paid to do it.
Not related to Macs but coincidentally StarCraft was also the first time ever I "cracked" a game. ^^
For whatever reason, the copy protection was not recognizing my game disc (apparently only worked on Windows 95 but not 98 as I found later).
This was my most anticipated game yet, so I made myself learn Windows/PC debugging on the spot — without the internet — which basically amounted to single-stepping through every line of disassembly in Visual Studio until the disc error message, then working backwards from the very last "jump" instruction, flipping the condition of each jump (I think it was JZ to JNZ or vice versa), until finally I found the 2 bytes (or was it 4) that took me to the blessed menu music that I can still recall. :)
Of course I had nobody to show my achievement off to and it wasn't even a moment of pride or anything, just relief and sheer happiness as I was about to get lost in what would become one of my most favorite games of all time.
(P.S. I hate what they did to the story in StarCraft II)
Impressive! Broken video games motivate a lot of learning.
When I was a teenager, we had three game-capable PCs, but only two had LAN cards. I also had an underpowered LAN-connected Linux machine. I connected the non-LAN PC to the Linux box with a parallel cable. Linux could route packets between the LAN and parallel-cable network. But DOS games find each other with local broadcasts which don't forward. No game had a function to specify a network address to connect to. I needed to bridge the networks. Linux could bridge ethernet, but the parallel network wasn't ethernet. So I copied the source of a kernel module and modified it to bridge IPX packets between the LAN network and the parallel-cable network.
It worked! My friends and I could play THREE-PLAYER games! DN3D, C&C Red Alert, Quake, Descent, Terminal Velocity, etc. Network drive sharing even worked. It was glorious.
Hahaha this is great. Reminds me of my first suspension from school when I used MS Word hyperlinks to get to all the drives "hidden" by the network admins. This included the homework mailbox drives, so naturally I found my least favourite teacher's inbox, hid the folder with the work in it and then created a pair of links pointing to one another. Good times. I wouldn't have been found out if I hadn't removed the graphic for the login screen and replaced it with the Christmas version in May... and then bragged about it when everyone in my class noticed.
Bragging, a very similar thing was my downfall back when I was much younger.
In middle school I had this weird idea to collect everyone's ID number. It, coupled with your name, would log you into everything on the computers. To this day I don't know why I wanted this info other than to have it. I never once used it for any purpose, I think I tested 1 or 2 but never touched any files. I had a HyperStudio stack (saved to my network drive) that had hidden buttons and a certain sequence you had to press them to get to the "database" (just text entry field that I saved 1 name and 1 ID number per line). It was painfully easy to collect the numbers as most kids had their class schedule on the outside or inside of their binder they carried around. The ID number was only 6 or 8 digits so it was easy to memorize, write down, and store in HyperStudio later.
But alas, stupid younger me thought it would be a good comeback to rattle off someone's ID number when they were picking on me one time which led to a 3 day in-school suspension and loss of computer privileges till the end of the year. They made me show the IT guy where I had stored the numbers (how to navigate my HyperStudio project) and phrases like "hacking" and "hacker" were thrown around even though this was literally equivalent to writing the numbers in a notebook but since I had used a computer to store the data it became a way bigger thing in their minds. Even "funnier" (not to me at the time) I had a friend that helped me collect the numbers (again, this was stupid easy, felt like a fun game to figure out how to get it, and who could collect more the fastest) who got a lighter punishment and didn't lose computer access.
Fast forward to high school and I ended up writing 2 different PHP-based apps for the school. A library attendance program that teachers used to mark that they were sending kids to the library that the library could see (so they didn't just skip school I guess? Or goof off in the halls) and to keep track of who was in the library and how long they had been there. I also wrote an online voting platform for the school that they could re-use for things like Homecoming court/Prom court/Senior superlatives/etc. The reason I bring up both of these? The high school gave me a massive CSV of all the students in the school.... and their ID number to be used for login to the platforms. I still get a good chuckle out of that.
It's funny how changing grades is a pretty common trope but surprisingly not that hard to do back in the old days. I remember reporting an issue where the school district had their reporting tool just open to the internet.
I don't know how I didn't get in trouble for all the snooping around I did.
You could also open restricted system preference panes by searching for a relevant term in Spotlight and going to a User Guide article. Often they would have a link to open the preference pane which would bypass any restrictions.
Hmm, let's see. It would have been 2004-2008 IIRC. I think I did it at the tail end of that time period though. I want to say that it was on the gumdrop shaped macs but I know the school got some of the first iMacs around that time as well. The new iMacs were awesome to me because I could carry around an external hard drive and boot the newest macOS (Mac OS then) off it. It meant I had root and that coupled with a proxy gave me a fully unlocked computer.
> I want to say that it was on the gumdrop shapes macs but I know the school got some of the first iMacs around that time as well.
The "gumdrop shaped macs" were the first iMacs and were released in 1998 (I remember this well because that was around the time I worked for a publishing company so had to deal with MacOS 8 and 9 a lot as well as wiring a gigabit Apple Talk network (at the time that was very futuristic).
I had similar tails of exploiting my school network. Though it was Windows 3 and I way playing Wolf3D loaded via a program called something like "Object Manager" that allowed you to embed data into winword (might have been related to OLE?). Those machines were null terminals so the game was installed into my user area. Unsurprisingly I got caught but thankfully deleted the executable just moments before hand so I only had to make an excuse for the WAD files.
At college I upped my game and write a RAT which I installed on every PC on the network. I actually managed to get away with that one, albeit there were a couple of near misses. One time I got caught because some mates sat next to me were playing games. When questioned what I was doing I confessed to the lesser crime of also playing games because writing malware would surely have seen me suspended (or worse) rather than having my IT privileges revoked for 24 hours! That college did eventually find the RAT on the network but only after I left, but assumed it was someone else. It wasn't until my brother got a job at the college IT department ~5 years later when they realised it was me who installed the software.
> The "gumdrop shaped macs" were the first iMacs and were released in 1998
Ahh, my bad. This was early-days for my "paying attention to macs". I only used them at all because that was all the school had, I was a die-hard, PC-master-race, build-your-own-computer, windows user at this point. So yeah, we had the gumdrop shaped iMacs and then we upgraded to the chunky white-bodied-on-a-stand iMacs. We did have a few Mac Pros in the library (for video editing) and in the shop class (for 3D modeling), the cheesegrater style ones.
In a way, the ever-increasing restrictions during my final year at school pushed us into exploiting various flaws in their setup for a couple of reasons. Primarily, they were arduous - by the middle of the year, any window with a title containing certain strings, even ones as innocuous as "Firefox", would be closed automatically without warning. It got in the way of legitimate activities - a number of teachers also found ways to avoid them as sites they needed were often blocked. It was also interesting to keep having to find new ways to get around it ("CGI proxies"[1] found via Google -> self-hosted proxies -> wildcard domains to bypass filter lists -> access via IP and random port -> local admin exploit to disable protection/monitoring software).
In the process, we discovered that the security was rather inadequate. A VNC server was installed on all machines, including staff machines, with the very imaginative password of "vnc" (not hard to guess once you see a member of staff typing in a three character password), and we shoulder-surfed a domain admin password and it was just "school". This was later changed[2], but we bruteforced a cached hash[3] and found it was just the name of the school with a '0' in place of an 'o'. We had a 'shadow' domain admin account for _months_ before it was noticed, even after the staff were aware people were poking at holes in the system (someone else had sent a Window messenger service message to the entire domain around the same time).
We never really used it for anything though - we created the domain admin account to see if we could, then it basically went unused after that. We only got caught after someone else used a script to change the local admin password on every computer (I'm still not entirely sure why). It did provide an interesting lesson in OPSEC though - it was only tied back to us as they were tracking USD device names, and someone called their USB drive "<surname> USB" and still had it connected when logging into the domain admin account.
The punishment was to spend a week working with the IT technicians (mostly doing busy work such as cable managing rooms and tracking down serial numbers/asset tags), which gave us plenty of time to fully explain the flaws we found. I think they took security more seriously after that.
[2] we had no malicious intent, so upon realising that gave you read/write access to everyone's files, we left an anonymous note containing the login details at the IT technicians' office hoping they would improve things. Some of the teaching staff were also aware, and their only advice was essentially "Don't get caught" (and one asked for a copy of the Ophcrack live CD).
[3] Booted from an Ophcrack live CD, something that was "fixed" by removing the CD drive from every machine in the school
You need the r permission to read a directory, in the sense of getting a list of filenames. But the x permission lets you access files and subdirectories if you already know the filename. Of course, you need r permission in the file to read it, but you can always stat() it.
Indeed, I actually meant "read (or check) a specified path/directory status entry" rather than "read a directory" but probably hit submit too quickly.
What this "privacy protections bypass" is doing looks like the former rather than the latter, and it seems like normal behavior if you have x/stat permission.
It could be that Apple's sandbox blocks r/readdir permission but not x/stat permission for some reason.
> I chose the example of ~/Library/Safari/LocalStorage because Safari names the files in this directory according to the web sites that you visit! Also note that the output of long format ls -l contains the last modification date of the files. Thus, one possible privacy violation from this technique is to learn the user's web browsing history.
Its a pretty serious issue if any random app can read your browsing history. Even more so that Apple hasnt fixed it more than one year after the author reported it.
No, not any random app. You knowingly chose to install the app. It either came from the App Store or it was notarized by Apple.
There are just so many reasons why software needs to access your hard drive. My app, for example, needs to write files in ~/Library/Application Support/Chrome in order to add native messaging permissions for my extension. Can you imagine the number of "Karens" that are going to email me because they "caught" me trying to "steal their data" if they add restrictions to this folder?
Apple did the right thing by only adding warnings for more sensitive areas like your Downloads or Documents folder, but any more than that and I think it'll cause more harm than good.
I agree with the blog post. Apple seems to be more focused with "security theatre" right now (or at least half-assed security for the sake of marketing). They do things like add easy-to-implement (via their FileManager class) file access warnings to appease most non-technical users. But at the same time ignore bigger looming threats like apps accessing the Internet. I think the issue isn't the warnings, it's what the warnings are about.
Anyway, my guess is that Apple will be adding network access warnings in the future (since it seems they re-wrote a large chunk of the networking code recently) but let's not deny the two-faced marketing speak going on right now and fact that they do stuff like making it impossible to inspect network traffic from Apple apps. The hand-wavy "trust us" argument shouldn't work for Apple either. Why do I have to trust Apple more than a third party developer?
> Apple did the right thing by only adding warnings for more sensitive areas like your Downloads or Documents folder, but any more than that and I think it'll cause more harm than good.
Sorry, I didn't mean to argue for apps to be able to read your browsing history.
The main point I was trying to make was that apps having network access without warning is more of a security/privacy issue than apps being able to read local files without warning. It's probably why Little Snitch became so popular and why I think Apple is in the process of shoving them out of the market by building it into the OS (I'm guessing!).
I think they are implying apple can’t control the sensitivity of third party tools across the board, so it’s up to chrome to figure out how to protect your browsing history, and they need to improve their file system layout or APIs to protect their users.
If that is a serious issue, it says a lot about how goalposts have moved the last decades. We haven't been able to expect anything less than every program being able to read all your files.
I don't necessarily have an issue with the idea that every app I install can read all my files. It's not great, because it forces me to place a lot of trust in every application, but if the alternative is compromising what those apps can do, or bombarding me with security prompts... well, I can see both sides of that argument.
However, if we are going to go down the path of compromising functionality and adding lots of annoying prompts, then that strategy had better actually work!
If despite all of these annoyances apps can still read my browsing history, that means I also still need to trust every application I install, so I'd definitely prefer we just went back to where we started.
On unix systems that still use a 1970s permissions model, this is true, but it's been fixed on mobile devices for ages, and macOS desktops for over a year (this bug notwithstanding).
Ransomware (enabled by bitcoin payments to anonymous recipients) really changed the game on desktop in the last few years. Apple stepped up, but there's crickets on the matter in Windows- and Linux-land, aside from the people who have been containerizing their desktop apps[1].
From early on, Android had strict separation of files across apps. I would think users are generally aware and expecting that apps do not have full access to all of their data.
Just out of curiosity, is there an allowed way to access directories like this? Sudo doesn't let me see the contents of the ~/Library/Safari directory. Can non-Apple programs access this, like backup programs? Can I bless a script to access this? Can Mozilla or others import bookmarks from Safari any more?
The report shows that given a path, any app can access file metadata or learn that no such file exists outside the sandbox. It does not show access to file data itself. The Safari specific part of the report is data leak caused by the fact that Safari embeds sensitive information (URLs) into file metadata (name).
I believe you can access those directories by either enabling Full Disk Access for the process of interest or by disabling System Integrity Protection.
"I continue to believe that macOS "security" is mainly theater that only impedes the law-abiding Mac software industry while posing little problem for Mac malware."
Unless we use a more specific term such as "user security" I just assume "security" means company security -- the protection of Apple Inc.'s business.
One can argue that the security of the business of Apple Inc. benefits its enthusiastically supportive customers. Thus it is possible to conflate "Apple security" with "user security" under the ambiguous term "security". However I think these two concepts are often not interchangeable, and at odds with each other. The user is not the corporation, nor vide versa; they are separate beings with different interests. Neither can speak for the other. Apple now "secures" its BSD-derived OSX from unauthorised software -- unauthorised by Apple Inc., not necessarily unauthorised by the user. As the author notes, this restriction guards against ("impedes") not only malware but all third party software.
The same applies with respect to "privacy". Under Apple's definition, there is no such thing as privacy from the company. It is as if the company and the customer are viewed as the same person. Employees of Apple are under strict obligations of confidentiality to the company, but they are under no duty of confidentiality to the customer. When an Apple employee discloses secrets of Apple, Apple can enforce its rights and the employee's obligation via the courts. When Apple discloses the secrets of a customer, the customer has no applicable rights or obligations it can enforce against Apple. Instead, we have seen privacy "theater" as Apple protested, via the courts, against aiding in disclosure; purportedly this was done on behalf of the customer. The truth may be that Apple was acting on its own behalf to protect the business of Apple, Inc.
> Under Apple's definition, there is no such thing as privacy from the company.
What customer privacy is Apple violating customer confidentiality? They have, arguably, the largest end-to-end encrypted messaging system, store user backups and data that they cannot access, are leading in mechanisms to prevent user tracking, and sell hardware specifically designed to reduce data customers are sending to them.
I would appreciate these disclosures a lot more if the author didn’t always include a flippant dismissal of security architecture improvements in macOS. Yes, it’s harder to write software with sandboxing and other modern security techniques, but that doesn’t mean we should go back to how things were.
This is a serious stance of his, with a lot of serious data and arguments to back it up, from a serious engineer who has written an impressive list of Mac software both for Apple and for Apple's customers.
You did use the word serious enough to make it compelling. But the author’s biography doesn’t mean that his comment wasn’t flippant.
He’s proved that an well-behaved, codesigned app can list file metadata about files in restricted directories. He hasn’t proven the sandbox compromised.
You claim he has so much serious evidence, link us there. Don’t just string adjectives together.
I have great respect for Jeff, but he is one of the more outspoken complainant Apple devs. At least he has a better basis for his commentary than DHH.
I would like Apple to not roll out BS prompts that make my life more difficult until those prompts are actually capable of protecting some of the most sensitive data on my machine.
Did I? The only other interaction I had with you recently that I can find is a discussion about Apple's security policies, which seemed fairly reasonable to me.
The biggest issue with the author is that he complains both about the controlling/locked down nature of Apple’s platforms and about any bugs that show up in that system.
I.e. His goal is to criticize Apple no matter what they do, because he dislikes the fact that they are no longer producing the kind of open system he prefers.
My thoughts exactly. If this was just an overlooked bug, which was reported to Apple and which Apple then fixed, that would be the system working as intended.
In reality, a very simple bug was reported more than a year ago, and Apple apparently hasn't cared enough to fix it. The only way I can interpret that is to conclude Apple doesn't really care about the integrity of their sandbox.
IMO, this more than justifies the author's accusation of "security theater". My browsing history is among the most sensitive data on my machine—certainly more private than anything in my Documents folder, which Apple felt the need to protect in a highly-disruptive way. I agree that it can be worth trading some degree of usability for privacy and security, but only if those privacy benefits are real. If they're not, then we're left in the worst of both worlds.
> The only way I can interpret that is to conclude Apple doesn't really care about the integrity of their sandbox.
There are many other ways to interpret it. Here is one completely made-up example that I created just now for this reply:
"Apple can't lock this down further without breaking open() calls in the majority of existing applications; therefore, they made a pragmatic choice to allow this issue to exist until their long-term roadmap plan to remove direct disk access to protected folders ships in a future macOS update; while declining to share their decision with the reporter, as is completely normal for Apple."
If you define "security theater" as "any practice that would not stand up to a human attacker", then all security is guaranteed by definition to be security theater, since all security protections will be found to have weaknesses, compromises, and design decisions that could be theoretically exploited. That definition is clearly non-viable in reality, and so all security decisions — even Apple's — will have unpalatable outcomes that do not invalidate the relevance of security.
> while declining to share their decision with the reporter, as is completely normal for Apple
This is completely normal for Apple, but that doesn’t make it OK for them to treat security fixes like product launches where they can choose an arbitrary timeline and keep the reporter hanging forever.
Yeah, it probably is; maybe it requires substantial changes in the kernel or something. The issue is that Apple never communicates this, they just sit on bugs until they fix them. This is a really poor experience for people reporting issues.
These security features are only nominally about protecting the user. Apple implements them to protect their services and platforms from competition and sells them via the privacy argument.
Does it happen to improve the security situation? Yes, for many people it does. Is it worth the cost? That's debatable, especially because of Apple's apparent apathy (and occasional hostility) towards the community.
> These security features are only nominally about protecting the user. Apple implements them to protect their services and platforms from competition and sells them via the privacy argument.
Stallman[1] and others[2] have talked about just this issue for over a decade now.
Haven't you ever thought that the flippant attitude is exactly why you are able to learn of this now (and not 10 years later/if at all)? If everyone was "policy abiding" folks who will give a megacorp the benefit of the doubt, these won't be disclosed for years.
The flippant attitude is exactly why and how you are reading of all these vulnerabilities now.
Knowledge of the issue (but enduring "flippancy") or not knowing it at all? You pick.
What I'm really saying is that this "flippancy" is the agency that's making someone write a blog post, sign their name to it, put it out there with code samples, etc. You dismissing "flippancy" is insulting the agency of this. Without that emotion, that idea where they thought Apple wasn't treating them well, that is the source where people find the energy to publish, to publicise.
Every single word takes strength to write. In this case, the flippancy was the driving force and it shows clearly.
Why would you dismiss that energy?
And no, it's not the author's job to "shield" you from the wrath of their flippancy. I take it and I thank "flippancy" for disclosing this issue.
We shouldn’t put the burden on the bug finder to be “nice” and persistent about doing Apple’s job for them. We should put the burden on Apple - the first trillion dollar company - to take bug reports seriously. Given that iOS and macOS have pretty novice security vulnerabilities (allowing apps to view Safari browsing history, allowing iOS apps to detect if a device is jailbroken), why is it up to the bug finder to be nice about it?
This limitation of the macOS sandbox has always driven me nuts. Even with a default deny macOS sandbox profile (much stronger than anything that entitlements or TCC can apply, but pretty close to the restrictions some Chrome/Safari processes will run with) you still get an ENOENT instead of EACCESS when trying to access a path that doesn’t exist. I understand not applying that behavior in default sandbox profiles but for apps that are built to run some processes in extremely aggressive sandboxes like browsers it would be a real benefit.
BTW, this is also how iOS apps could detect jailbreak status of the device: just try to open paths like `/var/lib/apt`, if it does not exist, it should return ENOENT; otherwise you would know this device is “not clean”.
Didn’t think the sandboxing on macOS also has this issue.
I don't know how to view specific TCC restrictions, and I don't know of any way to view the full sandbox profile applied to the current app, but in some quick tests it sure looks like the core issue is Apple applied this sandbox rule:
Okay, this is a serious issue and I want this to be taken seriously by Apple.
What can I, as a reader, do? Is there someone to forward this to? Is there a person in Apple to email? Or are we hoping for a tweet storm to stir the water?
I don’t know that voting with your wallet works with the richest company in the world, especially when a lot of professionals have to have their devices to do their jobs.
Voting with your wallet is the crudest & most direct form of power that people have. In a way, it surpasses democracy. So yes, if enough people do it, it does make a difference.
The HN crowd in particular has a sizeable influence on other people with regards to technology. Because we are the techies, people ask us what they should use/buy. People observe what knowledgeable people do, and they tend to learn from it. You have more influence than you think. It just takes time to see the changes take effect.
Is there any form of democracy in practice that doesn't involve money?
> The HN crowd in particular has a sizeable influence on other people with regards to technology. Because we are the techies, people ask us what they should use/buy.
See, many of us do recommend people to buy Apple. Because they're still very much the lesser evil among the Microsofts and Googles. If Apple does go bad, it's ridiculously easy to avoid Apple completely: Just don't buy any Apple hardware. Done. Not so easy with MSFT or GOOG, which is what we warn people about.
And that's something the other side on HN can't seem to be able to handle and tries to bury any opposing comments to give the impression of a homogenous echo chamber.
iCloud is still easier to move out of and never see again than say GitHub or YouTube, both of which were not owned by Microsoft or Google before but now here we are.
It is a legitimate data leakage issue and I can't say I'm surprised Apple has done nothing on it. Their speed in responding to software issues seems to be more miss than hit :/
I just have to say I had never heard of one of his products - Stop the Madness. A real game changer! The five things he fixes (that vex me routinely!):
Yes, you can get various other extensions in other browsers to fix the five issues he addresses, but his extension works in my preferred browser (Safari) and it just works. Didn't have to load a custom script into some other extension or tweak around - just install and done. I've had various success trying to overcome these five issues but my goodness his extension solves them all and so far no site has foiled it. Amazing.
>I continue to believe that macOS "security" is mainly theater that only impedes the law-abiding Mac software industry while posing little problem for Mac malware. It doesn't take a genius hacker to bypass macOS privacy protections: calling "ls" is a script kiddie level attack.
And doing something useful with it, to the level of malware? Is that also trivial?
Also, how would that "script kiddie" do that attack in the first place, if you get your apps from the App Store? If it's an independent app, all bets are off anyway. E.g. if they have a serious 0-day to do that, they wouldn't waste time with this. And they could ask the user to disable the SIP, enter root password, or whatever as well...
> And doing something useful with it, to the level of malware?
You probably couldn't use this to steal someone's bank password, but most of TCC doesn't really protect against that. An app could certainly use it to track users and target ads, since it can reveal your browsing history in detail.
That probably won't work on the Mac App Store—but the primary complaint about TCC in recent years is that it applies to all software, not just App Store apps.
This limitation also applies to the Safari and Chrome sandboxes on macOS. Being able to get metadata like this off of the system of someone who warrants buying a V8 vulnerability to attack them seems like a reasonable possibility to me.
> The only reason I was even looking for bugs here is that I could have really used the extra money, since it's difficult nowadays to make a living as a Mac developer in the face of ever increasing (and futile) macOS lockdown. Sadly, it's not very difficult to find bugs, though it's extremely difficult to get paid a bounty for them.
Prepackage the scripts, weaponize and sell them on White House Market
You incur no liability, only the people that penetrate incur some liability and only the people that use the pilfered information incur some different liability
Just drop the lone hacker idea, the black hat world functions like a corporation that dilutes and shifts liability until it is no longer recognizable and also worthless to bother with, while everyone perfects their niche and gets paid for that. The white hat world continues undervaluing and resisting market forces.
But that's not the case here. /bin/ls has no entitlements. And if I modify the sample project to just call stat() directly rather than invoking ls, it still works.
So it's really a kernel issue where for some reason filesystem metadata is not being protected as much as the actual data.