But that's not the case here. /bin/ls has no entitlements. And if I modify the sample project to just call stat() directly rather than invoking ls, it still works.
So it's really a kernel issue where for some reason filesystem metadata is not being protected as much as the actual data.
Hmm, I wonder if this is the root cause of something my friend group found in high school. We had macs that were locked down and I think it was something the system did vs third-party software but I could be mistaken. Pretty much you could only launch certain applications and you couldn't edit and preferences/settings for the system. Being kids we wanted to play Starcraft so I brought in a bunch of copies to play during our free period. Unfortunately you couldn't launch the Starcraft app. I still don't remember how we even figured it out but you could go into Safari and change the default browser to Terminal, then you could open up a word doc, type a link, and click it (or anything that would cause the system to open a link when you weren't already in a browser). That would launch Terminal but with a level of permissions that you couldn't get by launching it directly (or maybe it was that we couldn't even open Terminal directly, it's been a while). Once you had this "system-permissions-Terminal" launched you could type "open /path/to/Starcraft/launcher" and boom, Starcraft would launch and it was off the races.
Good times. Our teacher questioned how we could play games and if it was allowed but was placated with the explanation "The computers are all locked down so the ability to play this game means it must be ok to play". A bit of circular-logic and misdirection but this was still in period where all the teachers were woefully behind on technology and how it worked.
For whatever reason, the copy protection was not recognizing my game disc (apparently only worked on Windows 95 but not 98 as I found later).
This was my most anticipated game yet, so I made myself learn Windows/PC debugging on the spot — without the internet — which basically amounted to single-stepping through every line of disassembly in Visual Studio until the disc error message, then working backwards from the very last "jump" instruction, flipping the condition of each jump (I think it was JZ to JNZ or vice versa), until finally I found the 2 bytes (or was it 4) that took me to the blessed menu music that I can still recall. :)
Of course I had nobody to show my achievement off to and it wasn't even a moment of pride or anything, just relief and sheer happiness as I was about to get lost in what would become one of my most favorite games of all time.
(P.S. I hate what they did to the story in StarCraft II)
When I was a teenager, we had three game-capable PCs, but only two had LAN cards. I also had an underpowered LAN-connected Linux machine. I connected the non-LAN PC to the Linux box with a parallel cable. Linux could route packets between the LAN and parallel-cable network. But DOS games find each other with local broadcasts which don't forward. No game had a function to specify a network address to connect to. I needed to bridge the networks. Linux could bridge ethernet, but the parallel network wasn't ethernet. So I copied the source of a kernel module and modified it to bridge IPX packets between the LAN network and the parallel-cable network.
It worked! My friends and I could play THREE-PLAYER games! DN3D, C&C Red Alert, Quake, Descent, Terminal Velocity, etc. Network drive sharing even worked. It was glorious.
Nobody around me understood what I had done.
Games with more than 2 players each with their own private screen was where the PC really started to come into its own as a gaming platform.
But now I miss couch multiplayer with friends trying to yank or knock the gamepads out of each other’s hands..
In middle school I had this weird idea to collect everyone's ID number. It, coupled with your name, would log you into everything on the computers. To this day I don't know why I wanted this info other than to have it. I never once used it for any purpose, I think I tested 1 or 2 but never touched any files. I had a HyperStudio stack (saved to my network drive) that had hidden buttons and a certain sequence you had to press them to get to the "database" (just text entry field that I saved 1 name and 1 ID number per line). It was painfully easy to collect the numbers as most kids had their class schedule on the outside or inside of their binder they carried around. The ID number was only 6 or 8 digits so it was easy to memorize, write down, and store in HyperStudio later.
But alas, stupid younger me thought it would be a good comeback to rattle off someone's ID number when they were picking on me one time which led to a 3 day in-school suspension and loss of computer privileges till the end of the year. They made me show the IT guy where I had stored the numbers (how to navigate my HyperStudio project) and phrases like "hacking" and "hacker" were thrown around even though this was literally equivalent to writing the numbers in a notebook but since I had used a computer to store the data it became a way bigger thing in their minds. Even "funnier" (not to me at the time) I had a friend that helped me collect the numbers (again, this was stupid easy, felt like a fun game to figure out how to get it, and who could collect more the fastest) who got a lighter punishment and didn't lose computer access.
Fast forward to high school and I ended up writing 2 different PHP-based apps for the school. A library attendance program that teachers used to mark that they were sending kids to the library that the library could see (so they didn't just skip school I guess? Or goof off in the halls) and to keep track of who was in the library and how long they had been there. I also wrote an online voting platform for the school that they could re-use for things like Homecoming court/Prom court/Senior superlatives/etc. The reason I bring up both of these? The high school gave me a massive CSV of all the students in the school.... and their ID number to be used for login to the platforms. I still get a good chuckle out of that.
I don't know how I didn't get in trouble for all the snooping around I did.
The "gumdrop shaped macs" were the first iMacs and were released in 1998 (I remember this well because that was around the time I worked for a publishing company so had to deal with MacOS 8 and 9 a lot as well as wiring a gigabit Apple Talk network (at the time that was very futuristic).
I had similar tails of exploiting my school network. Though it was Windows 3 and I way playing Wolf3D loaded via a program called something like "Object Manager" that allowed you to embed data into winword (might have been related to OLE?). Those machines were null terminals so the game was installed into my user area. Unsurprisingly I got caught but thankfully deleted the executable just moments before hand so I only had to make an excuse for the WAD files.
At college I upped my game and write a RAT which I installed on every PC on the network. I actually managed to get away with that one, albeit there were a couple of near misses. One time I got caught because some mates sat next to me were playing games. When questioned what I was doing I confessed to the lesser crime of also playing games because writing malware would surely have seen me suspended (or worse) rather than having my IT privileges revoked for 24 hours! That college did eventually find the RAT on the network but only after I left, but assumed it was someone else. It wasn't until my brother got a job at the college IT department ~5 years later when they realised it was me who installed the software.
Ahh, my bad. This was early-days for my "paying attention to macs". I only used them at all because that was all the school had, I was a die-hard, PC-master-race, build-your-own-computer, windows user at this point. So yeah, we had the gumdrop shaped iMacs and then we upgraded to the chunky white-bodied-on-a-stand iMacs. We did have a few Mac Pros in the library (for video editing) and in the shop class (for 3D modeling), the cheesegrater style ones.
In a way, the ever-increasing restrictions during my final year at school pushed us into exploiting various flaws in their setup for a couple of reasons. Primarily, they were arduous - by the middle of the year, any window with a title containing certain strings, even ones as innocuous as "Firefox", would be closed automatically without warning. It got in the way of legitimate activities - a number of teachers also found ways to avoid them as sites they needed were often blocked. It was also interesting to keep having to find new ways to get around it ("CGI proxies" found via Google -> self-hosted proxies -> wildcard domains to bypass filter lists -> access via IP and random port -> local admin exploit to disable protection/monitoring software).
In the process, we discovered that the security was rather inadequate. A VNC server was installed on all machines, including staff machines, with the very imaginative password of "vnc" (not hard to guess once you see a member of staff typing in a three character password), and we shoulder-surfed a domain admin password and it was just "school". This was later changed, but we bruteforced a cached hash and found it was just the name of the school with a '0' in place of an 'o'. We had a 'shadow' domain admin account for _months_ before it was noticed, even after the staff were aware people were poking at holes in the system (someone else had sent a Window messenger service message to the entire domain around the same time).
We never really used it for anything though - we created the domain admin account to see if we could, then it basically went unused after that. We only got caught after someone else used a script to change the local admin password on every computer (I'm still not entirely sure why). It did provide an interesting lesson in OPSEC though - it was only tied back to us as they were tracking USD device names, and someone called their USB drive "<surname> USB" and still had it connected when logging into the domain admin account.
The punishment was to spend a week working with the IT technicians (mostly doing busy work such as cable managing rooms and tracking down serial numbers/asset tags), which gave us plenty of time to fully explain the flaws we found. I think they took security more seriously after that.
 we had no malicious intent, so upon realising that gave you read/write access to everyone's files, we left an anonymous note containing the login details at the IT technicians' office hoping they would improve things. Some of the teaching staff were also aware, and their only advice was essentially "Don't get caught" (and one asked for a copy of the Ophcrack live CD).
 Booted from an Ophcrack live CD, something that was "fixed" by removing the CD drive from every machine in the school
Is the sandbox supposed to block stat() if you don't grant an explicit permission like 'full disk access' but it isn't doing that properly?
What this "privacy protections bypass" is doing looks like the former rather than the latter, and it seems like normal behavior if you have x/stat permission.
It could be that Apple's sandbox blocks r/readdir permission but not x/stat permission for some reason.
Its a pretty serious issue if any random app can read your browsing history. Even more so that Apple hasnt fixed it more than one year after the author reported it.
There are just so many reasons why software needs to access your hard drive. My app, for example, needs to write files in ~/Library/Application Support/Chrome in order to add native messaging permissions for my extension. Can you imagine the number of "Karens" that are going to email me because they "caught" me trying to "steal their data" if they add restrictions to this folder?
Apple did the right thing by only adding warnings for more sensitive areas like your Downloads or Documents folder, but any more than that and I think it'll cause more harm than good.
I agree with the blog post. Apple seems to be more focused with "security theatre" right now (or at least half-assed security for the sake of marketing). They do things like add easy-to-implement (via their FileManager class) file access warnings to appease most non-technical users. But at the same time ignore bigger looming threats like apps accessing the Internet. I think the issue isn't the warnings, it's what the warnings are about.
Anyway, my guess is that Apple will be adding network access warnings in the future (since it seems they re-wrote a large chunk of the networking code recently) but let's not deny the two-faced marketing speak going on right now and fact that they do stuff like making it impossible to inspect network traffic from Apple apps. The hand-wavy "trust us" argument shouldn't work for Apple either. Why do I have to trust Apple more than a third party developer?
Browsing history is not sensitive‽
The main point I was trying to make was that apps having network access without warning is more of a security/privacy issue than apps being able to read local files without warning. It's probably why Little Snitch became so popular and why I think Apple is in the process of shoving them out of the market by building it into the OS (I'm guessing!).
>it’s up to chrome
> ~/Library/Safari/LocalStorage because Safari names the files in this directory according to the web sites that you visit! [emphasis added]
This is a serious issue for data leakage.
However, if we are going to go down the path of compromising functionality and adding lots of annoying prompts, then that strategy had better actually work!
If despite all of these annoyances apps can still read my browsing history, that means I also still need to trust every application I install, so I'd definitely prefer we just went back to where we started.
Ransomware (enabled by bitcoin payments to anonymous recipients) really changed the game on desktop in the last few years. Apple stepped up, but there's crickets on the matter in Windows- and Linux-land, aside from the people who have been containerizing their desktop apps.
Users absolutely blindly grant it because it's such a common permission and give it no second thought. The Android sandbox is a terrible example here.
I believe you can access those directories by either enabling Full Disk Access for the process of interest or by disabling System Integrity Protection.
Unless we use a more specific term such as "user security" I just assume "security" means company security -- the protection of Apple Inc.'s business.
One can argue that the security of the business of Apple Inc. benefits its enthusiastically supportive customers. Thus it is possible to conflate "Apple security" with "user security" under the ambiguous term "security". However I think these two concepts are often not interchangeable, and at odds with each other. The user is not the corporation, nor vide versa; they are separate beings with different interests. Neither can speak for the other. Apple now "secures" its BSD-derived OSX from unauthorised software -- unauthorised by Apple Inc., not necessarily unauthorised by the user. As the author notes, this restriction guards against ("impedes") not only malware but all third party software.
The same applies with respect to "privacy". Under Apple's definition, there is no such thing as privacy from the company. It is as if the company and the customer are viewed as the same person. Employees of Apple are under strict obligations of confidentiality to the company, but they are under no duty of confidentiality to the customer. When an Apple employee discloses secrets of Apple, Apple can enforce its rights and the employee's obligation via the courts. When Apple discloses the secrets of a customer, the customer has no applicable rights or obligations it can enforce against Apple. Instead, we have seen privacy "theater" as Apple protested, via the courts, against aiding in disclosure; purportedly this was done on behalf of the customer. The truth may be that Apple was acting on its own behalf to protect the business of Apple, Inc.
What customer privacy is Apple violating customer confidentiality? They have, arguably, the largest end-to-end encrypted messaging system, store user backups and data that they cannot access, are leading in mechanisms to prevent user tracking, and sell hardware specifically designed to reduce data customers are sending to them.
This is a serious stance of his, with a lot of serious data and arguments to back it up, from a serious engineer who has written an impressive list of Mac software both for Apple and for Apple's customers.
He’s proved that an well-behaved, codesigned app can list file metadata about files in restricted directories. He hasn’t proven the sandbox compromised.
You claim he has so much serious evidence, link us there. Don’t just string adjectives together.
I have great respect for Jeff, but he is one of the more outspoken complainant Apple devs. At least he has a better basis for his commentary than DHH.
I would like Apple to not roll out BS prompts that make my life more difficult until those prompts are actually capable of protecting some of the most sensitive data on my machine.
I.e. His goal is to criticize Apple no matter what they do, because he dislikes the fact that they are no longer producing the kind of open system he prefers.
There are an enormous number of protections and a small number of issues, which do eventually get fixed, and of course the threats are undeniable.
However you are right that Apple is notoriously had at communicating about bugs.
In reality, a very simple bug was reported more than a year ago, and Apple apparently hasn't cared enough to fix it. The only way I can interpret that is to conclude Apple doesn't really care about the integrity of their sandbox.
IMO, this more than justifies the author's accusation of "security theater". My browsing history is among the most sensitive data on my machine—certainly more private than anything in my Documents folder, which Apple felt the need to protect in a highly-disruptive way. I agree that it can be worth trading some degree of usability for privacy and security, but only if those privacy benefits are real. If they're not, then we're left in the worst of both worlds.
It's really quite damning.
There are many other ways to interpret it. Here is one completely made-up example that I created just now for this reply:
"Apple can't lock this down further without breaking open() calls in the majority of existing applications; therefore, they made a pragmatic choice to allow this issue to exist until their long-term roadmap plan to remove direct disk access to protected folders ships in a future macOS update; while declining to share their decision with the reporter, as is completely normal for Apple."
If you define "security theater" as "any practice that would not stand up to a human attacker", then all security is guaranteed by definition to be security theater, since all security protections will be found to have weaknesses, compromises, and design decisions that could be theoretically exploited. That definition is clearly non-viable in reality, and so all security decisions — even Apple's — will have unpalatable outcomes that do not invalidate the relevance of security.
This is completely normal for Apple, but that doesn’t make it OK for them to treat security fixes like product launches where they can choose an arbitrary timeline and keep the reporter hanging forever.
You know as well as I do that this stuff is complicated.
Does it happen to improve the security situation? Yes, for many people it does. Is it worth the cost? That's debatable, especially because of Apple's apparent apathy (and occasional hostility) towards the community.
Stallman and others have talked about just this issue for over a decade now.
The flippant attitude is exactly why and how you are reading of all these vulnerabilities now.
Knowledge of the issue (but enduring "flippancy") or not knowing it at all? You pick.
What I'm really saying is that this "flippancy" is the agency that's making someone write a blog post, sign their name to it, put it out there with code samples, etc. You dismissing "flippancy" is insulting the agency of this. Without that emotion, that idea where they thought Apple wasn't treating them well, that is the source where people find the energy to publish, to publicise.
Every single word takes strength to write. In this case, the flippancy was the driving force and it shows clearly.
Why would you dismiss that energy?
And no, it's not the author's job to "shield" you from the wrath of their flippancy. I take it and I thank "flippancy" for disclosing this issue.
> but that doesn’t mean we should go back
I don't think you understand the author's stance.
Didn’t think the sandboxing on macOS also has this issue.
(deny file-read-data (home-subpath "/Library/Safari"))
(allow file-read-metadata (home-literal "/Library/Safari"))
(allow file-read-xattr (home-literal "/Library/Safari"))
(deny file-read* (home-subpath "/Library/Safari"))
What can I, as a reader, do? Is there someone to forward this to? Is there a person in Apple to email? Or are we hoping for a tweet storm to stir the water?
The HN crowd in particular has a sizeable influence on other people with regards to technology. Because we are the techies, people ask us what they should use/buy. People observe what knowledgeable people do, and they tend to learn from it. You have more influence than you think. It just takes time to see the changes take effect.
Is there any form of democracy in practice that doesn't involve money?
> The HN crowd in particular has a sizeable influence on other people with regards to technology. Because we are the techies, people ask us what they should use/buy.
See, many of us do recommend people to buy Apple. Because they're still very much the lesser evil among the Microsofts and Googles. If Apple does go bad, it's ridiculously easy to avoid Apple completely: Just don't buy any Apple hardware. Done. Not so easy with MSFT or GOOG, which is what we warn people about.
And that's something the other side on HN can't seem to be able to handle and tries to bury any opposing comments to give the impression of a homogenous echo chamber.
The problem that is hard to solve is 'what does Apple do about issues that don't hit HN?'
I just have to say I had never heard of one of his products - Stop the Madness. A real game changer! The five things he fixes (that vex me routinely!):
Yes, you can get various other extensions in other browsers to fix the five issues he addresses, but his extension works in my preferred browser (Safari) and it just works. Didn't have to load a custom script into some other extension or tweak around - just install and done. I've had various success trying to overcome these five issues but my goodness his extension solves them all and so far no site has foiled it. Amazing.
I wonder how the fix for that one will look like.
And doing something useful with it, to the level of malware? Is that also trivial?
Also, how would that "script kiddie" do that attack in the first place, if you get your apps from the App Store? If it's an independent app, all bets are off anyway. E.g. if they have a serious 0-day to do that, they wouldn't waste time with this. And they could ask the user to disable the SIP, enter root password, or whatever as well...
You probably couldn't use this to steal someone's bank password, but most of TCC doesn't really protect against that. An app could certainly use it to track users and target ads, since it can reveal your browsing history in detail.
That probably won't work on the Mac App Store—but the primary complaint about TCC in recent years is that it applies to all software, not just App Store apps.
Prepackage the scripts, weaponize and sell them on White House Market
You incur no liability, only the people that penetrate incur some liability and only the people that use the pilfered information incur some different liability
Just drop the lone hacker idea, the black hat world functions like a corporation that dilutes and shifts liability until it is no longer recognizable and also worthless to bother with, while everyone perfects their niche and gets paid for that. The white hat world continues undervaluing and resisting market forces.