Today I'm disclosing a macOS privacy protections bypass. I discovered that an application can use the venerable Unix command-line tool "ls" (list directory contents) to bypass both TCC (Transparency, Consent, and Control) and the sandbox, enabling unauthorized access to file metadata in directories that are supposed to be protected. This issue remains unaddressed in the latest public versions of Big Sur, Catalina, and Mojave, and is therefore, in one sense, a zero-day. Here is the timeline leading to my disclosure:
I was pretty busy in November, so I didn't have time to look at this issue again until now. It's been almost a year since I reported it to Apple. This is well beyond the bounds of "responsible disclosure", which is typically 90 days after reporting an issue to a vendor. I've never been paid a penny by the Apple Security Bounty Program and doubt I ever will. (I disclosed a privacy protections bypass earlier this year, as well as a sandbox escape, and another privacy protections bypass last year.) So without further ado, here's my original report to Apple Product Security:
Attached is a sample Xcode project that demonstrates how a Mac app can explore the contents of directories that it shouldn't have access to, because of TCC and/or the sandbox. This exploit works on the current public shipping version macOS 10.15.2. I've also tested on macOS 10.14.6.
To reproduce, simply build and run the sample app. The app should not have access to the contents of ~/Library/Safari, but nonetheless the console output of the app will indicate whether the file ~/Library/Safari/LocalStorage/https_www.apple.com_0.localstorage exists, and if it does, it will display the names of the extended attributes of that file.
The app calls the command-line tool "/bin/ls", which is the problem here. The "ls" tool correctly prevents the caller from listing the contents of a directory the caller doesn't have access to, such as ~/Library/Safari/LocalStorage. However, "ls" will show information for any individual file, even if it's within a restricted directory.
An attacker can use this technique to probe for well-known locations on disk to see which files exist and which files don't exist. In the sample app, it can be used to determine a user's browsing history, because Safari saves local storage from a web site using the URL of the web site as the basis for the file name. But this technique is far more general. An attacker can determine whether certain files exist in the ~/Downloads folder, even when the app doesn't have access to the ~/Downloads folder. Even learning the names of the extended attributes of files may give away private information that shouldn't be disclosed to any arbitrary app.
Download Xcode project: lstest.zip
I chose the example of ~/Library/Safari/LocalStorage
because Safari names the files in this directory according to the web sites that you visit! Also note that the output of long format ls -l
contains the last modification date of the files. Thus, one possible privacy violation from this technique is to learn the user's web browsing history.
I continue to believe that macOS "security" is mainly theater that only impedes the law-abiding Mac software industry while posing little problem for Mac malware. It doesn't take a genius hacker to bypass macOS privacy protections: calling "ls" is a script kiddie level attack. As a Mac user and a Mac developer, it exasperates me that the Mac is becoming a parody for no benefit. The only reason I was even looking for bugs here is that I could have really used the extra money, since it's difficult nowadays to make a living as a Mac developer in the face of ever increasing (and futile) macOS lockdown. Sadly, it's not very difficult to find bugs, though it's extremely difficult to get paid a bounty for them.