Almost everyone, including Apple, has painted a false picture of the security benefit of Mac app notarization. There is a true benefit, which I haven't seen anyone mention, but I'll discuss it here. On the other hand, the publicly touted benefits of notarization are bunk, in my opinion, and I'm going to debunk them in this post too. Most notably, I'll explain how the notarization malware scan is superfluous security theater, a mere marketing gimmick that I believe ought to be abolished because of the burden it places on legitimate developers.
Notarization applies to Mac apps distributed outside the Mac App Store. For many years, macOS has required these apps to be signed by a Developer ID certificate purchased by the developer from Apple. Notarization is a new technology introduced with macOS 10.14 Mojave. According to Apple's developer documentation (I hesitate to link to Apple's docs, because the links tend to break over time), the notarization service "automatically scans your Developer ID-signed software and performs security checks. When it’s ready to export for distribution, a ticket is attached to your software to let Gatekeeper know it’s been notarized." Apple has recently announced new notarization requirements for macOS:
Note that in an upcoming release of macOS, Gatekeeper will require Developer ID signed software to be notarized by Apple.
Developers creating a Developer ID certificate for the first time need to submit their signed software for notarization in order to run on macOS 10.14.5 or later.
Starting with the public release of macOS 10.14.5, all new kernel extensions and updates to kernel extensions need to be submitted for notarization.
Many developers, including me, suspect that the "upcoming release of macOS" means macOS 10.15, presumably to be announced in June at WWDC.
The security of the Developer ID system depends on the security of the Developer ID signing certificates. Developers must keep their certs safe. If someone unauthorized has possession of your signing cert — it could be a hacker, but it could also be a former employee or contractor — then the unauthorized person has the ability to sign and distribute Mac software using your cert, entirely without your knowledge. Possession of the signing cert is the only requirement.
When notarization becomes required for distribution, unauthorized distribution using your cert becomes much more difficult. Notarization is a kind of two-factor authentication. In order to notarize an app, you first need to sign it with your Developer ID cert, but then you have to submit it to Apple using the Apple ID and password of your developer account. If your signing cert is compromised, that by itself would no longer be sufficient to distribute the app. The unauthorized person would also need to compromise your Apple ID. And recently Apple has required that all developer accounts enable two-factor authentication. Apple has a custom, nonstandard, bizarre, weaker implementation of 2FA, but it still makes secret compromise of the Apple ID, and thus distribution of Mac apps, more difficult for unauthorized persons. Furthermore, you get an email whenever you notarize an app, whereas there's no email when someone simply signs an app using a Developer ID cert. So if someone has secretly compromised your signing cert and your developer account, they still couldn't distribute software without your knowledge, unless they changed the email on the account, in which case you'd be notified of that too.
This is a true security benefit of notarization. It protects your Developer ID certificate from unauthorized use. Of course, it doesn't protect against malware authors simply paying $100 (perhaps with fraudulently obtained credit card numbers) to sign up for their own Apple Developer account and notarize their own software with their own Developer ID certificate and their own Apple ID. Thus, it's not a huge benefit, but it's a benefit. It seems unlikely that Apple is doing background checks on Apple Developer Program members. After all, there are many obvious scam artists in the App Store (see for example in my blog post The Mac App Store Safari Extensions Experience, as well as the many examples exposed by the Twitter account Apps Exposed). If Apple were curating its Developer Program members, how did scammers sign up?
A myth has been spread that Developer ID certs can only be revoked in entirety, meaning that all versions of all apps signed with a Developer ID cert would be invalidated when the cert is revoked. Apple has contributed a bit to this myth:
Notarization also protects your users if your Developer ID signing key is exposed. The notary service maintains an audit trail of the software distributed using your signing key. If you discover unauthorized versions of your software, you can work with Apple to revoke the tickets associated with those versions.
The problem with the myth of blunt revocation is that we have irrefutable public evidence it's utterly false. Consider The Case of the Stolen Source Code from Panic. According to Panic, Apple "walked us through the best way to roll our Developer ID and invalidate the old one, which we don’t think was leaked, but we’re being overly cautious. And more importantly, the right people at Apple are now standing by to quickly shut down any stolen/malware-infested versions of our apps that we may discover." It was indeed wise to revoke the old Developer ID cert, because if you're not absolutely sure whether a cert has been compromised, you have to assume that it has been compromised. They handled the situation correctly here. What may surprise you, though, is that older versions of Panic software that were signed with the revoked cert still pass Gatekeeper checks. Panic keeps an archive of old versions on their web site, so you can download and try for yourself.
How is this possible? An app signed with Developer ID for distribution has a secure timestamp. This can be controlled with the
--timestamp flag of the
/usr/bin/codesign tool. Xcode invokes this automatically and contacts an Apple timestamp server when you build an app for distribution. You can see this yourself by taking a packet trace, or by installing Little Snitch. If your Developer ID cert is compromised, Apple can invalidate all apps signed after a certain date and time, while leaving all earlier versions valid. Panic knew precisely when their Developer ID cert was possibly exposed to compromise, so this is surely what Apple did in their case. Panic apps signed with a secure timestamp before the malware HandBrake was installed were safe, so those apps didn't need to be invalidated. Any apps signed with the old cert after that date would be suspect, so the secure timestamp can be used as a cutoff.
Theoretically, it's true that notarization could be used to invalidate individual builds of apps instead of simply invalidating all builds after a certain date. Notarization is potentially a little more selective. In practice, however, this makes absolutely no difference. Why? In order for an app to be notarized, it first has to be signed with your Developer ID cert. Then it has to be submitted using your Apple Developer account. Anyone who has notarized an unauthorized build has compromised both your Developer ID cert and your Apple Developer account. Therefore, the cert must be revoked, and the account password must be reset. You can't simply invalidate individual builds while leaving the old Developer ID cert valid. If someone unauthorized has your signing cert, they will still be able to distribute unauthorized software with it for older versions of macOS without the notarization requirement, and they'll be able to notarize it again if they can compromise your developer account again. You have no choice but to revoke the signing cert. And you have to be suspicious of anything that was distributed during the time after the compromise occurred. In the end, Apple still has to follow the same revocation process that occurred before notarization existed. There's no practical difference. Moreover, I'll point again at Panic's statement: "the right people at Apple are now standing by to quickly shut down any stolen/malware-infested versions of our apps that we may discover." Apple already had this capability before notarization. There's nothing novel here.
When you download a Mac app from the internet and open it for the first time, you see a macOS Gatekeeper dialog that asks "Are you sure you want to open it?" Have you ever noticed, though, that when you update the app to a new version using the app's built-in software update mechanism, you don't see a Gatekeeper dialog on first launch? This is because Gatekeeper only asks you about apps that are "quarantined". When you download a file from the internet, the web browser adds a
com.apple.quarantine extended attribute to the file in the file system, and Gatekeeper checks for this attribute. Mac apps that aren't sandboxed have the ability to delete extended attributes on files, even extended attributes on their own app bundles. Therefore, a Mac app that you download from the internet has the ability to download a new version of itself, remove the quarantine on the new version, and then open the new version without a Gatekeeper dialog. This has been true for as long as Gatekeeper has existed.
The most widely used software update mechanism outside the Mac App Store is called Sparkle. Thousands of popular apps have adopted the Sparkle framework. If you look at the source code for Sparkle and search for "quarantine", you can see where Sparkle deletes the
com.apple.quarantine extended attribute after it downloads the update. This is perfectly normal and expected, nothing underhanded or nefarious. In fact, Sparkle also checks the code signature of the downloaded update before opening it to make sure it has been signed with a valid Developer ID certificate, just as Gatekeeper does. Mac apps can also perform additional, custom validation checks relevant to their own specific situation to make sure that the update is authorized and hasn't been tampered with.
The ability of Mac apps to update themselves shows that the notarization malware scan is security theater. Apple's notarization service scans for malware, but malware authors don't need to submit malware to Apple! They can submit a perfectly innocent app for notarization, get the app notarized, and then flip a switch on their own server to download a malware software update when the victim opens the "innocent" notarized app. The downloaded malware update doesn't need to be notarized, because the software updater will delete the quarantine attribute, thus bypassing Gatekeeper. It's impossible for Apple to detect this beforehand, because the malware update won't be made available for download until after Apple notarizes the original app.
The malware scan is unlikely to catch serious malware authors, but it does punish legitimate developers, because they have to submit their apps and then sit and wait for Apple's response, which Apple claims should take less than an hour (already too long), but in practice has taken much longer in some instances, according to developers I've heard from. Just yesterday, Apple's Developer System Status showed 2 outages of 90 minutes each with the Developer ID Notary Service. The whole point of distributing software outside the Mac App Store is to avoid problems like these, submitting to Apple for approval and waiting for their response, but now Apple is imposing those very same problems on software outside the App Store. If notarization is to be required at all, I think it should skip the security theater of malware checks and simply notarize the app on submission, a process that would be almost instantaneous. This would reap the benefit of two-factor authentication that I discussed earlier without placing an undue burden on developers outside the App Store.
This blog post has been about security issues related to Mac app notarization. If you'd like to read about privacy issues, see my blog post Mac app notarization and customer privacy.