Apple’s $1 million bug bounty could launch arms race for zero-days

Experts have given kudos to Apple for expanding its bug bounty program to all researchers. But is the $1 million top prize enough to turn black hats white?

Apple’s bug bounty program could launch zero-days arms race

Cloud Native Now

Apple is taking bug bounties to a new level—a level that some say could spur an arms race to acquire zero-day vulnerabilities between the good guys and bad guys.

Not that bug bounty programs are new. They’ve been around for a long time—a very long time in the world of IT, where a generation is three to five years or maybe less. It was 1995 when Netscape launched the first, offering cash rewards to those who found security bugs in their Netscape Navigator 2.0 Beta.

That’s pre-Facebook—by nine years. Pre-Google by three years. Pre-iPhone by more than a decade. Not to mention that while Netscape might have been cutting edge at the time, it has been defunct since 2008.

But they are still cutting edge, mostly because of necessity. The reality is that nobody’s software is perfect, so it makes sense to offer rewards people who find vulnerabilities if they’ll turn them over rather than use them to hack you.

And it isn’t just the small players without superb security teams doing it. Bug bounties are hotter than ever thanks to tech giants like Apple, which probably wouldn’t come to mind first as likely to expand what amounts to crowdsourcing security.

Get the CISO's Ultimate Guide to AppSec

Apple’s new bug bounty program

The company has been notorious not only for its walled-off “ecosystem,” but also for denying even most white hat hackers access to the internal workings of its operating systems. It has historically worked with hand-picked, invite-only security researchers.

Not anymore. At the recent Black Hat security conference in Las Vegas, Ivan Krstić, Apple’s head of security engineering and architecture, announced an overhaul of Apple’s bug bounty program that massively sweetens the payouts—the top award will jump from $200,000 to $1 million—and also opens it up to all researchers.

Beyond that, it expands the program, which began about three years ago and applied only to iOS, to include macOS, watchOS, tvOS, iPadOS, and iCloud as well, plus the devices that run on those operating systems.

Apple’s bug bounty program has payouts up to $1 million and is open to all researchers.

Even more unusual for Apple, Forbes reported before the announcement was even made that the company will be providing pre-jailbroken iPhones, also known as “dev devices” to selected “rock star” security researchers as part of the iOS Security Research Device Program.

An unnamed source told Forbes that the phones would “allow the user to do a lot more than they could on a traditionally locked-down iPhone. For instance, it should be possible to probe pieces of the Apple operating system that aren’t easily accessible on a commercial iPhone.”

Who wants to be a millionaire?

The $1 million award is reserved for disclosure of a vulnerability that would allow a remote attack that can gain total, persistent control of a user’s computer without any action by the victim, such as clicking on a malicious link.

How likely is it that even the best hackers will find a vulnerability like that? Not very, but $1 million definitely offers an incentive.

$1 million definitely offers an incentive to hackers.

Among the vulnerabilities that will yield less, but still significant awards:

  • Lock screen bypass: $100,000
  • Unauthorized access to high-value user data: $100,000
  • Kernel code execution: $150,000
  • One-click unauthorized access to high-value user data: $150,000
  • User data extraction: $250,000
  • CPU side-channel attack on high-value data: $250,000
  • One-click kernel code execution: $250,000
  • Zero-click radio to kernel with physical proximity network attack: $250,000
  • Zero-click access to high-value user data: $500,000

Microsoft’s bug bounty program

Apple isn’t alone, of course. Its expansion is the most impressive, but another giant, Microsoft, joined the bug bounty expansion party at Black Hat as well, with the launch of Azure Security Lab, where it is inviting “a select group of talented individuals to come and do their worst to emulate criminal hackers in a customer-safe cloud environment … which is isolated from Azure customers.”

Microsoft doubled its top bounty award to $40,000 and expects to pay a total of $2 million to participants during the year.

The company said it is doubling the top bounty award to $40,000 and expects to pay a total of $2 million to participants during the year.

The rationale behind the program expansions seems relatively obvious. First, it’s a modified version of what has been called “Linus’ Law,” named for Linux founder Linus Torvalds: “Given enough eyeballs, all bugs are shallow.”

In other words, let independent researchers and hackers attack your software, your systems, your networks, etc., and those “fresh eyes” are likely to find bugs that even the best people on your development team missed.

The difference here is, of course, money. Jeff Atwood rephrased Linus’ Law on his Coding Horror blog back in 2015 as “Given enough money, all bugs are shallow.”

A bidding war

But even that doesn’t explain it all. What also seems to be happening here is a bit of a bidding war. Companies like Apple and Microsoft (and any other that is hoping to protect itself from catastrophic breaches) know that even well-intentioned researchers could be tempted by what criminal hackers or hostile nation-states might pay for exploits nobody else knows about.

Bounties in the range of $150,000 to $1 million are more likely to persuade them to play for the good guys.

That, said John Kozyrakis, research engineer at Synopsys, is the main reason for the expanded program. “It is so teams of people that currently look for bugs will sell to Apple directly instead of middlemen,” he said, adding that “such teams already exist—Apple is just trying to change the incentives so that their research goes to Apple instead of the NSA or foreign governments through middlemen.”

“Incidentally, $1 million is also the amount of money exploit-middlemen currently pay for chains of zero-interaction code execution bugs on iOS,” he said.

“Incidentally, $1 million is also the amount of money exploit-middlemen currently pay for chains of zero-interaction code execution bugs on iOS,” he said.

Still, that kind of money raises the question of whether it could be spent better. While $1 million doesn’t even amount to pocket change to a company worth more than $200 billion, Apple could pay a couple of “rock star” researchers $500,000 each (much better than average) to work for them for an entire year helping to “build security in” to their operating systems.

The need for a bug bounty program

Indeed, one of the reasons hacking is a relatively easy way for criminals to make money is because there are so many “connected” products on the market with bugs or other defects that weren’t found and fixed during development.

But experts say no matter how rigorous security testing is during the software development life cycle (SDLC), the need for bug bounties remains.

As Rehan Bashir, managing consultant at Synopsys, put it, “A company as big as Apple already hires the best-of-the-best developers and security researchers, and even then a teenager was able to find a bug in the iPhone FaceTime application,” which lets users hear the audio of the person they are calling even before the call is picked up.

A bug in FaceTime lets users hear the audio of the person they are calling even before the call is picked up.

Sami Laine, director of technology strategy at Okta, said another benefit is that it expands the company’s reach into the research community. Apple’s platforms, especially iOS, “are arguably the most secure in the industry,” he said. “However, Apple recognizes that having global diversity in researchers is critical for uncovering novel attack vectors.”

Indeed, Kozyrakis said Apple could benefit not only from being informed about a crucial vulnerability, but also from knowing who found it.

Since there are very few people or teams around the world that would be capable of creating a kernel-level zero-interaction code execution exploit chain, “Apple would benefit from direct interaction with them, even for recruiting purposes,” he said.

Weak links in the chain

Ksenia Peguero, senior research lead at Synopsys, noted that another reason for bug bounties is that “there is a difference between the defects that internal security reviewers will find and the issues that bug bounty hunters will report.”

She noted that systems are increasingly interconnected, which can lead to a “chained attack vector.”

“When an application review is performed, we have to limit it in terms of codebase, functionality, deployment model. Some part of the system may get a rigorous review, while another part—some internal system—may get a less rigorous review,” she said.

“However, a low vulnerability in one system, chained with another low vulnerability within a second system, in addition to a vulnerability in the vendor system may together result in a significant compromise.”

Hackers evaluate an application as a whole, and therefore might find chained attack vectors that the internal teams missed.

Hackers, who aren’t constrained by anything considered “out of scope,” evaluate an application as a whole, and therefore “may be able to find a chained attack vector that was missed by the internal teams,” Peguero said.

Raising the security bar

Of course, there is always a risk that even good money won’t buy up every potential exploit. “This sort of bug bounty program stands to create an arms race,” said Tom Kellermann, chief cybersecurity officer at Carbon Black. “What guarantee is there that an exploit developer shares all critical vulnerabilities?”

But he agrees that “the ideal scenario is for organizations to combine solid internal DevSecOps and work with outside researchers to augment those efforts.”

Bashir said the move should raise the security bar for Apple both internally and externally. “Apple not only created a challenge for its developers and researchers to develop more secure software but also addresses the fact that no matter how effective a company’s internal secure development processes are, there is a much larger user base out there that will continue to try to break the software that Apple is producing,” he said.

Get the CISO's Ultimate Guide to AppSec

*** This is a Security Bloggers Network syndicated blog from Software Integrity Blog authored by Taylor Armerding. Read the original post at: