On Bounties and Boffins

Trying to make a living as a programmer participating in bug bounties is the same as convincing yourself that you’re good enough at Texas Hold ‘Em to quit your job. There’s data to back this up in Fixing a Hole: The Labor Market for Bugs, a chapter in New Solutions for Cybersecurity by MIT Press. Bug bounties follow a Pareto distribution, exhibiting the same characteristics as the distribution of wealth and other sociological phenomena. A select few, the boffins, report the largest number and highest quality bug reports and earn the lion’s share of bounties. The rest of the population fights over the boffins’ table scraps.

Fixing a Hole does not offer much to encourage software companies looking to improve their security through bug bounty programs. HackerOne, a company that hosts bug bounty programs, boasts that over 300,000 people “have signed up” to help organizations improve their security. It’s nice to think that you have 300,000 sets of eyes scrutinizing your code, but this number includes zombie accounts and people who never find a single bug. In reality, only an elite few are getting the work done and cashing in.

So why not just hire the boffins as consultants instead of gamifying your company’s security? The authors of Fixing a Hole argue that bug bounties should be designed to incentivize the elite. They say that making bounties invite-only lowers the operational cost of managing a tsunami of trivial, non-issue, and duplicate bugs. (Only 4-5% of bugs from Google, Facebook, and GitHub’s public-facing bounty programs were eligible for payment.) According to the authors, a small number of bounty hunters are indispensable and hold significant power to shape the market for bug bounty programs. Based on this, hiring security consultants under terms and conditions that can be controlled seems more practical.

The data undermining bug bounties

In Fixing a Hole, independent researchers funded by Facebook studied data from 61 HackerOne bounty programs over 23 months and one Facebook program over 45 months. The HackerOne data set includes bounty programs from Twitter, Square, Slack, Coinbase, Flash, and others. Usernames could be tracked across different programs in the HackerOne data set, but not to the Facebook data set.


Top: Participants, sales, and payments for Facebook (45 months) and HackerOne (23 months). Bottom: Population according to bug sales per person.

The prolific group at the top doesn’t limit itself to one program. It sweeps across multiple programs selling bugs across different technologies. What’s more, they also report the most critical bugs that are worth the most money. On average, those in the top 1% submitted bugs to nearly five different programs.

The authors included averages for sales per seller, earnings per seller, and price per transaction, but these values skew the analysis of uneven distributions, so they are disregarded in this review. (e.g., If 90 people end up earning a $10/hour wage, and 10 people earn $1,000/hour, the average wage is $109/hour, which isn’t characteristic of either population.)

Surprisingly, the variance of the data was not reported in Fixing a Hole. When populations are stratified, as the authors discover, the variance of the individual groups reveals some surprising insights. Consequently, a lot of information about the most interesting population, the top-performing 5%, is omitted.

We reproduced and overlaid some of the plots here to show the main theme in the report: there is a small group of prolific participants in bug bounty programs. The larger the data set, the more pronounced this trend is. For the entirety of the HackerOne and Facebook data sets, the 7% of participants with 10 or more bugs were paid for 1,622 bounties, while the other 93% of the population earned 2,523.


The most interesting group was arbitrarily lumped into the 10-or-more-bugs category. (See how the line trends upward at the end of the plot? You don’t want to do that. Plot all your data.) The top 1% (6 participants) in the HackerOne data landed 161 bounties and the top 1% (7 participants) in the Facebook data accounted for 274 bugs. That’s an average of 27 and 39 bugs per person, respectively! There may be stratification even among the top earners, but without knowledge of this distribution at the top (i.e., the variance), it remains a mystery.

As productive as the top 1% are, their earnings are equally depressing. The top seven participants in the Facebook data set averaged 0.87 bugs per month, earning an average yearly salary of $34,255; slightly less than what a pest control worker makes in Mississippi.

It gets worse for the top six earners from the HackerOne data set. Averaging 1.17 bugs per month, they earn a yearly average of $16,544. (Two outlying data points appear in the notes of Fixing a Hole mentioning that Google’s Chromium Rewards Program paid $60,000 for a single submission and one Facebook participant earned $183,000 in 21 months or a $104,000/year average.)

If your heart is breaking and you’re wondering whether this top 1% is wondering where their next meal is coming from, it’s more likely that these are security professionals working a side hustle. You can get really good at finding a few classes of critical bugs, then set scanners and alerts for when relevant bounty programs come online. You find your bugs, submit your proof, grab your dollars, then move on.

What’s better than bug bounties

Who are these top bug bounty performers, and what’s their background? What separates them from the rest? There’s no way to tell from the data, but the authors suggest three possibilities: improved skills over time, differences in raw talent, and professionals vs. hobbyists. (I believe that some top performers may work as teams under one account or are individuals who specialize in a few types of critical bugs and keep a watchful eye for low-hanging fruit when new programs launch.) Whoever they are, they’re indispensable and need to be incentivized to join bug bounty programs. For this, the authors offer three solutions:

  1. Keep the talent pool exclusive through invite-only programs that are closed to the public. This ensures that the most talented will not lose any bounties to lesser talent—even the low-hanging fruit.
  2. Escalate prices with successive valid submissions to deter people from straying to other programs.
  3. Offer grants to talented researchers, and pay them even if no bugs are found.

There’s not much difference between this advice and simply reaching out to a consulting firm for a code audit. Plus, an exclusive bug bounty program faces the chicken-or-egg paradox for the participants: How do you get an invite when you aren’t given the opportunity to establish a reputation? Also, there’s a lot less control and a lot more risk to holding bug bounties than most people are aware of.

The economics of bug bounty programs are turbulent, because there’s a competing offensive market in play. Exploitable zero-days can fetch up to millions of dollars from the right buyer. Anyone who discovers a critical bug can choose not to disclose it to the vendor and try to sell it elsewhere for much more. Fixing a Hole recommends that work should be directed toward incentivizing disclosure to vendors, but offers no practical details beyond that. There’s no evidence to suggest that researchers compare defensive and offensive bounty programs searching for the highest sale. Our opinion is that the decision to disclose or not disclose to vendors is mostly a moral one.

So who is currently incentivized to participate in bug bounty programs? Two groups: Citizens of economically disadvantaged countries, who can take advantage of US dollar exchange rates; and students who want to improve their security skills and learn the tools of the trade. After reading Fixing a Hole, I wasn’t convinced that the elite boffins are incentivized to participate in bug bounty programs. Perhaps they should wield some of their indispensable power to demand more from the market.

*** This is a Security Bloggers Network syndicated blog from Trail of Bits Blog authored by Trent Brunson. Read the original post at:

Cloud Workload Resilience PulseMeter

Step 1 of 8

How do you define cloud resiliency for cloud workloads? (Select 3)(Required)