I usually agree with what Bruce Schneier has to say about the economics of computer security, but this time he’s got it backwards.
He’s writing about the vulnerabilities market, in which people find security vulnerabilities in computer programs and sell them to the highest bidder. The buyer could be the software vendor, who will presumably remove the vulnerability from the program. Or, the buyer could be someone who does not want the vulnerability to be fixed. For example, it could be an antivirus software maker who wants to protect its customers better than its competitors, or a government agency that hopes to exploit the vulnerability for cyber espionage.
So far, so good. But then he engages in some revisionist history:
I’ve long argued that the process of finding vulnerabilities in software system increases overall security. This is because the economics of vulnerability hunting favored disclosure. As long as the principal gain from finding a vulnerability was notoriety, publicly disclosing vulnerabilities was the only obvious path. In fact, it took years for our industry to move from a norm of full-disclosure—announcing the vulnerability publicly and damn the consequences—to something called “responsible disclosure”: giving the software vendor a head start in fixing the vulnerability. Changing economics is what made the change stick: instead of just hacker notoriety, a successful vulnerability finder could land some lucrative consulting gigs, and being a responsible security researcher helped.
The real story for most of the past 25 years has been rather different. In the overwhelming majority of cases, software vendors and governments have not been friendly to vulnerability finders. For every “lucrative consulting gig” there are many more examples of do-gooders who have reported vulnerabilities to software vendors with no expectation of monetary reward, only to be prosecuted or sued; Dan Kaminsky has a helpful and humorous White Hat Hacker Flowchart that guides initiates through the process.
So, contra Schneier, the economics of vulnerability hunting has never really favored disclosure. Not only that, there have been many cases in which software vendors were notified of vulnerabilities and did nothing about them, or delayed patching them for years. Economics is one reason: fixing vulnerabilities requires a lot of programmer time that could be used to add features and sell more software, while on the other hand, the cost of an exploit is usually paid by the customer, not the vendor. Irresponsible vendors are the reason that the full disclosure movement started in the first place.
What’s happening today is that the value of vulnerabilities has gone way up, because more and more important systems rely on software, and more and more people carry mobile computers with them at all times. This is why the vulnerabilities market has sprung up, and prices have gone up. Schneier has it backwards when he says,
This new market perturbs the economics of finding security vulnerabilities. And it does so to the detriment of us all.
The market hasn’t caused the economics to change, it is the changing economics that have created the market. If we want to improve security it’s important not to confuse cause and effect. That could lead you astray, as in Schneier’s conclusion:
As the incentive for hackers to keep their vulnerabilities secret grows, the incentive for vendors to build secure software shrinks.
On the one hand, hackers don’t have an incentive to keep vulnerabilities secret—they would be happy to sell to software vendors, if they outbid the bad guys. And on the other hand, the incentive for vendors to build secure software is not shrinking. It remains close to zero, its historical average.