Monetizing Software Bugs
Hackers gotta hack. It's in their nature. But they also want recognition and maybe a little money. In times past, a hacker who discovered a flaw in a piece of hardware's programming might announce it proudly to his friends. He might have told the source of the flaw about the problem and given them time to fix it before he went public ... or not. More recently, if the hardware was important or if the user was notable, he might have sold it on the black market to a criminal gang or a nation state.
Published by The Lawfare Institute
in Cooperation With
Hackers gotta hack. It's in their nature. But they also want recognition and maybe a little money. In times past, a hacker who discovered a flaw in a piece of hardware's programming might announce it proudly to his friends. He might have told the source of the flaw about the problem and given them time to fix it before he went public ... or not. More recently, if the hardware was important or if the user was notable, he might have sold it on the black market to a criminal gang or a nation state. Still even more recently, he might have sold it back to the hardware manufacturer under a bug bounty program.
Give Lawfare
a birthday gift! |
Now we have a new paradigm—one that attempts to monetize the bug and establish its fair market value. Andrea Peterson reports on a new use of the stock market. A security research firm called MedSec recently found a flaw in the implantable heart device manufactured by St. Jude Medical. Rather than alerting St. Jude so they could fix it, or trying to sell it to them, MedSec took a different tack. It gave knowledge of the flaw to Muddy Waters Research, a hedge fund. Muddy Waters, in turn, took a short position on St. Jude stock (betting it would go down) and then released a report, based on MedSec research, that publicly disclosed the alleged flaws. The stock duly dropped, Muddy made a profit (nobody is saying how much) and gave a cut of the profit to MedSec.
A good result, no? Problem disclosed. Market discipline imposed. Starving cyber researchers compensated for their time. A win-win-win all around, right?
Or perhaps not. For one thing, independent researchers have been unable to replicate the MedSec research. For another, it now paints researches as greedy and opportunistic—not the image they want, I think. For yet another, the disclosure ignores the cost-benefit analysis of how significant the flaw is and what the underlying value of the implant is by hyping the danger with no real examination of the benefit. And, finally, of course, the problem with public vulnerability disclosure is always that the vulnerability persists while the fix is being sought—everyone with a St. Jude device is at public risk now and who knows when a fix might be available.
For now, the tactic is legal. I imagine that the SEC may soon ask itself whether it's stock manipulation. And for now, the FDA is telling people to continue using their St. Jude devices. Overall, yet another example of how rococo and complex cyber security issues can become—who would ever have thought that a hedge fund manager would be at the forefront of a cyber vulnerability disclosure?