Open source code is not enough
Open source code isn’t a warranty
Automotive software issues, such as the Jeep hack and Volkswagen cheating on emissions tests, have made headlines this year, which means the public is thinking about software in cars like never before. Some experts have argued that mandating that such software be open source is a solution to the problem. Although there are definite benefits to public scrutiny of the software, code visibility alone is no guarantee. As Sam Liles explained to me in a recent email, open source code didn’t prevent ShellShock.
Dr. Liles was formerly a professor of cyber forensics at Purdue University, where he and his students researched the security of automotive and other Internet of Things devices. He says that defense-in-depth is dead, meaning we can no longer rely on multiple layers of security for protection. Our phones and other personal devices, for example, know everything about us: where we go, with whom we communicate, even when we're having sex. These devices, and all of the information they contain, live inside our personal and work networks. A compromised phone can access troves of information or spread threats to other connected devices.
The sheer volume of these devices represents a challenge in itself. "Who is going to do incident response at this level?" Liles asks. For that matter, who is going to audit all of that code? In The Cathedral and the Bazaar, Eric S. Raymond wrote, “Given enough eyeballs, all bugs are shallow," which he called Linus’s Law, but we cannot rely on enough eyeballs alone. If such important and established projects such as OpenSSL lacked the resources to prevent bugs like Heartbleed, who is going to examine the millions of lines of software that run the devices we take for granted every day?
Although the 2011 NASA and the NHTSA investigation into a rash of unintended acceleration incidents involving Toyota cars found "no evidence that a malfunction in electronics caused large unintended accelerations”, other researchers have identified ways to produce acceleration in automobiles by way of software. "If the Power Management ECU has been compromised," the IOActive report reads, "acceleration could be quickly altered to make the car extremely unsafe to operate.” Clearly, software is a critical component of modern automotive safety.
Nevertheless, research such as that done by Liles' group remains relatively rare. Just analyzing the software is often difficult. "Forensics is almost never built into systems and often for the purpose of legal validity needs to be reverse engineered," Liles says. Additionally, the change in threats posed by the Internet of Things requires a fundamental shift in the way research is conducted. "Many of the 'old' information assurance and security rules, doctrine, and sometimes called science is based on myths, half truths, and outdated technological concepts."
So where does open source fit into this? Accidental bugs, sometimes significant, will continue to exist whether or not the source code is open. Heartbleed, ShellShock, and many other high-profile vulnerabilities in open source software tell us this is the case. Intentional misbehavior would become riskier in the open, but openness is only helpful to the degree we have some way of validating that the source code that has been provided is what's actually running. This becomes increasingly important as cars become open systems, connected to our phones and to mobile Internet services.