On August 1, Knight Capital Group, a financial services company, lost $440 million in less than an hour because of a software bug. As I understand it, this bug could have been avoided if more thorough testing was done before release but, as the Omaha World-Herald reports, the company "rushed to develop a computer program so it could take advantage of a new Wall Street venue for trading stocks...and failed to fully work out the kinks in its system."
In an op-ed piece in NYTimes.com, Ellen Ullman, a former software engineer and author, talks about how the SEC's call to companies like Knight to "fully test their computer systems before deploying coding changes" is an impossibility. Ellen writes:
First, it is impossible to fully test any computer system. To think otherwise is to misunderstand what constitutes such a system. It is not a single body of code created entirely by one company. Rather, it is a collection of "modules" plugged into one another. Software modules are purchased from multiple vendors; the programs are proprietary; a purchaser (like Knight Capital) cannot see this code. Each piece of hardware also has its own embedded, inaccessible programming. The resulting system is a tangle of black boxes wired together that communicate through dimly explained "interfaces." A programmer on one side of an interface can only hope that the programmer on the other side has gotten it right.
While I agree with Ellen and teach my students that all software comes with some sort of risk and should be evaluated with that in mind, I do think that, were the system(s) in this debacle written using an open source model instead of a proprietary one, these holes might have been caught, the release deadline would have been met (or even pushed back for the betterment of the product), and this whole thing would have been avoided. Systems don't have to be a 'tangle of black boxes'—they can be an efficient network of transparent tubes instead.
I may be preaching to the choir here when I remind you of Eric Raymond's statement that 'given enough eyeballs, all bugs are shallow.' Using the open source development model means that everyone can see what everyone else is doing in the system and can more easily communicate how changes are going to affect each interconnected module. I'm not saying that all bugs can always be found or that any system is perfect, just that using the open model means that problems can be found more quickly, communication is easier, collaboration is more effective, and bugs are patched more efficiently.
I take this incident with Knight Capital as yet another argument for software that affects the public (if not all software) being developed in an open fashion!