Why OSS is the right flavor for the government cloud

No readers like this yet.
Several clouds

Opensource.com

Recently, the Washington Post highlighted how the United States Navy ordered cessation of new server and data center procurement. The Navy cited a recent announcement by Federal CIO, Vivek Kundra, detailing a general movement to cloud-based computing (including a requirement for departments and agencies to identify in the short-term three applications that could move to the cloud).  Of course, to those unfamiliar and perhaps surprised by this announcement, it comes after a number of rather startling reports issued last year that talked in great depth about how the US Government creates and consumes data.  One of the alarming data points?  The fact that the US Government currently supports more than 2,000 locations classified as data centers.

 
Kundra’s 25 point IT Management plan sheds even more light on the subject.  A large component of this plan is a serious and concentrated move to Cloud architectures.  While the Government’s definition of the term Cloud was outsourced to the folks at NIST, a document released this past week entitled “Federal Cloud Computing Strategy” puts the finishing touches on an effort that has been more than a year in the making.  The message is pretty clear: follow efforts from private industry to increase utilization of IT assets, provide scalable and flexible services, and provide environments that support innovation while reducing risk.  
 
In fact, the Strategy provides a lot of insight on how Kundra wants Departments and Agencies to pull this together.  At some level, there’s an expectation of wholesale outsourcing.  Recent noise in the trade pubs about Microsoft and Google fighting it out in this arena show that agencies are already trying to aggressively move to meet the challenges set forth in the strategic plans.  And indeed, looking at the achievements in the cloud arena, it becomes readily apparent that success here means movement to service-mesh architectures: SalesForce, Google, Azure, AWS.  They all exist beyond Infrastructure as a Service, and make the infrastructure an abstract commodity.  Data in, results out: sounds exactly like the sort of thing that the government wants and needs.

While this may accommodate the needs of a large majority of the Government, there is a reason why the Department of Defense and the Department of State have so many computing sites that were labeled by the study as data centers.  Both agencies tend to put people in remote locations where sometimes the best connection you can hope for is via satellite or a shaky land-line.  This reality can throw a major wrench in the works when you look at how to project the cloud through proprietary and commercial cloud architectures.  As a member of the US Foreign Service, I have seen first hand the the difficulties in trying to access remotely managed, remotely provisioned services.  In the Middle East, South Central Asia, and Eastern Europe long-distance satellite hops, a flaky and heavily censored Internet, and more all conspire to make any end-user activity a productivity nightmare.  The sad truth is that most of this pain has been self-inflicted by failing to embrace the very technologies which have allowed for this propagation of information throughout the world.  The basic tenets of the Internet: geographic distribution, multiple pathing, and geographic proximity appears to have missed the translation from the network engineer to software design department.  This truth of systems design, relayed to me a long time ago, still rings true: latency is something you do to computers, not to your users.

The problem is much more complex than simple laws of physics and geography.  To steal an Internet meme, government data is governmental.  So long as there is legislation that concerns how government data must be managed, there will always be a premium for outside vendors to accommodate that need.  If there’s a lesson that the past twenty years of government systems design should have taught us, it’s that by the time you’re done modifying the proprietary product you might as well have just made the thing yourself.  The heavy cost of licensing compounded with complex integration results in budget-crushing expenditures, and often times elaborate and spectacular failure.  These budget and design issues are in addition to still another reason why Defense and State have so many data centers - because they exist in areas where even traditional computing can be difficult to provision and maintain.

If the traditional answers don’t work, then what’s left is a requirement to evaluate the problem differently.  Instead of trying to force government computing into an external cloud, take the best technologies and bring them into a government cloud.  It is possible to do this by building on the exact same architectures that Google, Rackspace, and Amazon already use.  Using Open Source to build internal service platforms that meet the NIST definitions of Cloud computing allows the government to go directly to the same technology source without all the additional expense of finding ways to manage government data on non-government systems.

A successful Open Source Government Cloud should:

1.  Support a diverse and flexible development environment

Whether the role is diplomacy, census-taking, providing security, or serving and interacting with citizens, innovation should be encouraged when there is a diversity of avenues through which manipulation of data can take place.  Government IT needs to move more toward the speed of IT than the speed of government.  Portable code, open APIs, and open languages are an absolutely critical component to the successful Government Cloud.  The functional articulation of this, other than specific development languages, is that the Government Cloud must support standards that encourage the use of massively distributed environments.  Messaging protocols like the Wave XMPP-extension, Java’s robust messaging architectures, and even SMTP are perfectly suited to this sort of widely distributed network.

2.  Provide heterogeneous access to the environment on a variety of hardware platforms

Commodity processing means removing every vestige of the concept that particular processes or services run on particular pieces of hardware.  Truly heterogeneous environments even abstract away from the traditional Intel x86 architectures, instead looking for wherever the best efficiencies can be found now and in the future.  We know that AMD, Intel, IBM, and Sun (now Oracle) have been doing data center analysis for years using higher temperatures, passive cooling, and other architectures. Building on this, a successful heterogeneous hardware environment relies on the ability to have portable code between platforms.  Interpreted languages like Java, Python, are the tiny ice crystals that make up the cloud, and it makes sense to have flexible architectures that can be flexible enough to run in any environment.  Doing so allows us to more effectively leverage “least cost” principals of acquisition and development.

3.  Collaborate and coexist with existing legacy infrastructure

This should go without saying: the ability to inter-operate with existing infrastructures is absolutely critical.  It makes for an excellent phased approach where the Cloud can slowly assume the capabilities of legacy installations.  As an example, creating a local authentication infrastructure that uses Kerberos to talk to Active Directory domains means providing basic interoperability.  Eventually, as the Cloud becomes more robust and less dependent on legacy services, those proprietary software components can be retired as the Cloud services functionally stand in.

4.  Provide secure, verifiable environments to Federal standards

It should be prefaced first that these comments are primarily directed at US Government.  For the rest of us, the folks at NIST do a fantastic job of critically thinking about what computer security means, how to best achieve it, and providing great documents on the subject.  All this labor by NIST for sensitive government computing and the Committee for National Security Systems for national security information creates an enormous up-front cost for commercial entities that is passed on to the customer.  However, taking the time to implement these requirements in Open Source projects means that once completed an entire ecosystem can take advantage of it.  In fact, certification of Open Source software within these security constructs can only guarantee better, more diverse competition from all sources, since in a cloud-driven environment many of these providers are looking at the same technology platforms to deliver service.

These four points make up the foundation to which any serious government cloud-hosting architecture should adhere.  Furthermore, Open Source Software can provide us the building blocks to making a wholly government-owned distributed cloud a reality.  Whether this architecture is Wave running on Debian on ARM, or SMTP on Red Hat on X86, or something else not even described yet running on CELL, it doesn’t really matter.  If we do it smartly, these packages are reduced into a mobile box of COTS hardware that can exist anywhere without the big box cooling that we deploy in force today.  No more “data centers,” and those on the edge keep their local computing resources while gaining the power of the cloud.  Everyone wins by ensuring wide-spread, reusable standards and core technologies that benefit the entire development ecosystem through a level playing field for innovation.

This future goal: a distributed, open government cloud, would meet Vivek Kundra’s needs by reducing operating and licensing costs for the environment.  Through geographic distribution of  the type of flexible architecture discussed here, such a construct would comprise of many cloud nodes in order to provide guaranteed local access.  This access would meet the needs of those of us who work every day in the places where the Internet is shaky at best or where , like in Egypt recently, disconnection is only a phone call away.  Just like a real cloud, these isolated wisps could separate, continue to provide functionality, and rejoin as it became possible to do so.  There are subtle, but marked, differences between this type of organic cloud structure and distributed client-server systems.  Most significantly a distributed organic cloud provides data portability and vendor-freedom, breaking the cycle of dependence that has forced solutions that never quite meet government standards.

Moving towards outsourced clouds that don’t guarantee interoperability or answer the far-flung edge user requirements means doing business as we’ve always done it, but with shiny new terminology.  Conversely, moving towards an open government cloud, driven by open standards for architecture and data would give us the ability to scale internally by agency, inter-agency, and even to reach out to private sector partners when the crisis event occurs that requires extra-agency scaling.  Building on open standards and a level playing field will create innovation and opportunities that don’t benefit any single Department or Agency, but provide substantive benefit to all agencies.

The advances in technology that have made cloud computing possible also provide a way forward to fundamentally change how the government uses technology in the way it collects and processes data.  It is a window of opportunity that only presents itself once in a generation, and it is well within the purview of the government to seize this opportunity, provide technical leadership, and establish the sort of computing model that will last now and into the future: Open Source Software is the key.

 

User profile image.
John Janek is a career member of the US Foreign Service and has been involved in the Information Technology field for more than fifteen years. He holds a BS in Information Technology as well as numerous IT and Security-related credentials.

1 Comment

This is really amazing! reading articles like this makes us more committed to spreading open source initiatives here in Nigeria & Africa in General. Cloud on Amazon's getting more attention than i initially thought for all the right reasons.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.