Open source is about more than cost savings

No readers like this yet.
A dollar sign in a network

Opensource.com

I recently stumbled upon this piece discussing the cost of cloud, and it made me realize that people still seem to mistakenly believe that open source is just about cost savings. Often times, when asked to explain the reasons for going open source, rarely is cost at the top of the list—it’s perceived as a more long-term benefit, ultimately, but certainly not expected in the initial ramp up of open source projects. A good reference for this is this SDTimes article.

The move to open source technology is a much more fundamental shift, and represents a trend that is starting to cross industries, even the most traditional ones, from financial services through telcos. It’s the shift from proprietary to open and intelligently crowd-sourced better code, and technology overall.

In a recent talk with Graeme Peacock of TD Bank, who are undertaking an ambitious project of moving bank infrastructure from traditional commercial and legacy tooling and software, almost entirely to open source projects, he explains it as a by-product of accumulating too much technology. Many times this is even duplicate technology, in business silos, with no alignment of versioning. This had become a nightmare for the bank to maintain; especially with each of these being customized and proprietary to business units—the real pain point and crux of the matter.

The concerns surrounding going with a large universal vendor had little to do with cost, but rather the proprietary nature of very closed products. The way Graeme describes it, is that all of the huge vendors are using the same buzzwords while essentially using very proprietary code, and increasingly making their cloud more proprietary and closed, because they know the cost of exit is significant.

"So we moved to a very open stance and said, look, we actually want to partner with some small agile very technical open source companies to deliver an OpenStack based cloud,” says Graeme. All this to break the dependency on large vendors.

Graeme was a keynote speaker at the OpenStack Vancouver event, where he discussed their move to OpenStack as their cloud of choice. And this pretty much sums up the community and ecosystem sentiment, as well. This ties in exactly to the overall statistics of OpenStack users.

If you take a look at the last OpenStack user survey from November 2014, under business drivers, cost savings is only slated as the third consideration for the move to OpenStack—and not the primary or even secondary reason, as is often perceived.

In a fast-paced industry, where agility and time to market have become business critical, the ability to innovate through flexible technology has become a requirement. The open nature of OpenStack, and other open source tooling, actually makes it possible to leverage the technology for innovation that simply is just not possible with other closed-source options.

That said, it still needs to be assumed that in reality there will be and still are certain applications that are critical to the business that are closed source and proprietary, in which case, controlling the infrastructure that runs the business has a direct impact on the level of competitiveness and profit margins of those applications.

With these, there will also be a set of applications that are less critical, and often times it will make more sense to outsource these completely. The optimal approach, would therefore, be a hybrid approach. With new standards shaping up like TOSCA (Topology and Orchestration Specification for Cloud Applications) the current barrier for creating a hybrid cloud has become significantly lower.

One such example is the move to NFV (network functions virtualization). Pretty much all the projects to date undertaking this endeavor are, not coincidentally, OpenStack-based projects. This is because the API and software-driven infrastructure, is pretty much the only existing option that exposes the networking functionality under the hood required to enable the virtualization of network functions.

The same sentiment is also echoed by Axel Clauberg, VP Infrastructure Cloud & Aggregation, Transport, IP, Deutsche Telekom AG, in the recent OpenStack & Beyond Podcast who, too, is leading the entire NFV movement, and even coined the term in 2012.

"When we put together that architecture we're now productizing—the thoughts early on were, if we were to bet on just as single vendor, it would be very high risk. Therefore, we decided early on to build our product based on open source—KVM hypervisor, OpenStack as cloud orchestration framework, Ceph for distributed storage."

The fear of vendor lock-in, and, as a by-product, the inability to innovate at the speed of market needs, is primarily what’s driving this mindset in even the most traditional of industries. While it’s no secret that specific expertise comes with a price tag, and this is true for any field or industry (I’ve heard similar gripes about DevOps or full stack engineers), with the rapid growth of the OpenStack ecosystem, the expertise in this area will soon be less exclusive, and the price tag associated with it will too be reduced. So those with foresight, and the need to develop more innovative technology, are betting big on OpenStack.

User profile image.
Sharone is the open source community lead for Cloudify, at GigaSpaces Technologies. In her spare time she helps drive other local communities, including the the DevOps Israel community, and the OpenStack Israel community - and helps organize five meetups worldwide. Find her on Twitter or Linkedin.

Comments are closed.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.