How open source and cloud-native technologies are modernizing API strategy

With new technologies from open source stacks and DevOps tooling to AI, the common layer of protection and management is the API management layer.
3 readers like this.
A person holding on to clouds that look like balloons

I recently had the opportunity to speak at different events on the topic of API strategy for the latest open source software and cloud-native technologies, and these were good sessions that received positive feedback. In an unusual move for me, on this occasion, I put together the slides first and then the article afterward. The good news is that with this approach, I benefited from previous discussions and feedback before I started writing. What makes this topic unique is that it’s covered not from the usual API strategy talking points, but rather from the perspective of discussing the latest technologies and how the growth of open source software and cloud-native applications are shaping API strategy.

I'll start by discussing innovation. All the latest software innovations are either open source software or based on open source software. Augmented reality, virtual reality, autonomous cars, AI, machine learning (ML), deep learning (DL), blockchain, and more, are technologies that are built with open source software that use and integrate with millions of APIs.

Software development today involves the creation and consumption of APIs. Everything is connected with APIs, and, in some organizations, there’s even API sprawl, which refers to the wide creation of APIs without control or standardization.

Technology stacks and cloud-native applications

In modern software development, there is the concept of stacks. Developers and organizations have so many options that they can pick and choose a combination of technologies to create their own stack and then train or hire what are known as full-stack developers to work on those stacks. An example of a stack includes, for the most part, open source software such as Linux, a programming language, databases, streaming technology, runtimes, and DevOps tooling, all using and integrating with APIs.

From technology stacks, there are cloud-native applications which, refer to container-based applications. Today, there are many cloud-native options across all technologies; the cloud-native cloud computing foundation landscape is a sample of the available cloud-native ecosystem.

When organizations move from applications in a handful of containers to applications in dozens or even hundreds of containers, they need help managing and orchestrating all that infrastructure. Here is where Kubernetes comes into play. Kubernetes has become one of the most popular open source projects of our time, it has become the defacto infrastructure for cloud-native applications, and it has led to the creation of a new and growing ecosystem of Kubernetes operators; most popular software has now its own operator to make it easier to create, configure, and manage in Kubernetes environments, and, of course, operators integrate with Kubernetes APIs. Many available data technologies now have Kubernetes operators to facilitate and automate the use of stateful applications that integrate with Kubernetes APIs.

What is the API management layer?

A cloud-native environment also has its stack, cloud infrastructure, operating system, container orchestration, containers operators, application code, and APIs. All of this supports a software solution that integrates and exposes data to mobile devices, web applications, or other services, including IoT devices. Regardless of the combination of technologies, everything should be protected with API management platform functionality. The API management platform is the layer on top of the cloud-native applications that must be protected as data and APIs are exposed outside organizations’ networks.

And, talking about technology architectures, it’s highly important that the API management platform has flexible deployment options. The strategy and design should always include portability, the ability to move and deploy on different architectures (e.g., PaaS, on-premises, hybrid cloud, public cloud, or multi-cloud architectures).

[ Try API management for developers: Red Hat OpenShift API Management ]

3 API strategies to consider for cloud-native technologies

To design API strategy for the latest technologies, there are multiple options that can be summarized in three major areas. First, is a modernization strategy, from breaking monolithic applications into services, to go cloud-native and, of course, to integrate with mission-critical applications in mainframes. For this strategy, secured APIs are built and maintained. A second area to design an API strategy is what is known as headless architecture, the concept of adding features and functionality to APIs first and then optionally providing that functionality to the user interface. A granular architecture designed with microservices, or entirely based on APIs to facilitate integration and automation. The third API strategy area is to focus on is new technologies, from creating API ecosystems to attract customers and partners who contribute and consume public APIs, to selecting technology stacks and integrating them with new technologies, such as AI, serverless computing, and edge computing. Above all, every API strategy must include API management and a security mindset.

API management platforms should include the full lifecycle functionality for API design, testing, and security. Additional features, such as analytics, business intelligence, and an API portal, allow organizations to leverage DevOps and full lifecycle management for the development, testing, publishing, and consumption of APIs.

A couple of other examples of today’s latest technologies and how the knowledge and use of them can be part of an API strategy include the following: The first is DevOps integration. There is a variety of commercial and open source options for DevOps automation. Key pieces include continuous integration and continuous delivery tooling. The other very relevant space is data and AI technologies, a growing space with thousands of options for every stage of the AI development lifecycle, from data collection and organization to data analysis and the creation and training of ML and DL models. The final step in the AI development lifecycle should include automated deployment and maintenance of those ML and DL models. All of these steps should be combined with full integration of the different technologies via APIs and for external integrations, including data sources, with the important layer of an API management platform.

Open source and the API management layer

In summary, with all these new technologies from open source stacks and DevOps tooling to AI, the common layer of protection and management is the API management layer. There should be a security-first API strategy driven by API management, and it’s important to remember that in this day and age, APIs are everywhere and that the modern technology stacks will be integrated via APIs with data technologies (databases and storage), DevOps, and AI leading the pack. Don’t forget to design and manage APIs with security in mind. Regardless of the selected API strategy for modernization, as a headless architecture, or based on new technology, the API strategy must go hand in hand with your technology choices and vision for the future.

[ Take the free online course: Deploying containerized applications ]

What to read next
Passionate about technology and open-source software, Javier is Chief Evangelist for Perforce Software. Responsible for technical thought leadership and advocacy for the open-source and application security portfolios. Prior to Perforce, Javier led the open-source program strategy for the IBM Z & LinuxONE platforms at IBM.

Comments are closed.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.