Docker sparked the trend in software containers less than two years ago. And since its modest presentation at PyCon in 2013, the startup has vaulted to a value of nearly one billion dollars, drawn 2,500 attendees to DockerCon, and its namesake technology has become a marketable skill to have, entering Hacker News' top 20 most frequently requested job skills.
Its raison d'etre is clear enough: "Docker is an open platform for developing, shipping, and running applications." Docker enables developers to develop on any virtual environment, any Linux distribution, any database back-end, etc., on their laptop.
This isn't exactly all that new; Google has used container technology since 2004. But Docker has made the most of this technology by making it quicker and easier for developers to use. Not only does Docker simplify virtualization on the developer machine, but it also allows the use of that "virtual" infrastructure stack in other environments, whether testing or production. This fluidity has vastly shortened the time it takes between writing code and seeing it in production.
"Docker was not the first container technology out there," said Vidar Langseid, Lead Cloud Infrastructure Engineer at eZ, "But you could say Docker was the one who gave container technology to the people. Not because it was open source (others were, too) but Docker suddenly made containers convenient and easier to work with."
Docker's lightweight containers also allow system administrators to scale their projects. They can quickly assemble applications from components, and Docker helps eliminate the errors that may come when shipping code. Nearly everyone in an organization can easily understand how an application works because the containers are easy to build, iterative versions are quickly enabled, and the changes to each version are easy to spot. Docker containers run everywhere, from laptops to data centers and private or public clouds. Thus, applications can easily cycle between an individual computer, a testing environment, and a cloud.
Why we love Docker
Around 2013, we were looking for ways to simplify development, accelerate developer onboarding, and facilitate collaboration by using the same technology across multiple systems inside the company. eZ considered some other technologies as well, but André Rømcke, Vice President of Engineering at eZ, landed on Docker.
"We were looking for a technology that we could use in all parts of our systems and then potentially in the future use in all our customer's systems," Andre said.
Andre began using Docker shortly after it was released. Since then, it's become an important part of several eZ developers' workflows, including Vidar's, who uses it on a daily basis.
"It's very cool to be able to create an environment based on any Linux distribution by issuing a simple Docker command," Vidar said. "This is great when you quickly want to test some new application knowing it won't be able to tamper with your system. It's easy to get rid of the application afterwards, too: just remove the container. On the professional side, Docker makes it possible to deploy and host web applications in a completely new way. It is really a game-changer."
Currently, we use Docker for three internal purposes:
- Automated testing of new code (pull requests) done before new code is accepted and merged into the code repository.
- A master version of eZ Platform and eZ Studio that is deployed daily and used by product management to evaluate the latest changes to the platform.
- Demo environments, which sales and product staff use to show customers and partners eZ's products in action.
Not all eZ developers are using Docker (or aware of it), but when they commit code, Docker is used as the underlying infrastructure for automated testing and deployment. Within eZ, Docker is a highly useful tool in improving "deployability" of eZ products, and putting the solutions in the hands of users.
Originally published on the eZ blog.
Comments are closed.