Jeremy Stanley

157 points
Portrait of Jeremy Stanley
Kill Devil Hills, NC, USA

A long-time computer hobbyist and technology generalist, Jeremy Stanley has worked as a Unix and GNU/Linux sysadmin for nearly three decades focusing on information security, Internet services, and data center automation. He’s a root administrator for the OpenDev Collaboratory, a maintainer of the Zuul project, and serves on the OpenStack vulnerability management team. Living on a small island in the Atlantic, in his spare time he writes free software, hacks on open hardware projects and embedded platforms, restores old video game systems, and enjoys articles on math theory and cosmology.

Authored Content

Authored Comments

Perhaps also interesting is that there is no strong consensus on what the term "pipeline" means between different CI/CD systems. The oldest explicit use of "pipeline" I'm aware of by a CI/CD system is in Zuul, which has been using it since its first release 9 years ago:

In short, because it was designed to be a queuing/sequencing "project gating" system, it used "pipeline" to refer to a collection of queues which provide context for a particular outcome (so from your example testing, building, and deploying would each be different pipelines configured in Zuul). The primary example is dependent queues built up from change approvals, where each change enters a "gate" pipeline awaiting the predicted merge state for the changes on which it depends or those approved ahead of it for related projects to reflect reality.

The Interoperability Special Interest Group within the Continuous Delivery Foundation (a fund of the Linux Foundation) has been attempting to survey the discrepancies between this and other related terminology here:

I should have also pointed out that Zuul's use of "pipeline" (much like many of its other terms of art) are drawn from processor design. Zuul pipelines changes in order to perform predictive parallelization, much like how a CPU pipelines operations to achieve speculative execution.