9 lessons from 25 years of Linux kernel development

Learn about top Linux kernel development best practices.
640 readers like this.
9 Lessons from 25 Years of Linux Kernel development

Internet Archive Book Images. Modified by Opensource.com. CC BY-SA 4.0

Because the Linux kernel community celebrated a quarter-century of development in 2016, many people have asked us the secret to the project's longevity and success. I usually laugh and joke that we really have no idea how we got here. The project has faced many disagreements and challenges along the way. But seriously, the reason we've made it this far has a lot to do with the community's capacity for introspection and change.

About 16 years ago, most of the kernel developers had never met each other in person—we'd only ever interacted over email—and so Ted T'so came up with the idea of a Kernel Summit. Now every year kernel developers make a point to gather in person to work out technical issues and, crucially, to review what we did right and what we did wrong over the past year. Developers can openly and honestly discuss how they interact with each other and how the development process works. And then we make changes that improve the process. We make new tools, like Git, and constantly change how we work together.

Over time, this evolution has created a resiliency that has allowed the project to go from one strength to the next while avoiding the forks that have split the resources of competing projects. It may be many years before we fully understand the keys to the Linux kernel's success, but there are a few lessons that stand out even now.

1. Short release cycles are important.

In the early days of the Linux project, a new major kernel release only came once every few years. That meant considerable delays in getting new features to users, which was frustrating to users and distributors alike. But, more importantly, such long cycles meant that huge amounts of code had to be integrated at once, and that there was a great deal of pressure to get code into the next release, even if it wasn't ready.

Short cycles address all of these problems. New code is quickly made available in a stable release. Integrating new code on a nearly constant basis makes it possible to bring in even fundamental changes with minimal disruption. And developers know that if they miss one release cycle, there will be another one in two months, so there is little incentive to try to merge code prematurely.

2. Process scalability requires a distributed, hierarchical development model.

A long time ago, all changes went directly to Linus Torvalds, but that quickly proved unwieldy as no one single person can keep up with a project as diverse as an operating system kernel. Very early the idea of maintainers of different areas of the kernel came about, where the responsibility of a portion of the kernel was assigned to an individual familiar with that area. Examples of this is networking, wireless, different driver subsystems like PCI or USB, or different individual filesystems like ext2 or vfat. Spreading out the responsibility for code review and integration across many hundreds of maintainers gives the project the resources to cope with tens of thousands of changes per release, without sacrificing review or quality.

Without the right tools, a project like the kernel would simply be unable to function without collapsing under its own weight.

3. Tools matter.

Kernel development struggled to scale until the advent of the BitKeeper source-code management system changed the community's practices nearly overnight; the switch to Git brought about another leap forward. Without the right tools, a project like the kernel would simply be unable to function without collapsing under its own weight.

4. The kernel's strongly consensus-oriented model is important.

As a general rule, a proposed change will not be merged if a respected developer is opposed to it. This can be intensely frustrating to developers who find code they have put months into blocked on the mailing list. But it also ensures that the kernel remains suited to a wide ranges of users and problems. No particular user community is able to make changes at the expense of other groups. As a result, we have a single codebase that scales from tiny systems to supercomputers and that is suitable for a huge range of uses.

5. The kernel's strong "no regressions" rule is also important.

Over a decade ago the kernel developer community made the promise that if a given kernel works in a specific setting, all subsequent kernels will work there, too. If the community finds out that a change caused a regression, they work very quickly to address the issue. The rule gives users assurance that upgrades will not break their systems; as a result, they are willing to follow the kernel as it develops new capabilities.

6. Corporate participation in the process is crucial, but no single company dominates kernel development.

Some 5,062 individual developers representing nearly 500 corporations have contributed to the Linux kernel since the 3.18 release in December of 2014. The majority of developers are paid for their work—and the changes they make serve the companies they work for. But, although any company can improve the kernel for its specific needs, no company can drive development in directions that hurt the others or restrict what the kernel can do.

7. There should be no internal boundaries within the project.

Kernel developers are necessarily focused on specific parts of the kernel, but any developer can make a change to any part of the kernel if the change can be justified. As a result, problems are fixed where they originate rather than being worked around, developers have a wider view of the kernel as a whole, and even the most recalcitrant maintainer cannot indefinitely stall needed progress in any given subsystem.

8. The kernel shows that major developments can spring from small beginnings.

The original 0.01 kernel was a mere 10,000 lines of code; now it grows by more than that every two days. Some of the rudimentary, tiny features that developers are adding now will develop into significant subsystems in the future.

9. Above all, 25 years of kernel history show that sustained, cooperative effort can bring about common resources that no group would have been able to develop on its own.

Since 2005, some 14,000 individual developers from more than 1,300 different companies have contributed to the kernel. The Linux kernel, thus, has become a common resource developed on a massive scale by companies that are fierce competitors in other areas.

These takeaways, and more detailed information on Linux kernel development, can be found in the 2016 Linux Kernel Development Report, co-written with LWN's Jon Corbet.

Libby Clark contributed to this article.

User profile image.
I'm a Linux kernel maintainer and a Linux Foundation fellow.

Contributors

7 Comments

Nice article, thanks for sharing. I have used Linux for over 20 years and in those years have compiled thousands of kernels. Your article made me think about how long it used to take on a desktop and how long it now takes on tiny multi-core embedded devices.

Nice read! Thanks!

I liked to learn a bit more how works and grow what we use every day :)

Nice article, thanks

Nicely written and a fun trip down memory lane in places too. Thank you :)

thanks

Thanks for putting this together, it was great to see some of the highlights for you as a kernel developer. The success of the kernel is certainly something many of us aspire to, and want to learn from.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.