9 major tenets of the Linux philosophy

How the 9 major tenets of the Linux philosophy affect you

Penguin swimming underwater
Image by : 


Last time I discussed a rather high-level view of the Linux philosophy in The impact of the Linux philosophy. There were some really good discussions on it, and many blogs syndicated Opensource.com.

One of the comments I received from my previous article was that another operating system has just as much capability on the command line as Linux does. This person said that you could just add this software to get these features and that package if you want those features. That makes my point. With Linux, it is all built in. You do not have to go elsewhere to get access to the power of Linux.

Many people left comments stating that they could see how it might be nice to know the Linux philosophy as a historical curiosity, but that it had little or no meaning in the context of daily operations in a Linux environment. I beg to differ. Here is why.

Nine major tenets

There are nine major tenets to the Linux philosophy.

  1. Small is Beautiful
  2. Each Program Does One Thing Well
  3. Prototype as Soon as Possible
  4. Choose Portability Over Efficiency
  5. Store Data in Flat Text Files
  6. Use Software Leverage
  7. Use Shell Scripts to Increase Leverage and Portability
  8. Avoid Captive User Interfaces
  9. Make Every Program a Filter

There are also 10 lesser tenets and some corollaries to the Linux philosophy that are also important. I will cover some of those in future articles.

Here is a quick command line program as an example that encompasses most of these nine major tenets.

who | awk '{print $1}' | sort | uniq

This little command line program performs the very simple task of listing all users who are currently logged in while only listing each logged in user once. It also eliminates extraneous data provided by the who command.

I won't cover all of the ways in which this CLI program conforms to the Linux philosophy, but I could make a good case that it meets most of them.

Make every program a filter

Each of the commands that make up this command line program is a filter. That is each command will take an input, usually from Standard Input, and “filters” the data stream by making some change to it, then sends the resulting data stream to Standard Output. Standard Input and Standard Output are known collectively as STDIO.

The who command generates an initial stream of data. Each following command changes that data stream in some manner, taking the Standard Input and sending the modified data to Standard Output for the next command to manipulate.

Small is beautiful and

Each program does one thing well

These two tenets go hand in hand. Each of the commands in this program is fairly small, and each performs a specific task. The sort command, for example does only one thing. It sorts the data stream sent to it via Standard Input and sends the results to Standard Output. It can perform numeric, alphabetic and alphanumeric sorts in forward and reverse order. But it does nothing else. It only sorts but it is very, very good at that. Because it is very small, having only 2614 lines of code as shown in the table below, it is also very fast.

Commands Source lines
who 755
awk 3412
sort 2614
uniq 302

Choose portability over efficiency and

Use shell scripts to increase leverage and portability

The portability of shell scripts can be far more efficient in the long run than the perceived efficiency of writing a program in a compiled languagenot even considering the time required to compile and test such a programbecause they can run on many otherwise incompatible systems.

For example, I worked on the email system at the State of North Carolina for several years. I was responsible for a collection of Perl and BASH CGI programs that ran on a Red Hat Linux host. These programs provided remote users in all of our 100 counties and hundreds of large and small agencies to perform account maintenance for their users. I was asked to test the possibility of moving all of these programs to Red Hat running on an IBM Z-series mainframe.

I created a tarball from the existing programs, data, and configuration files, copied the tarball to a Red Hat instance on the mainframe and untarred it. I made one configuration change to the Apache httpd.conf fileI changed the IP Address on which Apache listened. I started Apache and everything worked as it was supposed to with no other changes required. This all took about ten minutes and is an excellent example of true portability.

Had those programs been compiled, I would have had to recompile them and remove any hardware specific efficiencies that we might have programmed in.

Use software leverage

Software leverage means a couple things to me. First, and in the context of this example, it means that by using four command line commands, we are leveraging the work of the programmers who created those commands with over 7,000 lines of C code. That is code that we do not have to create. We are leveraging the efforts of those other, under-appreciated programmers to accomplish the task we have set for ourselves.

Another aspect of software leverage is that good programmers write good code and great programmers borrow good code. Never rewrite code that has already been written.

One of the great advantages of Open Source software at all levels, from the kernel, through the GNU and other utilities, and all the way up to complex applications, is that there is an incredibly large amount of well-written and tested source code out there that can do almost anything you want to do. Just find the pieces you need and include them in your own code. I use my own code over many times in different programs. I also use a lot of code written by other folks when it meets my needs. Either way, it saves me a lot of work and keeps me from having to reinvent perfectly good code.


This article is not meant to be a programming tutorial. Rather, it is intended to illustrate how the Linux Philosophy impacts and informs the daily work of system administrators and developers.

We are the beneficiaries of decades of code that was well-designed, well-thought out, and well-written by people who had a lot of skin in the game and actually knew what they were doing. The best code on the planet was written using these tenets.

The GNU Utilities alone represent a huge investment of time and effort by Richard Stallman and other programmers to provide open, free Unix utilities to anyone who wanted them. These GNU utilities were incorporated into the original Linux distribution by Linus Torvalds. Together this constitutes GNU/Linux and provides us with a source for an operating system and utilities that are incredibly powerful and useful.

GNU/Linux gives us the free and open tools that enable us to perform incredibly complex and creative tasks, many of which cannot be performed with any other tools. Many of these small, “do one thing well” GNU utilities are designed using STDIO so that they can be strung together in ways that could never be imagined by the programmers of large, monolithic utilities. In fact, the programmers of these GNU Utilities themselves had no idea of the virtually infinite combinations in which they could be combined to perform tasks they could not imagine. Yet these utilities that are over 40 years old are still in heavy daily use on computers around the world.

Next time I will have a challenge for you.


  1. Eric Raymond: The Art of Unix Programming http://www.catb.org/~esr/writings/taoup/html/index.html
  2. Mike Gancarz: Linux and the Unix Philosophy; Digital Press, 2003, ISBN 1-55558-273-7
  3. Wikipedia: http://en.wikipedia.org/wiki/Unix_philosophy
  4. Oregon State University: http://web.engr.oregonstate.edu/~traylor/ece474/lecture_verilog/beamer/linux_philosophy.pdf


About the author

David Both - David Both is a Linux and Open Source advocate who resides in Raleigh, North Carolina. He has been in the IT industry for over forty years and taught OS/2 for IBM where he worked for over 20 years. While at IBM, he wrote the first training course for the original IBM PC in 1981. He has taught RHCE classes for Red Hat and has worked at MCI Worldcom, Cisco, and the State of North Carolina. He has been working with Linux and Open Source Software for almost 20 years. David has written articles for