For many of us, when we look around at the state of computing in 2016, we nod and think, "Yes, today is what I expected when I thought about what The Future would be like." Sure, we haven't got flying cars yet, but today's technology is flashy. We swipe fingers across screens instead of pressing buttons, and I'm told we are all very excited about all of the latest virtual reality headsets and augmented reality gadgets.
So now seems as good a time as any to look back at how people of the past used to compute, and back to the days when a "desktop" computer was so called because it took up 80 percent of your desktop. The days when the term "computer" actually meant "a machine for computation."
Why bother looking back 30-year-old computing? After all, the computers back then were clunky, slow, and awkward, weren't they? Sure, they were, but the great thing about living in The Future is that we have the power to look back at the Old Ways and cherry pick information from them for modern technology. The truth is, there's power in simplicity, and old computing was simple out of necessity.
Survival of the thriftiest
Have you ever played a survival video game? The kind in which you must survive a zombie apocalypse by hoarding canned food and using rolling pins as weapons? Computers are a little like that.
People make complaints in modern computing about bloat, and the cavalier answer is usually "CPU cycles are cheap!"—or in other words, "I'd prefer to treat the symptom rather than the cause." In the old days, CPU cycles were prohibitively expensive, and even now, “cheap” isn't the same as “free.”
For $89, you can get a PocketCHIP and a keyboard, and have a tiny computer ready to use at any moment, but its CPU cycles are not cheap. What you save on its cost and power requirement you pay for in computing power, but you'll hardly notice, as long as you're okay with simplified computing.
The same is true for server slices or instances on shared servers, or within Crouton. The computer is there for the taking, but you have to learn to conserve your resources, whether it's bandwidth or CPU cycles or RAM.
A workflow of your own
Old computers weren't the pre-fab ready-to-wear appliances that we're used to now. Hardware is neat, but I'm not talking about the hardware. The stuff we interact with on a daily basis is the software, and "integrated" software is a relatively new idea.
Back before big monolithic applications got dumped onto the unsuspecting public, software came in batches on "shareware" diskettes and BBSes. If you wanted to create animation, you got software to help you draw images, used more software to string images into an animated sequence, used other software to produce sound effects, and finally used software to combine the sound and the image.
There was freedom in that, both for your computer, which didn't need to be capable of running that one big app that tried to do everything, as well as for yourself because it was up to you which applications you strung together to get the job done.
One way to compute like it's 1989 is to go and get a Raspberry Pi or a PocketCHIP and dive in to the wonderful world of low-powered living. The good news is that running Linux today is surprisingly similar to running UNIX or Linux in the '90s. The bulk of the commands you know and love are there, and many of the applications and the general sensibility have endured.
Most everything you do on your computer now can be done in a terminal, which obviously is the lightest of lightweight interfaces.
The old style of file management was far more direct than modern interfaces allow. Modern computers have automagical file handling, according to a mimetype database. Mimetypes are useful when your main interaction with a file is pointing at it and double-clicking to open, but that becomes largely irrelevant when you yourself are making the call about which application to use. It might surprise you to learn how much faith we tend to put in auto-detection now. For instance, who knew you could run a valid text edit command with
sed on a dynamic library or that you can see metadata in a sound file with
less? When you stop relying on pre-programmed decisions, then that is when you end up learning something new.
Most modern Linux users have at least heard of file management commands such as
mv, and that's an entirely valid and certainly the most efficient way to manage files. It's not the only option, though. If you yearn for that happy medium between constructing valid BASH commands and the intuitiveness of graphical interfaces, then look to GNU Midnight Commander.
Midnight Commander (invoked as
mc) is a DOS-style utility that provides a split-pane file manager in your terminal. It's primarily keyboard-driven, although being written with the "ncurses" toolkit, it can also receive mouse-clicks.
The interface is wonderfully intuitive. All you need to know is that it's a file manager, and from there you can quickly learn it. Contextual commands are listed at the bottom of the window, each being assigned to a Function Key and a full menu (a "pull down" accessed with F9) is always available at the top of the window.
Personal customization notwithstanding, there are always two panels in Midnight Commander, and you switch between them with the Tab key. All the usual file operations are either a menu selection or a keypress away. It's intuitive enough that you can poke at it for a few minutes and fall into using it without introduction, and yet so efficient in design that you can quickly become a power user and control it with emacs-like key combos. It is, for all intents and purposes, a "desktop" for an MS- or Pro-DOS experience.
To a lot of people, the Internet and the “www” subdomain are one and the same. What many people don't realize is that the "www" subdomain is really just the "World Wide Web" part of the larger Internet and generally it's the part that serves content over HTTP(S).
Web browsers have conditioned most of us to view the web as a place in which you go and "hang out." You sit and idle; you go to it, but never bring it home. This, of course, isn't strictly true—you're downloading bits to a temporary cache, but that's all abstracted from you by the browser. You can still visit the modern web whilst living a digital retro lifestyle, but it's less about loitering and more about getting stuff done. The earliest Linux distributions shipped with Lynx and ELinks, which provide the typical HTTP experience modern web users are used to, but there were and are many other ways to interact with the Internet:
Atom and RSS
The advantages of these is that they are a "push" model instead of a "pull." You don't have to go out and check a website to see whether there are updated news items. The software sends you an alert instead. Much of my daily web browsing is taken care of with one look at newsbeuter or Mashpodder. Once you start using RSS and Atom, you might just find that the sheen of HTTP is a lot duller than it seemed before.
Of the two, newsbeuter is the easiest to configure and use. Install it from your distribution's repository, and then launch it once to force it to instantiate its configuration file. Once that's done, you have only to edit your
~/.newsbeuter/urls file; a simple line-delimited list of feeds you want to check. A sample from my current
$ head ~/.newsbeuter/urls
Wget, curl, fetch
Regardless of the UNIX you're running, you have available some command to access the network and fetch a file. It's browsing the web without the browsing, and it's sublime. Unfortunately, many modern sites obfuscate where the actual content is (if you're using wget or similar, then you're not clicking on their ads), but for the practical websites out there, a quick download command is liberating and efficient.
Git itself is great, but another good thing about git's popularity is that people have actually started hosting blogs and other content in git repositories, which means you can easily grab that content using just a UNIX shell.
Thriving community servers are out there that are open to new users, and you can also build your own. You can find a list of free shell accounts, and now that you can get a computer for $35, setting up your own is one install and port forward away, even if only as an experiment to see how many friends you can get to join you.
There's an unusual amount of nostalgia out there for the Gopher protocol. It's not the greatest system ever invented (Gopher servers sometimes have trouble parsing Gopher's markup), but it does underscore one point that much of the web seems to have forgotten: It's all about the content, not the ads. When your site is serving lists of text and binary files, you inherit an objectivity that just gets lost in modern sites. The Lynx browser still recognizes Gopher, so start your journey with it.
Floodgap Public Gopher Proxy
Of course there's always email, the original social network. Too many people these days fall back on Gmail and other providers that use over-complex web interfaces that cease to function at even the slightest variation of browser vendor or version. There is a better way, and that is Mutt. It's a lightweight, simple, efficient, and effective email client with more customized config files than you could ever need. Better yet, it has next-to-transparent PGP integration, so you can start encrypting those emails from end to end.
Don't forget Usenet, Tor, GNUnet, and more. There are too many ways to access the worldwide network to list. If you look, you'll find all kinds of interesting lightweight technologies lying around out there.
Graphics without X
On some devices, an X server just isn't practical—sure, it might be possible, but you just know it's eating up a lot of precious RAM. You might run a lightweight desktop, but you'll still inherit the overhead of running X in the first place.
Mostly, when computing the Old Way, you don't need much by way of a graphical interface. A GUI just clutters things up, gets your hands away from the keyboard where they belong, and is painfully inefficient. If ever you're going to wish for a graphical display, it will be when you're online or checking email. People love graphics on the Internet, and people love embedding images into email.
startx yet. What if I told you that you don't need to run X to display graphics on your screen? Thanks to the Linux framebuffer device,
/dev/fb0, you can do just that.
There are a few different utilities to draw images straight to your screen without a graphic server. These don't work over remote or emulated connections (SSH, screen, tmux), but as long as you're sitting at the physical computer you're using, you can direct all kinds of output straight to the physical screen attached to it.
To view images, there's
fbi (framebuffer image viewing) and its successor,
fim (Fbi IMproved). Both essentially do the same thing. Point it at a bitmap file and it'll paint the picture on your display, abruptly, without fanfare or apology. You can use various controls; you can zoom, pan, or step through a slideshow. It's easy and immediate, and it's exactly what you need.
You can even play video without X, believe it or not. You need to make sure your username is a member of the "video" and "audio" groups (this is usually a default on even the bare-bones Linux distributions), and then:
$ mplayer -vo fbdev my_movie.mp4
Understand, this isn't a gimmicky "convert your images to ASCII" scenario—these tools actually display the images and video on your screen without a GUI. Depending on which shell you're using, painting pixels this way can confuse your input. If your shell starts to act funny after using
fbdev, use the
reset command and everything ought to return to normal.
The "you" in UNIX
UNIX training and training videos produced in the 1980s made it abundantly clear that the intent of the operating system was and is to empower users to take small commands and string them together to accomplish complex tasks. The individual's workflows were meant to be unique and infinitely extensible.
Beyond the mimicry of old computer interfaces and the rejection of modern network chatter is the enduring principle that computerists should be eager to find new tools, useful programs, and exciting ways to piece things together to accomplish tasks and to make life better for everyone. In other words, put the "you" in UNIX.