I turn 70 on my next birthday, and it's well known that people my age aren't computerate, at least here in Australia.
We have to be taught about email and web browsers and Facebook by grandchildren, or in special "computing for seniors" classes. We're not expected to know much about word processing or spreadsheets. And, if we're looking to buy a computer, it's best to ask a younger person to find something suitable for us.
But in fact, I know a fair bit about computers and computing, despite never having been an IT professional, or for that matter IT-trained. I program, too. Every time I sit at my computer I use programs I've written myself, and in retirement I spend a lot of my desk-time writing code.
How did that happen?
Back in the dreamtime
To begin with, I learned the basics of computer programming in high school in 1961, and no, that year isn't a misprint. My high school had a small IBM mainframe that processed data on punched cards, using an early version of the FORTRAN programming language. Kids in my high school were taught how to write simple programs in FORTRAN.
Two years after I left high school, the science-fiction writer Isaac Asimov imagined what life would be like in 50 years time, when computer use would be universal:
"All the high-school students will be taught the fundamentals of computer technology, will become proficient in binary arithmetic and will be trained to perfection in the use of the computer languages that will have developed out of those like the contemporary 'Fortran' (from 'formula translation')."
Sorry, Mr. Asimov. Even after personal computers became everyday devices everywhere, our schools here in Australia resolutely avoided teaching any programming languages. Curriculum planners opted instead to train computer-ready workers: interchangeable parts in a digital economy where the things computers do are mainly decided by large corporations.
An Australian high school graduate today has experience with Microsoft Word, Microsoft Excel, and Microsoft PowerPoint, and not much else. They are ready for your Windows-centered workplace, Mr. Employer!
There's been a push in recent years to teach coding in schools. Some of that push has come from overseas, as in the Hour of Code events that have engaged tens of millions of schoolchildren worldwide. Here in Australia, the new national curriculum has a "Digital Technologies" component, with programming ideas to be introduced in primary school. Even the major Australian political parties have become enthusiastic about coding in education.
The drivers behind this sudden enthusiasm for coding seem to be largely economic. It's to prepare Australian children for the "jobs of the future." The president of the Business Council of Australia, Catherine Livingstone, told the National Press Club in April 2015:
"As it stands in Australia, the gap between the digital literacy of our young people and that of our competitor nations is increasing. If we want increased productivity and participation, we need urgently to embark on a ten year plan to close that gap."
In other words, Australia admits it hasn't prepared well for today's digital jobs. Or those of the past 20 years, ever since the Internet took off. To get ready for some vaguely imagined future, we're now urged to train today's teachers, who are clueless about programming, to train their students to be coding teachers, who will train their students to code. Asimov might not be wrong, after all. Getting programming languages into high school apparently just needs 75 or 80 years, not 50.
Maybe it's a good idea to teach code in schools, maybe not. It certainly helped me, more than 40 years later. But the policy-makers, I suspect, don't understand what coding is all about, except that it has something to with computers, and that at least some of the people who write code make good money.
Safe or risky?
A June 2015 series of articles in Bloomberg Business was titled "The world belongs to people who code." How does that work? Like this, in my humble opinion: code controls computers and the people who use them. If you use pre-written code, supplied with a computer or packaged up as a DVD or downloaded from the web, or bought from an IT company, you're not in charge. You'll do what the software writers have allowed you to do and nothing else. In many cases you'll pay big money for the privilege of being limited this way, and you'll be closely restricted by end-user license agreements.
If you write your own code, on the other hand, you're the boss. You control your device and any other device that can run that code. Coding liberates. Does the business community really understand that? The more code you know and can write, the less shackled you need to be to existing business software and the work patterns imposed by those programs.
Does Mr. Employer want graduates who know how to take apart and rebuild the information flows in his business? People who can replace expensive software and consultants and third-party support providers with free code and in-house expertise? People who aren't interchangeable parts, and who expect to be paid what they're worth? Maybe, but it will probably mean giving up the familiar mediocrity of Windows.
I doubt we'll get competent coders by mass education, though, any more than we've gotten competent writers by mass education. And if you aren't compelled to hear about coding in school, year after year, why would you voluntarily learn to code, as I did?
That's a hot topic, and I can't speak for every coder, but here are seven reasons why an old fella like me does programming:
- In most cases, I code because no available software will do what I want.
- My code often does a better job than available software, or does it faster or more simply.
- The rewards for successful coding are instant. (It works!)
- Coding is creative, personal expression. It's possible that no one else has ever written anything like the code in some of my programs.
- Coding is good and productive exercise for an old brain.
- I like feeling in charge of my computer, rather than having it constantly obey the dictates of someone else's software.
- ...and another reason, which will take a little explaining.
Ten years ago, if I'd promoted open source software I'd be regarded as a dreamer or a boring evangelist. Why would someone write code, then give it away for anyone else to use, or modify, or redistribute?
In 2015, open source software doesn't need promoting. It's everywhere, doing everyday jobs on digital devices in the home, in your pocket, at the office, in data centres, and on networks of all sizes. Most of its writers are volunteers, or paid programmers, who insist that the software they write gets released on a "free to use, modify and redistribute" license. The most widely used device operating system, Android, is open source and based on another hugely successful open source project, Linux.
Giving is good, and giving back to a community that shares is even better. It's a nice feeling to share code into the public domain. Mine appears in online Linux magazines. How many coders around the world have read the coding tutorials and demonstrations I've published? I have no idea. But if my code gets used, it's because I've made sure it works. That's a great incentive to keep coding.