Progress

16 Jan 2014

Almost thirty years ago, in 1984, I got my first personal computer (shared, of course, with my brother) - an Acorn Electron. It had an 8-bit 6502 CPU running at 2MHz, 32K of RAM, much of which was used for the screen, and could display graphics in eight colours on an analogue TV, once you’d retuned a spare channel1. It opened up the world of arcade clones (I grew up on Snapper and Boxer, and wouldn’t play Pac-Man and Donkey Kong until years later), text adventures, and of course programming - first in BASIC, then in 6502 assembler. We loved it.

Twenty years ago, I had an Amiga 1200 (still shared). A 16-bit 68020 at 14hz. 3.5” floppy disks instead of cassette tapes. Hardware-accelerated graphics in hundreds of thousands of colors2. Gorgeous, deep games like Frontier and Beneath A Steel Sky. In every way, it made the Electron look like a relic of a bygone age (which, by that time, it was).

Ten years ago, I was writing up my thesis on an Apple iBook. 800MHz PowerPC G4, 512MB of RAM. Graphics that the Amiga could only dream of, and in sleek, portable package to boot. A capacious hard drive and wireless network connection. It was not only an incredibly powerful computer in its own right, it was part of a global network that was scarcely imaginable a decade before.

Today, I’m writing this on a MacBook Pro, which is… not all that different, to be honest. Somewhat faster, somewhat more storage, nicer screen, but it can’t do anything fundamentally different to the iBook I was using in 2004. Has progress stalled? Has the personal computer had its day?

It’s certainly true that there haven’t been any revolutions of magnitude of cassette to disk to mass storage, or orders-of-magnitude leaps in processing power or graphical ability. More importantly, the fundamental capabilities are the same. Ten years ago, I had a portable, Internet-connected computer with a powerful processor, plentiful on-board storage, a decent screen and a keyboard. Today, I have the same thing.

But.

I also have a smartphone, which allows me to check email, browse the web, and perform a plethora of other tasks wherever and whenever I want.

And I have an iPad that gives me a decent chunk of the laptop’s functionality in a lighter, more comfortable package, with a battery that lasts all day.

And I have a virtual server that provides a permanent presence on the network, without relying on my flakey domestic electricity supply and data connection.

And I have a Raspberry Pi, which lets me tinker at the lowest level of both software and hardware (or hook it up to a TV and watch videos if I’m feeling less tinkery).

And I have a host of other more specialised devices that let me read books, watch films, and listen to music in ways that didn’t even exist a decade ago.

Some might argue that these aren’t PCs - indeed, some of them are often referred to as “Post-PC” devices - but I think that requires a definition of PC that’s unnecessarily restrictive. My smartphone, for example, is most certainly a computer, and both physically and functionally it’s the most personal one I’ve ever owned.

The personal computer isn’t dead. In fact, it’s doing better than ever. More people are using computers, in more ways and more often, than ever before. All that has changed is that the PC is no longer a single machine that did everything, closetted away in a spare room (or plugged into the TV via an RF modulator). Ever-improving technology, coupled with the interoperability provided by the Internet and the web, have made it easy to have many different devices, each tailored to specific needs.

In The Invisible Computer, Donald Norman relates that, in the early twentieth century, a home might have a single electric motor, with numerous attachments to adapt it to specific tasks (sewing, grinding meat, churning butter). As motors became cheaper and more reliable, they proliferated, and each device (blender, vacuum cleaner, sewing machine) would have its own. At this point, users no longer see the motor, just the device and the task - I’m not using the motor, I’m cleaning the floor.

Norman suggests that computers will go through the same trajectory, and indeed this is what’s happening. He describes it as computers disappearing, and being replaced by “information appliances”. I’m not convinced by this distinction; to me, computers are simply becoming more prolific, more competent, and above all more personal. That’s progress.


Many thanks to the The Centre for Computing History for providing the Electron and Amiga images. They’ve just opened up a museum in Cambridge, with exhibits ranging from early punch card machines and minicomputers to the home micros of the 80s and beyond. Much of the vintage hardware is up and running, so you can see if you’re still any good at GoldenEye or try to remember some BASIC. Well worth a visit if you’re in the area.

  1. A TV would typically have eight channels - mapped to physical buttons with associated tuning knobs - which left plenty spare as there were only four broadcast channels at the time. [back]

  2. Well, 256 in sensible display modes - to get more, you needed to employ the CPU-intensive trick of Hold-and-Modify (HAM). [back]

This site is maintained by me, Rob Hague. The opinions here are my own, and not those of my employer or anyone else. You can mail me at rob@rho.org.uk, and I'm @robhague@mas.to on Mastodon and robhague on Twitter. The site has a full-text RSS feed if you're so inclined.

Body text is set in Georgia or the nearest equivalent. Headings and other non-body text is set in Cooper Hewitt Light (© 2014 Cooper Hewitt Smithsonian Design Museum, and used under the SIL Open Font License). BBC BASIC code is set in FontStruction “Acorn Mode 1” by “p1.mark” (licensed under a Creative Commons Attribution Share Alike license).

All content © Rob Hague 2002-2024, except where otherwise noted.