Issue #64: Retrocomputing

Return to Innocence

The pages of this magazine have often orbited around the subject of retrocomputing. Take, for example, the editions about sustainability, computer museums, hardware, hobbies, gaming, operating systems, or the one about BASIC published last summer. If you pay attention, you most probably have realized how much retrocomputing has grown in popularity in the past few decades, with more and more people learning on YouTube or TikTok how to replace the batteries or leaking capacitors from the motherboards of all kinds of computers of yesteryear.

It is the firm opinion of this author, that there are at least two long-term trends hidden beneath the appeal of retrocomputing in hobbyists and computer clubs.

First trend: Moore’s Law, or rather, its lack thereof. Despite all the efforts of Apple to hide its flattening curve with ARM-based chips, it is vox populi that a computer from 2014 is just about as powerful as the one you bought as a Christmas gift in 2023, bar the amount of RAM or the speed and capacity of the SSDs therein. On the other hand, consumer PCs of 2014 were, by all standards, widely superior to those of 2004, and immensely more than those of 1994 and, needless to say, 1984.

In practical terms, we can safely say that we are in a weird, intermediate, transitional age, where you are as a matter of fact retrocomputing as you read this article. Spoiler alert: unless there happens to be some breakthrough in physics that will drive more gigahertz to our CPUs, more crypto to those GPUs, more watts to our batteries, and more money to cloud providers, or unless there is some development in the realm of affordable, “home” quantum computing (will that ever be a thing?) we most probably will not see any difference between the computer you are using today and the one you will receive as a gift in Christmas 2033. Get used to this fact.

Retrocomputing, then, emerges not only as a fun, recreational activity, but also as one with widely applicable skills, particularly for those working in sustainable computing, and those spreading its use in regions of the world where it has not yet reached peak maturity, and where refurbishing old hardware is an effective way to raise standards of living through computer technology. In this age of rising temperatures and rising oceans, reusing old computers is a very valuable skill.

The second trend: the rise of AI. The impending dehumanization fears naturally brought by ChatGPT and other copilots shifted the spotlight back to retrocomputing. Paraphrasing Obi-Wan Kenobi, computers of yore were “elegant weapons for a more civilized age.” And just like Jedis finish their training by building their own lightsabers, many computer enthusiasts scavenge old computer stores for the Kyber crystals–sorry, for the vintage CPUs and capacitors required to build or repair their own retrocomputer systems.

We marvel at the fact that the Apollo 11 computer is vastly overpowered by the cheapest Apple Watch, yet it helped Armstrong and Aldrin to land on the Moon. Those of us old enough to have witnessed a Commodore 64 in action, chuckle as we run a VICE emulator and load a copy of Impossible Mission. We remind ourselves of times when Turbo Pascal was the most advanced IDE, and how we used it to code 2D games with our friends, tirelessly copying code from a magazine. We shake our heads in dismay watching young kids trying to use Windows 95, struggling just like Scotty did in Star Trek IV as he interacted (“how quaint!”) with a first-generation Mac… using the mouse as a microphone.

No, Scotty; neither Siri nor Alexa nor ChatGPT were available on the first Mac. And thankfully so. In our world, there is no better mechanism to get away from the Generative AI craze than by using a computer that cannot run any of that. Preferably, not even a web browser. How is that for a Luddite? As much as we can find ChatGPT or DALL-E dazzling inceptions, we marvel at a time when computers were so simple that you could understand each circuit on it just by reading the pile of manuals provided by the constructor in the box–yet another sign that these were more elegant weapons for a more civilized age.

Yes: even if you are not aware of it, each one of you are looking for ways to get away from the modern world. We all are. All the time. There are two kinds of people in our modern western world: exhausted people and liars.

That is why people repair and drive old cars, you know, the ones without computers on board. That is why people enjoy Classical music, go to museums, or read history books. This is why we browse old photos in an album left behind by our grandparents. People have done this for ages, but the difference nowadays is that, for the first time in the history of mankind, a fifty-year-old person can have vivid memories of a childhood before the iPhone existed. The outstanding level of change (or, as historians like to put it, the “pace of progress”) of our current times is draining our capacity for attention and patience.

Fans find retrocomputing to be a proverbial breath of fresh air. Free from the distractions of our wondrous, 30-year-old World Wide Web, some activities bring memories of a time when techno-optimism and its associated utopias were still a wish, and not a dystopian reality.

Retrocomputing can be as simple as playing a game of SimCity, compiling code on THINK C, making a website with HoTMetaL Pro and Paint Shop Pro and testing it on NCSA Mosaic, or writing your next bestseller on a rock-solid (and licensed-for-life, an underrated feature of legacy software) copy of Word 5.1a for Mac, allegedly the best version of Microsoft Word ever released. After all, does not George R.R. Martin write his novels with WordStar 4.0 for DOS? (In case you would like to follow his footsteps, you can download it for free, read the manual, and run it on FreeDOS. You are welcome.)

Speaking of old software, retrocomputing can consist of downloading and running some from the Internet Archive, VETUSWARE, WinWorld, or Bitsavers, recreating Chandler Bing’s laptop, installing the oldest possible Linux distro, playing old Mac games, watching Twitch channels dedicated to the Amiga or Objective-C, or going even further back in time and trying to run Multics on an old PDP.

You can enjoy retrocomputing simply by exploring the MacPaint and QuickDraw source code, enjoying the original music or Super Mario Bros or Pac-Man, following @mos_8502 on a retrocomputing Mastodon client, buying an old Kaypro 2 computer, loading software on an Apple II through sound, emulating an MITS Altair 8800, emulating a Nintendo Game Boy, compiling and running programs (kids: we did not yet call them “apps” back then) written in “arcane” or “ancient” programming languages…

And so much more.

Retrocomputing is good for our geek mind, exposed as it is to a plethora of unused and unusable functionality spread across gigabytes of disk space, and requiring permanent network connections for seemingly no reason at all (well, mostly corporate surveillance.) It is an escape mechanism from a world that has grown increasingly complex, even for computer geeks.

Here is a different angle, compatible with both the MCU, the DCEU, and another recent movie, brought forward by Paolo Amoroso:

Retrocomputing is a multiverse.

It’s a multitude of parallel universes in which every classic computer and software achieved success, and gained the ecosystem and love it always deserved.

Why not. Paraphrasing Anton Ego, retrocomputing could be seen as an example of fresh, clear, well seasoned perspective. We, computer programmers and users, crave a return to innocence, to a time when we were not afraid to use weak typing, where we could drop our hubris, where we could look into our kernels and return to ourselves. Alas,

It has been 50 years since the Magnavox Odyssey, and all innocence is lost.

Cover photo by Vincent Botta on Unsplash.

Donate using Liberapay

Adrian Kosmaczewski is a published writer, a trainer, and a conference speaker, with more than 25 years of experience in the software industry. He holds a Master's degree in Information Technology from the University of Liverpool.