It is hardly possible these days to browse any blog or news website commenting the latest trends in the Apple galaxy, without reading references to the mythical, yet entirely foreseeable, ARM-based Mac. The existence of this not-yet-announced piece of otherwise overpriced hardware is entirely predicted by a closer reading of technology history in that same galaxy, an exercise usually (sadly) abhored by many.
In the past four and a half decades, Apple has moved its products across successive CPU architectures in a continuous flow. Looking at it from the vantage point of this dystopian future of ours, each of the products in the cover picture of this article show a product with a different CPU architecture. After the 8-bit MOS Technology 6502 that powered the Apple ][ around 1980, Macs used the 16-bit (later 32-bit) Motorola 68000 around 1990, then 32-bit (later 64-bit) IBM PowerPC around 2000, later settling with Intel’s 32-bit x86 and 64-bit x86-64 around 2010.
We are in 2020, not even around 2020 anymore, and if we believe all of those blogs, we should be able to spend an insane amount of money to buy Macs powered by ARM architecture CPUs – “finally!“, as some of those headlines online would scream. After all, even the latest iPhone SE models are beating those old crusty Macs in pretty much every Geekbench test out there. And even Microsoft is selling ARM-based laptops these days.
For each one of those CPUs enumerated above, there has always been an “official” development environment; from Integer BASIC for the Apple ][, to Object Pascal for the Mac, to C++ and MacApp, to CodeWarrior and PowerPlant, to Objective-C and Cocoa, and finally to Swift and the upcoming SwiftUI. These “sacred” environments were always historically considered the “one true way” to build software for those platforms. Steve Jobs’ famous “Thoughts on Flash” open letter from 2010 still exemplifies that hard stance. Other means are frowned upon, or simply downright banned.
As an interesting pattern, these five programming languages have all been sufficiently different from one another as possible; thereby reducing the chances of mutual compatibility to an absolute minimum. The differences among them are not only of syntactic nature, which would have been a lesser problem, but in terms of runtime behaviour, memory models, and module patterns; the whole Swift application binary interface affair has recently put these difficulties in perspective once again.
In each step of the way, to ensure a certain smoothness in the migration to a new architecture, Apple has provided some sort of emulation or transitional technology, allowing software from a previous era to run unmodified in the next, or at least to be easily recompiled.
This has always happened, and will happen again.
Examples abound. The Apple ][e Card allowed Apple ][ software to run on the Macintosh LC in the early 90s. The Mac 68k emulator enabled Mac apps written for the 68000 family to run on PowerPC CPUs from 1992 to 2008. From 1994 to 1998, the Macintosh Application Environment made Mac apps run on A/UX, Apple’s flavor of Unix before Mac OS X. The Classic Environment allowed PowerPC applications to run on Mac OS X from 2000 to 2008. The Carbon API allowed developers to “recompile and run” older Mac OS 9 apps to run on Mac OS X from 2000 to 2012. Rosetta allowed Mac apps built for the PowerPC architecture to run on Intel CPUs from 2006 to 2011. More or less during the same timeframe, the Universal Binary executable format allowed the same bundle to run natively in both Intel and PowerPC CPUs.
And as I write these lines, during one of the most complex times of our era, I hear the governor of New Jersey desperately looking for COBOL programmers, prompting a seven time increase in the number of downloads in the GnuCOBOL project. And since it thankfully supports various COBOL dialects, such as COBOL85, X/Open, COBOL2002, COBOL2014, MicroFocus, IBM, MVS, ACUCOBOL-GT, RM/COBOL, and BS2000, developers can just start up their copy of Visual Studio Code in your laptop of choice, with Intel or ARM CPUs, and ask for access to a test mainframe. The COBOL cycle is starting up again.
Yes, Jean-Baptiste, vous aviez raison; plus ça change, plus c’est la même chose. Yes, Heraclitus, you were right too; πάντα χωρεῖ καὶ οὐδὲν μένει. Everything stays the same, because the only constant is change. Yes, Peter, everything old is new again.
As a corollaire, it is actually easy to predict that in the 2030s Apple (and, de facto, the rest of the industry) will once again migrate their product lineup to some other architecture, with a new fancy programming language featuring lambdas and monads and objects and most probably created with LLVM and (somehow) understandably incompatible with its predecessor. There will be some kind of transitional technology for both users and developers. A new breed of YouTube stars and bloggers will rave about this (old) new thing and make you feel old and forgotten once again.
This all will happen more or less at the same time COBOL will once again be fashionable, as we approach the year 2038. Because cycles is what the computer industry essentially is made of.
Cover photo by the author.