Last month’s news that IBM would do a Hewlett-Packard and divide into two—an IT consultancy and a buzzword compliance unit—marks the end of “business as usual” for yet another of the great workstation companies.
A quick aside on computing history. You can imagine personal computing being driven by two distinct schools of thought. The “top down” school, represented by research-led organisations including Xerox PARC, Bell Labs,academia and the military, asked “what would the world be like if everyone had their own minicomputer”? They took large, time-sharing systems like UNIX and installed them first under, then on, employees’ desks for their own personal use.
The “bottom up” school was made up of hobbyists who asked “can we make an interesting computer out of inexpensive components”? Thus companies like Apple and MITS in the US, Acorn and Sinclair Radionics in the UK, and others took chips that were usually used as peripherals controllers in “real” computers and built interactive programming systems around them. The microcomputer revolution came from the bottom-up school, as they made home computing affordable. The workstation revolution came from the top-down school, as they made powerful on-demand computing feasible.
The two schools came into very close proximity in the 1980s, when the Motorola 68000 family of CPUs (along with the 68881/68882 FPU and 68851 MMU) were the processors of choice in everything from entry-level PCs like the Atari 520ST, through games consoles like the Sega Mega Drive (Genesis in the US), to the most expensive UNIX workstations from NeXT Computer, Sun Microsystems, and Apollo Computer.
But then the workstation makers invested heavily in their own CPU architectures based on RISC design principles and again the two diverged. The workstation market became highly differentiated: RS/6000 from IBM (later PowerPC), Alpha from Digital Equipment Corp, MIPS from, well, MIPS, SPARC from Sun, PA-RISC from HP. The software on these workstations, while superficially very similar, was also differentiated and surprisingly incompatible. Take a program from HP-UX and you’ll have difficulty running it on NeXTSTEP, unless the authors shared the source code and used the nascent GNU autotools to support portable building. As Yoda said: begun, the UNIX wars have.
Of course we know that the (desktop) computing world today is mostly Intel and that workstations are mostly fancy PCs, rather than bespoke designs by vertically-integrated companies, Apple being the two trillion dollar outlier. How we got here was that the commodity parts got good enough that there was no evident advantage to workstation-grade hardware. A high-end PC could easily run a workstation OS like System V UNIX (Solaris was an early example), BSD (386BSD which later became FreeBSD, or NeXTSTEP) or Windows NT.
Along the way, the workstation companies consolidated (Apollo and eventually DEC got absorbed into HP; MIPS into SGI) or disappeared altogether (Sun became Oracle Hardware; SGI went bankrupt and sold its assets to sgi; Symbolics did similar—incidentally Symbolics was the first company with a .com domain). IBM long ago stopped even making its own brand PCs, and the news of its split means that there are now very few workstation companies trading in the same form they had “back in the day”. The only ones I can think of that have not had major changes to their corporate structures are Xerox and Sony, whose management may not even have known that they sold workstations.
What’s got lost alongside the death of the workstation is the business model where you sell expensive computers as part of an integrated solution into a particular vertical market, where that expensive solution will cost a lot less than cobbling something together out of cheap PCs. Why? I think people have a lower expectation and higher pain threshold when using computers now; they expect an amount of friction based on their own experience and translate that expectation into realms where it doesn’t belong. As I described way back in issue 2, computing is a lemon market.
Organisations would go to the workstation vendors because they solved particular problems very well. If you’re in AI, you need Symbolics. Computer graphics, SGI. Telecoms, that’d be Sun. If you want to write software in Ada for the military-industrial complex, you’ll be buying a Rational workstation. Yes, the first IDE was a completely integrated package of hardware and software. And, of course, Apple for Desktop Publishing, the Mac being a workstation of sorts itself. People would buy computers because applications like AutoCAD, Quark or Mathematica ran well on them. They wouldn’t buy the computer then browse the App Store to see whether it could do anything useful.
And the strange thing is that catering to those vertical markets with integrated solutions is easier than ever now. The wide availability of free software means that the basic job of “being a desktop computer” is taken care of at zero cost, so business can focus on contributing valuable bespoke behaviour. And hardware costs are lower than ever: the availability of high-capability SoCs and single-board computers like the Raspberry Pi and Rock64 should make it a no-brainer to sell the computers as accessories for the applications, not the other way around.
In high-tech domains, an engineer could readily have a toolchest of suitable computers in the same way that a mechanic has different tools for their tasks. This one has an FPGA connected by both PCI-E and JTAG to allow for quick hardware prototyping. This one is connected to a high-throughput GPU for visualisations; that one to a high-capacity GPU for scientific simulations.
The general purpose hardware vendors want us to believe that an okay-at-anything computer is the best for everything: you don’t need a truck, so here’s a car. But when you’re hauling a ton of goods, you’ll find it cheaper and more satisfying to shell out more for a truck. Okay-at-anything is good for nothing.
Cover photo by Serhii Butenko on Unsplash.