Among the oldest companies still active we can find: a few Japanese corporations founded between 500 and 800 AD, a restaurant in Austria, a French winery, an Italian bell maker, and quite a few hotels scattered all over the Northern Hemisphere. Through a combination of opportunism, luck, corruption, monopoly, perseverance, talent, and ingenuity, these businesses have been able to survive the inevitable chaos of the markets where they operate; in some incredible cases, for longer than a whole millennium.
The businesses enumerated above are, simply put, anomalies. The average age of businesses on the S&P index hovers around 20 years, with a moving average slowly sliding downwards during the past half century. The lifespan of medium and small businesses is even lower, with half of new businesses not even making it beyond their fifth birthday.
IBM is the anomaly of the software industry. Founded in 1911 under the name “Computing-Tabulating-Recording Company” or “CTR”, it will celebrate the 100 years of its “International Business Machines” moniker in 2024, even though some local branches of CTR, such as the Canadian one, called themselves IBM as early as 1918.
Technically speaking, CTR itself was a merger of four different companies, one of them founded at the end of the nineteenth century by Herman Hollerith, an ill-tempered inventor, son of German immigrants, who invented tabulating machines that stored data on punched cards. Which explains why various corporate biographers of IBM start their recollections during the Victorian era.
Thus, depending on whom you ask, IBM’s history spans between 100 and 140 years. Way less than the 1500 years of the Kongō Gumi construction company, but quite a feat nevertheless. Particularly in our software industry, eternally ripe for disruption.
During the past century, IBM has excelled at proving everyone wrong. Every generation has predicted its demise, and every generation has seen its rebirth. In 2019 IBM finished the purchase of Red Hat, purveyors of various cloud-related technologies, whose income is feeding WatsonX AI initiatives, and whatever quantum computing could turn out to be. Neither Hollerith nor the Watsons, Sr. and Jr. could have ever predicted such evolution.
(And no, Watson Sr. never said the phrase “there is a world market for maybe five computers”. It is just another urban myth, just like Bill Gates’ remark that “640 KB or RAM ought to be enough for everyone.” Think, and then check your sources.)
As stated previously, surviving corporations show a clever mix of opportunism, luck, corruption, monopoly, perseverance, talent, and ingenuity. IBM is no exception.
One: Opportunism. As Scott Galloway explained,
the most valuable companies in the world all have one thing in common: They build a thick layer of innovation on top of investments made by the premier VC in history, the U.S. government. Apple used Darpa’s GPS to build the iPhone. Facebook built an app on top of a government-funded hosting service called the Internet. And Netflix, like Amazon, leveraged the nation’s largest content distribution platform — the U.S. postal system — to send DVDs by mail.
Needless to say, IBM rightfully belongs to this select group. The U.S. government directly fueled IBM’s growth at various points in its history: for example, during the 1890 census, during the Great Depression of 1930, and during the Apollo Lunar missions of the 1960s. And when New Jersey Gov. Phil Murphy desperately called for COBOL developers in 2020, he indirectly gave a boost to IBM’s Master the Mainframe contest.
Two: Luck. IBM could have disappeared into oblivion during the Great Depression era. Instead, it was lucky enough to have quite incompetent competitors, and to have warehouses filled with unused hardware the day FDR launched the New Deal, who desperately needed lots of tabulating equipment to provide Social Security benefits to millions of unemployed U.S. citizens.
Three: Corruption. IBM could have lost a huge chunk of its European foothold had it shown not even the smallest bit of aversion to the antisemitic policies of Nazi Germany. The same rule book worked wonders in dictatorship-laden South America and Africa after the Second World War. Also, its very close ties with the U.S. government, in particular through their defense intelligence unit, were a huge bonus. Those headquarters on the East Coast, closer to Washington D.C. than Silicon Valley, definitely paid off.
But there is more; did you know that
Allegations emerged in 1995 that IBM-Argentina had paid US$37m in kickbacks and bribes in 1993 to win a US$250m contract with the government-run Banco de la Nación. In 1998, arrest warrants were issued for four former IBM executives and Judge Angelo Bagnasco charged ten people with crimes, including a former president of Banco de la Nación and IBM-Argentina’s former CEO and former COO. The New York Times noted in 1996 that six months after the initial revelations, the IBM scandal was “still front page news in Argentina, as new disclosures emerge almost weekly, tainting the computer giant’s reputation for honesty here.” In 2000, IBM was ordered by the U.S. Securities and Exchange Commission to pay a civil penalty of US$300,000. IBM was concurrently involved in no bid contracts awarded by Social Security Director Arnaldo Cisilino, which later resulted in the latter’s indictment for fraud in 1998.
Four: Monopoly. IBM, known for its reckless sales force and their anticompetitive tactics, could have been forced to split into various companies “à la Bell” after the 1969-82 antitrust trial “U.S. vs IBM” (talk about a love-hate relationship). However, the fast pace of the technology industry was more effective than IBM’s lawyers. By the time the trial ended, punched cards and mainframes (both at the core of the litigation) were rendered obsolete respectively by diskettes and hard drives (both IBM inventions, by the way), and microcomputers and PCs (other things in which IBM was famously involved.)
Five: Perseverance. IBM could have gone bankrupt in 1993, but it quickly remembered how Reaganomics worked, laying off a huge section of their workforce, reducing costs all over their portfolio, and focusing their efforts into consulting and the highly lucrative mainframe business. “Solutions for a small planet” and “eBusiness”, remember?
Six: Talent. IBM literally invented so many things, enumerating them in this article would be a waste of everybody’s time. The reader would be better served watching IBM’s own 100-year celebration video, which, even though creepy at times, provides in just 12 minutes a useful peek at the crazy effectiveness of getting extremely smart people working together. Heck, IBM has more Nobel Prize winners (six) than Argentina (five), and that is without counting six Turing Award winners (including the first female winner, Frances Allen, in 2006), plus an insane number of patents issued every year. They just announced a special chip just for AI tasks, and their quantum computers keep on breaking records.
Seven: Ingenuity. IBM could have owned the many markets it created, including databases, personal computers, hardware, printers, etc., naively letting other companies like Microsoft, Dell, Compaq, Oracle, Hewlett-Packard, and even Apple, eat their lunch. The side effect of this ingenuity is that, well, quite a bit of the software industry as we know it was born in their research centers in one way or another (with another big chunk coming, most famously, from Xerox.)
Sigh. Whether I like it or not, writing about IBM feels somewhat personal to me. I am writing these lines on a keyboard directly inspired from the Model M, connected to a 2018 X1 Carbon ThinkPad. My editor uses the IBM Plex font. A quick search in this magazine bring 43 issues where the word “IBM” is mentioned, including last month’s issue about database technology.
Even more personal, my mother and her brother both worked at IBM between the 1960s and 1970s. I have childhood memories of IBM brochures and magazines featuring computers the size of a room. And I remember my mother referring to the reign of legendary mogul Luis A. Lamassonne, 15 years before the inauguration of the Torre IBM in the Catalinas neighborhood (their offices were still in Diagonal Norte). She recalled fondly the glorious benefits package for pregnant employees, which covered all expenses for my birth at the Clínica del Sol in Buenos Aires.
Sadly, my birth also meant the end of my mother’s career at IBM. You see, to say that child-bearing IBMers were not appreciated by their male bosses is the understatement of the century. This was 1970s IBM Argentina, after all: computer hardware, suit and ties, and lots of cash. The testosterone could be smelled all the way from Diagonal Norte to the River Plate stadium. Such kind treatment begat a reinterpretation of the IBM acronym shared among female staff at the time: “Inmensa Bola de Mierda”. I will let the translation of this epithet as an exercise to the reader.
Of course, not everything was gloomy during my mother’s time at IBM. As an attractive young woman in her mid-twenties, she was often called to demonstrate the power of the recently released System/360 mainframe computers at various local trade shows. Just like Mar Hicks explained perfectly well in her book “Programmed Inequality”:
Another selling technique evolved simultaneously over the course of the 1960s. Some companies, including ICT (later International Computers Limited, or ICL), employed all-women computer demonstration teams who worked on-site at the company operating the demonstration machines for potential customers and also went to trade shows. These teams ensured that business consumers saw computers as easy to staff and not overly complex to run. The young women presented a vision of effortless perfection and conveyed none of the gravitas male staff might have. For similar reasons women operators and programmers at IBM’s world headquarters on Madison Avenue in New York City were told to work on the computers in the window, in view of the sidewalk, to make the machines look easy to use.
Yup, that was it. But instead of New York City, Buenos Aires. Despite such demeaning treatment, my mother stayed forever faithful to one particular IBM product: the Selectric, by far her preferred typewriter, which she got to use for decades as part of her work as an executive assistant. Yes, the same typewriter that the Russians bugged during the Cold War, and that could be used as a printer for your microcomputer in 1977.
But I digress. Most readers of this article are probably waiting for more technologically inclined content; we are going to get into that in a minute. The fact that most biographers of IBM were white, male, ex-IBM employees themselves leads to a widely distorted view of such a huge organization. IBM is, just like any other corporation of its size, a complex beast to talk about. To give you an idea, it reached almost half a million employees at several points in its history, with sales numbers consistently oscillating between 11 or 12 figures for decades. The technical perspective is, by far, the simplest and easiest to digest for sensible souls.
So, let us talk about technology, albeit for a little while. The concept of the modern computer is associated with the IBM brand through antonomasia.
IBM is at the origin of quite a few good ideas. Semiconductor memory chips. The floppy disk. The hard disk. The first laptop shipped with an SSD drive. Air travel reservation systems. The Fast Fourier transform algorithm. Computer algebra systems. The Data Encryption Standard algorithm. The supercomputer. The 8-bit word. The RISC CPU. (Ironically enough, the fact that your personal computer most probably understands the x86 instruction set can also be traced back to IBM’s Boca Raton laboratory in the 1980s.) Your groceries have UPC barcodes all over their packaging… and yes, that is something invented by IBM, too. Even Mandelbrot sets are an IBM invention.
Speaking about programming languages: Speedcoding, FORTRAN, FORMAC, COMTRAN, GOTRAN, CLIST, Rexx, SQL, APL, PL/I, PL/S, CL, GPSS, RPG, JCL, HAScript, ; they were all created by IBM engineers (the first two by none other than John Backus himself). Another of those brilliant engineers, Nathaniel Rochester, invented the first assembler for the IBM 701. The Forth programming language received its name because… the IBM 1130 operating system only allowed filenames with five characters (spoiler alert: they wanted to name it “FOURTH”). What about COBOL? Well, the language was designed by a committee that included IBM as well as some competitors (with long-forgotten names such as Burroughs, Honeywell, Sperry Rand, RCA, and Sylvania). Jean Sammet, author of the first printed history of programming languages, was an IBM employee at the time of its publication.
How many of those programming languages are still relevant? Well, a non-negligible number of programmers write SQL queries every day, while a (sadly) shrinking number of them still write FORTRAN, or should I say, Fortran, no need to shout. Plenty of COBOL developers make the world go round, launching programs with JCL every day, but hardly anyone notices. Amiga and OS/2 retrocomputing fans write Rexx scripts every so often. During the 1980s, IBM BASIC was a big player in the microcomputer BASIC market. In the 1990s, IBM had a love relationship with Java, and they even produced the first 16-bit implementation of the JDK for Windows 3.1, which was the one I used to learn the language. Closer to our modernist times, Swift fans probably remember Kitura (circa 2016), a short stint of IBM and Apple trying to work together.
(By the way, this whole “IBM and Apple” relationship thing has various highlights. The “Welcome, IBM. Seriously” ad in the Wall Street Journal in 1981. The middle finger of Jobs. The PowerPC chip. Taligent. “Think different”. Kitura. More recently, Watson for CoreML…)
On the downside, IBM also has a rich history of missed opportunities. IBM could have been the leader of the PC market, but could not keep up with Dell and Compaq. Lexmark printers might have been more popular, but Hewlett-Packard ate its lunch. After Deep Blue beat Kasparov and Watson won at Jeopardy, IBM could have been the major AI company, but OpenAI took the throne last year. Simon could have been the first touchscreen smartphone, but it was much too early for that. A CPU with RISC architecture could have been mainstream, but the Boca Raton engineers chose an Intel chip instead, and Arm went public last month. IBM OS/2 could have been the operating system of choice for the 1990s, but Microsoft Windows 95 started up first. IBM could have been at the helm of the microcomputer market, but Unix was created on a DEC PDP/11 instead. IBM could have been the inventor of “Cloud Native” technology, but Amazon got there first, and now WatsonX is available on AWS.
(Did you notice how many of these “could have beens” happened between 1980 and 2000? No, it is not a coincidence, and Louis V. Gerstner, Jr. wrote a whole book about that.)
Apart from computer engineers, IBM was also one of the biggest employers of lawyers in the U.S. You see, their position of monopoly in the computer market triggered quite a lot of scrutiny over the years. We have already mentioned one of the longest-running lawsuits in U.S. history, running from 1969 to 1982. Two decades after that, hardcore Linux users feverishly followed another heated lawsuit, this time launched by SCO. There are many more; just search for the phrase “IBM lawsuit” on your preferred search engine if you are interested.
In the same vein, IBM could have become extinct every so often, if it were not for a series of company-wide bets that put it back in the spotlight every time. The IBM 701, the IBM 1401, and the IBM System/360 come to mind. You see, the inertia created by its size, and the political power held internally by old-but-profitable business units, often made taking such bets almost impossible, which, in turn, almost meant bankruptcy during the 1990s, followed by another crisis during the 2010s. The same story repeated itself various times in IBM’s history: mechanical tabulators versus electric calculators; punch-cards versus tape drives; mainframes versus the PC; the 1401 versus the System/360; and so on and so forth. It is almost as if IBM’s worst enemy was, indeed, IBM itself. There is a lesson here for business managers, and a few Harvard Business Review articles to write.
IBM also has had quite an impact on the arts.
Cinema. A scene in Stanley Kubrick’s classic movie “2001: A Space Odyssey” features an IBM logo on the wrist of Dr. Dave Bowman floating in space, discreetly visible at the one hour and seventeen minutes mark. Legend has it that the “HAL” acronym was inspired by IBM’s, but both Kubrick and Arthur C. Clarke have denied that; particularly because IBM’s lawyers did not want its employer to be featured as “the bad guy” in the movie. We will never know for sure. In an earlier scene, a spaceship docks to a space station in orbit that features a prominent Pan-Am logo and a Hilton hotel inside. Corporations, corporations everywhere, but not all of them have survived the test of time. Companies come and go, but IBM is somehow still there.
Design. The IBM logo, another creation of legendary American designer Paul Rand (of UPS, Enron, Westinghouse, and NeXT logos fame, among many others), was introduced in 1956 and got its striped look in 1965. Not many companies have kept their corporate identity unchanged for more than half a century. On page 151 of “The Computer” by Jens Müller and Julius Wiedemann there is a whole page showing the “design system” created by Rand for IBM, applied to products, brochures, packaging, and store design, and in the opinion of this author, it looks fantastic.
Music. IBM has also inspired a beautiful album mentioned in a previous article of this magazine, and worth reposting:
As this article is published, IBM has a CEO of Indian origin, just like the vast majority of its workforce. But make no mistake, the center of gravity of its power and influence is still happily located on the East Coast of the U.S. Plus ça change, plus c’est la même chose.
IBM still sells millions (billions?) of dollars worth of insanely powerful mainframe computers every year, and it regularly makes the headlines for three main reasons: lawsuits, regularly scheduled layoffs, and the growing hype around quantum computing.
Nobody can dismiss the impact of IBM in our industry; you will still read its name mentioned in the pages of this magazine in future articles, even if they (finally!) run out of luck. I cannot help but referring to them with an equal mix of wonder and distrust. Now it is your turn: think about IBM for a moment, keeping in mind the destructive power of massive corporations in our modern world, but always remembering their uncanny capacity (in some cases, at least) for change and adaptation.