A magazine about programmers, code, and society. Written by and for humans since 2018.

The Hype Cycle Of OOP

Even though Marketing buzzwords might have an effect akin to Kryptonite against our readers, we are going to use the famous Gartner’s Five-Step Hype Cycle to take a closer look at the practice of Object-Oriented Programming (OOP) and its various ups and downs in the past 50 years. Remembering that one of the core tenets of this magazine is to make the impossible dialogue possible, the framework provided by Gartner fits this task perfectly well.

Innovation Trigger

With a silent and barely noticed birth at the end of the 1960s, Simula 67 introduced OOP concepts to the world: classes, inheritance, polymorphism, garbage collection, and the whole package (no pun intended.) We say “barely” because Jean Sammet noticed and reported Simula in her masterpiece.

The following programming language to apply concepts of OOP (that we know of) was, of course, Smalltalk, which we covered extensively in a previous edition of this magazine. In 2009 James Iry wrote about the birth of Smalltalk and OOP in one of the most popular posts ever published on Blogspot: “A Brief, Incomplete, and Mostly Wrong History of Programming Languages”:

1980 – Alan Kay creates Smalltalk and invents the term “object oriented.” When asked what that means he replies, “Smalltalk programs are just objects.” When asked what objects are made of he replies, “objects.” When asked again he says “look, it’s all objects all the way down. Until you reach turtles.”

Simula 67 (a superset of ALGOL 60) and Smalltalk famously appear in the “inspiration list” of almost all OOP programming languages created during the following 50 years, including C++, Java, C#, and many others.

OOP concepts took quite some time to take hold; at least 20 years of patient evangelization and, to a certain extent, indoctrination. The famous August 1981 edition of Byte Magazine featuring a colorful balloon (a symbol that would eventually become the official language logo) was quite an isolated hallmark. The market for hobbyist (and, to a large degree, enterprise) computing was not ready for OOP yet. We had to wait precisely five years, until the August 1986 issue, to find a significant reference to the world of OOP in the pages of Byte.

In the case of Dr. Dobb’s Journal, there was the first mention of OOP in a review of NEON (a cross between Smalltalk and FORTH, the latter a language dear to Dr. Dobb’s Journal in those days). It was in an article on page 96 of the October 1985 issue (volume 10, page 785) explaining how to send messages to an object “Bic” of type “Pen” to move around the screen. There was an article dedicated to OOP in March 1987 (volume 12, page 241) and, interestingly enough, featured in the column “Artificial Intelligence.” The subject of OOP became much more popular afterward; suffice to mention March 1988’s article “Object-Oriented Programming and Databases” (volume 13, page 119) or November 1990’s “Roll your own Object-Oriented language” (about Object Prolog, in volume 15, page 991)

Finally, the introduction to the collection of articles published by Dr. Dobb’s Journal in 1989 (volume 14, page 13) begins with a phrase that says it all:

If there’s any lingering impression of the past year — and that is reflected in this collection of DDJ articles — it is that in 1989 object-oriented programming stormed to the forefront in the world of programming.

Peak Of Inflated Expectations

OOP grew dramatically during the decade between 1985 and 1995. The first OOPSLA Conference was held in 1986, chaired by none other than Daniel Ingalls of Smalltalk fame. Bjarne Stroustrup released the first versions of C++. Brad Cox melted Smalltalk on top of C and created Objective-C. Bertrand Meyer released both Eiffel and its associated major bestseller book. Borland added OOP support to Turbo Pascal 5.5 (released May 1989). Famous groups of experts like “The Three Amigos,” “The Gang of Four,” and the burgeoning Agile movement started coalescing around OOP, yielding concepts and acronyms that would change the shape of our work forever, such as Design Patterns, Test-Driven Development, Scrum, Extreme Programming, standup meetings, and finally the Agile Manifesto.

Agile and OOP are closely related to one another in terms of community and timing, as explained by James Coplien (another prominent figure of both movements) in the Vidéothèque article this month.

By 1992 OOP was the next big thing™®©, as NeXT’s marketing material conveniently reminded us, with a Nike-level slogan stating that “The object is the advantage.”

OOP reached the pinnacle of its fame during the decade starting in 1995. The significant earthquake moment of that era was the release of Java, still one of the world’s most popular programming languages, 26 years after its release. Those were the times of the standardization of C++ as ISO/IEC 14882:1998; the rise of the Design Patterns movement; the explosion of TDD and xUnit frameworks; the introduction of .NET by Bill Gates; the release of the “Head First” series of books at O’Reilly; the spread of the idea of inversion of control and dependency injection.

And to top it off, all of these things happened at the same time as the rise of the World Wide Web, propelling OOP technologies to the spotlight as suitable ways to build and maintain web applications, with names such as WebObjects, Django, .NET, Ruby on Rails, and JavaBeans, making the headlines of blogs and magazines.

This author argues that the OOP frenzy lasted until January 2007.

Through of Disillusionment

During the late 2000s and early 2010s, OOP suffered various targeted attacks, lost credibility and popularity, and was almost unanimously regarded as an antipattern. Why did this happen?

Simultaneously to Steve Jobs’s introduction of the iPhone in January 2007, Google started publishing a series of blog posts called “Testing on the Toilet,” or “TotT” for short. On May 15th, 2008, this series featured a famous issue: “TotT: Using Dependancy Injection to Avoid Singletons.” The writing was literally on the wall of toilets worldwide: Singletons are bad™®©. Sadly, the much more exciting idea of dependency injection contained in the article got lost in the minds of most readers.

The reference to the iPhone in the previous paragraph is not anodyne: Cocoa Touch, the main application framework used to create iPhone apps, featured various singletons all over the place: NSNotificationCenter, NSFileManager, NSWorkspace, UIApplication, UIAccelerometer and the list continues… testability be damned. But, but, but Objective-C was dynamic, fans argued, so a savvy programmer could easily OCMock their way into testable code. Yeah, maybe not. The truth is that Google’s stance prevailed, fueled by an engineering culture many developers dreamt of joining.

And if Singletons were terrible, who knows what evils the other Design Patterns could be hiding?

Other authors blogged their concerns around OOP and Java (sometimes mostly, sadly, about Java): Paul Graham, Peter Norvig, Steve Jegge, and Joel Spolsky. I wrote about those bad times in a previous article:

Fifteen years ago, Bruce Tate published “Beyond Java.” This title was the reaction against what seemed as a chokehold of Java and the JVM on the server side of the software development industry. Proposing alternatives such as Python, Smalltalk or Ruby, it remains as the book that spoke the zeitgeist of its era: the mid-2000’s were not a good time for Java.

Conflating Java with OOP made things worse. Java was (and to a large degree, still is) the punching bag, the programming language we like to hate, no matter which other language we use every day. Moving away from Java became moving away from OOP.

During those days, we saw a revival of functional programming concepts, fueled by the prowesses and promises of “Big Data” and a newly found interest in machine learning. Many (if not all) popular OOP languages adopted filter(), map(), and collect() functions, conveniently encapsulating cumbersome for and while loops. C# went to the extreme of offering an extension called LINQ that encapsulated those functional principles in a SQL-like syntax. Similarly, Java got sidelined by Scala on the server and later by Kotlin on the mobile client.

Programmers saw a new type of object emerge: closures. A callable entity to be passed around functions, closing around the values defined in the stack frame where it appeared. At least for Objective-C, a literal object and the only one allocated on the stack.

On the other side of the fence, and more or less at the same time, functional languages like F# mixed OOP concepts and naturally interacted with a sizeable pre-populated collection of classes initially intended to be consumed with C#.

The Model-View-Controller architecture, whose acronym became associated with Massive View Controller, gave way to similarly-named alternatives: Django’s MVT, Swing’s MVP, Microsoft’s MVVM, Swift’s VIPER, and more.

Design patterns also suffered during those days. Singleton was the first victim; then came Observer. It is sad because it was a fine idea, a helpful mechanism to decouple the Model from the rest of the MVC concept. In Cocoa Touch, the Observer pattern became (in)famous through NSNotification and Key-Value-Observing.

But since the release of Swift in 2014, the Observer pattern died and was replaced by Reactive programming, a more generic idea built around data streams, which found fertile ground in web programming. In particular, since Facebook and Google respectively released the first versions of React (2013) and Angular (2016). Professor Salvaneschi from the University of Saint Gallen (Switzerland) explained the road from the Observer design pattern to Reactive programming in a paper published in 2014.

Slope Of Enlightenment And Plateau Of Productivity?

We have reached the point of productivity, which is why OOP is no longer hype: it is boring, tested, and works. But to reach this stage, OOP has had to morph, losing some characteristics that were inextricably associated with it.

In particular, OOP lost inheritance along the way; in many cases, it lost classes altogether.

Look at JavaScript and its prototype-based inheritance; the .NET DLR (Dynamic Language Runtime) and its DynamicObject class. In a move that would make Coplien happy, OOP is once again about objects and no longer about classes.

On the other hand, we lost class inheritance altogether: multiple implementation inheritance (à la C++ and Eiffel) had already died when Java (and its multiple inheritances of interfaces, borrowed from Objective-C’s protocols) became popular. The thing is, in the last decade, we also lost single inheritance: neither Go nor Rust have it, and we are talking here of two of the most popular programming languages of the 2010s.

To be honest, multiple inheritance came back in the form of traits, allowing developers to compose objects with various personalities. And single inheritance, well, it can now be found in the most unlikely of places: Docker container images can happily “inherit” FROM one another, overriding parameters and configuration settings layer by layer. Good luck finding out which base image overrode which setting.

TDD pundits can always inject any dependencies they need; there is a staggering number of dependency injection frameworks, making everyone happy. Joel Spolsky loves duct tape programmers, and he is right.

The market cherry-picked the best parts of OOP: encapsulation and messaging. The Cloud Native world coopted these two and remarketed them as microservices, exchanging little JSON messages sent over HTTP or through a message broker like Kafka or RabbitMQ. Kubernetes is the new object-oriented application runtime, and you can even have Aspect Oriented Programming thanks to eBPF.

The tradeoff? In these new OOP systems, the latency is worse than using Objective-C’s objc_msgSend() or COM+ IDispatch and even worse than using C++ vtables, but now we can safely apply Conway’s law and divide our teams conveniently: so that they only communicate using JSON messages on the cluster and with nerf guns on the open office floor plan.

Cover photo by Steve Johnson on Unsplash.


Back to top