Issue #10: Programming Paradigms

In Which Thought Is Implied

Long-time payers of attention to my words will undoubtedly expect my article in the Programming Paradigms issue to be another rant about doing OOP correctly. That was the subject of my book, OOP the Easy Way. I wrote about it in my M.Sc. dissertation. In 2015, I gave a talk on the topic saying that the reason OOP had failed was that it hadn’t been tried.

So was object-oriented programming the silver bullet? Is everybody an order of magnitude more productive because software engineering has saved us? Kindof hard to tell, because it’s difficult to find an object-oriented program, even in 1995.

No, that’s not a typo: I set the talk in 1995. So, time for another take-down of OOP as described, more complaints about how we’ve missed the point? Not so fast. I want to look at two sides of the same coin: the importance of a paradigm.

Everything Is Awesome!

Software engineers probably have as many problems now as we did back in 1995. Actually, one more problem: the price has dropped out of the software market (a predictable side effect of a lemon market). On the NeXT, Lighthouse sold Diagram! for $499 (in 1995, so maybe $550 today). Today, OmniGraffle for iPad is $60, and definitely at the more expensive end of the market. A factor of ten reduction in cost is a good return for a factor of 1,000 increase in addressable market. Much software today is free and you can’t make up for that with volume.

Money aside, writing software now is about as hard as writing software was in 1995. Whatever is going on, that’s actually pretty impressive. It’s impressive because our expectations for our software have increased astronomically in that time. The average computer user in 1995 had a beige box with a floppy drive, maybe a hard drive, and exceptionally a CD-ROM. It sat in their office, or maybe their home office, and was where computering got done. Many of these computers were not online and those that were used explicit online tools like Gopher, Netscape, or Email.

We expect modern software systems to remain up to date over a near-ubiquitous network link. They sync data between multiple devices and a network store in real time. Apps use multiple sensors and integrate data from many sources to improve the contextual relevance of their interfaces. And, in addition to their user-facing features, many apps perform surveillance, display advertising, engage in live testing of their users, and more.

The fact that we have increased our reach so much without drowning under the additional complexity truly is astounding.

Everything Is Awful!

Something must enable that growth in our aspirations and achievements. We find no single clear-cut cause: none of object-oriented programming, functional programming, reactive programming, or any other paradigms have clearly “won”. What I mean by that programmers are not universally applying some programming paradigm in an identifiable way. You can see things called objects and classes all over the place without seeing a coherent philosophy behind how to make objects and classes or how to put them together. You can see things called functions (or worse, “lambdas”), and you’ll see different things called functions in different places. They work in different ways.

Now a “paradigm” should be a conceptual framework, or pattern. When Object-Oriented Programming was still new, lots of programmers and architects expressed interest in a pattern language. The patterns community modeled itself after the community of practice in architecture formed around Christopher Alexander and his Pattern Language. In modern times many programmers know of the patterns language through a specific set of implementation patterns, published under the title Design Patterns. From either the book of that title or the newer Head First Design Patterns, developers learn about Singleton and Abstract Factory and resolve never to use either.

Which is a shame. If we want to share an idea of paradigm, we need to consistently describe it. The community comes together around shared practices by naming those practices and their motivations. By creating a pattern language. The language encapsulates a collection of related thoughts, and gives them a new name: a new, higher-level thought can be transmitted.

Everything Is Mediocre

To some extent, the software industry has indeed embraced patterns. We can identify some patterns in software business models, for example. “Ad-supported” means a lot more than supported by ads. “Software as a Service” usually implies Venture Capital backed software with a Free Tier and a Subscription Model yielding Recurring Revenue, storing data in Cloud Hostage. All of the capitalised phrases are themselves titles of patterns in the software business pattern language.

Maybe we have patterns and paradigms in all of the aspects of software construction except software construction. It’s interesting that while software-business phrases like “loot box” seem to have a stable meaning, software-engineering phrases do not. OOP, Agile, functional programming, free software, MVC, technical debt, refactoring, test-driven development. All of these ideas seem to go through a transition between description and practise that obfuscates, dilutes or outright changes their meaning.

Everything Is Normally Distributed

Evidently we don’t need to all derive the same meaning from these phrases for the software industry to expand its reach and ambition. My question is whether we’ve scaled up just through bloody-minded stubbornness. Are we due some form of reckoning, where the bottom falls away from the market far enough that people notice just how ridiculous our salaries are and “right-size” their expenditure on software? Will such a crisis make us coalesce our ideas and distill our practices? Or are we collectively spending the right amount?

The answer, of course, is that this question makes no sense. There is no collective civilisation choosing to open its communal purse and spend an amount on software, which it then apportions to valuable and worthless exercises, efficiently or wastefully. Nor is there an invisible hand deftly guiding everyone to make the correct connections between available capabilities and required services. Instead, there are local people making local decisions with local information. Some of their choices are good and others bad. We avoid bad choices by doing what the paradigms were inviting us to do in the first place: to think.

Cover photo by Aaron Burden on Unsplash.
Graham Lee

Graham is the chief labrarian of The Labrary, where the library and the laboratory intersect. He got hooked on making quality software in front of a NeXT TurboStation Color, and still has a lot to learn.