Banning, Adopting, Reckoning
Last month, OpenAI, the company (in)famous for their ChatGPT product, released a course called “ChatGPT Foundations for K-12 Educators”, an event that has raised more than a few eyebrows, and even some outrage. We must have a serious conversation about the value of a bullshit generator in the context of teaching programming skills to new generations.
It will most certainly not come as a surprise to our regular readers the revelation that this magazine tends to have a strong tendency towards Luddism. Yes, even when it comes to one of the most recent inventions made by man, the computer, we will continue to put our foot on the brake and call for reflection and evaluation, venture capital be damned. Of course, this does not happen, and we enter again and again cycles of hype that do more harm than good.
These days, generative artificial intelligence is all the rage. All of a sudden, and seemingly out of nowhere, a statistical bullshit machine, very far-fetched from the artificial intelligence I dedicated an open letter to a few years ago, is helping students craft 2000-words essays in seconds, and the same bullshit machine is used by overloaded and burnout teachers to summarize (and maybe even grade!) those same essays in an even shorter amount of time.
Oh brave new world, that has such people in it!
Can we ignore ChatGPT (or other LLMs for that matter) when teaching programming in the year 2024? Hardly. It is there, it is free to use, and it can be damn good at certain tasks. Damn good in the short term, unfortunately. In the long run, it might have a very detrimental effect, unless we adopt some measures right now.
Electronic Calculators
I belong to the generation of human beings who saw the sudden appearance of the electronic calculator in our pockets. During the 1980s, their price dropped as fast as their capacities grew; from being able to perform a square root calculation in seconds in 1980, we ended the decade in 1989 with programmable calculators featuring hyperbolic trigonometric functions and a myriad of other capabilities.
What was the reaction of math teachers worldwide to this sudden invasion? First, as expected, they reacted with a ban. We could not use calculators of any kind, not in the classroom, and especially not (God forbid!) during tests. They were prohibited, completely, absolutely, and entirely. We had to learn how to calculate square roots by hand (seriously, it is not that difficult) and to learn the algorithms of plenty of other integer and even real number operations.
Instead of calculators, we could sit at exams with certain books on our desks, filled with logarithm and trigonometry tables, featuring formulae for algebra, calculus, and geometry. French-speaking Swiss students might remember the ubiquitous “Formulaire et Tables” of the “Commissions romandes de mathématiques, de physique et de chimie”, published by the Éditions du Tricorne, that many generations of Swiss students used during their exams.
The second step was a slow adoption process for calculators in the classroom: we could use them, in particular to solve exercises outside of exams, but we could not use the programming kind thereof (which were quite expensive, anyway), and very often not at all during exams. Get yourself a cheap Casio or Texas Instrument calculator, like the ones featured next to a Walmart cash register, thank you so much. Again, no use of those during exams; but hey! We could do our exercises much faster, which meant we could do more exercises (and more complex ones at that) in the same amount of time.
Finally, the reckoning phase came, after all. By the 1990s I could use my all-powerful Hewlett-Packard 48GX with 128 KB of RAM, with the RPL programming language (inspired by Forth), featuring 3D graphics, matrix calculations, and even an equation editor, at any time, in any situation: classrooms, exercises, exams, you name it.
(Disclaimer: yes, the HP 48GX had a short-range infrared port that allowed devices to exchange data via Kermit. Yes, we used that during exams to exchange information with fellow HP 48GX users. No, that was not really legal. I will not make any further comments.)
All of these possibilities came with a caveat: the problems to be solved during exercise and exam sessions were immensely more complex than those you would find on a term exam in 1981. Kermit or not, I failed too many of those exams. Turns out one needed much more than raw computing capability to get a good grade. Surprise!
And this is key to this story; we will have to learn how to deal with LLMs in the classroom in one way or another, just like we had to learn how to cope with calculators forty years ago. The genie is out of the bottle, and yes, it is a drunk genie that hallucinates quite a bit sometimes, unfortunately. But this is what we got today.
As I write these words, school systems everywhere are reacting to the sudden existence of this clumsy genie by either banning it (he? she? they?) or, in the best cases, by slowly adopting it as an integral part of the world those kids will inherit.
Should we then follow the ideas of OpenAI and attend the dreaded online course mentioned at the beginning of this article? Yes, but not blindly; be aware of who the creator of this course is, and what their agenda looks like.
As always, caveat alumni.
Large Bullshit Models
What is a teacher to do when they ask their students to program a system in some programming language, and a week later they submit some ChatGPT-generated nonsense that they cannot even understand, let alone run properly?
We have to come to the conclusion that the “Hello World Era” is over. We cannot, and we must not teach computer programming ever again as we have in the past. The existence of LLMs must mark the debut of a new style of programming education, whether we like it or not. Sorry for all those educators who were looking forward to retiring while applying the same good old recipe semester after semester!
The same way we have left behind BASIC, Logo, Pascal, and to a certain degree, even Scratch; the same way we have left behind the dubious practice of making students write code by hand on a piece of paper (a sad idea this author has witnessed first hand thirty years ago); we will have to leave behind the usual coding tests, the standard programming workshops, and we will have to accept the existence of LLMs in our world, whether we like them or not.
Embrace the hallucination, and teach your students to digest it, to work with it, to learn from it.
In terms of reckoning, this means effectively using LLMs in the classroom; ideally not ChatGPT or similar ones, owned by large corporate conglomerates, but smaller ones, hopefully trained ad hoc for teaching purposes, fed with open access or public domain material. We are almost at the dawn of 2025 and thankfully, there are a few open-source models to choose from that fulfill these requirements, and maybe a business or two will provide such LLMs in the near future, if they are not already doing that.
(The astute reader of the last paragraph will most probably have realized that I just gave away, for free, an idea for the next multi-billion AI-powered startup. Do not thank me, just send a few stock options my way, thank you so much.)
Once the bullshit machines generate their code, the role of the educator is then to drive the discussion around those inevitable big balls of mud. Drive the reasoning process towards higher grounds: architecture, collaboration, large systems, maintenance, documentation, testing, quality. Use that code and make your students build large, very large systems even, with potentially hundreds of lines of code, and make them (the students and the systems) collaborate with one another. Take the output from those LLMs and then make students fix it (potentially with the help of an LLM!), document it, wire it, test it, maintain it.
This is something that the very creator of OOP had in mind:
Contrary to general practice Kristen Nygaard argued that the teaching of object orientation should begin with sufficiently complex examples in order to expose the strength of analysing and describing complex situations in an object-oriented perspective.
(Jens Bennedsen and Michael E Caspersen, “Teaching Object-Oriented Programming”, 2004.)
Teachers should drive their students to generate and analyze code in various programming languages. They should help them become polyglot software engineers, not just myopic single-language or single-paradigm zealots, able to deal with (at least) the top 20 languages of rankings such as TIOBE, RedMonk, and PYPL. Teachers should drive their minds to become fluent in both object-oriented and functional programming languages; in both academic and enterprise languages alike.
Empower your students so that they can make the impossible dialogue possible. We need this, more than ever.
Finally, make students sit for their final term programming exams in air-gapped environments without network access to the broader internet. Provide them with virtual machines or containerized environments, with all the information they need (including man pages, HTML websites, books in PDF and EPUB formats, etc.) and all the tools they need (compilers, runtimes, etc.) and make them create systems from scratch. Evaluate those exams using automated testing (and, please, not with another LLM!) so that they can get immediate feedback of their final grade.
Your students should program stuff during those exams in an isolated environment that contains the minimum required to do their work. In terms of technology choices, small LLMs like the Granite family of large language models, web-based development environments such as Eclipse Che, or even (if all else fails) VirtualBox virtual machines could certainly be used to bring such experience to life.
(Disclaimer: at the time of this writing the author of this article works for Red Hat, an IBM company, and yes, this employer sells some of the items enumerated above.)
We are firm believers in what Seymour Papert once said:
Being a mathematician is no more definable as “knowing” a set of mathematical facts than being a poet is definable as knowing a set of linguistic facts. Some modern math ed reformers will give this statement a too easy assent with the comment: “Yes, they must understand, not merely know.” But this misses the capital point that being a mathematician, again like being a poet, or a composer or an engineer, means doing, rather than knowing or understanding.
(Seymour Papert, “Teaching Children to Be Mathematicians Versus Teaching About Mathematics”, July 1971. Emphasis in the original.)
And in the age of the LLM, we believe that the “doing” part has to change substantially… once again.
Conclusion
As the big caveat of this article, I feel I must quote a recent paper highlighting many of the risks brought by LLMs in society, including those related to the educational context:
Educational uses of generative AI pose several other challenges. One is the perpetuation of biases and discrimination, potentially reinforcing racial or gender-based stereotypes during personalized learning, automated scoring, and admission processes.
(…)
The current debate about the role of generative AI, from primary schools to universities, revolves around whether generative AI should be banned, permitted under only some cases, or allowed as assistance for teachers and students. (…)
We argue that these approaches are limited in vision. A more forward-thinking approach would involve a curricular revolution to redefine the skills and competencies necessary to effectively utilize generative AI.
(Valerio Capraro, Austin Lentsch, Daron Acemoglu, et al. “The Impact of Generative Artificial Intelligence on Socioeconomic Inequalities and Policy Making,” 2024.)
Assuming LLMs are to be accepted in education, what should be the guiding principle? First, teach your students to think like humans. Even if it hurts (spoiler alert: it does). Second, rise the abstraction ladder, using the LLM as a tool to generate the minutiae of the larger systems at hand.
If anything, this author firmly believes that programming skills are second to those related to communication; most engineers coming out of colleges these days are unable to express themselves in public, to teach their peers, to write an essay or a blog post, to communicate their ideas to stakeholders, or to put together a simple documentation bundle without suffering a seizure in the process.
If we are going to have LLMs performing the grunt work of coding, we need students to become the architects of tomorrow, not just another coder selling their work at Fiverr for a living. Help those students build larger and more complex systems, while at the same time making them conscious of the process, and helping them develop those much required “soft skills”.
At the end of our article about BASIC, a programming language that was quite literally created to teach students how to program, we lamented the current state of programming education with these words, which we reproduce here for the sake of memory and repetition:
We used to teach kids how to think like a computer. These days, however, we are more interested in teaching computers how to think like kids, and we are doing a terrible job in both cases.
Cover photo by Aaron Lefler on Unsplash.