Creativity: How we lost it, why that’s bad, and how we get it back
Posted on Sun 21 November 2021 in blog • 14 min read
Right now, it’s easy to open a news site and come to the conclusion that the world as we know it is well and truly fucked. It doesn’t matter if you’re looking at Covid response, or inaction on climate change, or corruption, or communications surveillance: it increasingly looks like we are being governed and managed overwhelmingly by dunderheads just bumbling along, equipped with less than the most basic empathy, and lacking the mental faculties required to understand exponential growth or conditional probability or even percentages. And people — intelligent people — are seriously pondering the situation with utter befuddlement: how the hell did we get here?
The German writer Max Buddenbohm recently asked his followers a question to that effect on Twitter, which I am taking the liberty to translate:
Do you have a reflected opinion on why everything is so poorly organized, as in at its core? Historically or sociologically, what’s the real principal reason? How did this happen?
Now it’s perhaps a bit amusing and stereotypically German to complain of ſchlechte Organiſation! in the middle of a pandemic and global climate cataclysm, but the sentiment behind the question is sound: it looks as though at every twist and turn, those empowered to make any kind of decision make the wrong one, or none at all, or — the worst — aren’t even able to come up with sensible options between which to choose.
And I have a hypothesis: I think the issue at the core of why everything appears to be going down the tubes is that we have systematically drummed creativity out of people, for at least fifty years. And as a result, we collectively have no idea how we can get ourselves out of a rut.
Let me swiftly explain what I mean by “creativity.”
The wonderful Sir Ken Robinson, who left us much too soon in 2020, described creativity as “the process of having original ideas that have value.” And in arguing for the value of creativity, we don’t need to get all hippy-touchy-feely: creativity — and teaching and learning creativity — is a simple economic necessity that is also essential to our survival as a species.
If we have no idea what our world will look like ten years — or even ten months — from now, then how the hell is any “hard skill” we acquire today guaranteed to be useful in future challenges? The paramount faculty we need to acquire, train, and nurture is the ability to come up with flexible, intelligent, creative solutions to the problems we’ll encounter tomorrow.
And this we haven’t been doing. From an early age onwards, generation after generation has been schooled in “the right way” to do things. And since “the right way” exists only for things we already know, we are now ensnared in a trap called conformity that is no good at all in the present situation where so many of us are confronted with things we haven’t a clue about.
To illustrate what I mean, allow me to offer an anecdote from my own personal family experience. When my son, who is now nearly an adult, was a nine-year-old pupil in his fourth year of primary schooling, he started to be tasked with writing little stories — it would be an exaggeration to call them “essays” at that point — in school. And, given the fact that he was quite an imaginative kid, the first couple of stories he wrote were truly charming and delightful. But after a few weeks, something strange happened: his stories were getting rather bland and boring, and were hardly a reflection of his vivid imagination anymore. So as his parents, we gently queried about this mysterious phenomenon — and he was quite happy and forthcoming to explain the reason. He had observed, he informed us, that the fellow pupils of his that had got good marks and the teacher’s appreciation on the first stories they had written were the ones that had turned in the writing with the fewest spelling and grammar mistakes. And apparently he resolved then and there to henceforth only turn in stories that were composed of sentences strung together from words he already knew how to spell, using constructions that he was already familiar with — and that he was thus unlikely to stuff up. Of course that made reading the stories about as exciting as watching paint dry, but it kept the teacher happy and thus, off his back.
And I can attest that this very much goes for my own schooling as well. It doesn’t matter if we’re talking about my German classes or English classes or French classes, “correctness” always prevailed over originality or wit or creativity. I don’t mean to insinuate that clever writing should get you a free pass to absolutely butcher your spelling and grammar, but then on the other hand perfect orthography and punctuation did always make up for abject boredom, and that’s not quite right either. And this wasn’t restricted to just language classes: in maths exams there was no extra credit to be had for arriving at the correct solution of a problem in a novel or unconventional way. Nay, such brazen nonconformity would net you either a reprimand from the teacher for not showing the correct path to the solution, or at least a snide remark of the “oh you think you’re very clever don’t you, now sit down and behave” type.
My English and French and German teachers appear to have been unaware of another fact related to their institutional correctness obsession. Consider this: people who actually make a living from writing — no matter if it’s fiction or non-fiction — tend to work with editors, people whose calling it is to not only correct issues with orthography or punctuation, but also to helpfully point out plot holes and suggest the occasional rewrite of dialogue or rearrangement of chapters. Editors are highly respected by writers and instrumental to the success of a book but yet, strangely, these people’s names are normally not printed in bold letters across the book cover, and they also don’t appear in best seller lists. If my language teachers had been correct, editors should be celebrity superstars! And they should hold far greater prestige than the silly authors who only come up with the storylines but regularly struggle with the placement of a semicolon.
And the problem with the idea that errors are awful, and the fact that that idea is being hammered into our heads from an early age, is that this has a devastating effect on our creative imagination:
I don’t mean to say that being wrong is the same thing as being creative. What we do know is: if you’re not prepared to be wrong, you’ll never come up with anything original.
— Ken Robinson, “Do schools kill creativity?” (2006)
So, if you want people to be boring and dull and utterly devoid of originality, then foster a culture where being error-free is paramount.
In the business world, some tend to look down on others who have a tenuous relationship with spelling and grammar and punctuation, chiding such deficiencies as “unprofessional.” But at the same time, it’s commonly accepted to be dragged into a meeting and then forced to listen to someone drone on for an hour in a narcotic monologue consisting of the recitation of thirty-four wall-of-text slides with precisely seven bullet points each, in a ferocious assault on everyone’s attention and consciousness that gives an overdose of Valium a run for its money. How, pray tell, is it ever “professional” to steal people’s precious lifetime by boring them out of their fucking minds? And yet, this is somehow acceptable behavior in the world of business.
Let me add another bit of anecdotal first-hand experience: a few years ago when I was making most of my living as a travelling technical consultant, I was often brought in to help a team of engineers chart a path for solving a particular problem using one of the technologies I knew a little bit about. And in doing so, some decisions frequently boiled down to two choices, which I was always happy to lay out in detail: here is option A, it comes with these advantages and disadvantages, and here’s option B, it comes with those advantages and disadvantages. And I would explain to my client that it is now their business decision to determine whether the pros of A outweigh the pros of B for their business, and whether or not they would be able to live with the cons of whatever option they chose.
And inevitably, the reactions to this nearly always fell into one of two categories.
The first category was gratitude for me having laid out the options clearly and distilled the pros and cons of each, informed by my technical expertise on the matter and my understanding of their situation. And the business decision was either completely obvious to them, or they appreciated having a good basis on which to make a decision, which they resolved to make in the following days or weeks, presumably after some more empirical, experimental evaluation.
And the second category was complete confusion on a person’s face, followed by the exasperated question, “so what do you recommend?” Or, worse, “what’s the right way?”
You can probably guess which category of answers was more likely to come from managers — or “leaders”, as such people like to be called in a gross exaggeration of their capabilities.
And now imagine someone having gone through a conventional primary and secondary education, then continued on to university and from there into business or the law, or maybe the academic Ph.D. track, while in parallel having risen through the ranks of that cult of rigidity and conformism called a political party — and ultimately entering public office.
On that note, another illustrative anecdote. Not from myself or my family but still close to home: you may remember how early in the Covid-19 pandemic in February 2020, the Tyrolean ski resort of Ischgl became Central Europe’s first major infection hotspot, directly linked to at least 600 cases in Austria and more than twice as many across Europe. (At the time, 1,800 confirmed Covid cases seemed like a lot. As I write this, we have about 10 times as many. Per day.) The universal understanding of the majority of observers at the time — and today — was that the situation on the ground had been horribly botched, and that egregious mistakes were made that greatly facilitated the spread of Covid-19 across Europe.
The official in charge of public health in the provincial cabinet then went on national news a couple of weeks later to discuss the events. And, despite the repeated questions from the exasperated interviewer, voiced with increasing levels of disbelief, the official kept insisting that the authorities had “done everything right” and were not to be faulted at all for their actions — and inactions — in this public health emergency.
And I don’t even think that this was just hubris or an attempted cover-up. Rather, it’s a symptom of the exact problem I’m trying to describe. If you’re applying only what you already know to a situation that’s never been here before, you’re failing horribly at dealing with that situation. But if you’ve been conditioned that applying “the correct solution” is all you’ll ever have to do in life to succeed, you eventually end up genuinely believing that that’s enough.
And that’s how, in the greatest crisis that humanity has faced in peacetime in over a century, what we’re stuck with are leaders who, for the most part, have been so thoroughly molded by the perpetual vicious cycle of conformism that they now operate with the decisiveness and agility of a herd of mammoths deep-frozen in permafrost. They are just shockingly ill-equipped to deal with a global health crisis affecting a closely interconnected civilization. And the ones that actually used creative ways to get to power turn out to be sociopathic one-trick cronies that could not apply their skills to something useful if their life depended on it.
It looks as though systematically, those who have the power to make decisions, at all levels — in business, politics, anywhere — frequently lack the intellectual, emotional, and empathetic creative capabilities required to make those decisions. This is not to say that exceptions to this rule don’t exist — I’m lucky enough to live in a town where officials from the mayor on down have been exceptionally creative, empathetic, and successful in their Covid response, for example — but I would argue that the rule does stand.
Now, it would be entirely fair to counter my arguments with the observation that surely, in the say 1950s or 1930s or 1910s schooling and education were still more rigid than they are today, and certainly did not allow for more creative freedom than they do now. And that is certainly true, but there is something that children (at least those lucky enough to go to school, I am aware that for those who spent their childhood toiling in the fields or coal mines it was an entirely different story) had during their schooling in those days that is a precious rarity today: time and space to let their mind play.
John Cleese writes in his excellent book on creativity that mental play — the ability to let your mind wander and thereby become open to developing new ideas — is a key element of the creative process. And this is by no means limited to music or literature or the arts; some of the examples he lists of mental playfulness leading to groundbreaking new ideas are from the world of science. Now, it is essential to be able to keep our mind in that state of playfulness for some time, because it takes a little while for new ideas to pop into our head. And so, Cleese observes:
The greatest killer of creativity is interruption. It pulls your mind away from what you want to be thinking about. […] It might be an interruption from outside, like someone coming over and talking to you, or an email popping up in your inbox. Or it may come from inside, as when you suddenly remember something you’ve forgotten to do, or worry that time is running out, or that you don’t think you’re clever enough to solve whatever problem it is you’re trying to deal with.
— John Cleese, “Creativity: A Short and Cheerful Guide” (2020)
And this is the bit that’s incredibly difficult to achieve for most people born after about 1970, and many born earlier too. That includes any child alive today, but also their parents and many of their grandparents. We are constantly dealing with outside interruptions, many of them coming from a device we carry in our pockets all day. We have to fight for our uninterrupted mental play time. A child in the 1950s might just run off to play with friends for the afternoon, and return home for supper. Without a text from a parent or a snapchat message from the school bully rudely interrupting halfway through. It’s perhaps no coincidence that some of those 1950s children ended up being 26-year-olds who could figure out powering up a disabled spacecraft while it’s on a free-return trajectory around the Moon, saving the life of a three-man crew in the process.
There’s more evidence that even in the world of engineering, allowing your mind to let go for a bit can lead to creative breakthroughs: Jim Crocker was the Ball Aerospace engineer who solved the problem of how exactly the corrective optics on the Hubble Space Telescope should be installed — obviously not a scenario that anyone accounted for in Hubble’s design. The ingenious idea that ended up saving Hubble from being a multibillion dollar boondoggle came to him in the shower.
Also, in offices in which we worked with maybe one other colleague in the room, we had stretches of time where the other person was off running an errand in town or taking a meeting in a conference room. And you could close the door and put up a “do not disturb” sign and do some uninterrupted thinking.
Around the year 2000, most of that started to change, dramatically. Open-plan offices, which of course were allegedly introduced to “facilitate cooperation,” eliminated any room for uninterrupted mental play at work. Emails replaced typed and printed memos, office workers transitioned from workstations to laptops. Pagers and cell phones started to buzz people at home. Around 2010 we progressed to smart phones, tablets and other portable devices that had the ability to ping us out of playful thought with an audible notification at any time of day. Knowledge work became interrupt-driven — a sentence that in itself should make anyone shudder that knows anything about knowledge work at all.
And it’s no surprise that people who rose to corporate leadership after being imprinted by an interrupt-driven lifestyle — that is, people now in their 50s — think that such a thing is normal, and try to impress the same thing on their subordinates. That’s how you get to Slack-driven companies. That’s how you end up in a culture where people are proud of getting back to any email within 30 minutes (or less), and expecting the same from everyone. That’s how you end up with managers who take being signed into a chat (and thereby constantly listening for interruptions) as a measure of being “active”, so much so that they end up tracking metrics for it, and of course also making it a target they call “engagement” or some other abomination.
Organizations that do that are at risk of constantly shutting down creativity, problem-solving, and innovation. What they’re good for is developing efficient cookie-cutter techniques for optimizing solutions for yesterday’s issues. What we need today — in a global pandemic, and in the roiling climate crisis that’ll make this pandemic look like a walk in the park — is people and organizations equipped with the mindset for the issues of tomorrow. And a first imperative for making that happen is to let people think.
So is everything all doom and gloom? I’d say it isn’t, and there are indeed some things that make me hopeful. Some of those are related to a changed approach to creativity in education, some, to a changing approach to work.
For example, in my country as of a few years ago it is indeed such that creativity and originality are accorded at least some merit in marking and grading standards. And this is at the secondary school level, traditionally one of the most rigid and unchanging branches of education where I live. Much more still is happening at the primary and preschool level, where I see much more emphasis on thinking, creative play, and innovation in my younger kids’ education than I did in my older ones’.
And then, there’s the big push towards asynchronous distributed work — where obviously the asynchronous part is the bit that matters. Sure, companies suffering from offissification still exist, but for a while, so did dodos. But an ever-increasing share of humanity is beginning to understand what it’s like when you’re no longer shackled to seventeen Slack channels you constantly need to watch, when you can take a walk in the middle of the day because you know it’s OK to push something off for an hour to clear your head and come up with an idea, and when instead of spending an hour in a meeting you can spend 5 minutes reading a memo and use the other 55 minutes for thinking. And that’s where things get interesting.