THE WASHINGTON POST – Innovation, Matt Ridley told us at the start of his new treatise on the subject, “is the most important fact about the modern world, but one of the least well understood”.
Even as it functions as a powerful engine of prosperity – the accelerant of human progress – innovation remains the “great puzzle” that baffles technologists, economists and social scientists alike.
In many respects, Ridley is on to something. After decades of careful study, we’re still not entirely sure about innovation’s causes or how it can best be nurtured.
Is innovation dependent on a lone genius, or is it more a product of grinding teamwork? Does it occur like a thunderclap, or does it take years or even decades to coalesce? Is it usually situated in cities, or in well-equipped labs in office parks?
We can’t even agree on its definition. Generally speaking, an innovation is more than an idea and more than an invention.
Yet beyond that, things get confusing. We live in a moment when we’re barraged by new stuff every day – new phones, new foods, new surgical techniques.
In the pandemic, we’re confronted, too, with new medical tests and pharmaceutical treatments. But which of these are true innovations and which are novel variations on old products?
And while we’re at this game, is innovation limited to just technology, or might we include new additions to our culture, like a radical work of literature, art or film?
Unfortunately, no one happens to be policing the innovation space to say what it is and is not.
Mostly we have to allow for judgement calls and an open mind.
As an occasional writer on the subject, I tend to define innovation simply, but also flexibly: a new product or process that has both impact and scale.
Usually, too, an innovation is something that helps us do something we already do, but in a way that’s better or cheaper.
Artificial light is an excellent case study. Over time we’ve moved from candles, to whale oil and kerosene lamps, to incandescent and fluorescent bulbs, and now to LEDs.
Or, as another example, we might look to one of the great accomplishments of the 20th Century, the Haber-Bosch process to make synthetic fertiliser, as a leap that changed the potential of agricultural production.
On the other hand, we can regard the Juicero press – a recent Silicon Valley-backed idea that promised to “disrupt” the juice market.
They burned up more than USD100 million in the process – as a fake or failed innovation.
And still, this leaves us plenty of room for disagreement about what falls between these extremes and why.
Ridley enters into this messy arena with the intent of organising the intellectual clutter.
The first half of his book, How Innovation Works: And Why It Flourishes in Freedom, takes us on a tour through some highlights in the history of innovation.
We visit with the early developers of the steam engine, witness the events leading to the Wright brothers’ first flight at Kitty Hawk, North Carolina, and hear about the industrialisation of the Haber-Bosch fertiliser process.
There are likewise forays back to the early days of automobiles and computing, the development of smallpox vaccines and clean drinking water, and stories that trace the origins of the Green Revolution in agriculture, which alleviated famine for more than one billion people.
For dedicated science readers, Ridley’s lessons may have a glancing and derivative feel.
He knits together stories many of us have probably heard before – say, through the renditions of writers like Steven Johnson, Charles Mann or Walter Isaacson – but somehow misses the opportunity to enliven these sketches with a sense of wonder and surprise.
More seriously, he skirts the opportunity to footnote his summarisations, leaving only a skeletal guide to sources in his back pages.
What becomes clear, though, is that Ridley is focussed less on exploring the pageant of history than on fashioning a new belief system.
I don’t necessarily mean this as a critique; in fact, the second half of his book – where he looks closely, chapter by chapter, at the factors that shaped the innovations he’s spent his first 200 pages describing – is more polemical in its approach but often more engaging, even as one might disagree with a narrative direction that arises from what I would characterise as the libertarian right.
Indeed, as his book progresses, Ridley makes it obvious that he is not presenting an academic treatment of scientific history.
Mainly, he’d like to proffer an argument for the importance of free-market principles and why they’re crucial to improving our world and our lives.
Ridley’s most important chapters, and his book’s most interesting, are where he calls attention to “surprisingly consistent patterns” that describe the process of making new things. Innovation, he tells us, is usually gradual, even though we tend to subscribe to the breakthrough myth.
Or as he puts it, “There is no day when you can say: computers did not exist the day before and did the day after.”
The innovative journey, as he shows us, goes back to Jacquard looms and the step-by-step advances of a number of early tinkerers. And at some indistinct point, new computing machines achieved functionality; then impact; and then scale.
He also illustrates how innovation can be a matter of the right people solving the right problem at the right time – and that it often involves exhaustive trial-and-error work, rather than egg-headed theoretical applications.
This was typically the case with Thomas Edison, who, as Ridley notes, tried 6,000 different organic materials in the search for a filament for his electric light.
Edison, he pointed out, “remained relentlessly focussed on finding out what the world needed and then inventing ways of meeting the needs, rather than the other way around”.
One problem with cherry-picking the history of innovation, however, is that you tend to leave out examples that weaken your claims for universal principles.
Innovations that involve academic or state funding are given short shrift by Ridley, leaving one to naively presume that whatever governments do by way of investment or regulation hinders rather than helps the cause of progress.
Thus, you won’t find a lot here about the development of the atomic bomb, which depended almost entirely on state largesse, or about the subsidisation of renewable energy.
Nor will you read much on the transistor, many early lasers or the photovoltaic solar cell, which were created under the auspices of Bell Labs, part of a government-authorised monopoly.
There isn’t mention of Massachusetts Institute of Technology’s Rad Lab, which (thanks to the cavity magnetron, a British invention) helped develop radar.
And in Ridley’s story about the origins of Google, you will not see any indication that its founders were helped in their earliest days by a grant from the National Science Foundation.
Indeed, his book consistently plays down the influence of public funding in medicine, public health, personal technology, transportation and communications; it likewise minimises – quite strenuously, and erroneously – the role of federal assistance in the development of natural gas fracking, which was kept alive by research investments from the Energy Department in the 1970s.
It may be the case that we increasingly prefer argument to evenhanded analysis.
The world is too bewildering, and the field of innovation reflects the extreme complexity of our sciences, economics and politics.
Therefore a skilled polemicist can help us cut through the confusion. Yet by the end of this book, it’s hard not to ask whether the author has avoided difficult questions about his subject.
If you were wondering how new technological capabilities – in biology, computing or material science – have substantively changed the nature and pace of innovation since the days of the steam engine, you won’t find satisfying answers here. More crucially, you won’t come to any insights about whether some economic sectors, such as energy, follow different innovative patterns because of our political systems and our legacy investments in oil, gas and coal.
Instead, Ridley’s final pages focus on esoteric debates that probably mean little to most readers – disputes about “linear innovation”, for instance, which involve whether innovation goes in one direction, from a scientific idea to an engineered product – that were all the rage in academia decades ago but are now largely exhausted.
It is, in many respects, indicative of this book’s inefficient approach to solving the puzzle that innovation presents. Indeed, at Ridley’s conclusion, he can tell us only that innovation “is the child of freedom and the parent of prosperity” and “we abandon it at our peril”. It is unclear who would actually advocate such an absurd position or why the human urge to move forward is now at risk of being abandoned.
It seems more reasonable to believe that the pursuit of innovation will be just fine, as long as we keep encouraging and incentivising men and women who are trying to solve important problems.
And we don’t necessarily have to create an ideological schema to explain what may be happening.
For instance, our smartest scientists and engineers are now working around-the-clock, and around the world, to fashion a vaccine for the novel coronavirus.
They are approaching a big problem with lots of funding, lots of talent, lots of teamwork and lots of ambition. Isn’t that how innovation works, too?