By Chris Barton, Editor-in-Chief
This series aims to take a critical look at ASU’s claim to be “#1 in innovation.” What does that actually mean? What does it say about us and our school? And is it even a good thing?
Part one of the series looked at what differentiates innovation from invention, and how the process of ‘innovation’ can create technologies that both hurt, and help, mankind. Part two talked about how prioritizing ‘innovation’ undermines the technologies that we depend on daily, such as our transportation infrastructure and indoor plumbing. This week we’ll look at whether or not innovation is an model that can be counted on to help us toward a future we want.
Under the paradigm of ‘disruption’, the process of innovation amounts to a gladiator battle where the newest, best technologies race to make the others obsolete. Electric lights made oil and gas lamps into trash. Our cell phones are replaced by new ones every 2 years or so. Musk’s hyperloop aims to turn the New England rail system into nothing more than rubble.
To reject new technologies entirely out of pity for the old would be silly and counterproductive. But it would be equally silly to assume that the haphazard battle for supremacy that is innovation would lead us where we want to go. Even at its best, innovation’s ‘creative destruction’ takes us one step backwards for every two steps forward. And not all innovations benefit humanity; some – like slavery, or the atomic bomb – made the world worse for large swaths of the population. Others – plastic, the internal combustion engine – have had unintended consequences that are endangering our ability to continue living on this planet.
Yet, we’ve gotten this far. Surely, with all its faults, innovation must have a net benefit; after all, it led us to this wonderful world of indoor plumbing, penicillin, and the internet. But has the process of innovation really guided us to the world we live in today? Contrary to the way innovation is often talked about, it’s not an ancient force like gravity, nor a human constant like curiosity. In fact, innovation – as we use the word today – is a relatively new concept, and a recently developed model of progress. Innovation did not create our world; rather, it is the product of a series of ideas that arose over a very specific time frame.
Thanks to Google’s nifty Ngran viewer, we can look at how the use of the word ‘innovation’ in written English – a proxy for the concept itself – has changed over time.
The chart shows three periods where the use of the word spiked. The first is around the middle of the 17th century, in the midst of the protestant reformation in Europe. During this time of ideological upheaval, innovation – the establishment of a new way of doing things – was essentially synonymous to heresy. In 1636, one Henry Burton accused the English bishops of innovating in church doctrine. They, in turn, accused him of innovating, cut his ears off, and threw him in jail for the rest of his life. To the 17th century English, innovation was a threat.
The next spike in the use of innovation occurred in concert with the industrial revolution, where, thanks to Watt’s steam engine, innovation in manufacturing exploded. It was during this time that a clear distinction from invention and innovation occurred (the latter being an invention ‘brought to market’), thanks to an Austrian economist named Joseph Schumpeter.
The third surge in usage occurred in the decades after WWII, when America’s roiling capitalist economy was exploding with such innovations as roll-on deodorant, TV dinners, birth control, the microwave, and the computer. But this time, usage didn’t drop off – in fact it has kept increasing since, at an unprecedented rate. Has our society really been innovating this explosively?
No, says George Mason economist Tyler Cowen, quoted in the Atlantic: since 1970, the “forward march of technological progress has hit something of a dry spell.” Yet we keep talking about innovation. Why?
As Harvard historian Jill Lepore points out, disruptive innovation – as a theory of past change and as a model for future progress – is an innovation itself, holding only a tenuous connection to historical fact or established ideas of how humanity advances. The idea appeared on the scene in 1997, in Clayton M. Christensen’s “The Innovator’s Dilemma,” a book that both introduced the idea of disruptive innovation and demonstrated it. Christensen’s book followed the very model of progress that it promoted: it disrupted the way people thought about human advancement, convincing everyone to depart from inherited wisdom and pursue the newest, shiniest idea – namely, his shiny new idea. The book ended up being an excellent example of how a new idea (an intellectual invention), when effectively marketed and disseminated, can become an influential innovation regardless of its actual usefulness.
In the wake of the book’s release, the message heard everywhere from newsrooms to universities to hospitals was clear and compelling: innovate and disrupt – lest someone disrupt you. Rather than calling for a careful plan to strengthen and sustain the core business, the siren song of disruptive innovation tended to provoke a panicked and ill-advised rush to forsake the old and replace it with the new. For example, The New York Times’ ‘Innovation Report’ would see the classic paper abandon it’s careful content curation to chase after Buzzfeed and Vox for the novel metric of online “attention minutes”. Similarly, Christensen’s “The Innovative University” (published in 2011) led many school to discard the ancient infrastructure of in-person lectures for online classes – with questionable results.
The problem was that “disrupt or be disrupted” is not Christensen’s message. The original concept of disruptive innovation is much more nuanced and specific than what is invoked in the New York Times report or around many campuses, as his work was originally based on observations from specific business environments. His disruption model – which describes a particular process –has been shown to be right about 66% of the time, when applied correctly. But too often, it isn’t. The author himself laments that “the theory’s core concepts have been widely misunderstood and its basic tenets frequently misapplied… Too frequently, [people] use the term loosely to invoke the concept of innovation in support of whatever it is they wish to do.”
Yet Christensen’s promotion of innovation – along with that of his acolytes and contemporaries – seems largely responsible for the recent explosion in the word’s usage. Before “The Innovator’s Dilemma,” innovation was used as a descriptive model, to depict a way that change did happened. After 1997, innovation became a prescriptive model for the way change should happen. Nowadays, everyone wants to call themselves an innovator; everyone wants to disrupt.
Christenson’s succinct explanation of why companies fail has grown, feeding on constant promotion and overuse, into an unwieldy monster that has broken free from its creator’s control and found a life of its own. Innovation can now be found prowling the halls of institutions the world over, whispering in everyone’s ears that the world is a dangerous, tumultuous place where someone is always out to disrupt you into oblivion. Innovation is both the threat and the opportunity, the creator and the destroyer, the alpha and the omega. It claims to be omnipresent and omnipotent. And it has come to roost at ASU.
When we claim to be #1 in innovation, how are we using that word? We no longer spit it at heretics, as we did in the 1600’s; nor are we using it to describe the process by which the world is changing, as we did during the industrial revolution; nor to categorize the novel things around us, as we did in the middle of last century. Nowadays, we use it to label ourselves, our projects, and our institutions –but not in order to describe what we do. Instead, we use it only to hitch ourselves to the runaway bandwagon set in motion by Christensen. Innovation is but a buzzword, a marketing phrase, a paper tiger and an empty promise. Its rhetorical use has outstripped its descriptive validity.