Having been a long-time scientist myself, I’ve observed time and time again one very persistent approach by most of my fellow scientists to innovation: take what’s been done, and improve it. Not a single project that I’ve participated in could skip this important step – look what’s already been done, study the literature, talk to those who walked there before, learn what their approaches do well and where they have weaknesses, and see if you can keep the "good stuff" and somehow avoid the pitfalls, generally by tweaking things here and there. Granted, most of the technology comes from such an approach of learning more and more about the specific methods, and polishing them to perfection, until hardly anything can be improved, at which point the science proudly declares it to be "the state of the art" and "the best it can ever be", mathematicians formulate theorems proving that nothing better can be done with this technology – no matter how hard you try, and the method enters the classical textbooks as "the way to go". Until someone invents a new technology that totally outperforms the "old and tried" ways, making everyone wonder what has just happened…
You make it obsolete by introducing a superior methodology."
(Buckminster Fuller)
Remember the vacuum tubes? Neither do I. Perhaps, the only surviving vacuum tubes these days are the CRT TVs and computer monitors – but even those are becoming increasingly obsolete. With the invent of a transistor, electronics suddenly became cheaper, more energy-efficient, and much more compact. I remember playing with transistors as a kid – soldering simple radios and amplifiers for my home fun projects. I still remember using the equations to balance a simple one-transistor amplifier with about 10x amplification, and stacking those up to achieve a descent output from a microphone. That was a rather delicate process, requiring some trial and error, fine-tuning, and quite a bit of time. Then came the integrated circuits. I still remember the feeling of awe when I first got hold of those IC amps that looked just like a single transistor, only with more "legs", and had a whopping 100,000x of amplification power, with all the balancing done with only two resisters in a "negative feedback loop" – to limit the amplification, sacrificing it for the stability of the circuit. So much simpler, so much higher quality, so much more predictable. You could still play with transistors, but anyone with these ICs would leave you eating the dust in no time. Nowadays, this technology seems ancient to me – as the entire systems are being implemented on-chip, and all you see are the external controls, displays and headphone jacks being connected directly to the single chip on the tiny printed board. Period. How do you ever beat that with single transistors, or even the IC amps? You can’t. And if that’s not impressive enough, we are now carrying equivalent of the early Cray Supercomputers in our pockets merely to entertain ourselves with over 1000 hours of CD-quality music at our fingertips…
Which reminds me of another story. Remember the open reel audio tapes? Or the vinyl records? Well, if you are under 25, you probably don’t even know what I’m talking about… The reels were popular in the 1960s, and were still in use in early 70s – I had one player at home as a kid. The technology kept improving – instead of one or two tracks in the early systems, one could then record 4 tracks of audio on a 6.25mm tape. Further improvements reduced the tape speed from 38cm/s to 19cm/s, and then even further to 9.5 and even 4.25cm/s (the "super LP" mode). The Dolby noise reduction technology further improved the sound quality that was previously only available for very high tape speeds. All went well, until came the compact cassette. With its tiny form factor, convenience of handling, and up to 90 minutes of high-quality sound recording, the consumer market has totally embraced the new technology. The sales of open-reel tapes have dropped, and manufacturers were forced to discontinue its mass-market production. The noise-reduction technology has also been streamlined and perfected to accommodate the now cheaper cassette players – the ubiquitous "Dolby B", retaining much of the sound quality of the reels. The technology kept steadily improving, and some time in the late 70s Dolby has came up with "Dolby S" – a recording technology that made cassettes rival then emerging digital audio. If you’ve never heard of Dolby S, it’s for good reason. At about the same time, the optical Compact Disks have entered the game. With signal now represented digitally, the problem of noise and non-linearity of the recoding medium became entirely irrelevant. The tapes – reels or cassettes – became history. Then, at the turn of the millennium, yet another technology quietly appeared. At first mostly unnoticed by the industry, it suddenly swept all over the world of audio like a tornado, creating havoc and legal panic of RIAA – the "notorious" MP3. CD sales dropped. People didn’t want to carry shelves full of jewel cases, or buy the bulky CD changers – they’d rather carry a tiny pocket device called an "mp3 player", and have their entire music collection with them on the go. Unlike a CD player, you can jog with it, drop it, shake it, wear it on your arm, or clip it to your belt – the music will play without skipping a beat, for hours on end. And, unlike the CDs, if you happen to run over your player with a steamroller, the music will still be there on your computer, intact. I’m now buying more music on iTunes than anywhere else, and even if I happen to get an occasional CD, I would immediately rip it and plug in to my iPod – never to look at that CD again.
It seems that, despite the prevalent belief in the scientific community, the history of innovation tells us quite a different story – the "next big thing" is not a fine-tuned and improved old, but rather a totally new, superior technology that happened to be so much more efficient that it made the old one obsolete. Just like in biological evolution, it’s not a gradual selection of the fittest that changes the species – as Darwin had thought, but a sudden mutation that happened to be more fit for the times that survives in the long run – only to be replaced by another superior mutation later on. The evolution is not linear – rather, it happens by quantum leaps, with a new mutant obsoleting the rest of its species.
Ironically, this has happened to the science itself. In mathematics it was the invent of calculus, the discovery of logical paradoxes in set theory and Cantor’s theory of types and transfinite numbers, culminating in Gödel’s famous incompleteness theorems, and moving further into computability and complexity theories by Turing and others that made previous efforts like Hilbert’s program rather irrelevant or impossible. It happened in physics, where the first naive models of the world by earlier philosophers were overthrown by Newton and his mathematically precise laws. Those in turn were later proven to be only approximations by the theory of relativity and quantum mechanics.
The industrial age – the economic child of precise science – was overshadowed by information age with the invent of a computer, and now this much shorter-lived era is being replaced again with new wave of technologies. Even within computer science itself the early assembly codes were replaced by primitive programming languages like Fortran and C, then much more advanced object-oriented languages (C++, Java), and now we are seeing graphical programming environments such as UML, Flash, AppleScript, and even web-based apps and mash-ups like WordPress that take the development of computer applications to higher and higher levels, making the old ways look archaic in comparison. It took YouTube one year to revolutionize the internet video. It took another year or so for iTunes to respond with video-on-demand – it’s only a matter of time before DVDs will become obsolete, just like VHS are now a relic. In a few years, the entire industry of 35mm still cameras was swept into oblivion by the stunning digital cameras. It didn’t take much longer for the digital video to catch up with 35mm film, and now you can buy a $1000 camcorder that would rival an expensive Hollywood equipment of the 90s. And for the "real price tag" – the RED camera will now take you places you’ve never even dreamed of with the good old film.
In the words of Buckminster Fuller, "You don’t replace the old. You make it obsolete by introducing a superior methodology." Or, as David Neenan has put it, "It’s the mutants that make it."
Happy New Year, and may it be a year of superior mutations – outdating the old ways!
[techtags: mutants, outdating the old, superior methodology, Buckminster Fuller, science, electronics, music industry, mp3, calculus, logic, set theory, Cantor, Goedel, incompleteness theorem, Turing, Darwin, evolution, Newton, Relativity theory, quantum mechanics, computer science, Hollywood]
No comments:
Post a Comment