By user13366129 on Okt 14, 2009
During my summer vacation, and driven by the fact, that Oracle is acquiring Sun, I had been spending thoughts on the state of the IT industry lately.
Then last week, some colleagues forwarded a link to a fake interview with Bjarne Stroustrup on the invention of C++ and the drivers behind it. This interview now triggered my intent to write down some thoughts and publish them for discussion.
Additionally the new hype word "Cloud" made me want to add some comments to that as well. Specifically, as my possible new boss, Larry Ellison calls it "vapor water".
So, let's start with some basic facts, that I think, are important to understand what I'm trying to explain:
- Moore's Law
- Software needs
As a sideeffect of this rapid growth in compute power, many companies have gone out of business or are close to getting out of business. Think DEC, Data General, Compaq, SGI, Cray, Apollo Computer and others like these. Now Sun Microsystems also will become history.
The question behind this all simply is: Why?
In order to start an answer, I will first deviate a bit...
In September 1990 there was a famous 15 year anniversary edition titled "Byte Magazine 15th anniversary summit, setting the standards" of the Byte magazine, which did try to shade an outlook into the next 15 years of the computer industry and its development by asking the 63 most influential people of the time to look back and from that predict the future. Sadly, Byte was torn down in 1998, and although it did continue as a web-presence for some years, it is no longer available, and had been removed from the web in February 2009. So also the link above no longer works.
A link to the table of contents of that edition at least can be found here. The cover page is here. And some of the predictions published in that issue can be found here (that page calls it a brief excerpt, sadly I dumped all my paper versions of Byte approx. 10 years ago into the bin in the hope of it being online forever. How naive, wrong and mistaken have I been... ;-( ) (discussion point two) (and: btw: I'd like to get a copy, if anyone can spare his/hers).
In that discussion, many diverse topics had been approached, and had been answered by many but not all of the 63 people. One of those was Donald Knuth, creator of the programming language Pascal. He is cited with:
Donald Knuth: ...computers are going to double in speed every year until 1995, and then they're going to run out of ideas.
How wrong has he been...
And as a proof point to my second topic above, I'd like to quote Mr Kernighan, inventor of the famous programming language C:
What about the software side of the equation? Or are all the changes coming in hardware?
Brian Kernighan: Software, unfortunately, is not nearly as easy to make go better as hardware seems to be. And the software will not get better fast enough, and so you'll piddle away more and more of [the power] on stuff that doesn't quite work the way you want it to....
And then there was the question:
What is the biggest obstacle to major new breakthroughs in computing?
In a word, software, not hardware.
Suffice it to say, that I assume these predictions from close to 20 years ago a valid proof point for my thesis, that software didn't grew and doesn't grow in demand as quickly as the hardware grew and still grows in speed and capabilities.
After this short deviation into the past using a look into the most important publication of that era back to the question on why the IT industry is in the state that it is in today.
It seems obvious, that some (shall I say: many?) have miscalculated the differences in development speed between software and hardware. Or assumed, that the overall demand for compute power might still ramp up and make up for the growth in produced CPU capacities to still be able to sell as much (in Dollars) as before (that would have required to sell way more units then before). It did work for quite a while during the gold-rush era of the early 21st century and the dot-com bubble, where many new enterprise were created that used the internet and needed CPUs like crazy. But it also seems obvious from this fact, that hardware is becoming more and more of a commodity, because the average cost per CPU-cylce needed to solve a specific problem goes down and down. My favorite statement here is, which I use often in virtualization talks, and now in cloud talks: "Take your mobile phone out of your pocket, look at it, and remember: NASA had LESS CPU power (in total) to savely bring mankind to the moon and back compared to what you look at right now."
That then sets the stage for open and candid discussions on the state of virtualization, cloud computing et. al. But, again, more on that later.