Moore's Law & Greg's Law

Bill Joy, in a Wired Mag article about a year ago, thinks Moore's Law will last another 10  years at least. With algorithmic work, those processors will have 1000x more power than today (he wrote that in Dec 2003). That's "about" the timeframe as general use of Jim Mitchell's DARPA derived systems using a proximity interconnect. Hmmm. Is that possible? What has history taught us?

http://www.wired.com/wired/archive/11.12/billjoy.html

And Ray Kurzweil (yep - same Kurzweil famous for his MIDI keyboards and synths), wrote a fascinating book in 1999 called "The Age of the Spiritual Machine". On page 22, he charts the evolution of compute power since the 1900-era "mechanical" devices, thru the 1940's relay based devices, thru the 1950's vacuum tube computers, thru today's modern computers. Looking at the calcs/sec available for $1000.00 and plotting this on log paper results in an almost perfect straight line!!! Moore's Law simply tracks the 5th paradigm of computing. If the IC's useful life ends around 2020 (as suggested by some scientists)... a 6th paradigm will emerge and likely continue to sustain the exponential curve that started in 1900 and has continued for over 100 years and across 5 compute paradigms. Hard to argue with this historic consistency. Yet, exponentials always (must) tail off at some point.



If we were to extrapolate (just for fun) to the year 2020 (when Ray thinks the 5th Paradigm will end), using recent trends in H/W (CPU, storage, networking, etc...), an affordable home computer will offer the follow characteristics! Of course, as Jonathan points out [
http://blogs.sun.com/roller/page/jonathan/20041203], the personal desktop computer is not that interesting anymore: "...hardware is nearly identical, and the value's moved to services available through the device. Over the network. Battery life matters more than processor speed. Size of display more than disk...". However, this extrapolation might well apply to a 1 RU server blade in 2020!

   4THz                   Processor!! (or the equiv in throughput power, as we now understand)
   10TB                   Disk (via NFS v6?)
   64GB+                RAM
   100Gbps             Wired Network (photonic?)
   1Gbps                 Wireless Network (the client/consumer end points)
   3D Holographic     Video (the presentation side of the net)


Pretty amazing. I'd bet against that kind of power in a home computer. But 10 years ago, everyone would have bet against a 2GHz CPU, 80GB disk, 1GB RAM, and 100Mbps networking in a laptop. Can you imagine what you could do with a system that contains a CPU that has the (throughput) power of 1000 PCs running 4GHz Pentiums?

Read more on Ray's thoughts on Technology at:  http://www.kurzweilai.net/articles/art0134.html?printable=1

A friend replied to this.... While meant to be humorous, there is a grain of truth in this sarcasm. Enjoy...

Greg replies:
You have incorrectly assumed that Moore's law is the only law at work and have completely overlooked "Greg's Law".  While Moore's Law follows a geometric progression, Greg's Law is an inverse log relationship.  Further, one of the dependent variables is a function of Moore's Law there by making it a trinomial - inverse logarithmic function.

Briefly, Greg's Law states "The mass and volume of software, (i.e.  LOC size, memory demands, and processor loading) increase in an inverse natural logarithm relationship to the available processor resources".

 SWmass = e\^(PR),
    where PR follows Moore's Law {PR[1] = PR[0]\*2\^(t/18),
    where t is in months

combining the equations we have:

 SWmass = e\^(PR\*2\^(t/18))

solving for the SW increase in an 18 month period we have:

 SWmass = 7.39X

If we use your example of 20 years, processors will be roughly 1M times as powerful, yet software will be e\^1M times as massive which equates to "-E-" using my calculator.

Now consider MTBF.  With processors containing 1M times the circuitry, hardware failures will increase at a staggering rate.  Most will go unnoticed however, since the software will have grown by e\^1M causing applications and/or Windows to hang every 23.7ms on average and masking true HW failures.  This is conservative since I have assumed the SW failure rate to be equivalent to the HW rate.  Empirical data puts the SW failure rate at about 3 orders of magnitude higher.

In the near future:  The DLL to support graphical display and interfacing to the file system will require 1 GB of memory alone. Single instruction operations will be replaced with object oriented classes that consist of 1000 LOC and consume 1 M of RAM.  The program will require the transfer of several GB's of data and library calls to the point where the I/O will consume the first 1GHz of the processor power.

The bottom line is it will still take 30 seconds for the CNN webpage to come up - even though you will have a gazillion times the processing power of the Saturn V that took man to the surface of the moon and back.

It's all in the bloatware
.

Comments:

Post a Comment:
Comments are closed for this entry.
About

dcb

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today
News
Blogroll

No bookmarks in folder