Computer Chess Club Archives




Subject: Re: Introducing "No-Moore's Law"

Author: Robert Hyatt

Date: 15:09:47 02/27/03

Go up one level in this thread

On February 27, 2003 at 16:51:44, Jeremiah Penery wrote:

>On February 27, 2003 at 13:31:54, Robert Hyatt wrote:
>>On February 26, 2003 at 12:03:42, Steve J wrote:
>>>>5.  I am also looking for some predictions/information about processor speed in
>>>>20-30 years from now.  For micro's Moore's law still holds.  So 21 years is 7
>>>>doublings of speed or 128 times as fast as today.
>>>   I've spent 25 years in manufacturing side of the semiconductor industry and
>>>would like to introduce what I call "No-Moore's Law".  It describes the physical
>>>limitations that silicon (or any other compound) will run out of gas and can be
>>>shrunk no more.  It also talks about some of the financial limitations of
>>>shrinking die.
>>What you have written can't be true.  Because if you read this forum long
>>you realize that many are convinced that designing chips is a trial-and-error
>>where the engineers don't know anything at all about how fast a chip will run
>>until it
>>is produced and tested.  No ideas about the expected wafer defect rate.  Etc.
>Nobody ever said or implied anything of the kind.  You just seem to think that
>CPUs are produced as near that theoretical clock limit as possible, which is
>simply not true.

Sorry, but it _is_ true.  Yes, parts are marked "down" when there is a demand,
ask an engineer that works for one of these companies.  They don't design
and hope it will run fast.  They know _before they start_ how fast it is going
to clock
within a small error margin.

>>It was a
>>nice explanation of a known issue, but it can't be right because it implies that
>>engineers really know what they are doing, rather than relying on blind luck to
>>get something to work.
>>Of course, all the engineers I personally know are repeating your story and they
>>sticking to it.  But they must all be mistaken.  After all engineering isn't a
>>true science,
>>it is based mostly on serendipity.
>Not that any of what you say here has anything to do with Moore's Law in the
>first place...

Indirectly it does.  Moore's law relates to the size of a transistor.  But that
is also
proportional to how fast the thing will clock.  If it goes faster, it will be
If it is made smaller, it will go faster.  Circular, IE...

>Engineers can't predict the future.  They may be great at telling us the
>limitations of current technology, but they can't guess about the emergence of
>new technologies.  Many engineers thought the sound barrier was impossible to
>break, but that didn't make it true.  They've been saying for years that Moore's
>Law will fail.  Here we are today, with no indication of slowdown in the next
>few years.

I disagree with that last.  If you look at the clock speeds, things are _not_
progressing as fast now as they were 5+ or 10 years ago.  For the reasons
given already.  Pathways are _not_ going to become sub-atomic in their
width.  So there is a clear asymptote on the horizon...  How close to a
single atom they can go might be open for argument, but not beyond that.

>  There are many possible ways its usefulness can be extended past
>what many currently believe is possible.  Examples are finding a new
>manufacturing process that allows much smaller features to be created, finding a
>better material than silicon, etc.  It's possible that a completely new
>computing paradigm may become usable, rendering Moore's Law completely obsolete.
> Examples of this may be DNA and/or Quantum computing.

You can't make a feature smaller than the molecule you are working with, if the
substance is molecular.  Can't go below the size of an atom with any forseeable
process.  Whether quantum computing comes to pass or not is another subject,
of course.  But for _current_ computing, there's a definite limit on the horizon
with no forseeable workaround.

This page took 0.07 seconds to execute

Last modified: Thu, 07 Jul 11 08:48:38 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.