Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: next deep blue

Author: Robert Hyatt

Date: 20:12:28 01/23/00

Go up one level in this thread


On January 23, 2000 at 18:35:03, Tom Kerrigan wrote:

>Here is something you wrote from ~2 posts ago:
>
>>emulate DB's hardware stuff _in software_."  I said that _if_ you used a typical
>>emulator (as used to debug/test chip designs before dumping them to a fab center
>>for production, that would run probably 1 billion times slower than the chip
>>would actually run.  That isn't a guess.  I have used simulators for years
>
>I will explain some terminology so you won't be so confused:
>
>I can get an EMULATOR for a 68000. It makes my PC functionally equivalent to a
>68000. It's very fast--I can actually play Sega Genesis games with this
>emulator, and the games can run faster than on a real Sega Genesis.
>
>I can also get a SIMULATOR of a 68000. It will perform a transistor-level
>simulation of the chip. It will also play Sega Genesis games, but millions of
>times slower than the emulator.
>
>You are using the words EMULATOR and SIMULATOR interchangably. In the above
>passage, you are quite obviously talking about a SIMULATOR, because you are
>saying that it's used to test the chip design and it runs 1 billion times slower
>than the actual chip.
>

Those are two uses for the words.  But not the _only_ uses.  "simulator"
also is used commonly to describe what you are using "emulator" for, in a
different context.  IE I wrote a thing that was known as the "Xerox sigma 9
simulator" around the world.  It allowed operating systems developers to run
as a normal user, and use this simulator to boot an operating system while the
system was running normal timesharing stuff.  It executed most normal
instructions by using the "exe" instruction.  But some (SIO, HIO, TIO, AIO,
and other privileged instructions dealing with I/O and memory management
hardware were handled in software.  Some use the term emulation to describe
what happens when you use instructions on a CPU that doesn't have them.  IE
the old 386 without a 387 could still execute FPU instructions, but would trap
and execute subroutines that emulated the instruction behavior.

There are other 'simulator' definitions that fit just as well.  IE to an EE,
your terms are typical.  To others, they are may not be..




>Now let's say the DB chip has a 300 point bonus for each bishop.
>
>What I am proposing is to make a functional equivalent of the DB chip, i.e., and
>EMULATOR, which would simply add 300 to some integer in memory when it finds a
>bishop. It would take maybe 1 clock cycle.
>
>What you seem to think I'm proposing is a DB chip SIMULATOR, which would
>simulate all the little electrons running through a full-adder array to add 300
>to some number. This would take millions of clock cycles.

I suggested that, because you originally said that "Hsu must have something he
used to design the eval".  He did, the netlist passed to the fab shop to lay
out the chip.  This is a gate-level description of the chip, and would take
just what you are describing.

On the other hand, there are questions that can be asked in hardware that would
be very complex to ask in software.  IE to evaluate "potentially open files" you
have to ask "how far can each pawn on each file advance and what does it
encounter as it does this?"  In hardware, that results in a bunch of parallel
stuff.  IE 8 files tested in parallel.  In a normal program, this would be a
bunch of loops/tests.  I know.  I do it in Crafty already.  That turns into a
bunch of code for a simple question to ask when the answer can be provided by
a hardware designed explicitly to answer it.

That is why I think his eval would be horribly slow.  Which would cascade into
screwing up the search, which would screw up the eval weight tuning, and then
entire row of dominoes falls over...




>
>Now, when faced with an option between EMULATING and SIMULATING the DB chip, not
>even the stupidest person in the entire world would even consider SIMULATING.
>But for some reason you think that's what I'm talking about.

Because that is the _only_ "real piece of something" that he has to work
with.  IE he used some (admittedly, in his book) primitive VLSI design tools
because their budget didn't include anything fancy.  He did his own gate-level
layout because the tools at the fab shop couldn't fit the chip into the die
size he had to have, so he did it manually.  The thing he gave the fab shop
could be run on some sort of tool to simulate its behavior when actually built.

That is all he had to work with.  Which would be all he could use if he has to
use something already existing...




>
>So my challenge is this: go through my previous posts, and find a quote where I
>clearly ask/discuss how much work it would take to SIMULATE the DB evaluation
>function.
>
>You will not find anything, because like I said, the idea is phenominally stupid
>and it never even crossed my mind. I'm not even sure why it crossed yours.
>
>-Tom


It crossed mine knowing what Hsu currently has to work with.  He didn't write
C code, then translate that into a hardware design, etc.  He didn't have time.
They decided to do DB-2 3 months after match 1.  9 months before match 2.  It
was _very_ tight time-wise, as hardly anyone turns out a design that quickly
from concept to silicon to production...

It wasn't the typical "Intel" type design cycle, which surprised me as I read
his book.  It reminded me of a late-night graduate student lab project.  Yet
it did work...



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.