Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: chess and neural networks

Author: Landon Rabern

Date: 12:44:44 07/03/03

Go up one level in this thread


On July 03, 2003 at 03:22:15, Christophe Theron wrote:

>On July 02, 2003 at 13:13:43, Landon Rabern wrote:
>
>>On July 02, 2003 at 02:18:48, Dann Corbit wrote:
>>
>>>On July 02, 2003 at 02:03:20, Landon Rabern wrote:
>>>[snip]
>>>>I made an attempt to use a NN for determining extensions and reductions.  It was
>>>>evolved using a GA, kinda worked, but I ran out of time. to work on it at the
>>>>end of school and don't have my computer anymore. The problem is that the NN is
>>>>SLOW, even using x/(1+|x|) for activation instead of tanh(x).
>>>
>>>Precompute a hyperbolic tangent table and store it in an array.  Speeds it up a
>>>lot.
>>
>>Well, x/(1+|x|) is as fast or faster than a large table lookup.  The slowdown
>>was from all the looping necessary for the feedforward.
>>
>>Landon
>
>
>
>A stupid question maybe, but I'm very interested by this stuff:
>
>Do you really need a lot of accuracy for the "activation function"? Would it be
>possible to consider a 256 values output for example?
>
>Would the lack of accuracy hurt?
>
>I'm not sure, but it seems to me that biological neurons do not need a lot of
>accuracy in their output, and even worse: they are noisy. So I wonder if low
>accuracy would be enough.
>

There are neural net models that work with only binary output.  If the total
input value exceeds some threshhold then you get a 1 otherwise a 0.  The problem
is with training them by back prop.  But in this case I was using a Genetic Alg,
so no back prop at all - so no problem.  I might work, but I don't see the
benefit - were you thinking for speed?  The x/(1+|x|) is pretty fast to
calculate, but perhaps the binary (or other discrete) would be faster.
Something to try.

Landon



This page took 0.02 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.