Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: a faster neural-network activation function

Author: Jim Bell

Date: 05:21:16 06/05/01

Go up one level in this thread


On June 04, 2001 at 19:00:55, Landon Rabern wrote:

[SNIP]
>
>I have done some testing will a neural network evaluation in my program for my
>independent study.  The biggest problem I ran into was the slowness of
>calculating all the sigmoids(I actually used tanh(NET)).  It drastically cuts
>down the nps and gets spanked by my handcrafted eval.  I got moderate results
>playing with set ply depths no set time controls, but that isn't saying much.
>
>Regards,
>
>Landon W. Rabern

In case you are still interested, you might want to consider what I assume is a
faster activation function: x/(1.0+|x|), where x is the total weighted input to
a node. I read about it in a paper titled "A Better Activation Function for
Artificial Neural Networks", by D.L. Elliott.  I found a link to the paper (in
PDF format) at:

   "http://www.isr.umd.edu/TechReports/ISR/1993/TR_93-8/TR_93-8.phtml"

I should warn you that I am certainly no expert when it comes to neural
networks, and I haven't seen this particular activation function used elsewhere,
but it shouldn't be too difficult to replace the tanh(x),
and see what happens. (Of course, you would also have to change the
derivative function as well!)

Jim




This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.