Author: Peter McKenzie
Date: 20:37:39 03/07/99
Go up one level in this thread
On March 07, 1999 at 23:29:31, Stuart Cracraft wrote: >Crafty seems to have a Scale_Up/Scale_Down factor and I'm curious >if Bob would talk a little about this. Seems costly to add a memory >reference and a division (?). I'm guess his divisions are usually by a constant that is a multiple of 2, in which case the compiler will just do a bit shift. > >For example, seems like Bob has chosen to ScaleUp/ScaleDown fairly >large contributions such as > > weak back rank > bishop pair > certain rooks on open files > rook on 7th > protected passed pawns > >based on the side's material. > >Are there alternatives to controlling the "bloating" of the positional >part of the evaluation that aren't so computationally expensive? I know >there are very few of these compared to all the terms in Crafty so the >cost isn't a big deal (probably.) > >One I considered was going from centipawns to millipawns and then working >backward looking at just those things that should be big (king safety) >and sizing them accordingly letting them bloat to a certain degree beyond >a pawn. > >--Stuart
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.