Author: KarinsDad
Date: 10:49:06 01/26/99
Go up one level in this thread
On January 26, 1999 at 01:42:38, James B. Shearer wrote: >On January 25, 1999 at 04:24:37, Will Singleton wrote (in part): > > >>I am a bit confused by the fact that it appears to be more difficult for some >>member programs to play against computers than humans. That is, when a >>program doesn't play other programs, it's rating will usually rise. Is this >>generally true, or not? And why should it be so, since the rating system should >>normalize this? > > The rating system is based on a model which only approximates reality. It >would not be surprising if there are detectable deviations between the model and >reality. Note the model was developed for human tournament play, the ICC >environment is rather different. > In any case I don't think rating model claims that your rating playing a >subset of the pool of players will always be the same as your rating playing the >entire pool. This occurs in human rating as well. An 1800 city player is often stronger (on average) than an 1800 rural player. This may be starting to change a little with the advent of the internet, however, I have noticed that players that have access to a larger pool of players have a slightly higher strength at a given rating than players that do not. Theoretically, for a human, this means that if s/he suddenly starting playing in a larger pool, the rating would initially drop (I'm not sure if this has happened to a lot of rural players playing on the internet or not). For most computers which do not "learn", it is harder to explain why this would be the case (without some explanative data). Their rating raises or falls based on which other computer opponents they play and which humans they play that can learn how to beat them (or not), but the frequency of playing each must be used. My theory is that humans in a larger pool have both more access to a wider variety of different players, and also can more easily learn a variety of concepts just by playing against more people (i.e. they get a more broad based understanding of the game by having more opportunities to practice against a variety of opponents). Not to bring the team vs. individual concept back, but that is effectively what happens. The humans who can play in a larger pool gain the benefits of the entire "team", however, still have a pecking order within the team. Not everyone has a high rating, even if they are a good player. There are too many better players in the larger pool to significantly advance (i.e. competition is tougher, so ratings stay lower for the same strength). Yet another KD theory :) >So it is conceivable that program x performs at a 2200 level >against computers on ICC but performs at a 2400 level against humans on ICC. >(Of course such programs should be balanced by other programs which are better >against computers so there is no net flow of rating points between the human and >computer pools). Hard to say. I doubt there are many "balancing" programs out there. Since anyone can play anyone as many times as they want on the internet, an 1800 player (or computer) can endlessly play a 2400 rated computer and slowly bring it's rating up (and his rating down). I think that the concept of human/human tournaments minimizes that probability in normal human ratings (i.e. USCF, not ICC) since non-tournament single player vs. single player rated events have limits as to the number of times they can be done within a given timeframe. > Naturally one can speculate endlessly when lacking data. How does Amateur >do playing humans vrs playing computers? For example what is its performance >rating for its last 100 games against humans as opposed to its last 100 games >against computers? (One could divide the pool in other ways, strong opponents >vrs weak opponents for example.) Due to the informality of the internet, splitting ratings up into types may make sense. KarinsDad :) > James B. Shearer
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.