Author: K. Burcham
Date: 10:25:46 04/06/02
I think it is very difficult to test Deep Blue positions if you believe this article on Deep Blue depth and search capabilities. the article says that Deep Blue was searching average depth of 30/70--as you can see from my eval of two of todays two Deeps, depths are only about 15 for DS and about 18 for DJ7. the article says Deep Blue had 480 chess processors vs my two. here are four posiitons using 3066 mhz, Deep Junior7 and Deep Shredder Paderborn. i let the programs think too long for actual tournament match time. 2x1533 mhz Amd mp [D] 2kr3r/ppq1bpp1/2p3n1/6Bp/3N2nP/2P3P1/PP2QPB1/R3K2R w KQ - 0 1 Deep Junior 7: 19.0-0-0 Bxg5+ 20.hxg5 Qe5 21.Rhe1 Qxe2 22.Rxe2 Rhe8 23.Rde1 Rxe2 24.Rxe2 N4e5 25.Kc2 Kd7 = (0.03) Depth: 18 00:02:03 266815kN (time=8 minutes no eval update) Deep Shredder Paderborn 16.01 7:49 +0.50 19.Bxe7 Nxe7 20.O-O-O Kb8 21.Rhe1 Nc8 22.Bh3 Nf6 23.Bxc8 Qxc8 24.Qe5+ Ka8 25.Qg5 Ng4 26.f3 Nf2 27.Rd2 c5 (240.297.695) 512.1 best move: Bg5xe7 time: 11:16.093 min n/s: 511.287 CPU 199.6% nodes: 345.677.573 [D] 2kr3r/ppq1bpp1/2p3n1/6Bp/3N2nP/2P3P1/PP2QPB1/2KR3R b - - 0 1 Deep Junior 7: 19...Rhe8 20.Bxe7 Qxe7 21.Qxe7 Nxe7 22.Bh3 Kc7 23.Bxg4 hxg4 24.Rhe1 a5 25.Kc2 c5 26.Nb5+ Kc6 27.Rxd8 = (0.20) Depth: 19 00:06:42 891968kN Deep Shredder Paderborn 15.01 17:34 -0.11 19...Bxg5+ 20.hxg5 Rhe8 21.Qf1 Kb8 22.Rxh5 c5 23.Nb5 Qb6 24.Rxd8+ Rxd8 25.Rh7 N6e5 26.Be4 Nxf2 27.Qxf2 Qxb5 28.a3 (534.706.736) 507.2 best move: Be7xg5 time: 22:26.110 min n/s: 507.029 CPU 199.9% nodes: 682.517.867 [D] 2krr3/ppq1bpp1/2p3n1/6Bp/3N2nP/2P3P1/PP2QPB1/2KR3R w - - 0 1 Deep Junior 7: 20.Qc2 Bxg5+ 21.hxg5 Kb8 22.Rxh5 c5 23.Nb5 Qb6 24.c4 Rxd1+ 25.Qxd1 Nxf2 26.Qd2 Ne5 27.Qd6+ Qxd6 28.Nxd6 g6 29.Rh7 = (0.10) Depth: 19 00:14:22 1842175kN Deep Shredder Paderborn 14.40 6:32 +0.20 20.Qc2 Bxg5+ 21.hxg5 Kb8 22.Rxh5 c5 23.Nb5 Qb6 24.Rxd8+ Rxd8 25.c4 N6e5 26.Kb1 a6 27.Na3 (200.144.056) 509.6 best move: Qe2-c2 time: 8:28.750 min n/s: 510.401 CPU 199.8% nodes: 259.666.911 [D] 2krr3/ppq1bpp1/2p3n1/6Bp/3N2nP/2P3P1/PPQ2PB1/2KR3R b - - 0 1 Deep Junior 7: 20...Bxg5+ 21.hxg5 Kb8 22.Bf3 N6e5 23.Bxg4 hxg4 24.Qf5 g6 25.Qf6 Qa5 26.Nb3 Qc7 27.Nc5 Rxd1+ 28.Rxd1 Nf3 = (0.06) Depth: 20 00:13:46 1818583kN Deep Shredder Paderborn 15.01 18:48 -0.28 20...Bxg5+ 21.hxg5 Kb8 22.Nf5 Qb6 23.Rxd8+ Rxd8 24.Rd1 Rxd1+ 25.Qxd1 Nxf2 26.Qe2 Kc7 27.Nxg7 Qc5 28.Kb1 Ng4 (563.168.659) 499.1 11. There has been some question as to the endgame databases used during the match. Hsu stated that there were 20 gigabytes of endgame databases from Ken Thompson and Lewis Stiller on the hard drive. He said that they were all of the five-man and down, plus selected six-man endgame databases. To his knowledge, during the match, they were never accessed, but he was not sure of this. He said that since the chess processors have some of the engame databases built in (I have read that these are the 3-man set), he figured that it never got to the point where the SP2s would need to access the hard-disk-based databases. He said that it was probably a good psychological weapon for Kasparov to know that they were there, since, if he made one wrong move during the endgame, he would know that he would quickly look foolish in front of millions of people, and this would have to have an effect. Other differing reports about how many processors DB used were also answered. Deep Blue employed 30 SP2 Scalable Processors. The frames were capable of holding sixteen each, and there were two frames, but in each frame, two processors were tied together to form a master processor, which meant a total of 30 instead of thirty-two. Each SP2 had 16 chess processors attached, so that meant a total of 480 chess processors. Up until this point I had only heard 256 or 512. Hsu said that Deep Blue used "two-level parallelism" to process positions. He described this as the method of the master processor evaluating the first 4 moves, then sending the 1000 or so positions involved to the other SP2s, which would carry it out for another 4 moves, and then turn over the positions to the chess processors, which would go on for 4-5 more moves. He said that on average DB would reach to 30 ply in considering a move, but in certain cases it had reached, through selective extensions and pruning up to 70 ply, though this was rare. It would on average process 200 million chess positions per second, but that this reached as high as 400 million in certain cases. The chess processors made for the rematch were capable of 2-2.5 million nodes per second processed, and with improved evaluation with Joel Benjamin's help, and better selective search, the speed was improved by 3-10 times over the 1996 version. I asked how many cycles it took to evaluate a position, and was told that it varied. There was a short evaluation used approximately 80% of the time which took only one cycle, and there was a long evaluation used 20% of the time that took 8 cycles. Move generation took 4 cycles. There were 8000 adjustable evaluation features, and these included such things as the value of a rook on an unopened file which could later be forced open with a pawn exchange or sacrifice. He said this was one that was added through the help of GM Joel Benjamin, and he knew of one instance during the match when it had an effect. (I have not looked over the games to see where this would be, perhaps some helpful reader with more time could find this out.) It would be very interesting to know how these evaluations can be performed in hardware, but I am not sure that this will ever be covered, especially if Hsu is really thinking of a commercial version of the program. Since he also mentioned that he would be interested to see if a single-chip chess machine could be created to beat the world champion someday, he may not be forthcoming on his research, as would be hoped. kburcham
This page took 0 seconds to execute
Last modified: Thu, 15 Apr 21 08:11:13 -0700
Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.