Computer Chess Club Archives


Search

Terms

Messages

Subject: Re: impressive Power Mac G5 !

Author: Hristo

Date: 01:56:30 07/25/03

Go up one level in this thread


On July 25, 2003 at 04:34:10, Tom Kerrigan wrote:

>On July 25, 2003 at 02:49:58, Ryan B. wrote:
>
>>On July 25, 2003 at 00:51:13, Tom Kerrigan wrote:
>>
>>>On July 24, 2003 at 23:14:52, Hristo wrote:
>>>
>>>>On July 24, 2003 at 20:43:14, Dann Corbit wrote:
>>>>
>>>>>On July 24, 2003 at 17:27:55, Vincent Lejeune wrote:
>>>>>
>>>>>>http://www.apple.com/quicktime/qtv/wwdc03/ -> click "watch now" -> go to
>>>>>>1:40:30; You will see Power Mac G5 perform a little more than 2 times faster
>>>>>>than a dual Xeon 3.06 !!! Live run, screens side by side, with 4 or 5 different
>>>>>>applications.
>>>>>
>>>>>Figures don't lie.
>>>>>But liars will figure.
>>>>>
>>>>>The shame from Apple's current misinformation campaign won't go away until they
>>>>>start telling the truth.
>>>>>
>>>>>A little distortion is not unexpected.  But they are simply telling absurd tall
>>>>>tales.
>>>>
>>>>Dan,
>>>>if the applications being compared were using Altivec optimized code on the Mac
>>>>and were dependent heavily on this part of the code, then the Mac being 2 times
>>>>faster is easy to imagine.
>>>>
>>>>What they, Apple, don't tell "you" as a consumer is that only a few Applications
>>>>can gain execution speed from Altivec stuff ... and when it does happen you can
>>>>often feel that the P4 are slow, which is not the case for the general purpose
>>>>Applications.
>>>>
>>>>Why do you think Apple is not telling the truth? More precisely, what is it that
>>>>they are dishonest about, in relation to the above mentioned demo?
>>>>
>>>>Regards,
>>>>Hristo
>>>
>>>
>>>* The IBM guy was bragging about their 0.13um process, saying how great it is,
>>>saying that only IBM and Apple could deliver it... Hmmm... Intel has been
>>>selling 0.13um P4s for a year and a half and AMD has been selling 0.13um Athlons
>>>for, what, a year? Intel is going to be selling 0.09um processors not long after
>>>Apple starts shipping the 0.13um processors that only they are awesome enough to
>>>deliver, sure.
>>
>>I think it was the copper technology that he was bragging about.
>
>I'm pretty sure Intel and AMD's 0.13um processes are both copper. He could have
>been referring to SOI but he didn't mention that once, IIRC.
>
>>>* Steve says the 3.0GHz P4 was the fastest they could buy, but actually they
>>>could have bought a 3.2GHz P4.
>>>
>>>* Steve says the G5 is the world's first 64 bit desktop processor. The Opteron
>>>has been out for months now. If you want to argue that the Opteron is a
>>>workstation/server processor and not a "dektop" processor, then why are they
>>>comparing the G5 to a Xeon?
>>
>>The G5 was compared to the Xeon because you cannot get a duel P4.  Jobs clearly
>>stated that.
>
>Sure, he also clearly stated that they used GCC 3.3 for the SPEC comparison...
>that doesn't discount the fact that they were comparing the G5 to a dual Xeon.
>Jobs kept going on about how the Xeon was the fastest PC you can buy, but with
>the exact same rationale they could have compared it to a dual Opteron.
>
>>>* Of course there's the fiasco with Apple's SPEC scores being _way_ lower than
>>>officially submitted SPEC scores.
>>>
>>>* For the Photoshop and music demos, it seemed like the PC was stalling out for
>>>some reason. Is it because it didn't have as much memory? Is it because they
>>>were taking advantage of some problem with a memory allocator and it was
>>>thrashing? Maybe a virus scanner was running.
>>
>>Photoshop has always been faster on Mac's.  X86 computers are just obnoxiously
>>slow when dealing with images and video.  In many cases My Amiga is still faster
>>with video.  It's just not what the x86 was made for.
>
>No, that's Jobs's reality distortion field in full force. It has been shown time
>and again across the web that if you don't hand-pick your filters, PCs are
>faster at Photoshop. I can't speak to video, but I know that Adobe is only
>selling their high-end video software for PCs and I imagine that's for a reason.

Perhaps one of the reasons is that FCP (Final Cut Pro) wipes the floor with
Premier. :-)

>
>>>BTW, you mention Altivec. What makes you think IBM's implementation of Altivec
>>>is any better than Intel's implementation of SSE2?
>>
>>Come on, this is common cense.  Obviously Altivec makes for a bigger speed
>>difference in the mass majority of applications than SSE2.  If you doubt the
>>usefulness of Altivec please read Nasa's G4 study.
>>http://members.cox.net/craig.hunter/g5/NASA_G4_Study-1.pdf
>
>Yes, well, that why I said "implementation." Just because Altivec ran well on a
>G4 doesn't mean it will run as well on a G5.

Nor does it mean it is going to run worse.
This link worked for me:
http://members.cox.net/craig.hunter/g5/
"
 Vector performance of the G5 remains excellent, and is inline with current G4
systems on a per clock cycle basis.  As a result, raw vector performance of the
G5 will be boosted simply by its higher clock speeds relative to current G4
systems.
"

It seems that he is comparing the G5 vs G4 vs P4-2.66 GHz
and judging by his findings the G5 Altivec is doing just fine compared with the
G4 Altivec.
So he seems to have gotten his hands on one of those puppies. ;-)


Cheers,
Hristo

>
>>>One also has to ask what in the world Jobs is doing up there, oh so happy about
>>>having a 2GHz processor, when MHz don't matter. ;)
>>
>>It is called marketing, the only reason x86 has stayed the standard in home
>>PC's.  Apple has to take a shot at the perception game.  Too bad Apple will gain
>>very customer base from it anyway.
>
>The x86 certainly has not stayed in the game because of marketing. There were
>other factors at work. Intel only started advertising its processors beginning
>with the 486DX2 and no clone vendor was big enough to even advertise on TV for
>YEARS after x86 had become the de facto standard.
>
>-Tom



This page took 0 seconds to execute

Last modified: Thu, 15 Apr 21 08:11:13 -0700

Current Computer Chess Club Forums at Talkchess. This site by Sean Mintz.