Originally Posted by NSiNSiNSi
Well rince, I have extensively tested my 1.33 G4 against my older 1.3 Centrino and now against my 1.6, and the Mac has lost 90% of the time, often by an embarassing proportion. This is usually number crunching stuff though - rendering, compressing, coverting, etc... Launching apps, watching DVDs, using Word, etc, tends to be just about the same, most people don't even notice the difference. But I don't care what they say, Opera 7.5 on my TM8003LMi runs circles around Safari (or Mac Opera) on my G4. Not even close.
So you're right, megahertz are not the ultimate measurement... Centrino tought me that better than Apple ever did.
Likewise, regarding 3D, it's not only the hardware that counts. Macs are slower at full 3D scene rendering, it's pretty much an established fact. An ATI engineering official who shall remain nameless once explained to me why under Apple's current guidelines it is basically impossible for a current video card that is designed to work with DirectX, OpenGL on Windows, big endian math, etc., to catch up to a modern PC with the same card, and this is comparing a G5, forget a G4. It had to do with Apple limiting the ways you can interact with registers compared to Windows, but honestly it was a little out of my depth. This guy was part of the Mac team at ATI btw, not a PC fan.
All current cards are designed for PC architecture, with Mac versions as sort of an afterthought. Apple doesn't make huge sacrifices to make themselves completely compliant with PC specs, lest they lose their tight grip on controlling everything, and to graphic chip makers the Mac market is like a cute side business that doesn't even begin to approach the size of the PC, workstation or consumer electronics markets.