spikegifted - Random thoughts
|nVidia supposed cheating on 3dMark2003...discussion|
May 26, 2003
Don't be confused by the phrase 'no optimization'. Nearly all modern graphic hardware and their drivers are built to accelerate 3D graphics and they are 'optimized' to either perform 3D graphic operations (e.g. triangle setup) that would otherwise be done by the CPU or be 'optimized' to perform certain operations that improve visual quality of the image (e.g. anisotropic filtering). Notice that all these 'optimizations' are general - they deal with anything related to 3D and they don't address a particular piece of software. Before you flame me with your views, I acknowledge that optimization of specific pieces of software or applications also takes place - e.g. special drivers for AutoCAD... But on the whole, 'optimization' happens across all applications.
Then funky things happens - some people decided to 'optimize' their drivers to perform better when being used in benchmarking situations - more recent examples include ATi's infamous 'quack Quake III drivers' and nVidia's 3DMark03 routine. This is not optimization! This is CHEATING!! Why? In these situations, the offending companies have decided to allow their hardware to skip certain operations that provide improve visual quality and trade that with improved frame rates in the benchmarking routine. However, when the hardware and drivers are made to do things that are not in the routine, the hardware revert back to its 'normal' performance level. Hence they manage to 'boost' (or 'optimize') their performance for the benchmark (and the benchmark alone)!!
That is not general optimization - it is specific to the extent that it is only optimized if you only play Quake III running around in the same route as the benchmark. (How f***ing boring is that!! )
We, the general buying public, are spending princely sums to invest in computing hardware. The top-of-the-line vid cards cost around a month's mortgage payment! When this kind of money is being spent, we expect exceptional performance across the board, not just some stupid benchmark.
To take the case further - we, the general buying public, often base our buying decisions on benchmark results. 'Optimizing' for benchmarks are CHEATING us out of our money!!
We're living in 2003, not 1995. In investing in a piece of high-end graphics hardware, we expect it to be able to perform (ie. 60fps or higher) at high resolution (1280x1024 or 1280x960 or higher) with FSAA (you pick your level) and all the details set on high. Is that so difficult?