PDA

View Full Version : Does anyone know....?



DreamEndless
26th Jan 2004, 09:03
Finally games are coming out that are using the hardware features our hard-earned GFX cards have been boasting about for the last year. Specifically we are finally seeing the rise of the pixel shader. Trouble is it seems to be causing some problems...

DX:IW and T3 both require a pixel shader and use it prolifically throughout the game. The only other game (that the public can play around with) which uses pixel shaders (and DX9 in general) is Far Cry. Difference is Far Cry does not require a pixel shader.

Now, to get to where I am going: Everyone has lower performance than they expected with DX:IW, even those of us who now have it running well. In Car Cry, those peeps who have cutting edge machines and are able to turn on all the special effects are having performance issues. However, people with average or indeed weak machines are having no trouble at all!:confused: Now I know, this is because they are simply not using a lot of the features available in todays cards and are therefore playing without a lot of the special effects. It just seems to me that the performance drop for those people with everything cranked to 10 is disproportionate.

For example: My machine (see sig) running Far Cry. 1024x762 with 2XAA and every detail set to Very High - I get an average of 30 FPS.

However , if I turn the water detail down to low (one of the things with the heaviest pixel shader usage) my FPS soars to an average of 70+ FPS!

Now, I now that pixel shaders are very powerful programming tools and require some serious grunt, but aren't those of us with Radeon 9800 PROs, GeForce FX 5900 ULTRAs etc, etc supposed to have serious grunt? Isn't that the point?

So does someone know? Have those of us buying cutting edge cards been mislead (and can we please stay away from the Nvidia V Radeon debate for once!?) about our cards performance or are the coders and drivers simply not up to the task yet?

thegrommit
27th Jan 2004, 03:14
Originally posted by DreamEndless
aren't those of us with Radeon 9800 PROs, GeForce FX 5900 ULTRAs etc, etc supposed to have serious grunt? Isn't that the point?

So does someone know? Have those of us buying cutting edge cards been mislead

Pixel shader performance on current cards is at a similar level as 32-bit colour was on the first TNT card. In other words, the hardware supports it, but performance won't necessarily be that great. It'll be the next generation or two that really shines.

As an aside, this quote from Tim Sweeney is worth highlighting:

http://forums.beyondunreal.com/showthread.php?t=125713