tuteja1986
Diamond Member
- Jun 1, 2005
- 3,676
- 0
- 0
Originally posted by: redbox
Originally posted by: tuteja1986
Originally posted by: josh6079
We set up the ATI and Nvidia drivers as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
Quality AF: Off
Other settings: default
Nvidia ForceWare:
Image Settings: Quality
Vertical sync: Off
Trilinear optimization: On
Anisotropic mip filter optimization: Off
Anisotropic sample optimization: On
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default
:roll:
Both were standard settings which is horrible for a review of those kinds of high end systems. Why didn't they put both on "High Quality"?
Because XBit Lab cators for the noobs now!!
When VR-Zone did decent bechmarking , the idiot editor didn't realise that he was using High Quadity Setting ahh the irony
Wait a sec... are you calling the editor an idiot or Shamino? Cause I am pretty sure Shamino knew what he was doing the guy is a world class overclocker and is pretty far from what I would call an idiot. There was more wrong in that bench than Image Quality settings and he knew it which is why he pulled it down.
check the thread , the dude was confused why Nvidia got a low frame rate in alot of bechmark ;*(