Originally posted by: mamisano
More fishy things going on with Nvidia's drivers.
On page 13, the NeverWinter Nights scores...How can the 5600 Ultra score a 26.9 without AA/AF and then score a 30.5 WITH 4x AA/8x AF???
Just doesn't make any sense!
http://www.anandtech.com/video/showdoc.html?i=1890&p=13
Originally posted by: Sazar
Originally posted by: Evan Lieb
Originally posted by: lifeguard1999
Evan Lieb:
An answer to a simple question will calm my fears, and prove me wrong. Yes or No: Did you have the NV38 in your lab and run benchmarks on it?
What I fear is that NVidia sent you benchmarks for their NV38. You tried to match their configuration closely, and run the 9800XT benchmarks. Then you did a comparison review. Not that this is decidedly a bad thing to do. I now have a information on which to make my purchase decision that no other site could give me. It just feels like NVidia then is indirectly controlling the hardware setup.
No editor I know on any web site that I can think of would ever do that, and I know AnandTech most certainly wouldn't. To me, doing something like that would be like...I dunno, committing suicide or something.![]()
I know a few who have done so
anyways... I logged onto the site specifically coz I had nothing better to do...
evan... is there a particular reason WHY only a resolution of 1024x768 was utilised/ why there are no IQ comparisons/ why there is no discussion of the new dets that were tested/ why older dets would not work (seeing as the marchitecture is pretty much the same)/ why there is no explanation of the anomalies seen and finally why was TOMB RAIDER : AOD not benched ?
I am going to assume that the new dets expose the nv38's whereas it was not being recognised in the older dets... and will assume that a thorough IQ comparison will be presented @ a later time...
are you allowed to tell us how much influence (if any) nvidia did have over the testing... ie resolutions to use/benchmarks to use/architecture to use/settings to use ?
also... can someone please corellate this for me ?
We used ATI?s publicly available Catalyst 3.7 drivers and in order to support the NV38 we used NVIDIA?s forthcoming 52.14 drivers. The 52.14 drivers apparently have issues in two games, neither of which are featured in our test suite (Half Life 2 & Gunmetal).
and
Although we did provide some insight into the ?next generation? of games with scores from Halo, the real question on everyone?s mind is still Half Life 2 as well as Doom3. The performance crown under Doom3 is still in NVIDIA?s camp apparently, and although the latest drivers have closed the gap significantly, ATI is still ahead in Half Life 2. The numbers we?ve seen indicate that in most tests ATI only holds single digit percentage leads (< 5%), although in some cases ATI manages to pull ahead by double digits.
any insight would be appreciated...
Originally posted by: gorillaman
Originally posted by: mamisano
More fishy things going on with Nvidia's drivers.
On page 13, the NeverWinter Nights scores...How can the 5600 Ultra score a 26.9 without AA/AF and then score a 30.5 WITH 4x AA/8x AF???
Just doesn't make any sense!
http://www.anandtech.com/video/showdoc.html?i=1890&p=13
Nvidia cards have always been significantly better in Neverwinter.
Originally posted by: Sazar
Originally posted by: gorillaman
Originally posted by: mamisano
More fishy things going on with Nvidia's drivers.
On page 13, the NeverWinter Nights scores...How can the 5600 Ultra score a 26.9 without AA/AF and then score a 30.5 WITH 4x AA/8x AF???
Just doesn't make any sense!
http://www.anandtech.com/video/showdoc.html?i=1890&p=13
Nvidia cards have always been significantly better in Neverwinter.
naturally... the game was designed optimized for nvidia hardware... utilizing extensions for nvidia hardware... this is to be expected... I would be surprised if it was not the case..
Originally posted by: shady06
Originally posted by: Sazar
Originally posted by: gorillaman
Originally posted by: mamisano
More fishy things going on with Nvidia's drivers.
On page 13, the NeverWinter Nights scores...How can the 5600 Ultra score a 26.9 without AA/AF and then score a 30.5 WITH 4x AA/8x AF???
Just doesn't make any sense!
http://www.anandtech.com/video/showdoc.html?i=1890&p=13
Nvidia cards have always been significantly better in Neverwinter.
naturally... the game was designed optimized for nvidia hardware... utilizing extensions for nvidia hardware... this is to be expected... I would be surprised if it was not the case..
regardless of optimization, i have a hard time believing that performance is better with AA/FSAA on than with AA/FSAA off
But rewind a little bit; quite a few of these accusations being thrown at NVIDIA were the same ones thrown at ATI. I seem to remember the launch of the Radeon 9700 Pro being tainted with one accusation in particular ? that ATI only made sure their drivers worked on popular benchmarking titles, with the rest of the top 20 games out there hardly working on the new R300. As new as what we?re hearing these days about NVIDIA may seem, let us not be victim to the near sightedness of the graphics industry ? this has all happened before with ATI and even good ol? 3dfx.
We will be taking a much closer look at image quality very soon, but until then, it looks like ATI and NVIDIA have equal footing in the Aquamark3 arena and we are left to find more useful information about their differences elsewhere.
Originally posted by: Instigator
But rewind a little bit; quite a few of these accusations being thrown at NVIDIA were the same ones thrown at ATI. I seem to remember the launch of the Radeon 9700 Pro being tainted with one accusation in particular ? that ATI only made sure their drivers worked on popular benchmarking titles, with the rest of the top 20 games out there hardly working on the new R300. As new as what we?re hearing these days about NVIDIA may seem, let us not be victim to the near sightedness of the graphics industry ? this has all happened before with ATI and even good ol? 3dfx.
Burn!! For all of those ATI fanboys who acted like ATI doesn't do the same thing.
AquaMark3 Bench
Halo Bench
So much for your DX9 bashing. Oh and before you say it scored well because of reduced image quality:
We will be taking a much closer look at image quality very soon, but until then, it looks like ATI and NVIDIA have equal footing in the Aquamark3 arena and we are left to find more useful information about their differences elsewhere.
Aaah, nothing better than seeing the ATI fanboys walking around with their tails between their legs.
Image quality appears to have improved for NVIDIA in this benchmark over what has been reported of previous drivers, and the NV38 handled the massive overdraw portion of the test the smoothest of all the cards.
We strongly believe in only testing with officially released drivers and not beta level or other leaked drivers. Recent driver bugs and benchmarking issues are a very real concern to many of us. Therefore we chose to use the 45.23 NVIDIA drivers over the beta Detonator 50s that are available only to reviewers. To put it simply, we do not know what changes will be made to these non-released drivers by the time they make it to our readers in an official capacity.
To not show HL2 scores when it is the most important test to many of your readers is a mistake.
--Using this latest sample from Nvidia when it is not in release yet. Raises the following questions.
At least there should have been some very strong proviso's in it's use.
Such as:
- this may not be the same clock speed on the product when it is publicaly available.
-This may not have the same architecture or memory.
Also 1024 x 768 is far too low to be testing such high-end cards on. I'd much rather see 1600 x 1200 or higher but if you must go lower, 1280 x 960 is the bare minimum.
Originally posted by: spam
Here is a quote from HardOCP,
[/b]BTW Evan Leib, would Nvidia have let you test their upcomiing card without using their Beta drivers? If not, then you should have refused to test their card.
unlikely...
refusal == loss of hits == loss of ad revenue...
@ the end of the day it is 'reviews' like this and the blog posted by anand concerning the hl2 numbers for nvidia that really get hits...
again... I fail to understand WHY tabulations are being posted here using full precision on one card and partial precision on another...
anand's hl2 table
ATI was running in their DX9 codepath and the mixed mode codepath was used for NVIDIA. No AA/AF was enabled and we're looking at 1024x768 scores
again... for the sake of sanity... why are no figures included with the nv38 using the dx9 path ? this would give an excellent indication as to the performance improvements to be expected in hl2...
as it stands... even with pp... the nv38 lags behind... which is consistent with it having the exact same architecture as the nv35...
It is... new DirectX... new generation of video cards... new generation of video games.This feels like 1998-99 all over again
Originally posted by: Genx87
Spam is losing his mind here lol