NV: Everything under control. 512-Fermi may appear someday. Yields aren't under 20%

Page 24 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
You still don't get it; Nvidia DOES care about XP and they DO expect their GF100 line to run on XP. :p

XP is still the OS used by more mainstream gamers than either Vista or Win 7. And you forget that the BULK of Nvidia's sales - GTX 460, GT 450, GT 440, GT 430, GT 420 and GT 410 - The sub-$300 down to $70 video cards - are to the mainstream buyers - not the elitists like you who forget there is more to life than $500 video cards.

The worst thing about the current GeForce drivers is that they ALSO SUCK for Vista and Win 7 - once you get outside the popular games and try to play older games with demanding settings. AMD graphics is miles ahead with their current drivers for XP and for older games.

Apparently, no one likes to hear the truth, do they?

No point in talking about mainstream cards until they come out, whenever that might be.
For the high end, the only cards that NV have managed to release so far, XP doesn't really matter. If XP drivers still suck in however many months there are until they get a real lineup out, then it will be a real issue.
For NV, there is no life for GF100 outside the $400~500 cards. There is no mainstream. As far as all the other games not playing nice in XP or 7/Vista, that is a real problem, but XP support is not (yet) a real issue for the GF100 line.

Also, having an HD4850 which I purchased for around the same price as HD5770's go for these days means I am not particularly an elitist. My max price for a card is the HD5850 level and I prefer to keep it below that.
 

yh125d

Diamond Member
Dec 23, 2006
6,907
0
76
Apparently, no one likes to hear the truth, do they?

No, it's just hard to relate to situations that don't affect you. For example, I couldn't care less about eyefinity/physx/what have you (except as a talking point), so it's hard for me to relate to those who do
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
No point in talking about mainstream cards until they come out, whenever that might be.
For the high end, the only cards that NV have managed to release so far, XP doesn't really matter. If XP drivers still suck in however many months there are until they get a real lineup out, then it will be a real issue.
For NV, there is no life for GF100 outside the $400~500 cards. There is no mainstream. As far as all the other games not playing nice in XP or 7/Vista, that is a real problem, but XP support is not (yet) a real issue for the GF100 line.
If XP "doesn't really matter", then why are there WHQL XP drivers for GTX 480?
:rolleyes:

The rest of the GF100 lineup are coming out next month, starting with GTX 460
- perhaps Nvidia will pull off a miracle - by then .. you think? :p

However, until a few days ago, i do not believe they really knew the sad state of their drivers with older games. And i mean with WIN 7 and VISTA drivers - *besides* XP drivers

The logical conclusion is that NVIDIA does not have "stellar drivers" for GTX 4x0; the situation today is no different than when the brand new 8800-GTX was running on CRAP Vista drivers over 3 years ago; the only difference was that back then, we were told by Nvidia's fans was that Vista didn't matter and that XP was important.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
apoppin, I understand what you are saying. At the same time, chances are people who get a GTX470/480 are never going to be playing 5+ year old games and using super sampling and then complaining about getting 100 frames vs. 200 frames. Call of Duty 1? UT99? I dont think so. I am not going to buy a $500 graphics card to play Warcraft 3. And also, the games mentioned by BFG have such sub-par graphics by today's standards, I could care less if you apply 100x AA to them, Crysis will look better at 1024x768 with 0AA.

Most will buy GTX470/480 to run STALKER: CoP, Crysis, Metro 2033, Dirt2, Hawx, Diablo 3, SC2, etc. You need the horsepower to run the games which are butchering your current graphics card. No one is going to go out there and drop $350 just to have super sampling in UT2004...I would say that for most people, they care about smooth gameplay at the native resolution, maximum in game visual game settings (dynamic ambient occlusion, HDR lighting, soft shadows, highest quality textures) and then 4AA/16AF. Frankly, before you even apply AA, what do you do? Probably max out all the visual details first because they BY FAR make the most difference. Poor lighting, blurry textures, subpar shadows make a massive difference. AA is just icing really that you apply last after you at least make sure you can run your LCD at native resolution and all the visuals are maxed out.

Anything beyond 8AA in a FPS (esp. online) isn't really important because you are going to be shot when you stare at grass just to notice the difference. This is exactly why Eyefinity is about - using the extra horsepower. Also, as has been said, GTX285 has a significant texture fillrate advantage over GTX 470 (51.8 GTexels vs. 34 GTexels). So it may actually be faster than GTX470 because of that.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
And i understand what you are saying, RS.

No problem. i got a GTX 480 and HD 5870 CrossFire to play all of the latest games and i also have the option to play older games although i rarely revisit them.

i am pointing out the state of NVIDIA's drivers regards to older games. *Most* gamers who buy GTX 480 will probably have Win 7 and play the latest games. But *Many* with GTX 470 may be looking in the bargain bin, as many gamers do. And they should be aware of the potential frustration the immature GF100 drivers may bring until the issues are addressed by NVIDIA and fixed - especially before the mainstream Fermi cards are released.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I just think it's easy to blame XP drivers, but it may actually be the texture advantage GTX285 enjoys. In games which are less shader heavy, it outperforms or comes very close to GTX 470.

Check out this review: http://www.techpowerup.com/reviews/Powercolor/HD_5770_PCS_Plus_Plus/5.html

They are running Windows 7 64-bit and times when GTX285 beats the 470 card (Call of Duty 4, Far Cry, Prey, Quake 4). In other instances it provides almost similar performance (Unreal Tournament 3, Quake Wars).

Look at Wolfenstein performance under Windows 7. " Moreover, the old GeForce GTX 285 proves to be faster than the GeForce GTX 480 at all resolutions, which is the result of the weaker TMU subsystem." http://www.xbitlabs.com/articles/video/display/gigabyte-gf-gtx400_10.html
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I just think it's easy to blame XP drivers, but it may actually be the texture advantage GTX285 enjoys. In games which are less shader heavy, it outperforms or comes very close to GTX 470.

Check out this review: http://www.techpowerup.com/reviews/Powercolor/HD_5770_PCS_Plus_Plus/5.html

They are running Windows 7 64-bit and times when GTX285 beats the 470 card (Call of Duty 4, Far Cry, Prey, Quake 4). In other instances it provides almost similar performance (Unreal Tournament 3, Quake Wars).

Look at Wolfenstein performance under Windows 7. " Moreover, the old GeForce GTX 285 proves to be faster than the GeForce GTX 480 at all resolutions, which is the result of the weaker TMU subsystem." http://www.xbitlabs.com/articles/video/display/gigabyte-gf-gtx400_10.html

Agreed. Sometimes it is the texture advantage. However, BFG's review showed that the results - by using both Win 7 and Win XP with 36 games that ....

... here let me quote him:
Check the Windows 7 16xS results for UT99, Quake 3, RTCW and CoD1. Those games use little to no shading and mainly rely on fillrate and memory bandwidth, yet the GTX470 is miles ahead of the GTX285 (~60% faster in Quake 3).

But in XP the GTX470 is slower in the same games pretty much across the board. The fact that these scores move around so much by simply using a different OS tells me it’s a driver issue rather than an architectural limitation.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Apoppin’s quote sums it up best.

We can debate whether or not XP matters, but it’s actually quite irrelevant to the point. The point here is the large movement is observed between XP and 7 in some games, in both directions. You can’t blame texturing performance on that because many texture heavy games run a lot faster on the GTX470 than the GTX285 depending on which OS you use.

Also Serious Sam 2 is a texture heavy game (back in the G7x days there was a massive hit between Q and HQ), yet it flies on the GTX470 on both OSes. This is yet another reason why testing older games gives a more accurate picture of what the hardware is capable of. Yes, we know the GF100 does well in games with tessellation, but that doesn’t really tell us much anywhere else.

As for memory bandwidth, all indications are that it’s a non-issue, especially when you have a mid-range Radeon 5770 outrunning a GTX470 in Doom 3 at 2560x1600 with 8xMSAA. You can’t blame bandwidth or texturing for that.

Anyway, the specs for the GTX470 don’t paint the full picture given its effective texturing is higher due to cache and TMU improvements. Based on the synthetics from nVidia and 3DMark, if the drivers are tuned I’d still expect the GTX470 to be at least 10% faster than the GTX285 if the game is primarily texture bound. But I don’t think the drivers are tuned, and that’s where the problem is.

As for image quality, Crysis at 1024x768 with no AA is butt-ugly. It doesn’t matter if a game has the prettiest effects in the world if it’s being rendered onto lego bricks or onto a swarm of angry bees. With super-sampling you basically get perfectly image quality because there’s not a single pixel out of place. It’s like playing a pre-rendered CGI movie. I’d rather have CoD 1 style graphics at 2560x1600 with 8xSSAA than Crysis at 1024x768 with no AA.

In particular, Crysis has absolutely horrific vegetation aliasing, so much so that I cannot stomach playing it with less than 2xTrSS, and I’ll gladly drop the resolution and effects to get it. Some people might like sparkling Christmas trees, but I don’t. Why use effects like SSAO which cause a massive performance hit and are only visible in still screenshots when you can use 2xTrSS, which provides a massive gain to IQ during in-game movement?

If you want to see the same six games being tested, then any of the 34,697 reviews out there will show the same thing. But sooner or later almost everyone will want to play a game that isn’t tested in mainstream circles, and those reviews won’t help. The more games you test, the better idea you have about how robust the drivers and architecture is. Older games aren’t for everyone, which is why I also test newer games. But there are a lot of people out there that play older games and understanding the benefits of modern hardware, and the increased IQ it brings to the table.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Hey BFG, super sampling is not available on ATI 4000 cards right? Is there anything similar I can try?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I’d rather have CoD 1 style graphics at 2560x1600 with 8xSSAA than Crysis at 1024x768 with no AA.

See this is where we disagree. I'd rather take Doom 3 1024x768 maximum in-game visuals 0AA over Quake 3 2560x1600 8xSSAA, Crysis over COD1 same situation, Dirt 2 over Need for Speed: Hot Pursuit, H.A.W.X. 1024x768 maximum in game visuals over IL-2 Sturmovik, etc.

Jaggies can be completely eliminated to the human eye if you just sit farther from the screen. However, you can't just recreate extra visual details that aren't there. Every new generation, graphics on consoles are significantly better than the generation before it (i.e., NES --> SNES --> PS2 --> PS3, etc.)

The major graphical improvements come from improved geometry, complex shaders, textures and lighting, shadows etc., not anti-aliasing or anisotropic filtering modes. This is what makes games look better and better.

Call of Duty series over the years
Call_of_Duty_History-1.jpg


Call_of_Duty_History-2.jpg


Call_of_Duty_History-3.jpg



Call_of_Duty_History-4.jpg



Call_of_Duty_History-5.jpg




Unreal Tournament 1999:
955436898-1.jpg


Unreal Tournament 2004
ut2004_screen001.jpg


vs. Unreal Tournament 3:
unrealtourny3.jpg

Quake 3
quake_iii_arena_patch-11094-1236230897.jpeg


vs. Crysis with jaggies..
33.jpg



Introducing highest quality antialising to ancient games with outdated graphics is like dropping a Ferrari Enzo engine into a 2002 Honda Civic and then seeing how fast the Honda Civic is around the Nürburgring track compared to the Enzo....
 
Last edited:

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
apoppin, I understand what you are saying. At the same time, chances are people who get a GTX470/480 are never going to be playing 5+ year old games .

This is like saying, no one buys a $10k stereo setup to play vinyl records. Of course they do. That may not be the primary reason for buying a high end video card. But many hardcore gamers still play older games. I still break out my Sega Genesis every now and then.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
GTX 465 launches June First - Two and a half weeks from now and will be available in good supply immediately on that day.

Do you think everyone who buys a $250 GTX 465 will be on Win 7? Or only plays brand new games?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is like saying, no one buys a $10k stereo setup to play vinyl records. Of course they do. That may not be the primary reason for buying a high end video card. But many hardcore gamers still play older games. I still break out my Sega Genesis every now and then.

Ok but you aren't buying GTX470/480 to get 500+ fps in older games are you? Chances are you are buying it to improve performance in new games only. How many people would care if their UT2004 was running at 120 or 600 fps at this point....
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Ok but you aren't buying GTX470/480 to get 500+ fps in older games are you? Chances are you are buying it to improve performance in new games only. How many people would care if their UT2004 was running at 120 or 600 fps at this point....
Did you miss my post above yours?

GTX 465 launches on the first of June to compete with HD 5830

You think sub par drivers for older games will fly on it?
:rolleyes:
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
This seems like the console backwards compatibility thing, I don't really get it. Even a GTX 465 should get 100+ fps in old titles even if the drivers suck, I don't think nVidia should waste time shoring up their XP drivers.

Maybe in a few months revisit it but.. meh.. who used console backward compat for more than a month..? In with the new out with the old I say :p
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Well COD4 isn't that old and a ton of people still play and its running it at 42fps that would be tough to swallow if you just put down $350

and a few other more popular games are running in the 40-50fps range while a 285 is in the 60s
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Well COD4 isn't that old and a ton of people still play and its running it at 42fps that would be tough to swallow if you just put down $350

and a few other more popular games are running in the 40-50fps range while a 285 is in the 60s

They should package CoD5 with the GTX 465 then lol :) But ok yea, it should handle games as recent as that better than they are..
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
They should package CoD5 with the GTX 465 then lol :) But ok yea, it should handle games as recent as that better than they are..

They already package CoD-MW2 with some Radeons; i got a free D/L with my PowerColor HD 5870 PCS+

What *would* really bug me is if i "upgraded" from GTX 285 and got this kinda crap with GTX 470 like BFG10K did (especially if i *hated* noise)
:p

How well do you think GTX 465 is going to do with these titles in a couple of weeks? it is probably a little faster than HD 5830. Nvidia should get busy with their driver team.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Given how averse he is to loud cards, I'm surprised he went with the 470.

Well, he really wanted a GTX 480 but knew that sucker is painfully loud at full load

i guess he was *hoping* the 470 would be a lot quieter :p

i think his fellow editors are going to try and get him a waterblock
:)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Jaggies can be completely eliminated to the human eye if you just sit farther from the screen.
I didn’t buy a 30” screen so I can sit far away from it. I bought it so I can sit close and marvel at the details that PC gaming provides over consoles. If I’m going to sit far back, I may as well game on a console and be done with it.

Your screenshots can’t show aliasing that’s apparent during movement, nor can they portray the disadvantages of running non-native resolutions. They also can’t convey the look of a “clean” image in motion.

I’ll tell you what: you fire up Crysis at 1024x768 with no AA right now and let me know that looks during in-game movement. It’ll look like ass.

Call of Duty 1 might not compete on the technical level, but it’ll look clean as a whistle at 2560x1600 with 8xSSAA because almost every pixel will be perfect, so it’ll be very immersive as a result. OTOH Crysis will be a horrifically blurry shimmerfest in comparison.

Introducing highest quality antialising to ancient games with outdated graphics is like dropping a Ferrari Enzo engine into a 2002 Honda Civic and then seeing how fast the Honda Civic is around the Nürburgring track compared to the Enzo....
Your car analogies are silly because you don’t understand the benefits super-sampling brings to the table. Once you get used to perfect image quality you immediately notice regressions in other games. The more shaders games use, the more shader aliasing they bring to the table. In many ways Far Cry 1 looks much better than Crysis because its images are much cleaner and more pleasant to the eye.

How many people would care if their UT2004 was running at 120 or 600 fps at this point....
If you’d bothered reading the article, you'd see it was the difference between 111.62 FPS vs 68.79 FPS in that game. I didn’t buy a $350 MSRP card so the six games that reviewers benchmark run faster, but everything else is the same speed or slower. Likewise, when a Radeon 5770 is outrunning a GTX470, that’s a problem.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Given how averse he is to loud cards, I'm surprised he went with the 470.
After a year of using my GTX285 it was either that or the GTX480, so I picked the GTX470 as it was the lesser of two evils.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What *would* really bug me is if i "upgraded" from GTX 285 and got this kinda crap with GTX 470 like BFG10K did (especially if i *hated* noise)
:p

hhee OK it's a bit unfair to buy a GTX470/480 and then complain about noise...duh! :awe: It's commonly known they aren't quiet cards in reference design to begin with.

BFG, why do you say you could have only purchased a 470 or 480? What about 5850, 5870 or 5970? Or do you run Linux?

Since you run older games, 5850 is actually faster than GTX470 in some older games at 2560x1600:

1) Call of Duty 4
2) Prey
3) Quake 4
http://www.techpowerup.com/reviews/Powercolor/HD_5770_PCS_Plus_Plus/7.html

It would probably have been smarter to get the 5870. 5870s are ~ $390 which is only $40 more than GTX470. This basically would have guaranteed better performance in most games over GTX470, while being $110 cheaper than the 480. In fact, 5870 is far superior compared to GTX470 specifically in 2560x1600.

"The bottleneck of the GF100 architecture – the low main domain frequency and the cut-down TMU subsystem (the GeForce GTX 470 has only 56 TMUs) – shows up most clearly at 2560x1600. In six out of the 15 tests the GeForce GTX 470 is more than 15% slower than the Radeon HD 5870. In five more tests, the gap is 2 to 10%. In those tests that the GeForce GTX 470 wins, its advantage is no larger than 5%. Perhaps the GeForce GTX 470 is not meant for such a high resolution." http://www.xbitlabs.com/articles/video/display/gigabyte-gf-gtx400_17.html#sect0

What's more, GTX470 is actually 20% or more slower than 5870 at 2560x1600 in the latest games: BF:BC2, Metro 2033, Just Cause 2, Mass Effect 2. So not only are you getting worse performance with GTX470 in older games than 5870, but with newer games, the performance difference is even worse at 2560x1600. Of course this is not the fault of GTX470 since it is only meant to compete with 5850.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Point 1: If it was meant to compete with the HD5850, it would be priced equally with it. This isn't true in the US market, and it certainly isn't true in Europe, where the GTX470 is priced on par with the HD5870.

Point 2: I really wish the 256 drivers with surround had arrived, because it will be *really* interesting to see how Crossfire HD58xx or HD59xx compares to SLI GTX4xxx at really high resolutions.
Even 1920x1200x3 has 75% more pixels than 2560x1600.
Of course, if anyone had bought a DualHead2Go they could do 1920x1200x2 testing which is over 10% more pixels than 2560x1600 to see how the various cards scaled as you push resolution even higher.