NV: Everything under control. 512-Fermi may appear someday. Yields aren't under 20%

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Certainly not suicidal. It is one thing to block ~80% of the market- what you are talking about amounts to what is likely less then 1%(actually likely less then one tenth of one percent). If it's stupid or not depends on if it drives more sales for them or loses sales and how that relates to the costs associated with supporting the configuration down the road(in terms of additional funding required for driver development). Blocking 80% of your potential customer base is stupid, particularly when that same base is the ones most likely to want your highest margin items(right now Intel has a clear lead in gaming).
Considering the utter lack of traction (without NV paying for it) that hardware PhysX has, it certainly doesn't seem to have done them any favours. Now, part of that is the console issue, but having only a medium level of support as an option on PCs is hardly beneficial either.


So you are saying devs will never use DirectX11? PhysX runs on far, far more systems then it does.
He said "for something cool" (presuming he meant hardware PhysX and not the software that obviously isn't relevant in a discussion of PhysX and GPUs).
DX11 is used to make prettier graphics, which don't fundamentally change gameplay. Current PhysX is predominantly related to the same thing (except in maybe 2 games), mainly because software PhysX (the stuff that runs on far, far more systems, as you put it) can't do some of the interesting things which might be possible with hardware accelerated PhysX, so it's relegated to doing DX11 type things (making stuff look prettier), which wouldn't really be considered "something cool" when other games are using software physics engines for terrain deformation and exploding buildings. Sure, floating paper and waving banners might look pretty, but blowing up the ground and smashing a building to pieces is cool. And that's not what hardware PhysX is being used for in 90% of situations/games.

Is the list on the NV website a comprehensive list of all hardware PhysX games to date, or are there more that it hasn't been updated with? (I see Metro 2033 is not on the list so it seems somewhat out of date).
It would be interest to see how many games support GPU PhysX, how many were given money by NV, and how many use it for actual gameplay things, rather than adding graphical improvements.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i believe that i read that there are 15 games out now that utilize PhysX
- i can try to find the link .. it was pretty recent

And you have to realize that all the new twiimtbp games will utilize it even more

EDIT:

.. lots more than 15 (and Mafia II will have it featured also):
http://en.wikipedia.org/wiki/PhysX

Nvidia's Big List has Metro 2033 :p
http://www.nzone.com/object/nzone_physxgames_all.html

how many were given money by NV
No money is given by Nvidia to devs. Time and support are worth far more than cash.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Ageia wasn't going anywhere. It looks like they created themselves to be sold. Without Nvidia supporting them, PhysX would be dead.
I don't know... At the time Ageia was still active, I would have agreed with you. But seeing that Bigfoot is still around putting out insanely expensive NICs and they've somehow managed to keep afloat, I'm wondering if maybe Ageia might have had a chance to make it. At least when they had PhysX it could be used with video cards from both nVidia and ATi.

i can certainly force AA in Batman: AA from the ATi CP; it runs OK on my (single) HD 5870 at 2560x1600 and 4x or even 8x AA
That's not the problem, though. Everybody has been able to force AA through the control panel. But that's FSAA, not MSAA. So you're incurring a larger framerate hit.

In addition, ATi cards were having to do the initial MSAA calculations before having to go on to do FSAA. So by forcing ATi cards to do computing unnecessary for FSAA, they experienced another framerate hit. I don't know if that's still the case or if that might have been addressed in a patch.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I don't know... At the time Ageia was still active, I would have agreed with you. But seeing that Bigfoot is still around putting out insanely expensive NICs and they've somehow managed to keep afloat, I'm wondering if maybe Ageia might have had a chance to make it. At least when they had PhysX it could be used with video cards from both nVidia and ATi.

That's not the problem, though. Everybody has been able to force AA through the control panel. But that's FSAA, not MSAA. So you're incurring a larger framerate hit.

In addition, ATi cards were having to do the initial MSAA calculations before having to go on to do FSAA. So by forcing ATi cards to do computing unnecessary for FSAA, they experienced another framerate hit. I don't know if that's still the case or if that might have been addressed in a patch.
Maybe, maybe, maybe .. if Ageia got bought by Intel, PhysX would be dead by now; AMD didn't seem interested but Nvidia grabbed them up pretty quickly - you could see that acquisition coming - it was an easy one - just port PhysX to CUDA

As to Batman AA; i still don't get it
- i get a performance hit at 25x16 with maxed out (non physX) settings and 8x AA with a single HD 5870 and the framerates *Still* stay above 30 FPS :p

What is the issue again? The IQ looks the same while playing the game and the framerates are satisfactory on a HD 5870 at 25x16.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
As to Batman AA; i still don't get it
- i get a performance hit at 25x16 with maxed out (non physX) settings and 8x AA with a single HD 5870 and the framerates *Still* stay above 30 FPS :p

What is the issue again? The IQ looks the same while playing the game and the framerates are satisfactory on a HD 5870 at 25x16.

You're lucky they didn't decide that the original engine couldn't support that resolution - they might have needed NVIDIA to write some code and now your 5870 would only do 19x12.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
As to what this topic became, there is absolutely nothing new posted by either side over the last few pages
- it is all repetition and "why can't you get it through you thick skull" posts.
---- if you are going to rag on Nvidia, pick on their crap drivers for GF100

that is an easy target once you get outside a few games - PhysX is in only about 15 games and very few people use a Radeon plus a Geforce for it.
otoh, Nvidia's poor Fermi drivers affect hundreds of games (so it seems)

Well this is the 1st time iv heard of it, since people always claim nVs drivers are "Rock solid."
Do you mind telling more about that?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Well this is the 1st time iv heard of it, since people always claim nVs drivers are "Rock solid."
Do you mind telling more about that?

First of all, i guess you haven't seen part 2 of BFG10K's review of GTX 470 using 36 games (18 are 2007, or newer) and Windows 7 (he published this weekend)

it seems that the mainstream reviewers (me included; i only use 18 games) only test games that on on the "hot list" - the ones that Nvidia has optimized for

And the top 6 games that *most* reviewers use run very well on GF100

HOWEVER, once you get to games that are older .. the GTX 285 will often beat GTX 470 in many cases (and we are talking about mostly running the DX9c pathway)

imo Nvidia's GTX 4x0 drivers are a disgrace - it is exactly the same when 8800-GTX came out and only XP worked well and Vista drivers were Crap
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Well this is the 1st time iv heard of it, since people always claim nVs drivers are "Rock solid."
Do you mind telling more about that?

Nvidia's drivers were the cause of most of Vista's crashes and poor game performance. Especially if you were unlucky enough to have an Nvidia mobo. I remember poor bastards uninstalling & reinstalling different driver versions and beta drivers just trying to get all the parts of the mobo working at the same time.

Then there was the recent issue where Nvidia drivers were actually bricking the cards.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
HOWEVER, once you get to games that are older .. the GTX 285 will often beat GTX 470 in many cases (and we are talking about mostly running the DX9c pathway)

Based on the architecture of the chip, that's pretty much exactly what I expected(and discussed it already in the architecture thread). Maybe I am redaing too much into the layout, but I don't expect drivers to improve the situation too much in terms of performance on mainly fill limited games.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Based on the architecture of the chip, that's pretty much exactly what I expected(and discussed it already in the architecture thread). Maybe I am redaing too much into the layout, but I don't expect drivers to improve the situation too much in terms of performance on mainly fill limited games.

Did you have a chance to look at BFG10K's Part 2 (with Win7) review?
- you can see the sad state of GF100 Fermi drivers with older games

i expect some solid improvement; a GTX 285 should not beat a GTX 470 :p
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Did you have a chance to look at BFG10K's Part 2 (with Win7) review?

Of course, ABT is on my daily browsing list :)

i expect some solid improvement; a GTX 285 should not beat a GTX 470

Actually, if you tested a game like Q3A with perfect drivers for each a 285 should beat a 480 fairly easily. I have expected as much for some time, we had a discussion about it around a month ago here.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Actually, if you tested a game like Q3A with perfect drivers for each a 285 should beat a 480 fairly easily. I have expected as much for some time, we had a discussion about it around a month ago here.


Q3A is a specific example that will likely run faster on a GTX 285 over a GTX 470; let me try again and speaking a little more generally

GTX 470 should not be so badly beaten by GTX 285 in many of BFG10K's cases.
- most of those look to me like driver issues, not architectural.


Of course, Nvidia also needs to decide if they are going to support full scene Super Sampling AA or treat it like a bug (which is probably and very sadly where they are going with it) :p

it seems to me that Nvidia has latched on to "tesselation" as the 'next big thing' and this is what GF100 Fermi excels in. And of course by the time it IS "big", no one will be using GTX 4x0 as it will be too slow to run these new DX11 games
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
What about the consoles? PC gaming is the smaller sibling in the gaming market, and outside of very few exceptions PC exclusive devs are gone away. Even if every PC gamer already had a DX11 part, PhysX would still run on more gaming systems. Devs don't live in a black and white world based on the idealism of GPUs, I can assure you of that.

So we're talking software PhysX again? I remember this conversation, where PhysX runs on everything plus your cell phone. Everyone knows this. You know we're only talking about GPGPU accelerated PhysX.

apoppin said:
The had made almost no traction in the market before Nvidia promoted them. Wasn't there just one game with PhysX for the longest time? i remember the very few forum members here who had PPUs (and would keep talking about their demos and the one or two games that utilized it; the rest of us were only mildly interested).

Well, they had made no traction in the market with their hardware PPU unit. PhysX as a software physics package was still being used quite widely, which is good enough to keep the company going on top of their seemingly well backed financing. That would give them opportunity to look for ways to integrate PhysX into GPU friendly code.

What I figure would have happened is, the PPU unit would have obviously failed, they would've then developed API's for PhysX to run in Stream/CUDA/DirectCompute/OpenCL or wherever they thought they could get licensing dollars. And then traction would occur.. but they would never have been successful with their own hardware approach obviously.

It's clearly hypothetical and I agree with all your points, nothing may have happened with them. But it's hard to say what kind of progress they could've made with just the PhysX SDK. It still would've been a money maker even without the PPU though so I don't believe the company would've just disappeared had nVidia not bought them.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Any code that AMD wrote they should absolutely 100% be able to lock down to their hardware. Implying anything else would be moronic on a profound level.
ATi wrote the AA code for Stalker Clear Sky DX10.1 and also back-ported it to DX10 too. Are you saying it would benefit PC gamers if that code was locked down to ATi’s parts?

(TruForm working on CounterStrike back in the day would be another example, ATi wrote the code- it only worked on their hardware and noone whined about it because we didn't have the deity level worship we have today).
That’s because Truform required hardware tessellation which nVidia parts didn’t have, so they were physically unable to run that code.

This is vastly different to Batman’s AA code which has no fundamental reason why it can‘t run given it’s standard code, and indeed runs perfectly on ATi’s parts after the artificial vendor check is removed.

There’s a profound difference between artificial vendor lockout, and physically being unable to run something.

Actually, if you tested a game like Q3A with perfect drivers for each a 285 should beat a 480 fairly easily. I have expected as much for some time, we had a discussion about it around a month ago here.
Check the Windows 7 16xS results for UT99, Quake 3, RTCW and CoD1. Those games use little to no shading and mainly rely on fillrate and memory bandwidth, yet the GTX470 is miles ahead of the GTX285 (~60% faster in Quake 3).

But in XP the GTX470 is slower in the same games pretty much across the board. The fact that these scores move around so much by simply using a different OS tells me it’s a driver issue rather than an architectural limitation.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Check the Windows 7 16xS results for UT99, Quake 3, RTCW and CoD1. Those games use little to no shading and mainly rely on fillrate and memory bandwidth, yet the GTX470 is miles ahead of the GTX285 (~60% faster in Quake 3).

But in XP the GTX470 is slower in the same games pretty much across the board. The fact that these scores move around so much by simply using a different OS tells me it’s a driver issue rather than an architectural limitation.

Honestly though, even 9800Pro could more or less max out those games, unless you are into unnecessary 16/32AA settings. Those FPS games ran so fast, you would hardly even notice any difference between 4 and 16AA. I remember my Radeon 8500 would easily run COD1 and RTCW and Quake 3 and that's in 2002.

As far as XP vs. Windows 7 goes, mostly everyone who is buying a GTX470 should be running Windows 7. They likely have recent systems and want DX11 in games as well. Windows XP is so old, performance on it for the majority of gamers with latest hardware is largely irrelevant. I mean we will have Windows 8 soon....I understand testing older games (i.e., Crysis and other pouplar games from 3-4 years ago), but again, testing games which are 8-10 years old is meaningless since cards 4 generations out maxed them out, easily. I don't think anyone really cares to get 500 fps in Quake 3 16AA in 2010 vs. 100 fps, or to get 32 times anti aliasing. BFG, you just happen to be a minority in this case. For this reason, I wouldn't hold anything against NV for not optimizing games that 99.99% of people no longer play.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Honestly though, even 9800Pro could more or less max out those games, unless you are into unnecessary 16/32AA settings. Those FPS games ran so fast, you would hardly even notice any difference between 4 and 16AA. I remember my Radeon 8500 would easily run COD1 and RTCW and Quake 3 and that's in 2002.

As far as XP vs. Windows 7 goes, mostly everyone who is buying a GTX470 should be running Windows 7. They likely have recent systems and want DX11 in games as well. Windows XP is so old, performance on it for the majority of gamers with latest hardware is largely irrelevant. I mean we will have Windows 8 soon....I understand testing older games (i.e., Crysis and other pouplar games from 3-4 years ago), but again, testing games which are 8-10 years old is meaningless since cards 4 generations out maxed them out, easily. I don't think anyone really cares to get 500 fps in Quake 3 16AA in 2010 vs. 100 fps, or to get 32 times anti aliasing. BFG, you just happen to be a minority in this case. For this reason, I wouldn't hold anything against NV for not optimizing games that 99.99% of people no longer play.
Nonsense

What about GTX 460 and GT 450 and so on down the GF100 line? Should the users - who are mostly still on XP - suffer with second rate XP drivers?

Some older games simply run better on XP; it has not been discontinued by MS and i see WHQL drivers for that OS on Nvidia's site; heck, AMD's XP drivers are a LOT better. :p

As to a 9800p running Quake; not the highly modified version that BFG10K runs - it will stress GTX 480 and HD 5970 at the kinds of settings he uses
--imo he has performed a real service by pointing out the really poor drivers for older games (i mean 3 years old) that Nvidia has for GF100
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Nonsense

What about GTX 460 and GT 450 and so on down the GF100 line? Should the users - who are mostly still on XP - suffer with second rate XP drivers?

Some older games simply run better on XP; it has not been discontinued by MS and i see WHQL drivers for that OS on Nvidia's site; heck, AMD's XP drivers are a LOT better. :p

It's getting there:
http://news.yahoo.com/s/infoworld/20100510/tc_infoworld/122961

XP needs to die...yesterday.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

But it *isn't there* no matter what you or other elitists may feel :p
- and MS is ending support for Win2k and XP SP2 - please be thorough in reading you own links which disprove what you wrote (but thanks):
:rolleyes:
If you're using Windows XP SP2 or earlier, there's a free and easy way to continue to get Microsoft support: Simply upgrade to SP3, which you can do via Internet Explorer's Windows Update utility.
MS is not dropping support for XP this year.

Take a look at what gamers use as the most popular OS in a very large monthly survey (April 2010):

http://store.steampowered.com/hwsurvey/

Most Popular Windows Version:

Windows XP 32 bit = 37.59%

Ahead of Vista and Win 7. Don't you think Nvidia ought to write decent drivers for it? AMD does for their video cards.
- a huge chunk of NVIDIA's GF100 HW sales will run be on XP
(GTX 480/470 is mostly run on Vista/Win7 and is a small part of their total overall sales).
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I still play older games, but tbh I agree RussianSensation. To make a fuss about how many hundreds of fps you get is just silly. I play ut2004 still, it ran fine on a 6600GT/9800Pro. Those cards had 8 pipes - even a GTX 450 will still have 100's of times the power. You could have the most inefficient driver ever and it would still blaze through the games.

Now if the games didn't run properly or glitched so I couldn't play them that would be a problem, but arguing about performance is pointless.
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
But it *isn't there* no matter what you or other elitists may feel :p
- and MS is ending support for Win2k and XP SP2 - please be thorough in reading you own links which disprove what you wrote (but thanks):
:rolleyes:

MS is not dropping support for XP this year.

Take a look at what gamers use as the most popular OS in a very large monthly survey (April 2010):

http://store.steampowered.com/hwsurvey/


Ahead of Vista and Win 7. Don't you think Nvidia ought to write decent drivers for it? AMD does for their video cards.
- a huge chunk of NVIDIA's GF100 HW sales will run be on XP
(GTX 480/470 is mostly run on Vista/Win7 and is a small part of their total overall sales).

32-bit OS with a 1.5GB $500 graphics card?
More accurately, 21.55% of systems are DX10 and DX11 cards (including older ones like the 8800 and 4800 series) are on XP, while 56.5% are DX10 and 11 cards are on Vista or 7.
To be fair, it's not really all that worth it to write great drivers for an old OS that they really wouldn't expect their cards to be used on. That's one understandable decision.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
But it *isn't there* no matter what you or other elitists may feel :p
- and MS is ending support for Win2k and XP SP2 - please be thorough in reading you own links which disprove what you wrote (but thanks):
:rolleyes:

MS is not dropping support for XP this year.

Take a look at what gamers use as the most popular OS in a very large monthly survey (April 2010):

http://store.steampowered.com/hwsurvey/



Ahead of Vista and Win 7. Don't you think Nvidia ought to write decent drivers for it? AMD does for their video cards.
- a huge chunk of NVIDIA's GF100 HW sales will run be on XP
(GTX 480/470 is mostly run on Vista/Win7 and is a small part of their total overall sales).

XP is dying.
It lost 1.4% just last month.
Win 7 gained 1.49% the same month.

There is no use for a +DX9 card on XP.

Infact there is only 21.55% of all systems that have a +DX9 card on XP.
Compared to 56.47% with a +DX9 on Vista/Win7
Not to mention the change to WDDM1.0/1.1..again a place where XP will not go.

So XP is dying, by +1% a month, it won't get +DX9, it won't get WDDM.
And normal support is dead for XP, you "now" only have extended support for XP SP3, the "I-Am-An-Dying-OS"-support.

Wanna be XP loses close to 2% next month?
And Win 7 gains more than XP loses?

Cut the rope and move on, no idea in supporting dying, obsolete tech.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
Honestly though, even 9800Pro could more or less max out those games, unless you are into unnecessary 16/32AA settings.
There’s nothing unnecessary about it. You can raise edge AA until you’re blue in the face, but major parts of the scene won’t get any benefit unless you start super-sampling.

Super-sampling provides a huge increase in image quality in many games, and the highest levels are often only playable in older games. You aren’t going to be running 8xSSAA in Crysis, so it’s obvious the feature is for older games.
Those FPS games ran so fast, you would hardly even notice any difference between 4 and 16AA.
Utter nonsense; Call of Duty 1 in particular has massively shimmering vegetation without super-sampling. The difference is as plain as day, and someone would have to be blind not to see it.

As for performance, going from 111.62 FPS (GTX285) to 68.79 FPS (GTX470) is extremely visible in a twitch shooter like UT2004, not mention the alarming performance loss. And that’s only at 8xS.
I remember my Radeon 8500 would easily run COD1 and RTCW and Quake 3 and that's in 2002.
Nope, not if you want 2560x1600 with decent levels of AA:

http://techreport.com/articles.x/3203/18

The 8500 can’t even manage 4xSSAA at 1024x768 (last graph on the page).

I’m very familiar with these games because I have them under active play rotation, so I know exactly how they look and perform on modern hardware. There are a lot of people who enjoy playing older games with new hardware, and understand the benefits of increased image quality.
For this reason, I wouldn't hold anything against NV for not optimizing games that 99.99% of people no longer play.
I don’t expect explicit optimizations like new games get, but if the Radeon 5770 is outrunning the GTX470 (as it does in Doom 3) or the GTX470 is scoring half of the GTX285’s score, there’s an obvious issue that needs to be fixed. The GTX470’s Windows 7 scores in some games prove performance can be much better.

The GTX470 on Windows 7 (64 bit) is better than it is on XP, but it’s still vastly underperforming overall compared to the GTX285 on XP.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
32-bit OS with a 1.5GB $500 graphics card?
More accurately, 21.55% of systems are DX10 and DX11 cards (including older ones like the 8800 and 4800 series) are on XP, while 56.5% are DX10 and 11 cards are on Vista or 7.
To be fair, it's not really all that worth it to write great drivers for an old OS that they really wouldn't expect their cards to be used on. That's one understandable decision.

You still don't get it; Nvidia DOES care about XP and they DO expect their GF100 line to run on XP. :p

XP is still the OS used by more mainstream gamers than either Vista or Win 7. And you forget that the BULK of Nvidia's sales - GTX 460, GT 450, GT 440, GT 430, GT 420 and GT 410 - The sub-$300 down to $70 video cards - are to the mainstream buyers - not the elitists like you who forget there is more to life than $500 video cards.

The worst thing about the current GeForce drivers is that they ALSO SUCK for Vista and Win 7 - once you get outside the popular games and try to play older games with demanding settings. AMD graphics is miles ahead with their current drivers for XP and for older games.

Apparently, no one likes to hear the truth, do they?