NV: Everything under control. 512-Fermi may appear someday. Yields aren't under 20%

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Actually the best thing NVIDIA did for competition is to remove the ability to render physX with a primary ATI card - just means that whatever % of the market that uses ATI GPUs can't run it.

That means instead of consumers having to have 2 cards on their rig to play a game, one is enough because physX effects are not worth the money.

That is why physX effects are most of the time either limited and/or unspectacular.

They are so unspectacular that the only way for a physX game to show difference is to make sure that without physX on, basic and common effects found in several other titles are simply absent.

Open source software physics engine is nice but will it be a standard for all gaming like back in the "half life 2" days? Nvidia and intel has their own physics and time will come that intel might make a gpu (larrabee) and use the same method like nvidia hardware physx. Game developers might be looking foward of using hardware instead of software physx.That leads amd/ati out of the game....
Any gpu card can run open software physics engine but only gpu accelerated physics cards can run hardware physics.

You are somewhat confused.

There are "2 kinds" of physX - the effects software library and the GPU-acceleration physics portion.


ATI GPUs can accelerate hardware physics - they only have to be written in an API they can use. ATI is interested in that being OpenCL opposed to physX GPU-acceleration portion.

By having a library to compete vs physX library, ATI GPUs (and NVIDIA GPUs btw) can accelerate the Bullet effects using OpenCL.

So it is simply a question of having the code written in a language the GPU can use.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Every single thing about every debate on every aspect of "NV vs ATI" is exactly why I really really really wanted Intel to produce a competitive or at least somewhat useful GPU in Larrabee.

If Intel had succeeded, it would have effectively shut every idiot up who things that closed standards are in any way beneficial. Sure, at the moment it's a choice between A or B, but if we get a third player (Intel) in as C, then the dynamic changes.

Instead of being locked into one of two vendors, your choice is further restricted by being locked into one of three. So that means that 2/3rds of products you can't use if you want certain features. And what if those two parties end up adding vendor specific things? Then the situation gets back like it was in the old days, where you have multiple APIs and multiple extensions, and some people could do some things while other people could do others.

Apparently that's what some posters on this forum see as beneficial, while others do not.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
They should let ATi know how they did it then, because ATi says it won't work.



What lie would that be? That ATi sells a part with stated functionality that will only work if it is the primary display device? That they failed to make note of this issue on the box? That the only reason they wouldn't support it is because they don't want to deal with the work on drivers that it would require?



For the 9600AIW, it wasn't listed in all of the manuals made for the different parts(AIWP springs to mind as one I owned that they wouldn't let work and it wasn't noted anywhere in the manual that shipped with the board).



And ATi sold their parts as TVTuner/video capture parts.





http://www.nvidia.com/object/physx_faq.html

Again, the 9600 AIW was sold only as an AGP part. I know for a fact the PCI versions of cards worked along side as a capture card with GeForce/ATI primary display. It worked with my AGP VooDoo 2 when I got my Radeon AIW (based on the Rage Pro chipset) back in 2000.

That's how Gateway designed their Destination XTV PCs. The Radeon wasn't the primary render unit. Voodoo 2 for gaming, ATI Radeon for video/audio capture.

In the manual if you read what it says clearly it only lists that multiple display requires the ATI card to be the primary adapter. It doesn't state it's capture functions won't work.

And if you own an AGP version of the card, why would you want to use IGP for primary display? I don't recall multiple AGP slot motherboards back in the day.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
What does Radeon AiW and PhysX have to do with the topic?
NV: Everything under control. 512-Fermi may appear someday. Yields aren't under 20%

i have been away for a week and this topic discussion is really getting stupid.
- is there anything that i missed in this topic that was important?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
What does Radeon AiW and PhysX have to do with the topic?


i have been away for a week and this topic discussion is really getting stupid.
- is there anything that i missed in this topic that was important?
Why not go read it instead of whining? The thread has evolved into a very interesting discussion of properietary GPU technologies. Everyone else, please continue.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
What does Radeon AiW and PhysX have to do with the topic?


i have been away for a week and this topic discussion is really getting stupid.
- is there anything that i missed in this topic that was important?

That all started after a few pages when some people decided to talk about the other features of NV, which invariable results in threads being a discussion of the merits and drawbacks of said features.
Said features only typically seem to be mentioned by certain people who have by others been accused of having a significant bias towards a certain GPU manufacturer.
So blame them if anything.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
That all started after a few pages when some people decided to talk about the other features of NV, which invariable results in threads being a discussion of the merits and drawbacks of said features.
Said features only typically seem to be mentioned by certain people who have by others been accused of having a significant bias towards a certain GPU manufacturer.
So blame them if anything.

Yet "said features" have absolutely ZERO to do with the topic. :p

There is nothing "interesting" about reading religious fervor style diatribe repeated over-and-over with absolutely nothing new being interjected into the debate.

There is no evidence of any "evolution of a thread"; rather i see some deliberate creation of thread derailment into angry argument with plenty of insinuations.
:thumbsdown:
 
Last edited:

waffleironhead

Diamond Member
Aug 10, 2005
7,124
623
136
Yet "said features" have absolutely ZERO to do with the topic. :p

There is nothing "interesting" about reading religious fervor style diatribe repeated over-and-over with absolutely nothing new being interjected into the debate.

There is no evidence of any "evolution of a thread"; rather i see some deliberate creation of thread derailment into angry argument with plenty of insinuations.
:thumbsdown:

Is that why you are here instead of the forum at ABT? :awe:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Is that why you are here instead of the forum at ABT? :awe:

i am there *also* right now :p
- it is not a choice of "either" ..

i checked in - as i do weekly - to retrieve my PMs and then to comment on this thread's deterioration and also to answer a question about Eyefinity in another thread here.

Are you proud of the way this thread is going?
 

waffleironhead

Diamond Member
Aug 10, 2005
7,124
623
136
i am there *also* right now :p
- it is not a choice of "either" ..

i checked in to retrieve my PMs to comment on this thread's deterioration.

Are you proud of the way it has gone?

I wouldnt say I'm proud of anything, but then again I dont have any vested interest. I too check into multiple forums, your post just reminded me of the nV strokefest that goes on over there.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I wouldnt say I'm proud of anything, but then again I dont have any vested interest. I too check into multiple forums, your post just reminded me of the nV strokefest that goes on over there.

i wouldn't say that.

Keysplayr finally stopped posting over there and Rollo has gone back to politics :)

We just discuss different things

Right now we are currently *ripping* on the generally sh!tty drivers for GTX 4x0 that Nvidia has foisted on us.

Once you get away from the popular games, GTX 470/480 looks just like the clusterfsck that came with Vista and 8800-GTX .... with the same old tired Nvidia PR song and dance that "only new games matter"
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
There is nothing "interesting" about reading religious fervor style diatribe repeated over-and-over with absolutely nothing new being interjected into the debate.

There is no evidence of any "evolution of a thread"; rather i see some deliberate creation of thread derailment into angry argument with plenty of insinuations.
:thumbsdown:

Maybe you should apply for a moderation spot.

Anyway I don't know what can you expect this thread to evolve into.

It is about yields and those are what they are - and they suck at 40 nm and they suck more for NVIDIA than ATI.

The rest is some saying NVIDIA decisions are always right (and you can see that because of the money they made in the past), others saying ATI decisions are always right, some interested in absolute performance and others interested in value for money.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
My PPU card stopped working after the latest PhysX System Software 9.0 (some other number that can't remember now), before that update, I used to play games with Hardware Accelerated PhysX very fine like Mirror's Edge, even Batman AA with a hack which is the only impressive PhysX game thus far, Dark Void PhysX effects are disgusting and pathettic. Cryostasis water effect was pathettic too, it looked like milk diluted in water and silicon.

Anyways, my PPU card is now a perfect paper brick, but PhysX won't be missed except in Batman AA. If the PhysX market wasn't so close and propietary like for example, Apple, I'm very sure that developers would do a much better work with PhysX because more systems would be capable of it, but you wouldn't want to stagnate game sales locking out more than 2/3 market share in the hands of Intel and their powerful GPU solutions (sarcasm), and ATi.

So let's just accept nVidia's bribery and put some simple PhysX effects to fool nVidia that the job was done and come out clean with pockets full of money!!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Maybe you should apply for a moderation spot.

Anyway I don't know what can you expect this thread to evolve into.

It is about yields and those are what they are - and they suck at 40 nm and they suck more for NVIDIA than ATI.

The rest is some saying NVIDIA decisions are always right (and you can see that because of the money they made in the past), others saying ATI decisions are always right, some interested in absolute performance and others interested in value for money.

No thanks; along with Keysplayer, we were the first two video mods at ATF :p
- it is a lot of work and i mostly did not post here for the months i was a mod

Well, yeah .. yields are what they are and they are obviously getting better as we see a lot more retail stock.

As to what this topic became, there is absolutely nothing new posted by either side over the last few pages
- it is all repetition and "why can't you get it through you thick skull" posts.
---- if you are going to rag on Nvidia, pick on their crap drivers for GF100

that is an easy target once you get outside a few games - PhysX is in only about 15 games and very few people use a Radeon plus a Geforce for it.
otoh, Nvidia's poor Fermi drivers affect hundreds of games (so it seems)
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So, are you saying that what ATI did was bad (and therefore what NV is doing is bad) or that what ATI did was acceptable and also what NV are doing is acceptable?

Both are understandable. As a consumer who spent almost $400 on an AIWPro back in '98 I wasn't too happy about it, but it is the reality of the PC market. Sure I would have liked to have it work, but at the end of the day it is asking for a rather odd configuration to work and it shouldn't be shocking to anyone that the companies don't want to try and deal with it. I'm not saying that it is 'good' or 'bad'- it is business.

Also mens rea is a very important element in law.

In case law already established around this same issue nV's intent was to bar customers from using a configuration they didn't approve of which has been proven to be completely legal and furthermore circumventing that limit has on a legal basis been deemed illegal.

I actually read the cached physx faq the first time this conversation happened and they didn't add that requirement to the FAQ until they disabled PhysX w/AMD in the 186 patch.

My AIWPro didn't say it wouldn't work as a secondary device either, they included that information at a later date on different parts.

The AIW parts stated in the manual that it has to be the primary rendering unit to get all functionality.

Because everyone who ever bought an AIW bought the one example you could find a manual for that stated it? My manual didn't state it had to be used as a primary device. It worked with my V2s, wouldn't work with my AGP TNT/TNT2 and nothing explained why in the manual. Years later they may have well included that information in some other part, just as now the PhysX information is available. For the record, I would have just gotten a Matrox card if I knew it was going to be an issue.

I know for a fact the PCI versions of cards worked along side as a capture card with GeForce/ATI primary display.

Funny I could never get it to work and ATi explicitly told me that it wouldn't work either. I tried under Win95, Win98, Win98SE, WinME, WinNT and Win2K. It didn't work with any of them.

In the manual if you read what it says clearly it only lists that multiple display requires the ATI card to be the primary adapter. It doesn't state it's capture functions won't work.

My TVTuner and vid capture hardware would only ever work if the board was the primary display adapter. I bought the PCI version of the card with full intention of using it as a secondary card as some of the features of the AIW's tuner I liked better then the ones offered by other companies(some of the "WebTVish" features were cool while they worked).

You never answered this even though I mentioned it in several posts. Tell me, how would you feel about AMD's business tactics if their next driver release disabled 3D capabilities and HTPC abilities when an Intel chipset/CPU is detected?

I'd view it as suicidal and stupid. Not immoral, not evil, not anti trust or any of the other shockingly stupid things some loyalists around here think about corporations business choices. I am not a 19 year old babe without a clue on how the world works.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
- it is all repetition and "why can't you get it through you thick skull" posts.
---- if you are going to rag on Nvidia, pick on their crap drivers for GF100

that is an easy target once you get outside a few games - PhysX is in only about 15 games and very few people use a Radeon plus a Geforce for it.
otoh, Nvidia's poor Fermi drivers affect hundreds of games (so it seems)

If people need to rag on NVIDIA (at least those that look for value for money and aren't interested in paying absurd premiums for the highest performance) can just rag at GTX480. Depending on where they live they can also rag at the GTX470.

And stuff like no AA in Batman:AA are much more worrying for a consumer than physX, although I understand why some dislike NVIDIA stance of trying to make physX take off - we already know ATI isn't going to adhere to physX, NVIDIA doesn't have the strength atm (at least IMO) to push physX leaving ATI out and so it is lost time when we could be getting better physics effects.

And I understand NVIDIA stance - they lost the advantage (at least a sizeable part of it) they had just 3 years ago.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I'd view it as suicidal and stupid. Not immoral, not evil, not anti trust or any of the other shockingly stupid things some loyalists around here think about corporations business choices. I am not a 19 year old babe without a clue on how the world works.

So is it suicidal and stupid (for hardware PhysX) for NV to block hardware PhysX when they detect an ATI graphics cards?
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Disabling PhysX (and NV gave their reasons for this whether you agree with it or not) when an AMD card is present as the graphics renderer, and not an Nvidia card, isn't good from the perspective of a gamer with an AMD card as the primary renderer.

This is actually my main point of contention in the whole situation. It isn't good from the perspective of ANY gamer unless you're extremely short sighted. I *want* GPU physics to do something cool, even if you have to buy a dedicated PPU for it. But devs will never use it for something cool if it's a segmented market! All we will see is it being used by the marketing guys at nVidia to swirl some damn leaves around. Any nVidia fan should want to see everyone able to use PhysX. ANY. It's the only way any of us would get to see it used in games.

And that's the problem with the situation. It doesn't bother me so much that nVidia can do what they want with their shit, it bothers me that they are ensuring their own feature will never be more than a bullet point that most don't give a shit about. They should have left Ageia alone and then we would have actually had market innovation and features able to gain ground in the industry.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So is it suicidal and stupid (for hardware PhysX) for NV to block hardware PhysX when they detect an ATI graphics cards?

Certainly not suicidal. It is one thing to block ~80% of the market- what you are talking about amounts to what is likely less then 1%(actually likely less then one tenth of one percent). If it's stupid or not depends on if it drives more sales for them or loses sales and how that relates to the costs associated with supporting the configuration down the road(in terms of additional funding required for driver development). Blocking 80% of your potential customer base is stupid, particularly when that same base is the ones most likely to want your highest margin items(right now Intel has a clear lead in gaming).

But devs will never use it for something cool if it's a segmented market!

So you are saying devs will never use DirectX11? PhysX runs on far, far more systems then it does.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
It was really stupid of nVidia to do that with PhysX. They further limited the use and actually blocked sales. I will no longer buy Nvidia cards because of this and I know a few others that feel the same.

Not only that but OpenCL will work on both Nvidia & ATI's cards. Bullet Physics is using OpenCL of course. And if Havok uses OpenCL Nvida will hvae a problem on their hands.

And to people who say some of the garbage about what Nvidia did being ok, imagine if Intel did that and possibily took it further. If Intel releases a card and only lets have run with the brand card, what if they took it further and blocked Havok from running on any non Intel system.

This was really stupid because Nvidia is being pushed out of markets, their stock isn't going well, & they have limited products right now. It's going to me months before they get a revision out meanwhile ATI has SI for their 6000 series do out later this year.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
They should have left Ageia alone and then we would have actually had market innovation and features able to gain ground in the industry.

Ageia wasn't going anywhere. It looks like they created themselves to be sold. Without Nvidia supporting them, PhysX would be dead. Nvidia bought them, ported their code over to CUDA, supported their product for three years and then dumped support for the old PPU as their GPU is more efficient, anyway.

What is happening with Havok?
- all i hear is the same empty promises, year after year .. i don't see any physics improvement much over Painkiller series :p

And stuff like no AA in Batman:AA are much more worrying for a consumer than physX
i can certainly force AA in Batman: AA from the ATi CP; it runs OK on my (single) HD 5870 at 2560x1600 and 4x or even 8x AA
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
So you are saying devs will never use DirectX11? PhysX runs on far, far more systems then it does.

They'll use DX11 because they know all cards will be able to support it and because it's not too difficult to use it and make the app still run on DX10/DX9 hardware.

Ageia wasn't going anywhere. It looks like they created themselves to be sold. Without Nvidia supporting them, PhysX would be dead.

I disagree, they did make themselves ripe to be sold but they also were very heavy on licensing. I don't doubt that they would have adapted the PhysX platform to CUDA and DirectCompute and then tried to license it to ATI/Nvidia/game devs which would be a much better situation for gamers.

It wasn't until after nVidia took over Ageia that the PhysX fees went away, because nVidia has the bucks to subsidize it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
They'll use DX11 because they know all cards will be able to support it and because it's not too difficult to use it and make the app still run on DX10/DX9 hardware.

What about the consoles? PC gaming is the smaller sibling in the gaming market, and outside of very few exceptions PC exclusive devs are gone away. Even if every PC gamer already had a DX11 part, PhysX would still run on more gaming systems. Devs don't live in a black and white world based on the idealism of GPUs, I can assure you of that.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I disagree, they did make themselves ripe to be sold but they also were very heavy on licensing. I don't doubt that they would have adapted the PhysX platform to CUDA and DirectCompute and then tried to license it to ATI/Nvidia/game devs which would be a much better situation for gamers.

It wasn't until after nVidia took over Ageia that the PhysX fees went away, because nVidia has the bucks to subsidize it.

Ageia was practically advertising themselves to be sold :p
- for a long time i thought their sole purpose for existing was to be purchased by one of the big 3.

The had made almost no traction in the market before Nvidia promoted them. Wasn't there just one game with PhysX for the longest time? i remember the very few forum members here who had PPUs (and would keep talking about their demos and the one or two games that utilized it; the rest of us were only mildly interested).

Everyone realizes that they need PHYSICS to make a game more realistic; AMD and Intel are promoting Havok (which isn't apparently going anywhere either) so Nvidia took their own approach.

If you really like PhysX/CUDA, i figure you like Nvidia.
:D