DX11 questionnaire from nVidia

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Creig
Originally posted by: Keysplayr
Anyone else try it and not impressed by it is just blowing the biased smoke. It IS impressive and makes you want to replay or try every game you have. Old and new.
Isn't that a matter of personal opinion? Not everybody is necessarily going to be impressed by the exact same things as you.

I suppose you're correct. Not everybody is impressed by Eyefinity either. So I can see what you mean.

Originally posted by: Keysplayr
PhysX isn't going anywhere guys/gals.
At least we can agree on one thing. PhysX isn't going anywhere at the moment, especially now that they've locked out a good percentage of their potential audience.

I meant that PhysX isn't going away. But you knew that.

Originally posted by: Keysplayr
That porsche/diesel analogy was cute.
And was totally accurate.

But it was said that a porsche cant run on diesel. While a CPU can still run PhysX. Not accurate. But I see what you "could" mean.


Originally posted by: Keysplayr
But to all those who said that the comment of a GTS250 being faster than a 5870, and didn't realize that it was meant to be understood if comparing any PhysX title, instead scream bullshit? hehe. Well, it isn't. I know you all know that it's true when it comes to PhysX games. Can't deny it. Captain obvious statements do not make the situation false.
Not, but it doesn't make Ujesh Desai's claim any less laughable, either.

But you'd be laughing at a true statement. No matter how it was worded. Granted, there arent that many PhysX games out yet, but it IS true that a GTS250 will outpace a 5870 when it comes to PhysX titles meant to run on a GPU. That is a much more accurate way to describe the situation. All other games a 5870 would lay waste to a GTS250.

Originally posted by: Keysplayr
And AA in Batman. It seem pretty obvious to me, that AMD users wanted to use AA in Batman. It obviously means something to them. AMD didn't seem to want to do the same thing Nvidia did with Eidos. Has anyone written AMD to ask them why they didn't?
What, you mean pay developers to artifically cut features from their competitors card? Nvidia seems to be the only one around sinking that low. Besides, who would want that kind of bad publicity once word got out to the gaming community?

You can't cut what was never there to begin with. Do you disagree?


Originally posted by: Keysplayr
Sure, you can get AA by either forcing it in the CCC, or changing an ID, but why should you have to? But no, blame is shifted immediately to the bigger company for not enabling it to their smaller competition. WTF? :D
They didn't have to "enable" anything, they chose to "disable" it. Big difference.

"it" being the key word. There wouldn't even be an "it" if Nvidia didn't implement it. Chicken or egg anyone?


Originally posted by: Keysplayr
I'm very surprised that most of you do not think Nvidia has anything at all worthwhile feature wise over AMD's offerings. That just doesn't make sense.
Eyefinity is pretty cool for anything but gaming IMHO. DX11 isn't really a feature advantage for AMD, as Fermi is DX11 as well. And Stream? Compared to CUDA? Please guys, really?
Maybe people are just getting fed up with all the BS Nvidia has been pulling lately.

People are getting fed up on both sides. I dont' particularly care for PR responses from companies.

 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: SlowSpyder Yea, maybe AMD should have taken a page out of Nvidia's book regarding their 8800 -> 9800 transition. Now that's innovation! :thumbsup: :D


They did and were very similar actually!

When ATI moved to the Rv-670 -- they offered similar performance to the R-600 but with much less price-points.

When nVidia moved to the G-92 -- they offered similar performance to the G-80 but with much less price-points.





 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I can see Keys replied to my post.... but where is it? I can see bits and pieces in quotes... but don't see his original post. Where is it?

From the few things:

- the PhysX and Porsche/Ford comparison is accurate. PhysX in Barman does what on CPU? 1FPS? 2FPS? Might as well not run it at all (you can go out from the Porsche and push it to "race" the Ford too once you fill it up with diesel). Why even bother with such a claim in an OFFICIAL statement? You can say a HD3450 runs HAWX or Stalker:CS faster than a GTX295 using DX10.1. True? Yes. But I would be the first one to ridicule such a statement and the person saying it.

- as for AA in Batman? the AMD tech twittered that they sent the AA code to the devs but it was ignored in the end - believe it or not, it's your choice. And for me it looks like nVidia paid for the lockout. They paid the devs to implement a standard feature that both companies already paid big bucks to be used generally. So now they have to additionally pay for something they earlier helped establish as a standard? Now if this AA was somehow made to run through CUDA or whatever (not really a good statement but a concept) - great. No problem there as it's nVidia specific. MSAA is not!

- HD5870 is a DX11, Fermi WILL be DX11 when it arrives. No games use DX11 now anyway so it's a moot point, but we also have no idea when Fermi will be available - some rumors push it to early 2010 and since there's no official word from nVidia that's all we got. ATi has a card capable of running 3 monitors - and you can span your games on those said monitors. With additional features for such a scenario incoming (like compensating for bezel thickness which will be almost perfect by then imo). So the only thing nVidia has is PhysX - gaming wise. Which has one blockbuster title - Batman.

I can see why nVidia would block PhysX out of ATi in a combo scenario. They probably had leaks about HD58xx performance. A HD5870 + a cheap'o 8800GT and you got yourself a great rig that can play anything on the market with crazy speeds. So with Fermi being so far away that was their only choice to keep the GTX-series selling. Hell, even I could see myself trying such a thing (a 8800GT is what? an intense night out in the city here...) just to give PhysX a spin.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I tend to look at this way:

If PhysX or improved Physics isn't important to you at all and see no benefit -- then it isn't worth discussing. All the negatives and cheap-shots must mean -- that there is at least something that may be important to some and maybe more mind-sets down-the-road.

Some may disagree but the key to me is nVidia is spending resources and placing them where their mouth is. They talk about -- don't STFU about it and seems to me, believe that Physics is the next big thing.

Say what you will -- they're trying to push it forward -- some may disagree how they are doing but trying to improve physics is a good thing. Eventually it creates more awareness and slowly offered to improve gaming titles and will be wonderful when all the players are on the same page.

Physics is a lightning rod topic.......and really just helps bring awareness to Physics as a whole -- negative or positive.



 

waffleironhead

Diamond Member
Aug 10, 2005
6,917
429
136
Originally posted by: SirPauly
Originally posted by: SlowSpyder Yea, maybe AMD should have taken a page out of Nvidia's book regarding their 8800 -> 9800 transition. Now that's innovation! :thumbsup: :D


They did and were very similar actually!

When ATI moved to the Rv-670 -- they offered similar performance to the R-600 but with much less price-points.

When nVidia moved to the G-92 -- they offered similar performance to the G-80 but with much less price-points.







I think you missed the sarcasm...
 

waffleironhead

Diamond Member
Aug 10, 2005
6,917
429
136
Originally posted by: SirPauly
I tend to look at this way:

If PhysX or improved Physics isn't important to you at all and see no benefit -- then it isn't worth discussing. All the negatives and cheap-shots must mean -- that there is at least something that may be important to some and maybe more mind-sets down-the-road.

Some may disagree but the key to me is nVidia is spending resources and placing them where their mouth is. They talk about -- don't STFU about it and seems to me, believe that Physics is the next big thing.

Say what you will -- they're trying to push it forward -- some may disagree how they are doing but trying to improve physics is a good thing. Eventually it creates more awareness and slowly offered to improve gaming titles and will be wonderful when all the players are on the same page.

Physics is a lightning rod topic.......and really just helps bring awareness to Physics as a whole -- negative or positive.

FWIW, I dont think anyone is arguing that physx is worthless, they are just upset at the implementation restrictions placed on it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: waffleironhead
Originally posted by: SirPauly
Originally posted by: SlowSpyder Yea, maybe AMD should have taken a page out of Nvidia's book regarding their 8800 -> 9800 transition. Now that's innovation! :thumbsup: :D


They did and were very similar actually!

When ATI moved to the Rv-670 -- they offered similar performance to the R-600 but with much less price-points.

When nVidia moved to the G-92 -- they offered similar performance to the G-80 but with much less price-points.







I think you missed the sarcasm...



Hehe, it was meant to be sarcastic but his sarcastic view had logic -- which I had fun pointing out.



 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Originally posted by: waffleironhead
Originally posted by: SirPauly
I tend to look at this way:

If PhysX or improved Physics isn't important to you at all and see no benefit -- then it isn't worth discussing. All the negatives and cheap-shots must mean -- that there is at least something that may be important to some and maybe more mind-sets down-the-road.

Some may disagree but the key to me is nVidia is spending resources and placing them where their mouth is. They talk about -- don't STFU about it and seems to me, believe that Physics is the next big thing.

Say what you will -- they're trying to push it forward -- some may disagree how they are doing but trying to improve physics is a good thing. Eventually it creates more awareness and slowly offered to improve gaming titles and will be wonderful when all the players are on the same page.

Physics is a lightning rod topic.......and really just helps bring awareness to Physics as a whole -- negative or positive.

FWIW, I dont think anyone is arguing that physx is worthless, they are just upset at the implementation restrictions placed on it.

And I don't disagree -- there are ways to get a point across - some try it rationally and some do it sarcastically and some offer personal attacks and cheap shots. Personally like rational discussions!:)


 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Keysplayr
I'm very surprised that most of you do not think Nvidia has anything at all worthwhile feature wise over AMD's offerings. That just doesn't make sense.
Eyefinity is pretty cool for anything but gaming IMHO. DX11 isn't really a feature advantage for AMD, as Fermi is DX11 as well. And Stream? Compared to CUDA? Please guys, really?

What kind of features does Nvidia have that ATI doesn't that are worthwhile? Why doesn't it make sense? At every single price point except arguably the top, you can get a better card than Nvidia's.

Let's take a walk to Newegg. I will arrange the cards from cheapest to most expensive.

Radeon HD 4850 $99.99
Geforce GTS 250 $109.99
Radeon HD 4870 512MB $124.99
Radeon HD 4870 1GB $144.99
Geforce GTX 260 Core 216 $164.99
Radeon HD 4890 $184.99
Geforce GTX 275 $219.99
Radeon HD 5850 $259.99
Geforce GTX 285 $319.99
Radeon HD 5870 $379.99
Geforce GTX 295 $464.99

Now I'll put them in order of performance from fastest to slowest:

Radeon HD 5870 $379.99~Geforce GTX 295 $464.99
Radeon HD 5850 $259.99
Geforce GTX 285 $319.99
Radeon HD 4890 $184.99~Geforce GTX 275 $219.99
Radeon HD 4870 1GB $144.99~Geforce GTX 260 Core 216 $164.99
Radeon HD 4870 512MB $124.99
Radeon HD 4850 $99.99
Geforce GTS 250 $109.99

So if you care about performance and price, there isn't really a reason to buy an Nvidia card. Most people would also argue that Nvidia offers no worthwhile features over ATI that would in any way curb the sting of that price and performance difference across the board.

You keep saying 3D Vision but people don't care about it. You might, but the majority does not. People don't care about PhysX. You might, but the majority does not. Cuda is hardly an advantage over Stream. There's absolutely nothing on the hardware level that Cuda can do that Stream can not. It's merely a lack of developer support compared with Cuda at this point in time and nothing more. That isn't much a problem though because Cuda support on the software side is still so limited that just about everybody ignores it still. When it comes to two things that aren't really used by anybody, it's not the most impressive thing in the world to say one is far better than the other. When non gaming GPU accelerated applications become popular, I doubt Nvidia will hold any advantage what so ever in the field. There's no reason to think that they would.

Fermi isn't out yet. It probably will not be out for quite a few months still. DX 11 is a feature advantage of ATI's.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
LOL @ eyefininty being anything more than a gimmick. Matrox has provided this years. Even works with current cards and monitors. NVIDIA has had this on their Quadro line for awhile as well.

Besides there are numerous people here who will swear up and down how much of an image quality diffrence there is between 4xAA and 8xAA. For those same people to say that a big fricking monitor bezel right in the middle of their game is not a problem.... OK.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage
It's why they dominated last generation and probably why they will continue this trend. I think what they said is backed by a lot of facts. Instead of the usual biased opinion that gets thrown around here.

Hahahaha! Do you actually believe the BS coming out of your mouth? It's just crap used as a premise for more crap.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Keysplayr
I still don't see the difference between a DirectX solution and a Nvidia specific solution.
Nvidia did the work regardless of either situation. You feel that if it was a DirectX standard that it should be enabled across all hardware. Why?

That is actually the whole point of a standard API like DirectX. It doesn't matter who does or pays for the work. The fact of the matter is that if Eidos puts out a game that adheres to the DirectX standard than any DirectX features should work with any hardware that is compliant to the same version of DX. Once we have a situation where standards can be modified on an ad hoc basis for the highest bidder, you no longer have a standard.

IMO, MS should look into this, and if NV/Eidos did in fact Gerrymander Direct3D in Batman:AA they should pull DirectX sanctioning from the game unless it is patched. I'm not sure if MS has a DirectX compliance team that monitors that sort of thing (they probably never thought they would need one), but IMO selective implementation of standards like this dilute what DirectX means to the consumer, and in turn hurts the brand for MS.

If this type of thing becomes commonplace with AAA titles "Games for Windows" and "DirectX" will mean nothing, and people will look for TWIMTBP on the box instead. Although, I bet Batman:AA one the 360 outsold the PC version like 5:1 anyway, so it's not as if MS really gives a rat's ass.

PhysX is a different story. While I disagree with some of the choices that NV has made with regards to PhysX, they have the right to do so. IMO, their recent moves with PhysX highlights the importance of standard APIs like DirectX, OpenGL, etc.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nitromullet


That is actually the whole point of a standard API like DirectX. It doesn't matter who does or pays for the work. The fact of the matter is that if Eidos puts out a game that adheres to the DirectX standard than any DirectX features should work with any hardware that is compliant to the same version of DX. Once we have a situation where standards can be modified on an ad hoc basis for the highest bidder, you no longer have a standard.

This has nothing to do with DirectX. The Unreal Engine does not natively support Ant-aliasing.

NVIDIA worked with the developer to implement it. AMD did not. AMD has failed its customers. They are the ones to blame.
 

WelshBloke

Lifer
Jan 12, 2005
30,331
7,987
136
Originally posted by: Wreckage
Originally posted by: nitromullet


That is actually the whole point of a standard API like DirectX. It doesn't matter who does or pays for the work. The fact of the matter is that if Eidos puts out a game that adheres to the DirectX standard than any DirectX features should work with any hardware that is compliant to the same version of DX. Once we have a situation where standards can be modified on an ad hoc basis for the highest bidder, you no longer have a standard.

This has nothing to do with DirectX. The Unreal Engine does not natively support Ant-aliasing.

NVIDIA worked with the developer to implement it. AMD did not. AMD has failed its customers. They are the ones to blame.

I'll copy and paste myself as I cant be arsed to look it up again.

According to Ian Mcnaughton (AMD / ATI Senior Product Guy) they did provide the devs with help but it wasn't put in the final product. He hints that this is because it was a TWIMTBP title.

Whether you believe him is up to you I guess.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: Wreckage
LOL @ eyefininty being anything more than a gimmick.

And that fake 3D crap from Nvidia isn't a gimmick?

Fanboys are fun!

 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Wreckage
LOL @ eyefininty being anything more than a gimmick. Matrox has provided this years. Even works with current cards and monitors. NVIDIA has had this on their Quadro line for awhile as well.

Well yeah, lol @ eyefinity "gimmick" since most surely, 99% of the people will not use it. But we can also LOL at CUDA, since again, 99% of us, the videocards buyers, we only play games and don't care about what CUDA can or can't do. LOL at 3d vision, since again, it's not being used by us and the supporting games are scarce. LOL at dx 11, since no game is using it at this time and so on...

You see, all these features, that both companies poke our eyes with, are for the most of us, unimportant. You are one of the few people, that really fall to these very "important" features that Nvidia has. We mostly care about price/performance ratio and less for unused features.
 

ClownPuncher

Junior Member
Mar 31, 2009
5
0
0
Having used 3Dvision for the better part of 2 days, I can say that it is cool. I can also say that it made my eyes tired and gave me a headache at the end of the day. I can also say that the ONLY game that has impressive use of Physx is Batman.

I will be buying a 3rd monitor for eyefinity, I just don't mind the bezels one bit. I recently picked up an XFX 5850, the initial plan was to use my old g92 8800 gts for Physx but since 186.xx kinda ruined that plan, now it is on ebay.

I have very little use for CUDA or Stream. What is useful to me is that I paid $259 for a card that smokes a $310 285. Sure, nVidia had multi monitor support on their quadro series, but unless you can afford a quadro...who gives two squirts of piss? Matrox, yep they had it too, but there is a cost associated with that in addition to still needing a video card.

When I hear the argument "FERMI has that", it makes me smack my forehead. Fermi has nothing, it is an intangible product. There are 3 important things to remember when talking about Fermi; past, present, and future tense. I know that the HD 6870 also will have a fusion reactor... but since I cant buy one yet...nobody cares.

The product that AMD has out now is superior to the nVidia procuct in most ways. It is very possible the Fermi architecture will change those roles in several months, but today it is undeniable that the price/performance crown is not colored green.

P.S. Keys...you should ban yourself from using the IMHO phrase, if you have to state that your opinion is "humble" there is a 99.9% chance that it is not ;)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: WelshBloke

According to Ian Mcnaughton (AMD / ATI Senior Product Guy) they did provide the devs with help but it wasn't put in the final product. He hints that this is because it was a TWIMTBP title.

In his "blog" post he mentioned nothing about doing any work on Batman. Unless you have an updated link.

Also you left out that Ian is their Marketing guy. Funny how you left that smidgen of information out of his title.
 

WelshBloke

Lifer
Jan 12, 2005
30,331
7,987
136
Originally posted by: Wreckage
Originally posted by: WelshBloke

According to Ian Mcnaughton (AMD / ATI Senior Product Guy) they did provide the devs with help but it wasn't put in the final product. He hints that this is because it was a TWIMTBP title.

In his "blog" post he mentioned nothing about doing any work on Batman. Unless you have an updated link.

Also you left out that Ian is their Marketing guy. Funny how you left that smidgen of information out of his title.

From Hard forums

I didnt say he was marketing because I didnt know. :confused:

You seem to know an awful lot about AMD employees.

Anyway it seems to be just marketing on both sides that I can see.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Wreckage

This has nothing to do with DirectX. The Unreal Engine does not natively support Ant-aliasing.

It has everything to do with DirectX. If Eidos and NV made AA work with DirectX calls then this should work on all DirectX complaint hardware. It doesn't matter if the engine used originally supported it or not. The game supports AA now, and as a DirectX title it should work on all DirectX compliant hardware.

NVIDIA worked with the developer to implement it. AMD did not. AMD has failed its customers. They are the ones to blame.

Both NV and ATI contribute to the DirectX standard, so they both invested the work. Honestly, both NV and ATI should take a stance against developers not implementing standard features like AA in games instead of giving them money and support for putting out a sub-standard product.

I'm not sure why a gaming environment where features are added/omitted on specific hardware at the request of the highest bidder is attractive to some. If games start to become coded for ATI or NV cards specifically, not only does this hurt us as gamers now, it also will make it really difficult for a third player to compete in the market.
 

Schmide

Diamond Member
Mar 7, 2002
5,581
712
126
Originally posted by: Keysplayr
Look guys, you all seem way too angry. Are you going to calm down or not?

That's just lame. I'm not angry and I'm sure most here are not. Maybe a bit disappointed that your argument has been reduced to talking points and emotional retorts.

Every time I give a long winded explanation on why standards matter, you ignore me and poke others that may actually produce an emotional response. If you're here to discuss the nuances lets go, if you want to just convey the Edios/nVidia PR department, start a blog.

Edit: torts retorts same thing.
 

thilanliyan

Lifer
Jun 21, 2005
11,848
2,051
126
Originally posted by: Qbah
I can see why nVidia would block PhysX out of ATi in a combo scenario. They probably had leaks about HD58xx performance. A HD5870 + a cheap'o 8800GT and you got yourself a great rig that can play anything on the market with crazy speeds. So with Fermi being so far away that was their only choice to keep the GTX-series selling. Hell, even I could see myself trying such a thing (a 8800GT is what? an intense night out in the city here...) just to give PhysX a spin.

Physx in Batman:AA is not that impressive IMO. I've already tried it with my 4870 + 8800GT. I have Batman on my PS3 and I played through that and didn't notice any effects from the PC demo that were missing (ie. I didn't notice it in the actual gameplay). I think PhysX in Cryostasis is a much more integral and impressive part of the game with all the ice/water effects. I don't think PhysX will take off as long as it is restricted to nVidia.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Wreckage
Originally posted by: WelshBloke

According to Ian Mcnaughton (AMD / ATI Senior Product Guy) they did provide the devs with help but it wasn't put in the final product. He hints that this is because it was a TWIMTBP title.

In his "blog" post he mentioned nothing about doing any work on Batman. Unless you have an updated link.

Also you left out that Ian is their Marketing guy. Funny how you left that smidgen of information out of his title.

You are criticizing someone for using a marketing guy as a source? Really? You?