DX11 questionnaire from nVidia

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Qbah
I can see Keys replied to my post.... but where is it? I can see bits and pieces in quotes... but don't see his original post. Where is it?

From the few things:

- the PhysX and Porsche/Ford comparison is accurate. PhysX in Barman does what on CPU? 1FPS? 2FPS? Might as well not run it at all (you can go out from the Porsche and push it to "race" the Ford too once you fill it up with diesel). Why even bother with such a claim in an OFFICIAL statement? You can say a HD3450 runs HAWX or Stalker:CS faster than a GTX295 using DX10.1. True? Yes. But I would be the first one to ridicule such a statement and the person saying it.

Something that I wanted to add to the mix regarding PhysX, games which uses GPU PhysX like Batmah AA and Mirrors Edge are intentionally made single threaded, so when PhysX are activated using an ATi card, it will run like crap, both games only uses one core when PhysX is activated and the FPS drops dramatically, I'm pretty sure that if PhysX used multi threading, the performance drop would be less dramatically that it is currently. My Quad Core speed doesn't even ramp up at full speed when I played Mirrors Edge with PhysX on, what a waste of computing power...

Originally posted by: Wreckage

This has nothing to do with DirectX. The Unreal Engine does not natively support Ant-aliasing.

NVIDIA worked with the developer to implement it. AMD did not. AMD has failed its customers. They are the ones to blame.

It does support Anti Aliasing in DX10 mode, Deferred Rendering can't support MSAA in DX9. nVidia did failed the customers for not providing support for future features like DX10.1 or DX11 support on time, rehasing the G92 again and again thinking that all customers are a bunch of bumbling fools, heck, nVidia now will release a DX10.1 card, what's the point? nVidia is almost 1 year behind of ATi in technology and feature set.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
That Questionnaire was hilarious, let me see what I got from it:

Q1)What do u think of DirectX 11?
You know, PhysX and 3D Vision rock. Batman Arkham Asylum physics FTW!

Q2)What has PhysX got over Havoc?
IPhone PhysX is really something. Did I mention PhysX is sexy?

Q3)Is Fermi better than two HD5870 in Crossfire?
Sure Fermi is faster, even a Geforce4 MX wins here. After all, the MX has AGP 8X support, the HD5870 doesn't.

Q4)Can you release drivers on a monthly basis?
Our notebook drivers support CUDA and PhysX, and best sellers such as Batman Arkham Asylum.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: Keysplayr
PhysX isn't going anywhere guys/gals. Batman:AA is the best PhysX title out yet.

The question is, would Batman: AA be just as good without the PhysX lockin? Would the effects that was supposedly "only possible on PhysX" be possible on video cards from other companies?

I'm going to say yes.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Man haven't seen his name for a while, nVidia really should keep a muzzle on Brian, he always sounds like such an idiot, particularly when talking to enthusiast sites.

Would the effects that was supposedly "only possible on PhysX" be possible on video cards from other companies?

What do you mean on video cards? In terms of rendering the grapihcs, easily, but you can turn them on with an ATi card if you want to no problem. The issue will then be your CPU.

If you mean in terms of if other video cards could process the same level of physics, we can't really say for sure on that one way or the other. ATi is pretty adamant that they don't want those kinds of effects in games yet, they have been fighting hard against that sort of progress for a while now.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Originally posted by: thilanliyan
Physx in Batman:AA is not that impressive IMO. I've already tried it with my 4870 + 8800GT. I have Batman on my PS3 and I played through that and didn't notice any effects from the PC demo that were missing (ie. I didn't notice it in the actual gameplay). I think PhysX in Cryostasis is a much more integral and impressive part of the game with all the ice/water effects. I don't think PhysX will take off as long as it is restricted to nVidia.

I was more trying to make a point that by combining a HD5870 and a 8800GT one would be able to play anything imaginable and possible on the market at crazy settings :) By introducing this artificial block, nVidia wanted people to buy their cards - which are slower and more expensive.

Originally posted by: evolucion8
Something that I wanted to add to the mix regarding PhysX, games which uses GPU PhysX like Batmah AA and Mirrors Edge are intentionally made single threaded, so when PhysX are activated using an ATi card, it will run like crap, both games only uses one core when PhysX is activated and the FPS drops dramatically, I'm pretty sure that if PhysX used multi threading, the performance drop would be less dramatically that it is currently. My Quad Core speed doesn't even ramp up at full speed when I played Mirrors Edge with PhysX on, what a waste of computing power...

I did write a small paragraph regarding how PhysX is heavily unoptimized running on the CPU (single thread only for example) but it wasn't relevant to the point I was trying to make - namely that saying a GTS250 runs physics in Batman faster than a HD5870. So I removed it in the end. The comparison isn't true as the Radeon isn't the one running PhysX, it's the CPU.

To sum up, the whole "interview" was the nV PR guy dodging answers by saying how PhysX and 3D Vision is awesome and so much better and that nVidia isn't pushing their solutions, but industry standards ;) And this was an official response from nVidia, which in general makes the whole points made by that guy a total joke.
 

thilanliyan

Lifer
Jun 21, 2005
12,033
2,246
126
Originally posted by: Kuzi
That Questionnaire was hilarious, let me see what I got from it:

Q1)What do u think of DirectX 11?
You know, PhysX and 3D Vision rock. Batman Arkham Asylum physics FTW!

Q2)What has PhysX got over Havoc?
IPhone PhysX is really something. Did I mention PhysX is sexy?

Q3)Is Fermi better than two HD5870 in Crossfire?
Sure Fermi is faster, even a Geforce4 MX wins here. After all, the MX has AGP 8X support, the HD5870 doesn't.

Q4)Can you release drivers on a monthly basis?
Our notebook drivers support CUDA and PhysX, and best sellers such as Batman Arkham Asylum.

Lol...someone playing dodgeball there? :p
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Kuzi
That Questionnaire was hilarious, let me see what I got from it:

Q1)What do u think of DirectX 11?
You know, PhysX and 3D Vision rock. Batman Arkham Asylum physics FTW!

Q2)What has PhysX got over Havoc?
IPhone PhysX is really something. Did I mention PhysX is sexy?

Q3)Is Fermi better than two HD5870 in Crossfire?
Sure Fermi is faster, even a Geforce4 MX wins here. After all, the MX has AGP 8X support, the HD5870 doesn't.

Q4)Can you release drivers on a monthly basis?
Our notebook drivers support CUDA and PhysX, and best sellers such as Batman Arkham Asylum.

:D Nice summary. When these PR people are asked questions (from any company) I wouldn't expect much substance.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Kuzi
That Questionnaire was hilarious, let me see what I got from it:

Q1)What do u think of DirectX 11?
You know, PhysX and 3D Vision rock. Batman Arkham Asylum physics FTW!

Q2)What has PhysX got over Havoc?
IPhone PhysX is really something. Did I mention PhysX is sexy?

Q3)Is Fermi better than two HD5870 in Crossfire?
Sure Fermi is faster, even a Geforce4 MX wins here. After all, the MX has AGP 8X support, the HD5870 doesn't.

Q4)Can you release drivers on a monthly basis?
Our notebook drivers support CUDA and PhysX, and best sellers such as Batman Arkham Asylum.

lol

 

Equ1n0x

Member
Oct 9, 2009
28
0
0
That's about as truthful as Intel's claims that their chips offer a "rich 3D experience for games". Intel is now downplaying GP-GPU by claiming there could be security issues. There *could be* if Intel had anything remotely capable of doing GP-GPU, and LRB certainly is nowhere near ready.

If NVidia had something to compete with that was DX11 capable, they would be screaming it from the rafters.

 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: Equ1n0x
Intel is now downplaying GP-GPU by claiming there could be security issues. There *could be* if Intel had anything remotely capable of doing GP-GPU, and LRB certainly is nowhere near ready.
A brilliant observation. Thank you for the insight.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
I don't see how people can say that ATi didn't help Eidos make the game for there cards. They stated they had it working in an earlier build and if that isn't enough, did they not supply a method to make it work? All Eidos has to do is enable it on ATi cards.

This is complete intentional blocking of features. Just like Intel is doing to nVidia on chipsets.
 

Equ1n0x

Member
Oct 9, 2009
28
0
0
Originally posted by: lopri
Originally posted by: Equ1n0x
Intel is now downplaying GP-GPU by claiming there could be security issues. There *could be* if Intel had anything remotely capable of doing GP-GPU, and LRB certainly is nowhere near ready.
A brilliant observation. Thank you for the insight.

I'm not wrong.
 
Dec 30, 2004
12,553
2
76
Originally posted by: Rezist
I don't see how people can say that ATi didn't help Eidos make the game for there cards. They stated they had it working in an earlier build and if that isn't enough, did they not supply a method to make it work? All Eidos has to do is enable it on ATi cards.

This is complete intentional blocking of features. Just like Intel is doing to nVidia on chipsets.

goes around/comes around
:thumbsup:
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Barfo
Nvidia's FUD marketing strategy has worked well for them so far, I don't see them changing it anytime soon.

I remember the Viral Marketing from Nvidia, it was a fun time with Rollo getting banned again and again an again. God he was like a broken record lol ;)
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Kuzi
That Questionnaire was hilarious, let me see what I got from it:

Q1)What do u think of DirectX 11?
You know, PhysX and 3D Vision rock. Batman Arkham Asylum physics FTW!

Q2)What has PhysX got over Havoc?
IPhone PhysX is really something. Did I mention PhysX is sexy?

Q3)Is Fermi better than two HD5870 in Crossfire?
Sure Fermi is faster, even a Geforce4 MX wins here. After all, the MX has AGP 8X support, the HD5870 doesn't.

Q4)Can you release drivers on a monthly basis?
Our notebook drivers support CUDA and PhysX, and best sellers such as Batman Arkham Asylum.

That was funny. :laugh:
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,551
136
Originally posted by: BenSkywalker
Would the effects that was supposedly "only possible on PhysX" be possible on video cards from other companies?

What do you mean on video cards? In terms of rendering the grapihcs, easily, but you can turn them on with an ATi card if you want to no problem. The issue will then be your CPU.

If you mean in terms of if other video cards could process the same level of physics, we can't really say for sure on that one way or the other. ATi is pretty adamant that they don't want those kinds of effects in games yet, they have been fighting hard against that sort of progress for a while now.

Why would the issue be the CPU? Most of the effects in Batman seem to be possible with previous tech such as the fog effects and stuff like caution tape that breaks when you cross it. I think there was also some extra glass breaking.

Don't get me wrong, it increases the immersion of the player into the game but these effects seem possible before PhysX. Especially with the power of today's video cards, and I'm counting the Radeon 4xx0 series and GT200 series.

Also, anyone who wants full eye candy usually has a tricked out system anyways so needing a more powerful CPU vs needing an extra video card for PhysX isn't going to be much of a barrier for those who really want the extra effects. I just question whether the effects were possible only with PhysX like nVidia claims.
 

nicnas

Junior Member
Sep 19, 2009
22
0
0
Originally posted by: MegaWorks
With all the BS that came out of nVidia this past couple of weeks I swear this is the last time I'm buying there products.

Yep, me too. I've had Riva TNT, GF 2MX 400, GF Ti 4200, GF 5200FX, GF 6600GT and a GF9600GT so I've been considering myself as a nVidia fan (not fanboy). Now I'll be looking for my next GPU among ATI offerings.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: waffleironhead
Originally posted by: SirPauly
Originally posted by: SlowSpyder Yea, maybe AMD should have taken a page out of Nvidia's book regarding their 8800 -> 9800 transition. Now that's innovation! :thumbsup: :D

They did and were very similar actually!

When ATI moved to the Rv-670 -- they offered similar performance to the R-600 but with much less price-points.

When nVidia moved to the G-92 -- they offered similar performance to the G-80 but with much less price-points.

I think you missed the sarcasm...


He wasn't being sarcastic.

With the transition from the HD2000 to the HD3000 series, AMD actually significantly lowered power consumption and lowered price by making the GPUs on a smaller manufacturing process.

The 9800GT is simply a rebadged 8800GT. There is no difference between the cards at all, besides the name. Likewise can be said for the 512MB 9800GTX+ to GTS250 transition: You weren't guaranteed to get a GTS250 with the new board and lower power consumption and thus some cards were simply re-badged. And now OEMs can have the GTS240, which is a rebadge 9800GT which is a rebadged 8800GT.

edit: why is my text italicized?... Strange that the quote feature left an unclosed markup, because I definitely didn't erase it.
 

novasatori

Diamond Member
Feb 27, 2003
3,851
1
0
There is an unclosed italics at the end of [Q ][I ]Originally posted by: [B ]SlowSpyder[/B ] >[/I ]< needed there

 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
i'm just going to ask this since i've been drinking and it's been bugging me:

how do you pronounce cusideabelincoln. all i see in there is "abe lincoln," so that's all i say in my head when i see it. but what the hell is it, really? and what does it mean?
 

Schmide

Diamond Member
Mar 7, 2002
5,702
956
126
Originally posted by: alyarb
i'm just going to ask this since i've been drinking and it's been bugging me:

how do you pronounce cusideabelincoln. all i see in there is "abe lincoln," so that's all i say in my head when i see it. but what the hell is it, really? and what does it mean?

It's a penny?

Cu = copper

Copper Side Abe Lincoln
 

Equ1n0x

Member
Oct 9, 2009
28
0
0
Originally posted by: lopri
Originally posted by: Equ1n0x
Intel is now downplaying GP-GPU by claiming there could be security issues. There *could be* if Intel had anything remotely capable of doing GP-GPU, and LRB certainly is nowhere near ready.
A brilliant observation. Thank you for the insight.

Well, we could put out every month, or in this case, every year, a questionnaire asking LRB which functions they think they can support with ISA x86 architecture "paving the way" with a "rich history".

You know, until we see hardware.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: Schmide
Originally posted by: alyarb
i'm just going to ask this since i've been drinking and it's been bugging me:

how do you pronounce cusideabelincoln. all i see in there is "abe lincoln," so that's all i say in my head when i see it. but what the hell is it, really? and what does it mean?

It's a penny?

Cu = copper

Copper Side Abe Lincoln

Holy crap, I think this is the first poster to correctly deduce my username. After using this name for damn near a decade... all I have to say is wow! Most people at least need a hint, and the vast majority need a complete explanation.