Nvidia: DirectX 11 Will Not Catalyze Sales of Graphics Cards

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
Originally posted by: Henrah
With NV pushing CUDA and ATI pushing OpenCL, will developers mind programming for both (and how hard is it to do so), or might this be like BluRay/HD-DVD?

for most people, the encoder of choice is x264, and we use frontends like handbrake or automkv and others. x264 is parallelizable up to 16 threads or so, but beyond that it actually starts to slow down. motion estimation is the primary thing we want to run on the GPU leaving the rest for the CPU, so you have each chip doing what they do best and the result is faster than a GPU-only or CPU-only encode.

this is why the earliest implementation of GPGPU-based encoding applications were not that fast. in fact, cyberlink's mediashow espresso is only ~3x faster with a CUDA-assisted encode. the problem is that every GPGPU encoder, including cyberlink, is horrible. the output quality is not great and the user has absolutely no control over encoding parameters (they are too obfuscated into the writer's narrow "lets just get this working" framework). the program GUIs are insultingly graphical and there's absolutely no ability to customize your work. there are 3 or 4 presets and that's it. maybe they are marketing the program to kids. can't say, since it's so revolutionary in this one aspect, yet completely unsuitable in all other aspects. i really don't even know why they bothered. they thought they could take the same lazy-ass approach that worked with PowerDVD for so long, but no. parallel computing is hard.

still, people with GPUs are going to want that 300% boost. we are just waiting for handbrake or automkv to implement it so that we can have more "expert" encoding options available rather than a few pathetic presets. it probably isn't going to happen from them, though. it'll likely be someone else starting a totally new open source project, and it's likely to be opencl because it's royalty free and will run on ATI and nvidia.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Dribble
Performance is the normal reason, but tbh anyone with a 4870 can run nearly all games maxed out. As the sort of people to spend big on gpu's probably already have a pretty fast one, unless you must play stalker/crysis again no point spending $$$ on a replacement if you aren't going to see any difference.

I agree, and I think in this way what NVIDIA is doing with PhysX is smart. Take Batman for instance, NVIDIA has inserted PhysX as a differentiator between the PC version and the console versions. While it isn't critical to the game, what PhysX adds in terms of visual experience is not insignificant.

 

Scotteq

Diamond Member
Apr 10, 2008
5,276
5
0
While I understand the theory behind GPGPU, so far I've not seen much in the way of practical application on the street. So in my humble opinion, it's a good story. But since the world at large is still by and large application driven I feel GPGPU won't take off until developers deliver on the promise.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Coming from the company that said DX10 would complete the Windows Vista experience.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
Originally posted by: Griswold
Originally posted by: Denithor
DX10 was essentially a flop due in large part to the slow adoption rate of Vista.

Thats the second time i read this "flop" nonsense in this thread.

D3D10 doesnt even remotely sport the feature set D3D11 does, so it was pretty clear that it wont be a beast as far as visual improvement and to some degree performance goes. If you believe otherwise, you fell for the marketing hoopla of MS, ATI and NV.

Yet, it paved the way for D3D11 by getting rid of alot of legacy junk that was standing in the way of progress with the D3D api.

Everyone and their mother could have been using vista on day 1 after the launch and we still wouldnt see more much better looking games due to D3D10, simply because it was never capable of delivering that...

It's not clear to most consumers. DX10 as I remember was supposed to be big and all we really got were hyped features and poor performance. Now if DX11 is supposed to be the beast we've been waiting for, where's the marketing hoopla from MS, ATI, and NV?

Maybe we'll get it when Win7 is finally out, or maybe PC gaming really is dying...
 

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
NV keeps playing the same game, if they can't get something done then just keep saying that thing is not too useful like DX10.1 now DX11. news has it they got their sample back from lab, yield of their dx11 parts are like 2%, that's too low and it's like 400mm2 in size, a monster of a chip. so most sites are predicting early next year launch of their dx11 parts if tsmc can work out yield issue with them. so now they are behind in the dx11 race, they are claiming dx11 is useless. this all points to another delay in their production again on dx11 parts. whatever, if they can't get dx11 parts out before xmas, we can always get it off ati. hopefully they won't be late.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
If nv believes that video graphics aren't going to drive new sales, then why are they so afraid to let PhysX work alongside an Ati card?

Seems to me like they just say what they think people want them to say in any given situation...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Modular
If nv believes that video graphics aren't going to drive new sales, then why are they so afraid to let PhysX work alongside an Ati card?

Seems to me like they just say what they think people want them to say in any given situation...


A couple of things with your comments here.

1. Nvidia believes that DX11 will not catalyze sales.
Your interpretation: Nvidia believes that video graphics will not drive sales.

2. Nvidia disables PhysX for use with any ATI card present in a system.
Your interpretation: Nvidia is afraid of ATI.

How do you arrive at these interpretations? What goes on up there?

 

HurleyBird

Platinum Member
Apr 22, 2003
2,817
1,552
136
Originally posted by: Keysplayr
Direct Compute has nothing to do with DirectX 11 specifications.
As stated, Direct Compute is already fully supported by DX10 hardware (G80 and up).

Compute shaders have everything to do with DX11, at least on the software side. Sure, DX10 hardware can do simplified compute shaders, but that's not really the point. When Compute shaders and OpenCL move forward, and that's going to happen in a very short time now, GPGPU is going to explode. It's hypocritical in the extreme for Nvidia to say that DX11 doesn't matter as much as GPGPU because DX11 is going to be the catalyst for GPGPU to take off, and having DX11 compliant hardware is going to let you extract the most performance out of it.

Bottom line: GPGPU is not important in the slightest, and will not be until we have a universal language (DX11, OpenCL). G200 only has a partial implementation, 'RV870' has the full implementation. That in itself is huge, and the huge difference in shading power between ATI's new stuff and Nvidia's old stuff makes matters much worse. Nvidia needs a DX11 card out ASAP, or everyone is going to be writing code for ATI's architecture when the GPGPU revolution commences shortly.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
i dont know about GPGPU "exploding" right off the bat. it'll take a while. i mean, do you think there are people furiously writing down OpenCL algorithms that they've been saving all these years? i doubt it. even vectorizing scalar programs is difficult. of course, the lack of a universal royalty-free platform wasn't helping matters.

can an SSEx extension be translated into DC/OpenCL without a lot of work? that would be something, and i'll bet it can.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: alyarb
Originally posted by: ronnn
So their master plan is to sell a faster cpu? Time to buy a console I guess.

i hate it when people continuously juxtapose the CPU and GPU.


Yep my bad. I was really responding to this recording of the statement. Seemed to me he was saying - no profit in gpu as fps are very high already and prices will always drop. So lets make money by making programs run faster.

Hell I will spend money on graphics, faster photo shop may be worth a bit extra, but ....

Actually I spend much more on gpu's than cpu's - so I really believe nvidia is contradicting themselves here.

 

HurleyBird

Platinum Member
Apr 22, 2003
2,817
1,552
136
The way I see it is that you when you combine the truly massive performance of modern GPUs with a universal language, it pretty much has to explode. Even if programs don't start coming out in droves until after G300 launches, those programmers will be in the meantime doing most of their coding on and for ATI's next-gen products.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: HurleyBird
Originally posted by: Keysplayr
Direct Compute has nothing to do with DirectX 11 specifications.
As stated, Direct Compute is already fully supported by DX10 hardware (G80 and up).

Compute shaders have everything to do with DX11, at least on the software side. Sure, DX10 hardware can do simplified compute shaders, but that's not really the point. When Compute shaders and OpenCL move forward, and that's going to happen in a very short time now, GPGPU is going to explode. It's hypocritical in the extreme for Nvidia to say that DX11 doesn't matter as much as GPGPU because DX11 is going to be the catalyst for GPGPU to take off, and having DX11 compliant hardware is going to let you extract the most performance out of it.

Bottom line: GPGPU is not important in the slightest, and will not be until we have a universal language (DX11, OpenCL). G200 only has a partial implementation, 'RV870' has the full implementation. That in itself is huge, and the huge difference in shading power between ATI's new stuff and Nvidia's old stuff makes matters much worse. Nvidia needs a DX11 card out ASAP, or everyone is going to be writing code for ATI's architecture when the GPGPU revolution commences shortly.

I'd love to know where you're getting this stuff. Come now Hurley, how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware? Think about that............. :music: jeopardy music :music:

Ok, what did you come up with?
 

br0wn

Senior member
Jun 22, 2000
572
0
0
Originally posted by: Keysplayr

I'd love to know where you're getting this stuff. Come now Hurley, how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware? Think about that............. :music: jeopardy music :music:

Ok, what did you come up with?

This is incorrect, Direct Compute is not fully supported by Nvidia DX10 hardware.

DX11 introduces Direct Compute (Compute Shader) version 4.0 for DX10 hardware, 4.1 for DX 10.1 hardware, and 5.0 for DX11 hardware.

There is a big jump in functionalities from Compute Shader 4.1 to 5.0 (you can read up any presentation about DX11 Compute Shader 5.0----developers are mostly excited for Compute Shader 5.0). See http://www.xbitlabs.com/news/v...g_Compute_Shaders.html
 

Forumpanda

Member
Apr 8, 2009
181
0
0
Originally posted by: Keysplayr
how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware?
I guess
'The main aim of compute shaders 4.x is to allow game developers to practice with compute shaders technology, enable GPGPU via DirectX as well as let game developers to use CS for complex rendering-related tasks instead of pixel shaders so to gain performance.''

Is now the same as 'fully supported'

I appreciated reading your posts keys, but you sometimes make some poorly researched nVidia favored posts that do not really make any sense.

I think John Carmacks take on it is about what I think
http://www.gamephys.com/2009/0...ot-a-big-fan-of-physx/

Look, I'm all for GPGPU accelerations in games and general software, but it will require a standardized interface, much like using SSE instructions does not require you to worry about what vendor the CPU is from.
I am all for nVidia releasing a DX11 card to compete with ATI asap, because that is where we will see GPGPU be used in desktop applications (I personally do not play games).

Of course if you prefer, you can have it from the man himself.
http://www.ditii.com/2009/08/2...tcompute-in-windows-7/
I guess that is where you got your fully supported quote from, welcome to marketing, but I guess you are already considered part of nV marketing ;) (I hope that wasn't an insult - I'd just expect you get paid for your time).

For people who still do not get it this is the basic problem.
http://www.gamephys.com/2009/0...equirements-for-physx/

Buying current gen hardware for physx is like buying an Pentium processor for MMX, while technically novel, by the time it is broadly utilized new hardware will be many times faster and a much more relevant platform for the technology.
DX11 has real value not just for games (I dont care about those) but for applications using GPGPU capabilities, physx and cuda and the like will eventually be ran over, and replaced by widely used industry standards, on future hardware that makes current performance look like its still the 20th century.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Forumpanda
Originally posted by: Keysplayr
how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware?
I guess
'The main aim of compute shaders 4.x is to allow game developers to practice with compute shaders technology, enable GPGPU via DirectX as well as let game developers to use CS for complex rendering-related tasks instead of pixel shaders so to gain performance.''

Is now the same as 'fully supported'

I appreciated reading your posts keys, but you sometimes make some poorly researched nVidia favored posts that do not really make any sense.

I think John Carmacks take on it is about what I think
http://www.gamephys.com/2009/0...ot-a-big-fan-of-physx/

Look, I'm all for GPGPU accelerations in games and general software, but it will require a standardized interface, much like using SSE instructions does not require you to worry about what vendor the CPU is from.
I am all for nVidia releasing a DX11 card to compete with ATI asap, because that is where we will see GPGPU be used in desktop applications (I personally do not play games).

QFT
Keys, I feel sorry for you. Hard times ahead till GT300 comes.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Forumpanda
I guess that is where you got your fully supported quote from, welcome to marketing, but I guess you are already considered part of nV marketing ;) (I hope that wasn't an insult - I'd just expect you get paid for your time).

Seriously, why? How would that not be an insult.

Do you treat your fellow forum colleagues the same way when they acknowledge they are AMD or Intel employees?

Do you pepper your posts to them (AMD and Intel employees) with comments suggesting their reasons for participating in this forum is to be a paid mouth-piece spreading marketing BS?

Its insulting to me to read this crap, this is not how we treat each other, it is insulting and it is disrespectful to imply or outright state that you think people are here for reasons other than to simply contribute to a community discussion composed of fellow enthusiasts.

How many people do you think are probably lurking right now and won't publicly acknowledge their business relations IRL out of fear that it will simply become a method of making subtle and not-so-subtle personal attacks as a means to undermine credibility rather than just sticking to scrutinizing the topic itself as explained in the TOS?

Let's just stop it, please.
 

Forumpanda

Member
Apr 8, 2009
181
0
0
Originally posted by: Idontcare
Originally posted by: Forumpanda
I guess that is where you got your fully supported quote from, welcome to marketing, but I guess you are already considered part of nV marketing ;) (I hope that wasn't an insult - I'd just expect you get paid for your time).

Seriously, why? How would that not be an insult.

Do you treat your fellow forum colleagues the same way when they acknowledge they are AMD or Intel employees?
I just believe that no one does something in their spare time without being motivated by some means.
It could be that it simply provides personal pleasure to talk positively about one companies hardware over another, but personally I have always had a hard time attaching emotions to companies.

I also have no intentions of becoming a serious or credible poster here, take my unfiltered opinion fwiw, if you'd rather I don't post, then I have no problem with that.

Originally posted by: IdontcareHow many people do you think are probably lurking right now and won't publicly acknowledge their business relations IRL out of fear that it will simply become a method of making subtle and not-so-subtle personal attacks as a means to undermine credibility rather than just sticking to scrutinizing the topic itself as explained in the TOS?
And this is where you misunderstand me.

If I am aware that someone is actively working or have worked in the industry re: yourself.
That makes me many times more likely to read and follow their posts with interest, if someone comes here and says: I worked on designing the G80 for nVidia, this is my take on ATIs new cards. Then I will be much much more likely to read any negative comments about possible flaws in ATIs hardware and why you should or should not buy it, coming from him as I would regard him as a person with knowledge in the field.

Facts can be researched, poor logic stands out much easier when it comes from people with knowledge.

If someone who I know nothing about posts something that *sounds* like it is insightful knowledge, then I treat it with kilos of salt because more often than not your average forum poster has no in depth knowledge but just likes to sound like it.


If I could filter this forum to *only* read posts from people who worked in the industry I would do it in a heartbeat, I consider the bickering about this or that hardware by people who haven't even done a simple google search or at least read the wiki page on the topic to be quite the waste of my time.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: br0wn
Originally posted by: Keysplayr

I'd love to know where you're getting this stuff. Come now Hurley, how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware? Think about that............. :music: jeopardy music :music:

Ok, what did you come up with?

This is incorrect, Direct Compute is not fully supported by Nvidia DX10 hardware.

DX11 introduces Direct Compute (Compute Shader) version 4.0 for DX10 hardware, 4.1 for DX 10.1 hardware, and 5.0 for DX11 hardware.

There is a big jump in functionalities from Compute Shader 4.1 to 5.0 (you can read up any presentation about DX11 Compute Shader 5.0----developers are mostly excited for Compute Shader 5.0). See http://www.xbitlabs.com/news/v...g_Compute_Shaders.html

Thanks for the article Br0wn. Articles I read a while back suggested differently.

@ Forumpanda: I can certainly admit when I am mistaken, as in this case. Like I said, I read contrary information a while ago and it stuck with me. But dude you really need to stop with this BS.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Forumpanda
Originally posted by: Idontcare
Originally posted by: Forumpanda
I guess that is where you got your fully supported quote from, welcome to marketing, but I guess you are already considered part of nV marketing ;) (I hope that wasn't an insult - I'd just expect you get paid for your time).

Seriously, why? How would that not be an insult.

Do you treat your fellow forum colleagues the same way when they acknowledge they are AMD or Intel employees?
I just believe that noone does something in their spare time without being motive by some means.
It could be that it simply provides personal pleasure to talk positively about one companies hardware over another, but personally I have always had a hard time attaching feelings to companies.

Originally posted by: IdontcareHow many people do you think are probably lurking right now and won't publicly acknowledge their business relations IRL out of fear that it will simply become a method of making subtle and not-so-subtle personal attacks as a means to undermine credibility rather than just sticking to scrutinizing the topic itself as explained in the TOS?
And this is where you misunderstand me.

If I am aware that someone is actively working or have worked in the industry re: yourself.
That makes me many times more likely to read and follow their posts with interest, if someone comes here and says: I worked on designing the G80 for nVidia, this is my take on ATIs new cards. Then I will be much much more likely to read any negative comments about possible flaws in ATIs hardware and why you should or should not buy it, coming from him as I would regard him as a person with knowledge in the field.

Facts can be researched, poor logic stands out much easier when it comes from people with knowledge.

If someone who I know nothing about posts something that *sounds* like it is insightful knowledge, then I treat it with kilos of salt because more often than not your average forum poster has no in depth knowledge but just likes to sound like it.


If I could filter this forum to *only* read posts from people who worked in the industry I would do it in a heartbeat, I consider the bickering about this or that hardware by people who haven't even done a simple google searched or at least read the wiki page on the topic to be quite the waste of my time.

I agree, the signal to noise ratio of any forum and thread is something each of us must wrestle with and ultimately the process itself can sap the enjoyment we extract from participating in said forum or thread. That is why we rely on the mods do be vigilant in rooting out the trolls and spammers and the viral marketeers.

But my point is simply this - restrict your engagements to those you are somewhat convinced are worth your time and you'll find that your posts are only adding to the signal instead of exacerbating the noise in any given thread.

For those posters whom you might be wary of and concerned that they are not legitimately knowledgeable or might actually be paid mouth-pieces, just ignore them (or if the infraction is particularly egregious report them to the mods).

But there is no harm to come from choosing personal inaction and not posting a response or a rebuttal.

However there is harm to come from posting a rebuttal if perchance you misread the other person's character...why risk needlessly offending someone who just might be legitimately knowledgeable but misinterpreted?

Who loses out in that situation? Everyone. We risk stifling another potential source of signal.

I'm a big believer in so-called golden rule: do unto others as you would have done unto you.

So I'm just saying if you are going to make a post that suggests a fellow poster has less than admirable reasons for contributing to the community then just take a moment to ask yourself what that says about you to be wasting your time making such a post.

It probably doesn't send the kind of message about yourself that you really want others to view you thru either, so just don't do it.

If people are here with the wrong kinds of intentions trust me they will eventually out-themselves because they tend to be the kind of one-track mind posters aren't capable of knowing when they've gone to far in adding noise to the forums.

Eventually the weeds get weeded out of the garden, but you got to wait to see what kind of leaves and flowers come from the roots, if you just keep pulling up roots then for all you know you are weeding out your vegetables too.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Janooo
Originally posted by: Forumpanda
Originally posted by: Keysplayr
how can Direct Compute have everything to do with DX11, when Direct Compute is already fully supported by Nvidia DX10 hardware?
I guess
'The main aim of compute shaders 4.x is to allow game developers to practice with compute shaders technology, enable GPGPU via DirectX as well as let game developers to use CS for complex rendering-related tasks instead of pixel shaders so to gain performance.''

Is now the same as 'fully supported'

I appreciated reading your posts keys, but you sometimes make some poorly researched nVidia favored posts that do not really make any sense.

I think John Carmacks take on it is about what I think
http://www.gamephys.com/2009/0...ot-a-big-fan-of-physx/

Look, I'm all for GPGPU accelerations in games and general software, but it will require a standardized interface, much like using SSE instructions does not require you to worry about what vendor the CPU is from.
I am all for nVidia releasing a DX11 card to compete with ATI asap, because that is where we will see GPGPU be used in desktop applications (I personally do not play games).

QFT
Keys, I feel sorry for you. Hard times ahead till GT300 comes.

Yeah, bad research on my part. Happens to everyone once in a while. But why do you feel sorry for me? This makes zero sense. Can you explain this?
 

Forumpanda

Member
Apr 8, 2009
181
0
0
Originally posted by: Keysplayr
@ Forumpanda: I can certainly admit when I am mistaken, as in this case. Like I said, I read contrary information a while ago and it stuck with me. But dude you really need to stop with this BS.
Hey it wasn't meant as an offensive remark, I appreciate the time you put into posting, it is just sometimes it can get a little obnoxious when you have like 5 posts on a page that is pretty much a repeat of the previous page ;) (obvious exaggeration here)
Instead of just letting other people who weren't even debating with you to begin with have their own debate.

I'm happy you admit you are wrong sometimes, and I will be happy to admit you know more about the graphics industry than I do, I just come here to read what people who put time into this part of the computer industry thinks.

<insert beer smiley because I suck at forums>