Nvidia: DirectX 11 Will Not Catalyze Sales of Graphics Cards

MODEL3

Senior member
Jul 22, 2009
528
0
0
http://www.xbitlabs.com/news/v...of_Graphics_Cards.html

Nvidia Corp. said during a conference for financial analysts that the emergence of next-generation DirectX 11 application programming interface will not drive sales of graphics cards. The firm believes that general purpose computing on graphics processing units (GPGPU) as well as its proprietary tools and emergence of software taking advantage of these technologies will be a better driver for sales of graphics boards than new demanding video games and high-end cards.


Q1 2010?
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
dx10 didn't catalyze sales either. people bought the cards because they were faster because that's the only reason you need when your present card isn't fast enough.

of course, OpenCL and CUDA 3.0 aren't going to catalyze sales, either. so it's a pretty stupid press statement. kind of like how "physx makes our 9400 GT faster than a 5870 X2" or whatever the comment was.

but they are right. the API precedes games by about a year, and it will not catalyze sales. i don't see how a launch date can be inferred from this statement, however. they have no choice but to stay in the media while continuously skirting the GT300 issue, and that's what they're doing.
 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
I have to agree with that statement. However, they would be saying the opposite if their DX11 parts were hitting the market before ATI's.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
of course. they would probably tell us we need dx11 support to make our dx10 and dx9 games better, and that PhysX is faster on non-PhysX supporting titles. all i said was that they were telling the truth this time. not that they're saints. or even "good" people.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,256
126
Disregarding DX11, the 58XX series should be faster than anything else in DX9 and DX10 so THAT should (ideally) drive sales of cards. nV is basically right in their statement, however, would anyone in the market when the new ATI/NV cards come out disregard DX11?? I doubt it.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: Barfo
I have to agree with that statement. However, they would be saying the opposite if their DX11 parts were hitting the market before than ATI's.

Absolutely.
 

Beanie46

Senior member
Feb 16, 2009
527
0
0
Of course, when they have DX11 parts, they will make damned sure everyone knows that their cards support that "not important" feature.

Make no mistake about this, if they had DX11 parts ready to go and ATi didn't, they'd launch a smear campaign telling the whole world how important DX11 is and how ATi isn't worth its money because of its ancient technology.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I think what is interesting here is that NV said this during a conference with financial analysts.

This isn't a smokescreen for undermining AMD's marketing efforts, there are other more appropriate avenues for that.

This is a conference with people who don't like to be fed BS, they aren't going anywhere and they'll be back for the next conf and the next conf quarter after quarter.

And what do they do with the info used to set their expectations? They determine where the institution money is going to be invested, stock valuations, earnings projections, etc. Again not the kind of folks that you try and manipulate because they are going to be the same people sitting in the conference call 6 months from now, and have been the same people sitting there for the past 3+ yrs.

These people have long memories and you do not cross them or blow smoke up their asses lightly. You will get knocked down (ticker price) and you'll forever find your stock trading in a narrow band of 6-8 times forward P/E instead of the happier band of 10-12 forward P/E because these are the guys who determine how much of your float is going to be tied up with institutional holdings.

Now the reason I say it is interesting is because look at what NV is doing there, they are setting the expectations of the people who determine where big money goes and the expectations they are trying to set is that the market growth opportunity is in GPGPU, not GPU, for the next year or so. Now it doesn't matter at these kinds of conferences whether they say market growth is going to be in selling dog food or making ballpoint pens, what matters is that you (a) have credibility in your growth forecasting abilities (hence the don't fuck with them factor), and (b) convince them you have the ability to capitalize on said growth market opportunity.

If NV says the growth opportunity, as they see it when they crunch their demographics data, is in the GPGPU applications markets then I think that is very telling. Remember both these companies, AMD and NV, have executive decision makers that are trying to convince their BOD, shareholders, and financial analysts that they have a method to their madness for bothering to stay in the profitless GPU business to begin with.

If their stated business plan is to stick with the status quo then that isn't exactly the kind of words to inspire bullish sentiments on the stock price. And when you have no profits, wallstreet loves a good growth story. So NV is trying to give indication where they think the growth story is and where it isn't.

You guys can say that you think NV would be singing a different tune of their DX11 hardware was coming out first but I really don't think that is the case, they wouldn't risk torquing off these kinds of folks just to attempt to manipulate their institutional holdings for the next 60-90 days before reality comes back to confirm or deny their stated outlook. Would be remarkable short-sighted of them to do that.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Barfo
I have to agree with that statement. However, they would be saying the opposite if their DX11 parts were hitting the market before ATI's.

Why? They still have to sell DX11 parts sooner than later as well.

If DX11 was an ATi exclusive, then this would just be smoke.

As someone pointed out, DX10 wasnt exactly mind-blowing either.

Originally posted by: Beanie46
Of course, when they have DX11 parts, they will make damned sure everyone knows that their cards support that "not important" feature.

Make no mistake about this, if they had DX11 parts ready to go and ATi didn't, they'd launch a smear campaign telling the whole world how important DX11 is and how ATi isn't worth its money because of its ancient technology.


Yes, it is all part of the master plan. Trash a technology that they themselves will have to promote in the next couple months in front of financial analysts.

They were then hoping that a tech-nerd would post it on an internet forum, and everyone would read it and not buy 5XXX!


Actually they werent even trashing DX11, they were just saying that wont be the only reason to buy the next gen. Basically they are talking up the other features.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
So their master plan is to sell a faster cpu? Time to buy a console I guess.
 

Bill Brasky

Diamond Member
May 18, 2006
4,324
1
0
Originally posted by: Idontcare
I think what is interesting here is that NV said this during a conference with financial analysts.

This isn't a smokescreen for undermining AMD's marketing efforts, there are other more appropriate avenues for that.

This is a conference with people who don't like to be fed BS, they aren't going anywhere and they'll be back for the next conf and the next conf quarter after quarter.

And what do they do with the info used to set their expectations? They determine where the institution money is going to be invested, stock valuations, earnings projections, etc. Again not the kind of folks that you try and manipulate because they are going to be the same people sitting in the conference call 6 months from now, and have been the same people sitting there for the past 3+ yrs.

These people have long memories and you do not cross them or blow smoke up their asses lightly. You will get knocked down (ticker price) and you'll forever find your stock trading in a narrow band of 6-8 times forward P/E instead of the happier band of 10-12 forward P/E because these are the guys who determine how much of your float is going to be tied up with institutional holdings.

Now the reason I say it is interesting is because look at what NV is doing there, they are setting the expectations of the people who determine where big money goes and the expectations they are trying to set is that the market growth opportunity is in GPGPU, not GPU, for the next year or so. Now it doesn't matter at these kinds of conferences whether they say market growth is going to be in selling dog food or making ballpoint pens, what matters is that you (a) have credibility in your growth forecasting abilities (hence the don't fuck with them factor), and (b) convince them you have the ability to capitalize on said growth market opportunity.

If NV says the growth opportunity, as they see it when they crunch their demographics data, is in the GPGPU applications markets then I think that is very telling. Remember both these companies, AMD and NV, have executive decision makers that are trying to convince their BOD, shareholders, and financial analysts that they have a method to their madness for bothering to stay in the profitless GPU business to begin with.

If their stated business plan is to stick with the status quo then that isn't exactly the kind of words to inspire bullish sentiments on the stock price. And when you have no profits, wallstreet loves a good growth story. So NV is trying to give indication where they think the growth story is and where it isn't.

You guys can say that you think NV would be singing a different tune of their DX11 hardware was coming out first but I really don't think that is the case, they wouldn't risk torquing off these kinds of folks just to attempt to manipulate their institutional holdings for the next 60-90 days before reality comes back to confirm or deny their stated outlook. Would be remarkable short-sighted of them to do that.

Your common sense and logic is not welcome here.

In all seriousness, I think they are discussing market growth potential because anyone who doesn't have their head up their butt (fanbois) knows that price/performance is what moves cards. Kudos to NV for recognizing the potential in GPGPU, since the true mainstream market is pretty much non existant.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
Originally posted by: ronnn
So their master plan is to sell a faster cpu? Time to buy a console I guess.

i hate it when people continuously juxtapose the CPU and GPU. larrabee's IDF slides only make things worse as they depict the words "CPU" and "GPU" literally on a collision course. larrabee can partly get away with this because it's x86, but the reality is that GPGPU is not about replacing the CPU, it's about putting a bound on the unreasonably huge and hungry CPU. x86 CPUs are designed to run many, many different types of applications both quickly and simultaneously, while there are only a handful of workloads, usually SIMD-like that GPUs do exceptionally well, and when it comes to this SIMD-like work, the CPU and GPU are not on a collision course because GPUs have simply been launched too far ahead. With the GPU, especially in terms of Fusion/Haswell- type architectures, I suspect the role they seek to play is that of a power-efficient vector coprocessor that is switched on only when called for. this is good way to "get idle asap," which seems to be a common sentiment these days. as a result, the amount of die area on a mobile CPU that is actually powered up will continue to shrink. i see the native direct compute video transcoder of windows 7 and perhaps a GPGPU-based adobe flash player being immediate needs that could be effortlessly satisfied by the coming generation of IGPs. This doesn't take the GPU's focus away from gaming at all; it's just a new aspect that can be taken away from the move to inherently faster architectures. The stream processing concept advocates that the majority of your die area be allocated towards large arrays of FPUs, which is pretty anti-x86. With x86 all your die area goes to cache and branching and trying to keep everybody happy and the pipeline full at the same time. This is a good thing, but there's no room for earth-shattering FPU in this architecture. I'll look into this, but i'm going to guess that there has not been a single x86 processor in history to devote more than 5% of its die area to FPU and/or SSE. Earth-shattering FPU is a good thing too, and as time goes on, the library of VLIW-type programs along with cuda/opencl/direct compute will continue to diversify and we'll all have fast laptops and phones. if you people have n scalar equations, just write a vector equation instead.

your big GPUs are going to get faster too, as we're beginning to see from these speculative RV870 data. We (universal we) are continuously doubling the size and speed of our frame buffer and at approximately doubling execution units and TMUs (with the exception of g200). so there's no need to whine about 3D performance not being center stage, because it absolutely is, and you have consoles to thank because console revenue is high and console games are not readily pirated especially now that the focus is shifted to online play and a high level of internet integration. ATI and nVIDIA will continue to compete for design contracts with future consoles. They may bounce back and forth between xbox and playstation as they have previously done, but it was the little wii that was the wildcard with outrageous sales and that console runs on an AMD GPU. so don't fret. AMD and nVIDIA will continue to chafe every season of every year under the burden of 3D rendering rivalry.

in the meantime we have a $400< Radeon that is supposed to be good for 2.7 tflop. It would take almost fifty i7 920's to achieve that, almost $13,000 in CPUs alone (except you would need more expensive xeons to make a real bloomfield-based 3tflop cluster). The radeon is cheap, it's one chip, and it runs crysis. Modelers, encoders, photoshoppers, cryptographers, folders, gamers and most importantly scientists are in for something cool, and soon. GPGPU is simply badass, whether your implementation is large or small (prepare to see petaflop mATX systems and tflop notebooks).
 

jimhsu

Senior member
Mar 22, 2009
705
0
76
In a way, they contradict themselves because one of the key features that I'm looking at for DX11 is compute shaders. Imagine that - running general purpose code on a massively parallel architecture for your apps - outsourcing physics, AI, post-processing, etc. Sounds like "compute" to me.

The CPU is great for singlethreaded linear workloads. For massively parallel processing, of course Nvidia and ATI (and Intel) are looking at GPUs.
 

dev0lution

Senior member
Dec 23, 2004
472
0
0
Originally posted by: jimhsu
In a way, they contradict themselves because one of the key features that I'm looking at for DX11 is compute shaders. Imagine that - running general purpose code on a massively parallel architecture for your apps - outsourcing physics, AI, post-processing, etc. Sounds like "compute" to me.

Umm, actually they don't. DirectCompute in Windows 7 is not tied to DX11 hardware. NVIDIA's DX10 parts already support the compute function. CUDA has already been doing "compute" - DirectCompute and OpenCL only add to the programming options you have for GPU computing.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
This article makes me think otherwise. Not right away, but if DX11 isn't a flop like 10, then eventually I could see DX11 drive sales.

It mostly depends on how hard the game devs push it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: jimhsu

In a way, they contradict themselves because one of the key features that I'm looking at for DX11 is compute shaders.
I came in here to post exactly the same thing. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: dev0lution
Originally posted by: jimhsu
In a way, they contradict themselves because one of the key features that I'm looking at for DX11 is compute shaders. Imagine that - running general purpose code on a massively parallel architecture for your apps - outsourcing physics, AI, post-processing, etc. Sounds like "compute" to me.

Umm, actually they don't. DirectCompute in Windows 7 is not tied to DX11 hardware. NVIDIA's DX10 parts already support the compute function. CUDA has already been doing "compute" - DirectCompute and OpenCL only add to the programming options you have for GPU computing.

This ^

Direct Compute has nothing to do with DirectX 11 specifications.
As stated, Direct Compute is already fully supported by DX10 hardware (G80 and up).

 

Henrah

Member
Jun 8, 2009
49
0
0
Originally posted by: alyarb
Originally posted by: ronnn
So their master plan is to sell a faster cpu? Time to buy a console I guess.

i hate it when people continuously juxtapose the CPU and GPU. larrabee's IDF slides only make things worse as they depict the words "CPU" and "GPU" literally on a collision course. larrabee can partly get away with this because it's x86, but the reality is that GPGPU is not about replacing the CPU, it's about putting a bound on the unreasonably huge and hungry CPU...... if you people have n scalar equations, just write a vector equation instead.

your big GPUs are going to get faster too, as we're beginning to see from these speculative RV870 data. We (universal we) are continuously doubling the size and speed of our frame buffer and at approximately doubling execution units and TMUs (with the exception of g200).......

in the meantime we have a $400< Radeon that is supposed to be good for 2.7 tflop. It would take almost fifty i7 920's to achieve that........ GPGPU is simply badass, whether your implementation is large or small (prepare to see petaflop mATX systems and tflop notebooks).

(Shortened for post size only, not relevancy)

A very good and interesting read. Thanks alyarb :)



About GPGPU:

With NV pushing CUDA and ATI pushing OpenCL, will developers mind programming for both (and how hard is it to do so), or might this be like BluRay/HD-DVD?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: WelshBloke
Does this mean NV are giving up on gamers? ;) :evil:

Nah, I think it is NV recognizing that PC gamers have given up on the PC game developers. (or vice versa, PC game developers have given up on the PC gamers in a chicken vs the egg situation)

At what point did DX10 become a differentiating catalyst in the growth of PC game sales?

DX11 looks interesting, but can game developers really rely on DX11 to become a significant selling point in that it catalyzes PC game sales?

If game developers aren't confident in this then what basis is there to support the notion that DX11 will spur the growth of GPU sales?

I'm not saying I agree with NV or that I feel that PC game sales are lackluster, etc, I'm just saying there is a need for self-consistency in the logic here as we connect the dots...and if we take NV's statements as fact then there are ramifications to the implications of what that means they are hearing from the PC game developers for next year's titles. (and who would better know what the PC game developers are saying at a high-level about DX11 than the GPU guys?)
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Obviously DX11 will only catalyse sales when there are games people want to play that require it for a noticeably better experience. Until then it's a nice to have because you know you'll need it in the future. However no one is going to buy a card primarily for DX11 games - you might as well wait until the games arrive then purchase, as by then the cards will have got cheaper/faster.

Same with opencl/havok - it's a nice to have, but until games you want to play support it pointless buying a card for that reason.

Performance is the normal reason, but tbh anyone with a 4870 can run nearly all games maxed out. As the sort of people to spend big on gpu's probably already have a pretty fast one, unless you must play stalker/crysis again no point spending $$$ on a replacement if you aren't going to see any difference. Due to the prevalence of console ports that's not looking too likely to change right now. Even those without a fast card would probably be wiser to save their money and just buy a cheap 4870.

Hence the big sales push for eyefinity - that's the only thing that absolutely requires a 58xx card (and at least 3 monitors :) )
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
DX10 was essentially a flop due in large part to the slow adoption rate of Vista.

I see Win7 gaining a much larger market share in a considerably shorter time frame which should make DX11 more influential (there will be a large enough installed base to support DX11-only titles, something we haven't really seen from DX10).

Until games are released that require DX11 it will not be a true driver of hardware sales.
 

Red Irish

Guest
Mar 6, 2009
1,605
0
0
Originally posted by: Idontcare
Originally posted by: WelshBloke
Does this mean NV are giving up on gamers? ;) :evil:

Nah, I think it is NV recognizing that PC gamers have given up on the PC game developers. (or vice versa, PC game developers have given up on the PC gamers in a chicken vs the egg situation)

At what point did DX10 become a differentiating catalyst in the growth of PC game sales?

DX11 looks interesting, but can game developers really rely on DX11 to become a significant selling point in that it catalyzes PC game sales?

If game developers aren't confident in this then what basis is there to support the notion that DX11 will spur the growth of GPU sales?

I'm not saying I agree with NV or that I feel that PC game sales are lackluster, etc, I'm just saying there is a need for self-consistency in the logic here as we connect the dots...and if we take NV's statements as fact then there are ramifications to the implications of what that means they are hearing from the PC game developers for next year's titles. (and who would better know what the PC game developers are saying at a high-level about DX11 than the GPU guys?)

In other words, game on a console and forget about upgrading your pc?

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
there is certainly going to be a lot of growth in gpgpu going forward, but I can't help but feel that if nvidia had a monster card coming in the next 90 days they would have at least mentioned it to the analysts. In fact, huge performance jumps like 8800gtx or incredible value increases like 8800gt have in the past made nvidia a LOT of money. I don't know when the gt300 release is going to be, but I think that we can rule out an imminent release of a 5xxx killer if nothing else.
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: Denithor
DX10 was essentially a flop due in large part to the slow adoption rate of Vista.

Thats the second time i read this "flop" nonsense in this thread.

D3D10 doesnt even remotely sport the feature set D3D11 does, so it was pretty clear that it wont be a beast as far as visual improvement and to some degree performance goes. If you believe otherwise, you fell for the marketing hoopla of MS, ATI and NV.

Yet, it paved the way for D3D11 by getting rid of alot of legacy junk that was standing in the way of progress with the D3D api.

Everyone and their mother could have been using vista on day 1 after the launch and we still wouldnt see more much better looking games due to D3D10, simply because it was never capable of delivering that...