What happens to nvidia?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
But he said AMD won't make a dent in Nvidia's profits. I think a competitor that appears to be out maneuvering Nvidia in one of their big revenue areas and is putting up strong competition will in fact have an impact on their profits.

I'm not saying Nvidia is going anywhere. I'm not saying that Nvidia can't make money. But I don't know how someone can say AMD won't have any effect on Nvidia's bottom line.

Obviously AMD can effect nvidia - they are in a lot of the same markets. The problem is there's a lot of exaggeration, all this talk of the 6XXX to "killing" nvidia is highly unlikely whatever Charlie says, even if it is everything AMD hope for in truth the impact will still be fairly small.

If you want to talk about a competitor out manoeuvring them it was Intel - how much more money could nvidia have made if they hadn't been locked out of the chipset market?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
How can you ask for no trolling when you're instigating yourself. And how can you be so naive? You are either ignorant or in denial if you think AMD could even put a dent in nvidia's profits with Radeon cards.

Nvidia has profit margins with quadro and tesla cards that AMD can only dream of. A market that AMD can't even touch, nvidia has no competition at the moment except for Intel in this arena.

You have to look at the big picture.
http://www.nvidia.com/object/quadro-fermi-home.html
Up to $5000 a piece.
http://www.nvidia.com/object/tesla_computing_solutions.html
I don't even know how much they cost, but you certainly can't buy one.

Plenty of huge corporations purchase these. All you're focusing on is what you are interested in, the small ludic gaming market which isn't even 30% of their profits. And even there they are doing just fine, as you can clearly see Fermi cards sell well.

If anything you should be thankful that AMD is early to the game with a good competitive end user gaming product, because if they kept releasing products like R600 they probably wouldn't exist by now. The two companies compete but they aren't the same.

JaG no need to attack the op this is in fact a liget topic,
But to say amd hasn't a clue as to how to attack the High margin areas NV is attacking is wrong . Fusion although meager to start with Is a real threat . You must consider intel and AMD do have a common goal . Retain market in x86. and do what NV is tring to do . Intel and AMD will work together in the area of compatiable programing and in the hunt that is the fox is it not. So your remarks are inflamatory and invalid. The OP is correct in that this is a valid topic.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Very nice posts I read them all. I think especially after sayinig their branding will only be AMD on graphic cards starting next year I think. It's pretty much go with AMD cpu and graphics card or go with the solo companies,, Intel and nVidia. I have choosen Intel and nVidia many years ago with my rig and I will continue to do the same in 2020 when I upgrade. I never liked ATI and the x800 gave me hell with driver issues and CCC issues soo Im one and done with them. The CPU and the VC are priced lower then Intel and nVidia. If your looking to save cash or go as low as you can go with AMD package,, If you have money and want the best go with Intel and nVidia .. Thank you and gb and gl,
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
... If your looking to save cash or go as low as you can go with AMD package,, If you have money and want the best go with Intel and nVidia .. Thank you and gb and gl,

Your post makes no sense.

Until the release of Fermi, if you wanted "the best" this year, you had to go with AMD/ATI for a video card.

At this point, one canmake a very strong argument that Nvidia is offering better value than Radeon throughout the range where GTX 4xx & Radeon 5xxx intersect.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Now that NVDA is in 2nd plac..."We don't care about the enthusiast video card market" ...yeah right:rolleyes:
Green team oughta take it like men instead of trying to move the goal posts.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
So much for nVidia's winning streak, I really hope they can recover, without them, the next Radeon HD 6870 will stuck at the same price point forever lilke the HD 5870...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think they could indeed put a dent in Nvidia's profits... didn't Nvidia lose money last quarter?

In the earnings announcement [August 2010], the company addressed a longstanding issue--and ongoing financial burden--centered on a defect in some of its earlier GPUs and chipsets. The problem was first cited by Nvidia in July 2008 when it announced a charge ranging from $150 million to $200 million to cover costs to repair and replace GPUs and chipsets due to "weak die/packaging material" in older laptop products. "Die/packaging" essentially describes the chip. Nvidia also announced additional charges after July 2008.

On Thursday, Nvidia said it recorded an "additional net charge" of $193.9 million related to the same problem. "The charge includes additional remediation costs as well as the estimated costs of a pending settlement of a class action lawsuit related to this matter," the company said in a statement. Combined with the $282 million of net charges announced previously, the total net charge related to the issue comes to $475.9 million, the company said.

Read more: http://news.cnet.com/8301-13924_3-20013543-64.html#ixzz0yxiDXOem

So a $194 million expense one-time write-off related to GPU/chipset defects was part of the $141 million net loss. So if you take that out of the equation, NV still made a profit, albeit marginal.

The reality is that both AMD and NV are far more susceptible to the economic downturn rather than who has the performance crown of some discrete videocard only 5% of users care about. After all, consumer graphics is not what people consider a necessity -- not like an iPhone :). Both companies had their share of failures - 5800 and 2900xt/3800 series come to mind. Fermi isn't as bad of a failure as NV5800/2900xt/3800 series were though - the pricing and performance are very competitive.

Credit must be given to ATI for releasing a more power efficient and cost effective architecture for gaming with HD4000/5000 series (and probably 6000 series). The problem with NV not being competitive with HD6000 series is that ATI will again maintain $400 price on the high end for another 6-9+ months.

One sure way to lose a fight....is to not show up at all. NV's biggest pitfall this generation was releasing the cards 6 months late, with GTX460 9 months late and nothing on the low end <$180.
 
Last edited:

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Why bother then?

Because nvidia has been known to do things just for glory in this segment.

If they cared about this market, they wouldn't release a GPU that runs at 95C, sounds like a leaf blower, and uses 300W of power. They do it just so they can claim they are leaders. The original design and it wasn't meant for the gaming market, it was meant for HPC. That's what pays off R&D.

So you take GF100, and make it run fast enough to beat the competitor, without causing a meltdown. It's clear that Fermi can run even at 800 Mhz with all 512 SP enabled, and that would probably make the 5970 even beg for mercy, but it wasn't feasible in terms of power and heat. So nvidia settled for fastest single GPU and called it a day. And they barely squeezed that within spec.

As you can see, the GeForce product is just an offspring of the original design, adjusted in such ways to be competitive with the AMD while retaining some sort of leadership. Nvidia always does this, they claim to be leaders with CUDA, Physx, and all the other candy they pack and proudly advertise. And driver and application support is generally what sets them apart and drives people into buying their hardware as opposed to AMD.


JaG no need to attack the op this is in fact a liget topic,
But to say amd hasn't a clue as to how to attack the High margin areas NV is attacking is wrong . Fusion although meager to start with Is a real threat . You must consider intel and AMD do have a common goal . Retain market in x86. and do what NV is tring to do . Intel and AMD will work together in the area of compatiable programing and in the hunt that is the fox is it not. So your remarks are inflamatory and invalid. The OP is correct in that this is a valid topic.


Fusion is more a part of AMD's cpu design and GPU design. It is meant to compete with Intel, not with Nvidia. Since Nvidia doesn't produce neither CPUs nor chipsets anymore, they aren't very concerned with integrated solutions.


Now that NVDA is in 2nd plac..."We don't care about the enthusiast video card market" ...yeah right:rolleyes:
Green team oughta take it like men instead of trying to move the goal posts.


Not really. It's more like they found much more business elsewhere (CUDA and HPC) and decided to focus on that.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
What kind of affect does the release of 6XXX series have on nvidia? After the beating they have taken due to the early release and success of the 5XXX series I'm sure they aren't happy to know it's happening again with the 6 series. I'm not sure what it costs to make the gtx 4 series but they can't be making much selling them much cheaper than the competition. Are they gonna survive this or could nvidia become what amd CPU division have become?

Keep it civil. No reason for trolling. At this point it's become a valid discussion. The economy has affected everyone and How long can a company survive today when they make no money?
It is funny how you want to keep this civil where you begin this thread is a troll thread. I will try to break your argument one by one, in a civil way.

First, it is true that the cypress series from AMD is a success, but there is no guarantee for 6xxx. Look at Nvidia, with the success of the G92 core, they roll out the entire 2xx series with success. It is atleast as good as the 4xxx series from ATI at the time and could have simply recycle the design just like ATI did with there 4xxx series, but they didn't. The Fermi series are more or less a complete redesign in terms of architecture. While the first product isn't perfect, it is expected. GTX460 clearly indicated that the Fermi architecture works at multiple ends, unlike what people said "It isn't for gaming."

So now Nvidia has the first generation of their new architecture out, plus all sorts of CUDA, 3D and PhysX to back it up, what do ATI have? Are they going to recycle the 4xxx design again and use 25 nm this time? Sooner or later ATI needs to make a new design, and it will get a hit just like Fermi got. You can't expect something new just work without problems. Here is the catch though, if they get a hit like the one Nvidia got, they may not be able to get back like Nvidia.

How did Nvidia got back? Well with those proprietary stuffs. Those who have Nvidia 3D vision probably work even try an ATI card if it isn't like 150&#37; performance with 50% the price. Those are utilizing the power of CUDA will not switch until ATI come up with something that can suit their needs. Yes some can hack their way through and use their old Nvidia card as a PhysX card, which many have done, but Nvidia's new drivers won't support such setup, so it is the matter of time before they sell their ATI card for a Nvidia card. Why do I say that? Well, they could have sold their old Nvidia card at the first place and ditch PhysX, but they didn't. An outdated Nvidia card can serve as a PhysX card. An outdated ATI card can do what?

The one thing the lead to the success of the cypress series wasn't what ATI has done right, but the amount of screw up Nvidia had made. Nvidia cease the production of GTX285 way too early. The reason for this action may be due to their confidence on the fermi architecture and they were so ready to mass produce it as soon as it is ready. On one hand my beloved Charlie was busy generating FUD about how Fermi is unmanufacturable, too big, and too hot while Nvidia was busy making their big hit. All was suppose to go well until TSMC delivered the message to both Nvidia and ATI about the manufacture problem they have on the 40nm yield. While ATI can modify their design with ease since it really isn't new, Nvidia was more or less screwed. They can't go all out with Fermi and redesign would take too long as the factory line has stopped. They have no other choice but to go forward with what they have, just not at full force. Rumors said that Fermi will have very limited supply due to yield, which was busted. Charlie's FUD about unmanufacturable was busted. Rumors about no tessellation was also busted. It was, however, power hungry as it isn't 100% efficient and generates too much heat.

Usually this wouldn't be as serious as it had been but it was at the time where window 7 and DirectX 11 first arrives. This is a golden time when consumer want to buy new hardwares and Nvidia has nothing to sell between september 2009 and march 2010. Mean while ATIs cypress was the only option for a new video card, which eventually lead to a unexpected sells that ATI have to increase the price to smooth out the demand. All OEMs were asking for goods and Nvidia really have nothing. The new isn't ready and the old has ceased. Just when things are bad enough on this end, their other product, Tegra2, had discovered a problem all the way down to its design level. Many new products (tablets) never made into market in time which benefited ipad, and therefore iphone.

As if things are not bad enough, some smart programmers actually cause some aftermarket HS fan not to spin with their new Nvidia driver. They new SC2 is a demanding game and will cause video card to get very hot. The intention is to adjust the fan speed (spin faster) but somehow some aftermarket HS fan decided not to follow instructions/specifications. Now we know that SC2 kills video cards, but at the time people believed that the driver is the solo cause of the dying cards. GG!

As you can see, Nvidia's wound was not caused by ATI, but TSMC and themselves. Of course public attention only focus on 480, and people relate the failure of everything to 480 as 480 is the only thing Nvidia ever made. Many believed that it is too big and too hot. The truth is, 480 is not bad and heat isn't a problem. I was expecting to find lots of threads about the death of GTX480 in the SC2 forum, at least a few. To my surprise, no deaths of 480! Many G92 chip based cards were fried, some ATI 4xxx and cypress, but no 480! Too hot? I don't think so. The idea of hot however, stayed.

So with the 6 months glory ATI had, people are slowly selling their ATI card for a Nvidia card even if it is a side-grade or down-grade. Why? Well there are things that you can only do with a Nvidia card. Those FUD about 480/470 will keep people away from them, which allows their new 460 to sell. Interestingly, 460 is indeed a better make, lower cost and higher yield. This leaves 460 the only card they should make, so they are massively producing it, and therefore able to reduce the cost of BoM. Eventually, 485/475 will replace 480/470, and dual core is not far away.

It is true that most cypress user ain't going to drop the card now because their is no reason to, but that is going to change really soon. Lots of 3d movies are coming out and not long until 3D home theaters kicks in. Lots of laptop manufacturers see this and are now using Nvidia chipset for Nvidia 3D. Where is the AMD's version? I hope the 6xxx series will retrofit 3D or else the market will swing back to Nvidia as soon as Avatar 3D video boxset arrives. IMO ditching the ATI name is not a smart thing AMD to do at this time.

Programmers are not gods. Think of them as smiths who craft softwares, and video cards and processing units are woods and nails. Without the proper tools, the quality of those woods and nails are meaningless. The thing about CUDA is not on the quaility, but the tools which drives it. The fact is, although 2xx architecture supports CUDA, the new Fermi architecture will allow better utilization when it comes to CUDA computing, think of this as making nails that goes into wood easier. This is an improvement of quality where ATI has not even begin. CUDA is based on C++, which is something all programmer knows, and Nvidia has provided CUDA development suite which allows programmers to forge softwares with, while using "open whatever" and/or ATI's stream is like trying to make a table with bare hands. Someday someone will invent the tools which can utilize them, but none has been invented yet. Even if AMD starts now they are still years behind.

Video games are important, more important than food some may believe. They also believe that the video card that pushes the most FPS is going to have the last laugh, and I agree. The question is, how do they get there? AMD creates quality woods and nails and hope someone will figure out how to utilize it. Nvidia creates quality woods, nails and the tools to use it. Who will have the last laugh?

Forget about theorycrafts, look at what we have now. Nvidia users are the first who experience Adobe Flash accelerated by GPU. Video editors are also benefited only by Nvidia users. When will ATI users have a taste at what GPU can do for them other than playing games?

Will 6xxx changes all this? Will fusion changes all this? We will see.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Because nvidia has been known to do things just for glory in this segment.

If they cared about this market, they wouldn't release a GPU that runs at 95C, sounds like a leaf blower, and uses 300W of power. They do it just so they can claim they are leaders. The original design and it wasn't meant for the gaming market, it was meant for HPC. That's what pays off R&D.

Of course the GF100 released for the HPC market are all cool parts that don't consume insane amounts of power... and HPC really loves high power consuming parts...

And of course the GF 5800 never existed... and all the recent GPUs from NVIDIA are best known for being tiny, cool and consume small amounts of power.

So you take GF100, and make it run fast enough to beat the competitor, without causing a meltdown. It's clear that Fermi can run even at 800 Mhz with all 512 SP enabled, and that would probably make the 5970 even beg for mercy, but it wasn't feasible in terms of power and heat. So nvidia settled for fastest single GPU and called it a day. And they barely squeezed that within spec.

Except those 512 SPs parts are nowhere to be seen.

As you can see, the GeForce product is just an offspring of the original design, adjusted in such ways to be competitive with the AMD while retaining some sort of leadership. Nvidia always does this, they claim to be leaders with CUDA, Physx, and all the other candy they pack and proudly advertise. And driver and application support is generally what sets them apart and drives people into buying their hardware as opposed to AMD.

That the GF100 isn't a gaming design there is few doubts.

The GF104 is a much more balanced part from a gaming pov. Although it must have come as someone hobby, because clearly NVIDIA wouldn't spend resources in the gaming market.

But soon we will see if AMD doesn't retake the single GPU performance crown with a smaller power budget and GPU.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
So you take GF100, and make it run fast enough to beat the competitor, without causing a meltdown. It's clear that Fermi can run even at 800 Mhz with all 512 SP enabled, and that would probably make the 5970 even beg for mercy, but it wasn't feasible in terms of power and heat. So nvidia settled for fastest single GPU and called it a day. And they barely squeezed that within spec.

Doubtfull, Fermi with the remaining 32 stream processors enabled will not do a huge difference in performance, they have everything else of the Uncore enabled (I think they had 1 TMU disabled), so simply adding 32 stream processors and 1 TMU will not make it up for the fact that the HD 5970 is considerable faster, look at these benchmarks.

http://en.expreview.com/2010/08/07/more-benchmarking-results-of-512sp-gtx-480-exposed/9018.html

"Under the Extreme mode with 3DMark Vantage, 512SP GTX 480 got a GPU score of 10072 while the 480SP version scored at 9521, showing a slight performance difference.

According to the Feature Test 1, 512SP delivers better texture fill performance of 41.55 GTEXELS/S, compared to 38.82 GTEXELS/S on the 480SP-equipped model. Under the enthusiast mode with Crysis Warhead (1920×1200, 8xAA on), the 512SP GTX 480 runs at 34.72fps, while the 480SP model at 32.96fps."
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81

Seero, enough with the troll thread accusations. You call the G92 a success and indeed it was a success, at the end they made so much of such chips that they had to write off inventory lots of them that are still currently in a warehouse. And if you didn't know, the GT200 was basically a tweaked G92 with twice of everything, just like the RV770 was when compared to the RV670, so no new architecture here from both GPU vendors, the failure here is that nVidia with its two times bigger GT200 was only competitive with the two times smaller RV770, which shows the engineering prowess that AMD is capable when they put their crap together. I don't call the HD 3870 a failure either, it basically offered HD 2900XT performance for 1/3 of its power consumption, DX10.1 and competitive prices, it couldn't outperform the 8800GT but added some pressure to force nVidia to lower prices. It sold well but was far from spectacular compared to the HD 4x00 series.

nVidia has to resort to a GPU to address CUDA, CPU market, HPC stuff etc because they lack of a CPU, AMD will not want their GPU to stagnate CPU sales in the HPC market or server market or supercomputer market. They want to create a more flexible solution combining a CPU and a GPU (Fusion is one of their concept), its more powerful to combine the flexibility of a CPU for general and serial code with the powerful parallelism of a GPU to create a complete HPC or supercomputing solution, GPU's are horrible for serial, branchy and dependant code. (You know that!)

PhysX is a gimmick and will not take off until nVidia pull their greedy head out of their arse and makes it more useful, briving developers to add unrealistic debris is not what I call realism and inmersion. CUDA is indeed has potential and AMD has nothing to beat it for now. 3D will take off and who said that you can't use 3D in AMD hardware?

You can't blame nVidia entirely for Cypress huge success, I'm pretty sure that if nVidia released Fermi at the same time as Cypress, AMD would have the slight edge in sales thanks to its lower power consumption, competitive performance and competitive price, I'm pretty certain that it would had been GT200 vs HD 4x00 series all over again, the much bigger die with higher power consumption trying to compete with the smaller die with much less power consumption and cheaper to manufacture and sell, hence more profits.

TMSC affected both, nVidia and AMD, and that the GTX 480 isn't bad? Well, it ain't bad but not great either, isn't much faster than the HD 58770 and consumes way more power, and heat doesn't have much to do with chips dying, the G92 was a fairly cold chip and yet, they died quite often, the HD 4870 deaths are rare, but they exists, the GTX 480 is too new to claim such empty statements, the GTX 480 SLI is the hottest setup ever created by man and there's no way that you can't deny that, the HD 5970, two chips on a stick are cooler than the GTX 480, and you say that the GTX 480 isn't too bad? :rolleyes:

Like I said before, 3D can run on AMD hardware, with 3rd Party software like EZ3D, let see if AMD will eventually will come with their own solution. 3D is barely starting to come off now.

And the argument that the GTX 460 is selling well isn't totally right, why would you drop the prices of a product that's selling well? That will not do good for your brand. Like Phynaz said:

"With all those glowing reviews you don't devalue your product and your brand by cutting its price if it is selling well."

The prices of the GTX 460 are dropping crazy because the product isn't selling fast enough off the shelf and many of them are in warehouse waiting for shipment.

You are trying to devaluate AMD's effort in the GPGPU arena, they did increase their GPGPU performance with Evergreen, a single HD 5870 can match the GTX 280 in general GPGPU performance, in some calculations can be faster, in others, slower. AMD's effort in GPGPU arena can't be compared with nVidia's great efforts, but AMD is a CPU company, they don't need to put GPGPU performance a top priority. nVidia is making a "All In One" solution to increase their market share because Intel and AMD is squeezing nVidia out, Intel kicked them off of the chipset market and AMD is dominating the OEM, Notebook and Low End integrated market with their SKU's. What does nVidia had to compete if they had only created a GPU for gaming?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The prices of the GTX 460 are dropping crazy because the product isn't selling fast enough off the shelf and many of them are in warehouse waiting for shipment.

Is that another one of Charlie's fud statements, which has nothing to do with reality??? I am pretty sure Charlie has 0 background in finance but an Honorary PHD in marketing from AMD.

Let me add more fuel to the thread:

Nvidia Tops AMD in the High-End Graphics Market in July

"The release of the GTX 460 card helped NVDA to retake the majority of the market share in the mid-range, and NVDA also gained share in the low-end," analyst Daniel Berenbaum said in a note to clients.

NVDA's share came in at 55 percent, while AMD tapped 45 percent of the high-end graphics card market. In June, NVDA had 57 percent of the high-end graphic card segment, compared to AMD's 43 percent.

Separately, the brokerage said its checks suggested that NVDA's new Fermi-based products are not doing as well with original equipment manufacturers (OEM) as in the after-market. However, NVDA share gains in the after-market would seem to imply that the weakness evident in its recent negative pre-announcement may be a market issue and this weakness will likely affect AMD as well.
 
Last edited:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Let me add more fuel to the thread:

Nvidia Tops AMD in the High-End Graphics Market in July

NVDA's share came in at 55 percent, while AMD tapped 45 percent of the high-end graphics card market. In June, NVDA had 57 percent of the high-end graphic card segment, compared to AMD's 43 percent.

Separately, the brokerage said its checks suggested that NVDA's new Fermi-based products are not doing as well with original equipment manufacturers (OEM) as in the after-market. However, NVDA share gains in the after-market would seem to imply that the weakness evident in its recent negative pre-announcement may be a market issue and this weakness will likely affect AMD as well.


The GTX 460 is hardly a high end card, :) So I really doubt that such news are true, specially when steam survey states the opposite.

http://news.cnet.com/8301-13924_3-20012025-64.html

"AMD's ATI graphics unit took 51 percent of the standalone, or "discrete," graphics chip market compared to Nvidia's share that was just shy of 49 percent, according to Mercury Research, a Cave Creek, Arizona firm that tracks graphics chip shipments. This is a sharp reversal from the same period a year ago when Nvidia had about 59 percent of the market and AMD had just under 41 percent.

Total market figures for the second quarter of 2010 had Intel with a 54.3 percent share, AMD with 24.5 percent, and Nvidia with 19.8 percent. This, also, is a setback for Nvidia: in the second quarter of last year, Nvidia had a 29.6 percent share compared to AMD's 18.2 percent.

Read more: http://news.cnet.com/8301-13924_3-20012025-64.html#ixzz0yyEInYiC"

How in less than 3 months nVidia could catch up with AMD? :rolleyes:
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The GTX 460 is hardly a high end card, :) So I really doubt that such news are true, specially when steam survey states the opposite.

Hmmm let's see. Should we believe a Steam Survey which captures a tiny fraction of mostly a North American market (and North America itself is a small market for NV) or a financial analyst who has access to exact info from Management of NV regarding global sales levels? :sneaky:

AMD has taken away market share from NV, without a doubt. But you have to remember that NV had no cards prices below $250 for almost 9 months...with GTX460/470 and 480 at better price/performance points than ATI 5830/5850/5870 at the moment, ATI will need to lower prices or itself risk losing back some of that share.

Also you are discussing total discrete market share in the results you posted above - that includes mobile GPUs and <$200 GPUs as well. I only am talking about high-end discrete GTX4xx series in the desktop space.

Wait until mobile GTX4xx series gets to market, GTS450/455 and HD6770 launches and let's revisit the Overall Discrete market share in 2 months :)
 
Last edited:

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Hmmm let's see. Should we believe a Steam Survey which captures a tiny fraction of mostly a North American market (and North America itself is a small market for NV) or a financial analyst who has information from Management of NV regarding global sales levels?

Their definition of high end is totally borked. Did the survey include professional graphic cards too or just consumer products?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Their definition of high end is totally borked. Did the survey include professional graphic cards too or just consumer products?

Agreed about the definition. Since NV has no low-end GTX4xx cards, I presume they just grouped GTX460/470/480 as high-end. The results don't include professional graphics because that's a separate segment from "Discrete Graphics".

BTW, NV has also extended their market share in professional space by another 2.2&#37;.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Hmmm let's see. Should we believe a Steam Survey which captures a tiny fraction of mostly a North American market (and North America itself is a small market for NV) or a financial analyst who has access to exact info from Management of NV regarding global sales levels? :sneaky:

AMD has taken away market share from NV, without a doubt. But you have to remember that NV had no cards prices below $250 for almost 9 months...with GTX460/470 and 480 at better price/performance points than ATI 5830/5850/5870 at the moment, ATI will need to lower prices or itself risk losing back some of that share.

Mercury research is far more realistic than such ghost link that looks like one of these dodgy stock-picking investment spam operations. How can you call disputable winner the GTX 480? A card that's barely faster than the HD 5870, more expensive and slower than the HD 5970? The only card competitive to the HD 5870 in a price/performance perpective was the GTX 470 which just now recently had price cuts and now is a great buy. Remember than the High End market is a niche and the High End market terms used in such link is flawed. Is basically calling all the GTX 4x0 series a high end series of cards, competing against what? HD 5970? A card that's not breaking any sales record? Please. :rolleyes:
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Lol steam survey.

There are like 25 million accounts on Steam, of which 10 percent if you are lucky participate in the hardware survey, of which 10 percent if you are lucky are actual enthusiasts that are ready to dish out money when the latest and greatest comes out. That makes 250 thousand enthusiasts, fanboys, or people with more money than brains. This is your idea of an analytic sample?
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
<Snip you because you Snip me>:twisted:
First, it is IZ3D. It isn't a software, but a hardware setup consists of its own monitor and shutter glasses. It is a 22" LCD @ 1680x1050 and doesn't support multi-display.

Second, PhysX does have its values and people who have brought an new ATI card will try to heck its drivers for it. People won't be doing this if it is just a gimmick.

Everything else are covered by the post you have <snipped>.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Their definition of high end is totally borked. Did the survey include professional graphic cards too or just consumer products?
Please excuse my ignorance, but why will I want to spend 600+ bucks on the high end card where I can simply SLI 2 GTX460 with better performance at 400 bucks?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
BTW, NV has also extended their market share in professional space by another 2.2&#37;.

There is no competition for Nvidia, in the professional space.

Period.
Please excuse my ignorance, but why will I want to spend 600+ bucks on the high end card where I can simply SLI 2 GTX460 with better performance at 400 bucks?

Your statement is valid when Nvidia releases a $400 dual gf104 card.