• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Here's what AMD didn't want us to see - HAWX 2 benchmark

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
False.

Ubisoft said no to code that lowered the amount of tessellation (for both vendors).

Quit lying.

Look at the facts. Look at the numbers. Do you realy want to beliewe this is real and not constructed with a purpose? - and the rest of the dx11 games on this planet is wrong?
 
Look at the facts. Look at the numbers. Do you realy want to beliewe this is real and not constructed with a purpose? - and the rest of the dx11 games on this planet is wrong?

Oh really :hmm:
Go look at the Civ 5 numbers again.

Why are there so many false claims in favour of AMD?
You did read the whole thread right?
If not...why post? :thumbsdown:
 
what I dont get is if tesselation is so important, and so big a deal, ment to save people memory so their cards run a game faster than if they did the same IQ without it, why are some reporting this:

"The performance drop in this particular case is almost 50%" -quote xbitlabs

thats from Metro2033, where its only used on the characters.
Is that because they used to high detailed models? so the benefits are almost nothing other than hurting performance for nearly no IQ improvement?

If tessellation in its current form is being used the way Metro2033 does, you get like NO IQ improvement for 50% fps hit.... ermmmm NO THANKS.

Metro2033 -- 2560x1600 x AF16
480 gets 56.3 fps without Tess.
480 gets 30.6 fps with Tess turned on.

PS check it out for yourselfs (theres like NO differnce with or without tess):
http://www.xbitlabs.com/articles/video/display/hardware-tesselation_8.html
 
This sholdnt surprise anybody. AMDs own graph shows their tesselation performance above factor 10ish starts to arrive near the 5000 series with these chips. In other words the tesselator is still weak compared to Nvidia's 400 series. It should be interesting to see if the 6900 series fixes this issue. But considering Tessealation was one of the 3 big additions in DX11 it is disappointing AMD didnt address this issue with these cards.

Ironic that there has been so much online hype about how AMD was first to market with DirectX 11, yet the 5000-series cards are quite slow in certain DirectX 11 games compared to the Geforce 400-series cards. This is especially true with games that use heavy tessellation.

If Nvidia was using the same die area (255mm^2), their tesselator power would be reduced accordingly because it scales with shaders for them.

So in a heavily tessellation limited benchmark like this, in order to make comparisons on the tesselator being weak/strong you should actually be testing a GTS 450 against the 6870. Similar die sizes means a fair fight.

I'm sure if AMD was allowed a 366mm^2 die like the GTX 460 where the extra area was just tesselator units it would own that benchmark.

The world doesn't work like that. A "fair" fight is comparing the current generation AMD cards to the current generation Nvidia cards. The end.

AMD took one approach to make their 5000-series cards. Nvidia took another approach with their 400-series cards.

Due to the differences in architectures, I doubt that even an increase in die size would dramatically boost the tessellation performance of AMD's 5000-series cards.

Read what I said. They all demonstrate a tessellation bottleneck. Stop being in denial. This isn't about games or benchmarks or whatever. This is about hardware limitations.

I think we're now seeing that there are indeed plenty of paid AMD shills on the forum as well. Hard evidence is denied left and right, AMD's PR spin is held as absolute truth, and nVidia is being blamed for everything. The cat is out of the bag guys, give it up.

You hit the bullseye exactly.

Paid PR personnel posting on forums are becoming a big problem all over. Not just on tech sites, but other kinds of sites as well.

I think a lot of forums need to take a much harder stance against these PR people. Moderation needs to be much more strict regarding some of these PR people. Many companies are well aware of the marketing opportunities available on the internet. Ironic then that many enthusiasts have yet to catch on to the sleazy and sly undercover marketing and ghost blogging tactics that companies are utilizing.

As you say, just because games don't yet *use* tessellation factors that are in the 'danger zone' of AMD's hardware, doesn't guarantee that there won't *be* any games that will not be using 'dangerous' tessellation factors.

Here we are, HAWX 2 is apparently the first game that enters the 'danger zone' for AMD. Result: AMD PR panics.
I'm betting there will be plenty more games to come. AMD had better come up with a good tessellator.

I'm expecting Crysis 2 to have very heavy use of tessellation. I think we will see some noticeable performance gaps between AMD and Nvidia hardware in Crysis 2.

I have noticed at trend.
AMD PR makes a claim.
The claims gets debunked.
AMD fans still use original (now debunked) claims as a "fact".
Makes you wonder...

Indeed, makes you wonder if the number of times they re-use the debunked claims corresponds to how much they get paid.

It seems ironic that people are jumping to conclusions already about AMD wanting to fix a performance bug. We still don't know if it will impact image quality. Stuff like this happens all the time to both sides, so why do we foam at the mouth when AMD wants to increase performance?

Also, I am quite confident that you can easily find a benchmark where AMD cards destroy Nvidia cards. Doesn't mean either is faster, its just the performance of that specific feature.

Also Cayman is supposed to have more radical changes to the hardware itself, so we can see if there any further tessellation speed ups.

Just like the 6870 and 6850 were supposed to have "radical" hardware changes if you were to believe all the hype. Look how that turned out.

Cayman should be an improvement, but I don't expect anything radical.

Where is the killing being done? Can you back that up with gaming benchmarks? If not, enjoy buying a 200 dollar video card to run synthetics and argue on messageboards that you've purchased the superior product.

It will be card that many will enjoy on future DirectX 11 games making very heavy use of tessellation.

Im not all bothered at all. Read my sig.

And i certainly dont think Radeon 5xxx or 6xxx series owners will be bothered by your take or angle on tesselation for these cards versus the GTX4xx series and wanting to make the 5xxx and 6xxx series a much worse buy/product.

The fact remains that AMD has 90% of the DirectX11 market and AMD will no doubt increase its Total marketshare with the release of these new products this year.

Lets not forget that alot of people are still sitting on Radeon HD4xxx series and GTX8800/98xx and 200 series cards. These cards cant do tesselation. And alot of these people will buy either a 5xxx or 4xx series card, but i belive most of them will infact buy a 6xxx series card.

Why will they buy the 6xxx series cards? its tesselation performance is good enough while also providing the best performance/watt and performance/cost. I dont know how Nvidia can top that. Except ofcourse by lowering the GTX460 1gb prices even more, and i mean the Overclocked versions ofc. The stock GTX460 has been left in the dust.

I'm very doubtful that AMD still has 90% of the DirectX 11 market. I would be willing to bet real money in real life on this too. I think it's share of the DirectX 11 market is less than 90% and will continue to decrease since Nvidia now has a full 400-series lineup and as more of the 400-series mobile cards make their way into notebooks.

So a lot of people with GTX 88xx/98xx cards you think are going to buy AMD cards all of a sudden? That is an awfully bold assumption, one that I would strongly argue is incorrect.
 
You can argue as strongly as you like.

Per 14.october 2010, AMD has/had 90% marketshare of DX11 products. AMD is actually way ahead of Nvidia on the laptop side of things since their 5xxx series launched a full 6-8 months ahead of Nvidias 4xx series.

25million Dx11 cards sold, 90% marketshare

The link is going to the Xbit article.

Ofcourse: AMDs 90% share will go lower and lower as it gets some competition on the market from Nvidia, but they are likely to outsell Nvidia by a good margin and this is not going to change with the release of the 6xxx series.
 
Are you kidding? Those models kick ass with tesselation on. It's clearly a noticeable improvement.


I hope your being sarcastic... have you looked at 2 pictures side by side in the link I posted? IQ is so small to me it, the only place I can see it on the soldier is the pocket thingy is bloated/full/round instead of being squire.

otherwise its not even noticeable, and that comes at a 50% fps hit for a 480? no thanks.


ps the 6870 thats ment to compete against the 460s, does faster tessellation than a 460 in Uniengine 2.0. So the current 6xxx series will probably have better tessellation than the 5xxx ones.
Hawx2 slept with nVidia so ofcouse that benchmark runs better for nvidia products because it was optimised to run on theirs.


I'm very doubtful that AMD still has 90% of the DirectX 11 market. I would be willing to bet real money in real life on this too. I think it's share of the DirectX 11 market is less than 90% and will continue to decrease since Nvidia now has a full 400-series lineup and as more of the 400-series mobile cards make their way into notebooks.


Your forgetting that APUs will have directx 11 too.... CPUs that ll go into FAR more notebooks than the 4xx will. AMD has a new line out in the 6850/6870/6950/6970/6990, which will be out before the 580's and will likely be much stronger cards.

The llano is supposed to be a 56xx card performance on die of a CPU (this will kill alot of nvidia 4xx mobile sales because they ll be slower, use more power (which is bad on a notebook))
with dual channel fast DDR3 you ll be over 30 GB/s bandwidth, and have a more effecient use of that bandwidth because its on the cpu.

If you go by steam numbers its closer to 85% of the directx 11 market, but they only represent statisics of 30+ million users or so.

with 85% of the market useing AMD cards.... it doesnt make sense to use tessellation that works best with nvidia and hurts the other 85%. Those are the realities of haveing market share, just like we ll probably see less and less physX in new games as well (simply because most dont have it, and its nvidia only).
 
Last edited:
I hope your being sarcastic... have you looked at 2 pictures side by side in the link I posted? IQ is so small to me it, the only place I can see it on the soldier is the pocket thingy is bloated/full/round instead of being squire.

otherwise its not even noticeable, and that comes at a 50% fps hit for a 480? no thanks.

Tess didn't have a 50% hit at other resolutions, just 25x16 for that one card.

You can turn tess off if you want. Turn it on if you want. Everybody is happy. :thumbsup:
 
Paid PR personnel posting on forums are becoming a big problem all over. Not just on tech sites, but other kinds of sites as well.

I think a lot of forums need to take a much harder stance against these PR people. Moderation needs to be much more strict regarding some of these PR people. Many companies are well aware of the marketing opportunities available on the internet. Ironic then that many enthusiasts have yet to catch on to the sleazy and sly undercover marketing and ghost blogging tactics that companies are utilizing.
No doubt, those posting on forums that are paid to give a biased opinion (which they don't declare) are scum, and a problem that needs to be addressed.

But the rest of your post suggests you think this is something AMD is guilty of, and not Nvidia, which I find unlikely (you appear to be biased in favour of Nvidia). If, like me, you are interested in removing shills, then you'd like to remove them from all sides. Nvidia has been found guilty of mis-information in the past, regarding benchmarks, so I think it's unlikely that this is a problem that AMD are guilty of, and not Nvidia.
 
Tess didn't have a 50% hit at other resolutions, just 25x16 for that one card.

You can turn tess off if you want. Turn it on if you want. Everybody is happy. :thumbsup:

Ah...he cherrypicked the benches, found 1 place where there was an deviation from the norm and presented that as the norm?

I guess that settles his "credability" *chough*
 
No doubt, those posting on forums that are paid to give a biased opinion (which they don't declare) are scum, and a problem that needs to be addressed.

But the rest of your post suggests you think this is something AMD is guilty of, and not Nvidia, which I find unlikely (you appear to be biased in favour of Nvidia). If, like me, you are interested in removing shills, then you'd like to remove them from all sides. Nvidia has been found guilty of mis-information in the past, regarding benchmarks, so I think it's unlikely that this is a problem that AMD are guilty of, and not Nvidia.

AMD is no better?
Fuddy has made so many false claims that I have actually lost count...
 
You can argue as strongly as you like.

Per 14.october 2010, AMD has/had 90% marketshare of DX11 products. AMD is actually way ahead of Nvidia on the laptop side of things since their 5xxx series launched a full 6-8 months ahead of Nvidias 4xx series.

25million Dx11 cards sold, 90% marketshare

The link is going to the Xbit article.

Ofcourse: AMDs 90% share will go lower and lower as it gets some competition on the market from Nvidia, but they are likely to outsell Nvidia by a good margin and this is not going to change with the release of the 6xxx series.

I didn't catch this link explicitly mentioning 90% DX11 share, but no matter.

Another thing to realize is that DX11 is still a tiny percentage of the overall graphics market.

By the time DX11 becomes more prevalent and has a bigger share of the overall graphics market, AMD's DX11 share will go down quite a bit.

You are correct in the mobile market, as many major mobile brands have yet to use the Nvidia 400M-series of cards. Some models with these cards are already out though, and many more are coming in the next few months.

I hope your being sarcastic... have you looked at 2 pictures side by side in the link I posted? IQ is so small to me it, the only place I can see it on the soldier is the pocket thingy is bloated/full/round instead of being squire.

otherwise its not even noticeable, and that comes at a 50% fps hit for a 480? no thanks.

ps the 6870 thats ment to compete against the 460s, does faster tessellation than a 460 in Uniengine 2.0. So the current 6xxx series will probably have better tessellation than the 5xxx ones.

You need to look closer. There are far more differences. Look at the details of the guns and all of the details of the soldiers. This is a game that doesn't even tessellate the environment, and I can spot noticeable differences in visual quality.

Your forgetting that APUs will have directx 11 too.... CPUs that ll go into FAR more notebooks than the 4xx will. AMD has a new line out in the 6850/6870/6950/6970/6990, which will be out before the 580's and will likely be much stronger cards.

If you go by steam numbers its closer to 85% of the directx 11 market, but they only represent statisics of 30+ million users or so.

with 85% of the market useing AMD cards.... it doesnt make sense to use tessellation that works best with nvidia and hurts the other 85%. Those are the realities of haveing market share, just like we ll probably see less and less physX in new games as well (simply because most dont have it, and its nvidia only).

You're forgetting that Intel will soon have DX11 integrated graphics available. You're also forgetting that Nvidia dominates professional and non-consumer markets where AMD has a weaker presence.

What exactly is your point? That AMD has a high DX11 share? This is obvious. Do you mean to say AMD will continue to have a high DX11 share? That is arguable.

You have no idea how well AMD's APUs are going to sell.

Furthermore, AMD currently has no answer for Nvidia's Optimus technology. Optimus continues to become more common on notebooks, and it is a very strong feature that allows for significantly better battery life than notebooks equipped with AMD GPUs.

Wait, did you just group together the 68xx cards with AMD's other upcoming 6000-series cards and say that they will likely be much stronger cards than the GTX 580? That is one of the most ridiculous statements I've heard in a while. I would strongly argue that AMD's upcoming 69xx cards won't be much stronger than Nvidia's 500-series cards. What makes your statement any more correct than mine? Nothing, so this sort of posting is meaningless.

Your logic is flawed. According to this logic, DX10 cards still make up the majority of the graphics market. Therefore, game developers and future games should cater only to DX10 since that is the majority of the market. That is just plain silly. Also using your logic, developers should not care about their games being optimized in DX10 on AMD/ATI hardware, since Nvidia cards make up the majority of the DX10 market.

No doubt, those posting on forums that are paid to give a biased opinion (which they don't declare) are scum, and a problem that needs to be addressed.

But the rest of your post suggests you think this is something AMD is guilty of, and not Nvidia, which I find unlikely (you appear to be biased in favour of Nvidia). If, like me, you are interested in removing shills, then you'd like to remove them from all sides. Nvidia has been found guilty of mis-information in the past, regarding benchmarks, so I think it's unlikely that this is a problem that AMD are guilty of, and not Nvidia.

I never denied that Nvidia might use such PR people on forums. I wouldn't be surprised if they did.

However in this case I call it like I see it, and on these forums specifically, I see a clear trend of primarily AMD PR people based on username patterns, and posting behavior. If Nvidia PR people post on these forums, to me there seem to be less of them.

I agree that all such people need to be removed from forums, no matter what company they work for.
 
Last edited:
nvidia seems alot more likely to pull PR stunts like that, their marketing is one of their main strengths. Im with Triggaaar on that one. Also most people here have a brand they like more than the other, for me that happends to be AMD/ATI.

I used to be a nvidia only guy... tried the otherside, and all the nvidia fanbois with "ati has bad drivers" nearly drove me nuts reading about on the net. Its plain out FUD.. Ive used the 4xxx series and the 5xxx series and havnt had any driver issues at all.

also as far as I know, amd/ati has never had drivers that turn off fans that then resulted in cards getting to hot and being destroyed by drivers. FFXI has driver issues atm with 4xx series and has had those driver issues for many months, people on 4xx cards are haveing 5-10 fps when a old 2xx series can easily keep 30+ constant fps (its 30 fps limited game).

also I feel we ve gotten totally off track here with the supposed subject of the HAWX2 benchmark.

nvidia payed to have it optimsed for them, pushes for reviewers on launch day to use this benchmark, even when the game isnt out yet, to make amd/atis new launch look bad. Then to show that 460 are fast compaired to the new cards, are shoving the fastest 460 overclocked cards you can buy down the throats of reviewers.

anyways I hope prices stay as they do... but I dont see how Nvidia can use a 470s a 448 mm^2 chip priced around the same as a Amd 255 mm^2. Prices might go up again after this 68xx launch has blown over, because nvidia are probably loseing money each time they sell a 470.
 
Last edited:
I didn't catch this link explicitly mentioning 90% DX11 share, but no matter.

Another thing to realize is that DX11 is still a tiny percentage of the overall graphics market.

According to steam DX11 is around 11% of the market:
http://store.steampowered.com/hwsurvey/videocard/


And there NVIDIA has 15,72% of the market.

I guess 90% sound better than 85% 😉

I predict the next results will make that closer to 80%.
 
nvidia seems alot more likely to pull PR stunts like that, their marketing is one of their main strengths. Im with Triggaaar on that one. Also most people here have a brand they like more than the other, for me that happends to be AMD/ATI.

I used to be a nvidia only guy... tried the otherside, and all the nvidia fanbois with "ati has bad drivers" nearly drove me nuts reading about on the net. Its plain out FUD.. Ive used the 4xxx series and the 5xxx series and havnt had any driver issues at all.

also as far as I know, amd/ati has never had drivers that turn off fans that then resulted in cards getting to hot and being destroyed by drivers. FFXI has driver issues atm with 4xx series and has had those driver issues for 7+ months, people on 4xx cards are haveing 5-10 fps when a old 2xx series can easily keep 30+ constant fps (its 30 fps limited game).

also I feel we ve gotten totally off track here with the supposed subject of the HAWX2 benchmark.

nvidia payed to have it optimsed for them, pushes for reviewers on launch day to use this benchmark, even when the game isnt out yet, to make amd/atis new launch look bad. Then to show that 460 are fast compaired to the new cards, are shoving the fastest 460 overclocked cards you can buy down the throats of reviewers.

anyways I hope prices stay as they do... but I dont see how Nvidia can use a 470s a 448 mm^2 chip priced around the same as a Amd 255 mm^2. Prices might go up again after this 68xx launch has blown over, because nvidia are probably loseing money each time they sell a 470.


And do you have proof for the firest bolded statement?
Can you show me where is't not standard DX11 code, but optimized for NVIDIA GPU's only?


If not, I must ask you:
Do you work for AMD? 🙂

And all the numbers I have seen so far, shows that NVIDIA har way better margins on their G1xx series than AMD has with the 5xxx series...so how do you come to the "conclusion" in the second bolded part? :hmm:
 
Last edited:
Just a few thoughts to provide some clarity and debunk some arguments:

- Despite all the ranting and raving from people that Nvidia is losing money selling certain products, they're doing very well financially at the moment and they are likely to have some decent Q3 results, better than AMD probably

- Despite AMD's dominance so far in the DX11 market, they still failed to make a profit in Q3 this year. The reasons for this are not even ATI-related necessarily, but it is what it is. Point is that AMD's dominance of the DX11 market so far made very little difference to AMD's financial situation overall.
 
Because if your selling a chip for a liveing, and that chip is TWICE as big as your competitors product, and their both priced the same, you have to be makeing less than your competitor.

no I dont work for anyone, Im just tired of listning to all the pro nvidia fanbois.
There are people that like amd too, reguardless of what you guys think of them.

Do you mean if I have bank records where I track money from nvidia into the pockets of the HAWX2 developers,... no. Im just going by rumors on the net, but its most likely true, given the 2 companies history.

Did you know assasins creed had a directx 10.1 removed when it turned out it ran better on amd/ati card,s and nvidia had it pulled? When nvidia can phone up ubisoft and have them pull directx 10.1.

link:
http://techreport.com/discussions.x/14707


what did Ubisoft get for pulling directx 10.1 from their game? money? nvidia says no, only they got allowed to use "the way its ment to be played" on their game.

"...As part of this program, Nvidia may promote Assassin's Creed for Ubisoft in various ways (including magazine advertising) and may offer engineering resources and development assistance."

so free advertiseing by Nvidia, and you let nvidia code the game for you, so it runs really well on theirs, and bad on their competitors.



"Point is that AMD's dominance of the DX11 market so far made very little difference to AMD's financial situation overall. "

many suspect its tax evasion.... haveing so much of the market, and selling so many cards has made them profits. Others suspect its because they payed for the research for the new cpus that have grafics elements... their apus.

point still stands though, if your useing half the size chip at same price as competitors, you sell chips for a liveing, you should be make buttloads of money.
 
Last edited:
Having so much of the DX11 market ... which is still a small fraction of the overall graphics market? So what.

I can quickly see where this argument is devolving to. I will state this one last time, that DX10 cards still make up the majority of the graphics market, and Nvidia made a lot of profit with their DX10 cards. The majority of the DX10 market is made up of Nvidia cards.

You also continue to ignore Nvidia's strong market presence in non-consumer graphic markets.
 
According to steam DX11 is around 11% of the market:
http://store.steampowered.com/hwsurvey/videocard/


And there NVIDIA has 15,72% of the market.

I guess 90% sound better than 85% 😉

I predict the next results will make that closer to 80%.

Simple maths, really. AMD had 100% of the DX11 market for quite some time and from 100% there is no going up. You must know this. You probably also know that Steam hardware surveys are not exactly perfectly accurate with the big picture. DX11 might be a small part of the market, but it's what's being sold now. If you have 60% of the whole market, but only 10% of the DX11-market, that shows how poorly you've performed the in this generation.
 
Last edited:
Can you show me where is't not standard DX11 code, but optimized for NVIDIA GPU's only?
I find it amazing that there is such a thing as a fanboy (for any brand), but unfortunately there is, so we have to deal with lots of accusations flying around, and we can't remember whether any ever had proof.

Regarding a HAWX 2 benchmark though: would a reviewer generally choose to compare graphics cards using a game that's not yet been released? I'd have thought not, because you're not likely to get results that help consumers choose which card is for them. So personally, I'm just going to ignore the annoying PR of both companies, and ignore the results of games that aren't released yet.

Regarding tesselation: I looked at those screen shots from Metro, and I can't appreciate the difference between the two. Tesselation sounds like an interesting technology that will help improve graphics, but I just don't see an improvement in those pictures. And that's with me looking at stills - if I was in the middle of playing a game, there's no way I'd notice whether tess was on or off (in that example).
 
Regarding tesselation: I looked at those screen shots from Metro, and I can't appreciate the difference between the two. Tesselation sounds like an interesting technology that will help improve graphics, but I just don't see an improvement in those pictures. And that's with me looking at stills - if I was in the middle of playing a game, there's no way I'd notice whether tess was on or off (in that example).

omg finally some sense 🙂

Triggaaar your not alone, I cant notice any differnce in those models image quality either. With tess on or off... and if it has performance loss of near 50% sometimes, then reguardless of if AMD or Nvidia puts it into games will I be useing it playing said games.

also I find it ironic that for along time, many generations of cards AMD/ATI has been pushing to get people to use tessellaion and nvidia was saying its a waste! it uses performance without IQ improvements! Nvidia is the reason why directx 10 didnt get tessellation implemented from what I heard.

Now both companies have done a 180 degree, now its ATI thats saying tessellation within limits, and nvidia thats pushing to get people to use as much as possible.
 
Last edited:
I find it amazing that there is such a thing as a fanboy (for any brand), but unfortunately there is, so we have to deal with lots of accusations flying around, and we can't remember whether any ever had proof.

Regarding a HAWX 2 benchmark though: would a reviewer generally choose to compare graphics cards using a game that's not yet been released? I'd have thought not, because you're not likely to get results that help consumers choose which card is for them. So personally, I'm just going to ignore the annoying PR of both companies, and ignore the results of games that aren't released yet.

How many time do we need to repeat this?
Read the *beeep*ing thread, there you will find mention of:
FarCry 2
AvP
DIRT2

All relased as benches and use by reviewers in benchmark before the games where out.

Regarding tesselation: I looked at those screen shots from Metro, and I can't appreciate the difference between the two. Tesselation sounds like an interesting technology that will help improve graphics, but I just don't see an improvement in those pictures. And that's with me looking at stills - if I was in the middle of playing a game, there's no way I'd notice whether tess was on or off (in that example).

Ah, the "I can't see it..." argument.
Only comes up when performance isn't where it should be.
 
Just a few thoughts to provide some clarity and debunk some arguments:

- Despite all the ranting and raving from people that Nvidia is losing money selling certain products, they're doing very well financially at the moment and they are likely to have some decent Q3 results, better than AMD probably

- Despite AMD's dominance so far in the DX11 market, they still failed to make a profit in Q3 this year. The reasons for this are not even ATI-related necessarily, but it is what it is. Point is that AMD's dominance of the DX11 market so far made very little difference to AMD's financial situation overall.



While true AMD as a whole didn't turn a profit, neither did Nvidia, at least according to the last few quarterly reports.

As of the latest report, issued Aug. 30, 2010, Nvidia had an overall net loss, just like last quarter and last year.

Their business for the last reported quarter, as noted above, had a comprehensive net loss of just over $140M, but that was worse than the comparable quarter from last year, which was a loss of $105M.

For the six months prior to Aug. 1, 2010, they had a comprehensive loss of $3.3M, which was greatly improved as compared to last year's same time frame....a loss of $306M.


By division: (For the last quarter)

GPU.....loss of $$221M

PSB (Professional side...Quadro).....$85M profit

CPB (Tegra, etc.).......loss of $38M


This represents larger losses in the GPU and CPB divisions as compared to the same quarter a year ago.

For the six months ending with this quarterly report, again, the only division showing a profit is the PSB division with both the GPU and CPB divisions losing money, again....same as last year.

You can read it in their last quarterly report, found here:

http://msnmoney.brand.edgar-online....&companyid=4967&ppu=/Default.aspx?ticker=NVDA


The division breakout is on Page 27.

But as a company, Nvidia lost $140M for last quarter. Page 4 of the linked report shows their overall performance for the quarter and it compared to last year's same quarter.

So, while Nvidia has $1.7B in cash and cash equivalents (stock, bonds, etc.), Nvidia certainly isn't making huge profits but instead losing money.
 
Back
Top