• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Here's what AMD didn't want us to see - HAWX 2 benchmark

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
So, we have 14 pages for one demo for one game that will likely suck on the PC anyway?


Amazin

Nice troll...next time read the thread before you make yourself look like nothing more than a flamebaiter.

It's not just HAWX2 that reveals AMD's limited tessellation implimentation and you would know that if you had bother to read the thread.

*still amazed at all the deraling attempts by AMD fans in this thread*
 
Nice troll...next time read the thread before you make yourself look like nothing more than a flamebaiter.

It's not just HAWX2 that reveals AMD's limited tessellation implimentation and you would know that if you had bother to read the thread.

*still amazed at all the deraling attempts by AMD fans in this thread*

I'm an AMD fan and a troll now?

I'm not sure there is a way to derail people who are clearly unhinged.
 
foo..snip


No. YOUR arguments are foolish. HAH

See what i did there?

Let me put it to you like this: prior to the GF100 release, AMD was considered by what i can only call the majority of the people here (since only a few write while alot view etc, you cant really be sure either way) as being ahead of Nvidia on Graphics tech. By almost a year.

Now, you can talk about proprietary features all day long, but the graphics that you see on your screen was that much faster and better on AMD tech.

Now we have AMD releasing a GTX460 1gb OC punisher in the HD6870, JUST 3 months after Nvidias card launched. The advance in tech and in steps for the next launch just got better in AMDs favor.

Unless Nvidia can manage to launch something before this year is over, they will be more than a year behind in graphics tech for gamers and entusiasts.

This is my opinion. Nvidia may very well as many suggest on this forum, opt out of the gamer marked, something im sure you as well as i will consider a surrender 🙂 But lets not hope that happens, since we like these pricewars and we like to get our cool hardware for lower prices.

Madcat out
 
Reason I did 2 6870's in CF is because they are about the same price, power consumption, and die size compared to 480. You can do 460 gtx's as well if you want. Pretty sure 6870's still win. IMO, Superior technology is when the same input is normalized (price, power consumption, die size or any combination of them), the performance is better. So, IMO, 6870's in CF are doing pretty well technology wise. Also, these things are a lot smaller than cypress and almost as fast (better in CF) and Cayman is supposed to be almost double the size of Barts with more changes to the core. I think AMD will have a comfortable lead till the true next gen chips, which will be very exciting (Fermi definitely needs 28nm, but with some good tweaking and the lessons nVidia learned, I think we will have a good competitive end of 2011).
 
nvidias problem is PCIe (restrictions)... 480s use so damn more power... *IF* they managed to fix that with the 580s we might see them compete with the 6950-6970's.

If not... well there is always late 2011 OR early 2012, when nVidia do a 28nm grafics card.
 
Unless Nvidia can manage to launch something before this year is over, they will be more than a year behind in graphics tech for gamers and entusiasts.

I think you meant 1 year behind in market share, not performance. :thumbsup:

NV still has a good product in GTX460 and its overclocked variants, while GTX460 768mb has the $150 market all to itself. HD6870 is no faster than the GTX470 (but a better overall card due to power consumption and noise).

Since AMD still has not released any high-end HD69xx series, the only high performing card for enthusiasts they have is the HD5970. So imo, they are both 1 year behind for enthusiasts! The countdown for NV's high-end card won't start until HD69xx is out.
 
I think you meant 1 year behind in market share, not performance. :thumbsup:

NV still has a good product in GTX460 and its overclocked variants, while GTX460 768mb has the $150 market all to itself. HD6870 is no faster than the GTX470 (but a better overall card due to power consumption and noise).

Since AMD still has not released any high-end HD69xx series, the only high performing card for enthusiasts they have is the HD5970. So imo, they are both 1 year behind for enthusiasts! The countdown for NV's high-end card won't start until HD69xx is out.

From a manufactoring and economic point, a 480/470 is about diesize 550 mm2, a 360 is 380mm2 and a 6870 is 255mm2. As yield is not liniarily correlated with size, and profit is the last money on the card, this just spells big, big trouble from the financial side for NV.

Looking at the steam number confirmes they have a small market share now (it might as well be the same because of the die/yield issue), but the future look even worse with fusion (ontario/llano), SB int gfx and bd comming.

I dont know if 360 is a good card, but it cant compete with the slim size of the barts. Its not economically feasible both on the short, medium and long term.

We all prefer competion as you say, but if you look at your rig, you have to owe the low prices to primarily AMD. The first series of computers i bought 15-20 years ago, was just so expensive because Intel was the only one (AMD kind of made bad copies). There will alway be competition on the gfx field. Intel is comming strong with SB, i know its not exactly highend but it caters to a large part of the market. Without strong prices on the gfx, no AMD.

This market have a tendency towards a natural monopoly given the extreme high entry cost for development and fabs. And thats what is starting to hit NV now. AMD have been hit for nearly all the years,- its a cash burning machine and have never made any profit for more than a few years at a time.
 
Last edited:
From a manufactoring and economic point, a 480/470 is about diesize 550 mm2, a 360 is 380mm2 and a 6870 is 255mm2. As yield is not liniarily correlated with size, and profit is the last money on the card, this just spells big, big trouble from the financial side for NV.

Looking at the steam number confirmes they have a small market share now (it might as well be the same because of the die/yield issue), but the future look even worse with fusion (ontario/llano), SB int gfx and bd comming.

I dont know if 360 is a good card, but it cant compete with the slim size of the barts. Its not economically feasible both on the short, medium and long term.

We all prefer competion as you say, but if you look at your rig, you have to owe the low prices to primarily AMD. The first series of computers i bought 15-20 years ago, was just so expensive because Intel was the only one (AMD kind of made bad copies). There will alway be competition on the gfx field. Intel is comming strong with SB, i know its not exactly highend but it caters to a large part of the market. Without strong prices on the gfx, no AMD.

This market have a tendency towards a natural monopoly given the extreme high entry cost for development and fabs. And thats what is starting to hit NV now. AMD have been hit for nearly all the years,- its a cash burning machine and have never made any profit for more than a few years at a time.

Who cares about diesize when gaming?
I care about performance/features.

And from where I am sitting it looks like AMD's focus on diesize is costing them in features/performance.
The things I care about when gaming.
 
Who cares about diesize when gaming?
I care about performance/features.

And from where I am sitting it looks like AMD's focus on diesize is costing them in features/performance.
The things I care about when gaming.

If you care about performance/features then you probably should, to a non-zero extent, care about the financial health of the companies because it is (to a large part) the cash-flow generated by today's products that is determining the R&D budget for today's R&D team which has been tasked with creating and implementing performance/features for tomorrow's products.

This is one of those occasions where you don't have to agree, but we are within our rights to wonder why you take the time and expend the effort to crap on the conversation that is occurring between forum colleagues who so clearly do care to talk about such things.

And I know you probably know this but this is VC&G, not the PC Gaming forum, so you are going to encounter a fair amount of non-gaming specific discourse and dialogue in here.

If this dialogue is not the sort of discussion you wish to be exposed to then you will probably find yourself in better sorts by hanging out in a more relevant (to you) sub-forum.
 
I'm also not convinced that die size - directly correlates to whether my favorite graphic card maker can make profits and continue to grow and invent.
Wall street is puzzled by recent AMD earnings or lack thereof.
Well, if I wasn't impressed with $33 million, you better believe I think last quarter's $1 million operating profit on $392 million in sales is a fly in the company's ointment.
An interesting point, IMHO, is there is no current workstation product based on the g104 gpu, that I know of.
 
notty22 that is $1m when AMD is done paying for whatever their R&D teams are working on for next year and the year after.

Basically they are re-investing nearly every dime they make, which is about as awesome as it gets if you are a consumer that wants to buy next year's graphics cards. Not so awesome if you an investor with an eye for the short-term.
 
I'm also not convinced that die size - directly correlates to whether my favorite graphic card maker can make profits and continue to grow and invent.
Directly correlates? Definitely not. It's only one part of the equation. But it is the most fundamental aspect of manufacturing costs. Generally, the smaller the die, the less the power consumption, better yields, more dice per wafers. This all corresponds to cost savings. And it trickles down into board costs, lower power means less robust power regulation, smaller fansink etc.

One thing I think a lot of people overlook is the importance of the OEM market. The 57xx cards have been a big hit here, not because of their performance (which is fine but not earth shattering) but because the product is small and power efficient. OEMs love this because they can get away with cheaper support components. Nvidia is in the same boat, the rebranded G80/G92 et al. and the low end Fermi's sell in large numbers for the OEM market.

As for AMD's financials, you won't get an answer out of my as to why they can't turn a decent profit. But on the flip side, they are slowly improving, it was not too long ago they were losing 400 million/quarter.
 
If you care about performance/features then you probably should, to a non-zero extent, care about the financial health of the companies because it is (to a large part) the cash-flow generated by today's products that is determining the R&D budget for today's R&D team which has been tasked with creating and implementing performance/features for tomorrow's products.

This is one of those occasions where you don't have to agree, but we are within our rights to wonder why you take the time and expend the effort to crap on the conversation that is occurring between forum colleagues who so clearly do care to talk about such things.

And I know you probably know this but this is VC&G, not the PC Gaming forum, so you are going to encounter a fair amount of non-gaming specific discourse and dialogue in here.

If this dialogue is not the sort of discussion you wish to be exposed to then you will probably find yourself in better sorts by hanging out in a more relevant (to you) sub-forum.

I think you misunderstand me.
Making a small die, cutting out features, is not progress to me...it's the opposite.
I could turn the tables around and say:
This is not the "finacial forum", where cost of dies would be more suited.

We are talking GPU's here.
High end GPU's.
Their primary use is...gaming.

Given the choice of:
a) small die, but less features/performance/powerconsumption
b) large die, with more features/performance/powerconsumption

I go with option b.

Otherwise I would just buy a console.

Again, when using my GPU it's diesize is irrelvant.
Any costsavings due to lower powerconsumptions is negated by "exchange" rate of GPU's (short lifespan before upgrade)..and lesser performance.

I my view if your primary consern is diesize/powerconsumption you should stick to IGP's...if your pimary consern is performance you should buy a GPU.

My experince tells me that when anythings else that perfomance is brought up (be it powerconsumption, noise, diesize ect.) is due to not having the best perfoming product.

It happens every cycle and always the "side" with the least performing GPU that tries to remove the focus from performance and onto something else.
 
I think its almost impossible to guesstimate the business model of Nvidia or AMD off of knowing the die size. We know the same dies are used in workstation products that help pay for the cost of a wafer.
Consider trying to guesstimate how a newspaper turns a profit. The cost of the paper alone, never mind the ink, presses and persons to manufacture it every day. Are MANY times the 1 dollar cost. Its obviously the advertising which is where most actual profit comes from. Newspaper page size is directly related to profit. A smaller (pages) newspaper means less profit.
Funny how things work, lol
 
Last edited:
I think you misunderstand me.
Making a small die, cutting out features, is not progress to me...it's the opposite.
I could turn the tables around and say:
This is not the "finacial forum", where cost of dies would be more suited.

We are talking GPU's here.
High end GPU's.
Their primary use is...gaming.

Given the choice of:
a) small die, but less features/performance/powerconsumption
b) large die, with more features/performance/powerconsumption

I go with option b.

Otherwise I would just buy a console.

Again, when using my GPU it's diesize is irrelvant.
Any costsavings due to lower powerconsumptions is negated by "exchange" rate of GPU's (short lifespan before upgrade)..and lesser performance.

I my view if your primary consern is diesize/powerconsumption you should stick to IGP's...if your pimary consern is performance you should buy a GPU.

My experince tells me that when anythings else that perfomance is brought up (be it powerconsumption, noise, diesize ect.) is due to not having the best perfoming product.

It happens every cycle and always the "side" with the least performing GPU that tries to remove the focus from performance and onto something else.

I agree about diesize. Who cares what it cost them to make? Care about how much it cost *you* to *buy*! NV can subsidize its operations from Quadros and Teslas anyway, something I encourage since I buy only GeForces. Make those professional-graphics and HPC people pay more for GPU development and wafer costs, I say! 🙂

I sorta disagree about power/heat/noise because if it gets bad enough, it can offset the performance advantage to some degree. But it has to get pretty bad.

An obvious example is if a card eats so much power that you need to upgrade your PSU just to use it, then that card is probably not as cost-effective as the next-fastest card that doesn't need a new PSU to run.

Also, if you pay a lot for electricity then the operating cost of the card might be a deterrent to buying it vs. a similar-speed-but-more-efficient alternative.

Everyone has different noise tolerance levels. Since I typically game with a headset, I have a high tolerance. However, the card can't get so loud that my gf complains about it, or so loud that it makes my PC unsuitable for watching movies on.
 
Making a small die, cutting out features, is not progress to me...it's the opposite.

Small die: Check
Cutting out features: 404 Evidence not found.

I do not understand when you mean by cutting of features, what features did they cut when compared to Cypress/Juniper?

In fact, they added a few feature like 3d blu-ray and Support of 4 monitors.
 
Are you seriously going to argue that is the single most important feature that they cut and that affects their features and performance. Really? Your argument is, well...weak.

Did you read the story? They cut out more than that one feature, but they didn't want to talk about them on the record (except for Sideport).


I also want to add that the negative press surrounding "fermi" mostly had to do with TMCS's 40nm process, their process has matured since then:

http://benchmarkreviews.com/index.p...k=view&id=607&Itemid=72&limit=1&limitstart=15

No respin, just a more mature 40nm process.
Sadly not all manufactors have Intel's FAB and process technology.

I agree, TSMC really let NV and AMD down. And then they cancelled 32nm! About that article though, there was a thread on here discussing it. The process is more mature but there are lots of variables that could also explain or partially explain the deltas.
 
Last edited:
Are you seriously going to argue that is the single most important feature that they cut and that affects their features and performance. Really? Your argument is, well...weak.

Prehaps you should read the article and what I wrote before posting again:

Eric was telling me about how they trimmed down 870 from over 400mm2 down to 334mm2 and how wonderful the end product was. I stopped him and asked for more detail here. I wanted an example of a feature that they had to throw out but they really wanted to keep in. Manufacturers rarely tell you what they threw out, marketing likes to focus on what’s in the chip and make everything sound like a well calculated move. Thankfully, marketing wasn’t allowed to speak at my dinner.
Eric turned to Carrell and said: “i know one feature we could talk about

&

In early 2008 ATI realized they had to cut this chip down from 20 - 22mm on a side to 18mm, everyone had to give up something. Carrell was the big advocate for making 870 smaller, he couldn’t be a hypocrite and not give anything up.

Your "counterargument" seems very weak.

That AMD only specified only thing specifically dosn't mean they didn't cut out more...like the article also stated.
 
I'm also not convinced that die size - directly correlates to whether my favorite graphic card maker can make profits and continue to grow and invent.
Wall street is puzzled by recent AMD earnings or lack thereof.

An interesting point, IMHO, is there is no current workstation product based on the g104 gpu, that I know of.

I've read some information that indicated that with the timing of the release of those AMD financials they were deducted for on-hand inventory of 6870/6850 cards that were being stockpiled for release.
 
Prehaps you should read the article and what I wrote before posting again:

Eric was telling me about how they trimmed down 870 from over 400mm2 down to 334mm2 and how wonderful the end product was. I stopped him and asked for more detail here. I wanted an example of a feature that they had to throw out but they really wanted to keep in. Manufacturers rarely tell you what they threw out, marketing likes to focus on what’s in the chip and make everything sound like a well calculated move. Thankfully, marketing wasn’t allowed to speak at my dinner.
Eric turned to Carrell and said: “i know one feature we could talk about.”

&

In early 2008 ATI realized they had to cut this chip down from 20 - 22mm on a side to 18mm, everyone had to give up something. Carrell was the big advocate for making 870 smaller, he couldn’t be a hypocrite and not give anything up.

Your "counterargument" seems very weak.

That AMD only specified only thing specifically dosn't mean they didn't cut out more...like the article also stated.

I'm a little confused. What specific features did they cut that you would like to have? Compare Fermi to HD 5xxx/6xxx, the only features that either lacks is physx aka cuda and eyefinity.


I assume he's talking about parts of the die that could have increased performance but took up a large portion of it - not something like adding a new DX extension (they don't really exist anymore, MS said no).
 
I've read some information that indicated that with the timing of the release of those AMD financials they were deducted for on-hand inventory of 6870/6850 cards that were being stockpiled for release.


Ive heard things like tax-evasion, paying for research to the cpus (apus) since they have grafics cards in them.

anyways Im pretty sure that:

1) size does directly correlate sales prices/profits.
2) selling 25million grafics cards (90% market share) will make you money (unless your selling big chips for less than it costs to make them).

3) nvidia is loseing money on its discreat GPU sales, and its tegra. The thing that nvidia has that makes them money is their proffesional cards sales (but their profits here, are not enough to cover their GPU losses + tegra losses). And for the last 3 or 4 quarters overall Nvidia has been loseing tons of money (not makeing profits).


I think its almost impossible to guesstimate the business model of Nvidia or AMD off of knowing the die size. We know the same dies are used in workstation products that help pay for the cost of a wafer.
Consider trying to guesstimate how a newspaper turns a profit. The cost of the paper alone, never mind the ink, presses and persons to manufacture it every day. Are MANY times the 1 dollar cost. Its obviously the advertising which is where most actual profit comes from. Newspaper page size is directly related to profit. A smaller (pages) newspaper means less profit.
Funny how things work, lol

I'm also not convinced that die size - directly correlates to whether my favorite graphic card maker can make profits and continue to grow and invent.
Wall street is puzzled by recent AMD earnings or lack thereof.


that is the craziest annology Ive ever heard for chip sizes being big = more profits.
Im sorry but a crafics card isnt a news papir... in this world waffers have a given price (costly), and the bigger the chip => the smaller the yields => the higher you would need to price your product.

thats all fair and good, no reason that cant work out.

however heres the kicker.... if your competitor has smaller chips with same performance in a market (pc gamers), then if your priced performance vs theirs, your makeing less than them.

amd can price their cards to the point where they only profit 1-2$ us dollar pr card sold = amd makes little money on their financal reports. However nvidia matching the competitors cards in price/performance would then be loseing money each time they sold a card that matches up to the competitors.

now throw in a price war, where a 255mm2 (6870) chip is fighting a 529mm2 (470) chip, and you ll see nvidia is bound to be loseing more money this quarter than the last 3 or so.

Also its likely to stay this way, with nvidia far behinde in performance/mm2, so they cant make the same profits pr grafics card sold that amd can. This wont change until the 28nm are introduced... so it ll be like a year from now until nvidia has a chance to make money on their GPU sales to non perfessional market.

Luckly nvidia has alot of money in the bank... it might have to liqudate some stuff and sell some things to free up cash. 1 year from now nvidia most likely wont have as much money in the bank though.
 
Last edited:
I don't follow the forum too closely but from what I can gather, Lonberg the new wreckage?


Personal attacks/insults are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:
Back
Top