6870 VS GTX460 Die sizes and price/performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

psoomah

Senior member
May 13, 2010
416
0
0
Once again, NV designs a GPU for 3 tasks: (1) games (2) Quadro line (3) GPGPU computing.

ATI designs GPU for 1 task: games.

Unless this changes, NV will always have a larger GPU relative to AMD for the same performance. This was true during G80, G200/b and GF100 times. Nothing has changed.

Why is it always news when a chip designed specifically for graphics is more lean and more efficient than a chip designed for multiple tasks? NV amortizes its R&D/costs across 3 product lines while AMD amortizes it against 1 product line.

Furthermore, last quarter AMD only had 1 million operating profit despite the smaller die sizes for HD5000 lines. Therefore, it's pretty obvious that other factors are involved such as marketing costs, distribution costs, etc.

From an efficiency perspective, AMD has led for as long as I can remember.

From everything I've read, the OEM and low end AIB markets are the most vital $$$ links in the chain. Without them Nvidia cannot operate in the black with the rest of their graphics board markets.

Keep in mind AMD is moving hard to get OpenCL and OpenGL up to speed and to the point they can compete on a level field with Nvidia in the professional market (which uniformly prefer a platofrm future with OpenCL and OpenGL). Nvidia is far from price competitive in the professional markets. As soon as AMD becomes development platform competitive, Nvidia will lose major market share to AMD.

Game developers are also very interested in developing OpenCL and OpenGL.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
The gf100 was late. The gf104 was right on time. So why would the gf110 be late? I'm sure it was planned well before the gf100 launch.

The gf100 being late has little or nothing to do with the gf110 launch date.
well wasn't the gf110 designed only after they realized the gf100 was not what they hoped for?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,814
1,550
136
Once again, NV designs a GPU for 3 tasks: (1) games (2) Quadro line (3) GPGPU computing.

ATI designs GPU for 1 task: games.

Which is why RV770, RV870, and RV870 both had DP support. Because, you know, games make use of that :rolleyes:

To be clear, both companies design their products for all three tasks, the big difference is how they *prioritize* those three tasks in their designs.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
The gf100 was late. The gf104 was right on time. So why would the gf110 be late? I'm sure it was planned well before the gf100 launch.

The gf100 being late has little or nothing to do with the gf110 launch date.

GF104 was about 5~6 weeks later than expected (in April the rumour was 1st June, it arrived 12th July)
GF106 and GF108 were also later than expected (all based on rumours).
NV hasn't delivered "on time" based on rumours at all. And rumours are what we base our targets on since we don't have access to insider knowledge.
NV said in January that nothing had been delayed by the Fermi delay, they also said there would be a refresh out in 2010. There are still just under 2.5 months left to deliver a refresh, but I'm not holding my breath.

Not to mention that since June there have been rumours of a dual GPU GF104 based product. Also hasn't appeared. There have been rumours of 512-core Fermi. 384-core GF104. Also not appeared.
In fact, almost every rumour and prediction about when things are coming from NV has been wrong.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,701
406
126
The gf100 being late has little or nothing to do with the gf110 launch date.

Depends if NVIDIA was planning on doing a refresh of GF100 at 28 nm or at 40nm.

28nm was supposed to be out around now(?) when all the planning of this gen started. There were some rumours of a GF102, which supposedly was a GF100 die shrink at 28nm, that seems to have completely disappeared.

We already know that AMD had to fall on a backup plan (or was some rushed because Cypress already was less than it was supposed to be originally and 28nm is delayed?) due to 28nm node be MIA.

This GF110 seems to be a recent plan or a backup plan - unless NVIDIA was already predicting it wouldn't be able to launch fully operational GF100s/104s?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,701
406
126
Which is why RV770, RV870, and RV870 both had DP support. Because, you know, games make use of that :rolleyes:

To be clear, both companies design their products for all three tasks, the big difference is how they *prioritize* those three tasks in their designs.

And we can't discount the fact that AMD also produce CPUs and is starting to release APUs which can lead to a different approach.

We should also consider how the market looked a few years ago with NVIDIA having good presence (not saying overwhelming because of Intel) in chipsets, laptops and OEM entry cards .
 
Last edited:

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
I wonder if the 6xxx is pin compatible with older 5xxx, can existing owners be just offered a way to upgrade it like say sent in the 5xxx card and pay some extra to have it swaped to 6xxx? That would be very interesting for gpu industry. First upgradable platform ever.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,814
1,550
136
I wonder if the 6xxx is pin compatible with older 5xxx, can existing owners be just offered a way to upgrade it like say sent in the 5xxx card and pay some extra to have it swaped to 6xxx? That would be very interesting for gpu industry. First upgradable platform ever.

Lol, you could probably hack it if you were hardcore enough. But what's the point? 68xx seems like more of a sidegrade to current HD58xx users anyway. Those guys would be looking at a HD69xx if anything.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
The gf100 was late. The gf104 was right on time. So why would the gf110 be late? I'm sure it was planned well before the gf100 launch.

The gf100 being late has little or nothing to do with the gf110 launch date.

wow really?

THINK about what you just said for a second. The entirety of Fermi development was delayed by several months. GF104 (and 106 and 108) are derivatives of GF100, if GF100 was 6 months late, then the derivatives are just as late. They were on time relative to when GF100 debuted, sure, but thats the whole point regarding development schedules.

I don't think you're fast enough to keep arguing with.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If this is what's holding NVIDIA's gaming performance back, then well... they should stop, and make two products instead of one.

You realize it costs too much $ for NV company to design 2 separate GPUs when a GPU design probably costs $2-3 billion every 2-3 years? :biggrin:

It's interesting how AMD absolutely bombed HD2900 and HD3800 series and all it takes is 1 miss of a generation for NV, and everyone is saying NV is doomed from now on.

From everything I've read, .... Nvidia is far from price competitive in the professional markets.

I stopped reading right there...because what you are saying makes no sense. NV gained even more market share last quarter and now commands about 88% of the professional market segment.

If you are going to use Fudzilla or SemiAccurate as your backup for understanding of NV's business lines, then let's just end this thread now.

The 460 IS price/performance competitive with the 5830/50 cards, although at what HAS to be razor thin profit margins.

Considering AMD earned $1 million in operating profits last quarter, I wouldn't be flaunting AMD's profit margins right now, despite the smaller die sizes of their products. The average consumers doesn't really care how large the chip is as long as the performance, price, power consumption and noise levels are reasonable. NV's historical profit margins are in the range of 43-47% while AMD can only dream of that. Honestly, every week there is a new thread about NV on the brink of collapse and how it's such a horribly run company.

I think I am going to pull out a lawnchair and a couple beers :D

After all Asus could just pop a 6870 in a ROG 5870 board and get an immediate 20>30% kick on the reference 6870. Same goes for any of the premium o/c 5870 boards form MSI, Sapphire, Gigabyte etc.

I doubt the 460 has the headroom to match that.

What did you expect for HD6870 (next generation) to only match GTX460? That would be utter failure. Of course AMD card is supposed to be faster than a GTX460. That's what happens in the graphics card market. First 5850/5870 were leading, then GTX460/470 started to be better price/performance cards, now HD6000 series will be the best. etc. etc.
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Because the people that are surprised paid for a chip specifically for graphics? :awe:

NVIDIA would do well to have a more varied lineup. Joe Schmo does not care nor want to pay for non-graphics silicon. In his mind, he's paying for a graphics card. If this is what's holding NVIDIA's gaming performance back, then well... they should stop, and make two products instead of one.

its not as if nvidia is pricing their chips out the market because they have bigger dies...if anything NV absorbs the cost at the foundry not the consumer.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
Once again, NV designs a GPU for 3 tasks: (1) games (2) Quadro line (3) GPGPU computing.

ATI designs GPU for 1 task: games.

Unless this changes, NV will always have a larger GPU relative to AMD for the same performance. This was true during G80, G200/b and GF100 times. Nothing has changed.

Why is it always news when a chip designed specifically for graphics is more lean and more efficient than a chip designed for multiple tasks? NV amortizes its R&D/costs across 3 product lines while AMD amortizes it against 1 product line.

Furthermore, last quarter AMD only had 1 million operating profit despite the smaller die sizes for HD5000 lines. Therefore, it's pretty obvious that other factors are involved such as marketing costs, distribution costs, etc.

From an efficiency perspective, AMD has led for as long as I can remember.

exactly this ^^++
 

Rezist

Senior member
Jun 20, 2009
726
0
71
From what it looks to me with the 4800 series they sacrificed making a profit for market share, now there sacrificing market share for profit with the 6800 series.

I won't mention the 5800 series since they enjoyed 90% of there life with no competition, and got both market share and most likely good profit.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
From everything I've read, the OEM and low end AIB markets are the most vital $$$ links in the chain. Without them Nvidia cannot operate in the black with the rest of their graphics board markets.

Keep in mind AMD is moving hard to get OpenCL and OpenGL up to speed and to the point they can compete on a level field with Nvidia in the professional market (which uniformly prefer a platofrm future with OpenCL and OpenGL). Nvidia is far from price competitive in the professional markets. As soon as AMD becomes development platform competitive, Nvidia will lose major market share to AMD.

Game developers are also very interested in developing OpenCL and OpenGL.

nvidia supports opencl/gl better than AMD already anyways. PRO consumers are much more resistant to change than regular consumers. they have something that works and they are reluctant to switch because they don't want to risk problems. AMD is so far behind in Professional/HPC. much further behind here than NV is in gaming graphics. your entire post is AMD marketing FUD
 
Last edited:

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
You realize it costs too much $ for NV company to design 2 separate GPUs when a GPU design probably costs $2-3 billion every 2-3 years? :biggrin:

It's interesting how AMD absolutely bombed HD2900 and HD3800 series and all it takes is 1 miss of a generation for NV, and everyone is saying NV is doomed from now on.



I stopped reading right there...because what you are saying makes no sense. NV gained even more market share last quarter and now commands about 88% of the professional market segment.

If you are going to use Fudzilla or SemiAccurate as your backup for understanding of NV's business lines, then let's just end this thread now.



Considering AMD earned $1 million in operating profits last quarter, I wouldn't be flaunting AMD's profit margins right now, despite the smaller die sizes of their products. The average consumers doesn't really care how large the chip is as long as the performance, price, power consumption and noise levels are reasonable. NV's historical profit margins are in the range of 43-47% while AMD can only dream of that. Honestly, every week there is a new thread about NV on the brink of collapse and how it's such a horribly run company.

I think I am going to pull out a lawnchair and a couple beers :D

+1
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
http://www.amazon.com/HIS-Eyefinity-...7525569&sr=8-1

$275 for a 6870, which is really just the 5770 replacement, is a freaking rip off IMO. Nvidia has no reason to lower prices now as the gtx460 actually delivers the same performance per dollar which is important in this level of card.

You know why I love your posts? Whether it comes to CPU bottlenecks or overpriced videocards, you never take sides. You just say it straight like it is. When you have upgraded to a GTX470 and you weren't satisfied with its performance increase, you openly expressed it. You also didn't defend HD5770 when it was horribly overpriced at $170 at launch vs. $125 HD4870 at the time, while everyone else was trying to justify the $50 price premium (i.e., DX11, lower power consumption, eyefinity, etc.)

Respect +!! :thumbsup:

7900 vs X1900 - NVIDIA had a big lead

In terms of what? X1900XT/X consumed more power but it was the better performing card and had better image quality. I recommended X1900 series over 7900 every day of the week despite the much larger die size. Die sizes don't sell videocards to consumers. :awe:
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The entirety of Fermi development was delayed by several months.

You are wrong. The gf104 was not delayed, it released on schedule. There were many leaked slides that backed that up. What is the original release date for the gf110 chip?
Do you know? I would bet it was about 1 year after the original gf100. So exspect a refresh of the gf100 A.K.A the gf110 at the end of this year or January at the latest.
Thats normal for a refresh, one year later.
So wheres the delay?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You know why I love your posts? Whether it comes to CPU bottlenecks or overpriced videocards, you never take sides. You just say it straight like it is. When you have upgraded to a GTX470 and you weren't satisfied with its performance increase, you openly expressed it. You also didn't defend HD5770 when it was horribly overpriced at $170 at launch vs. $125 HD4870 at the time, while everyone else was trying to justify the $50 price premium (i.e., DX11, lower power consumption, eyefinity, etc.)

Respect +!! :thumbsup:

Yea ,Toyota even though he come accross like a jerkoff sometimes (and he knows this :) ) He tells it like it is. :thumbsup:
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I have read other accounts that the die size of g104 is smaller than what that site 'measured'. I don't have a link handy, also Nvidia has not officially stated (believe) what the actual die size is.
Gpu-z lists its as 332m
5xk.png

IMHO, the die size is a transparent fact to most perspective buyers.
Nvidia has idle / 2d power usage optimized to very low levels as its competitors. Which is probably important to some.
In fact, the behavior when o/c is exactly the same as stock, meaning it down volts, down clocks in 2d seamlessly, from what I've read, this not the case with most 5 series cards, though vendors have different bios's that behave differently.
 

psoomah

Senior member
May 13, 2010
416
0
0
If you are going to use Fudzilla or SemiAccurate as your backup for understanding of NV's business lines, then let's just end this thread now.

Having looked over your three responses to my posts, I find myself in agreement with part of that sentiment.

The part that involves you continuing to post in this thread.
 

psoomah

Senior member
May 13, 2010
416
0
0
From what it looks to me with the 4800 series they sacrificed making a profit for market share, now there sacrificing market share for profit with the 6800 series.

I won't mention the 5800 series since they enjoyed 90% of there life with no competition, and got both market share and most likely good profit.

With prices $40 to $70 over MSRP for the bulk of that 90%, most likely = most definitely.
 

psoomah

Senior member
May 13, 2010
416
0
0
nvidia supports opencl/gl better than AMD already anyways. PRO consumers are much more resistant to change than regular consumers. they have something that works and they are reluctant to switch because they don't want to risk problems. AMD is so far behind in Professional/HPC. much further behind here than NV is in gaming graphics. your entire post is AMD marketing FUD

We'll see when CS6 releases.