Farcry 2 results: GF100 Vs. GTX285

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
My mistake, not thinking right. The leaked pics for fermi has 1x6 and 1x8 pin, which puts it 300W max. From other sources, the 360 will have 2x6 pins. A 360x2 is possible if they downclock majorly.

I like nV, but they've been selling the 200 series for a loss for a while now before EoL'ing them. If they are again forced to sell gf100 at a loss, its going to be a major hurting. How long til they learn their lesson and stop trying to make one giant die to target two very different market?
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
My mistake, not thinking right. The leaked pics for fermi has 1x6 and 1x8 pin, which puts it 300W max. From other sources, the 360 will have 2x6 pins. A 360x2 is possible if they downclock majorly.

I like nV, but they've been selling the 200 series for a loss for a while now before EoL'ing them. If they are again forced to sell gf100 at a loss, its going to be a major hurting. How long til they learn their lesson and stop trying to make one giant die to target two very different market?

My 4870 had two 6pin and they managed to double that up without "down clocking majorly"

The number of plugs can give us a rough estimate.. but in the end it means almost nothing. There are dozens of reasons one would want the card covered for more power than it will draw. It is certainly logical to expect fermi to use a lot of power.. but there is no way we can be as certain as you seem to claim.

While I personally like the small die approach of ati better right now.. who is to say that in 4 years building one massive die, with HPC users eating the early consumer losses, won't be the better model? It is perfectly possible that a strong showing in the HPC market could allow them to sustainably sell at a loss in the consumer for a few months every release while the yields come up to money whoring levels.
 

TheRickRoller

Member
Dec 2, 2009
164
0
0
I like nV, but they've been selling the 200 series for a loss for a while now before EoL'ing them. If they are again forced to sell gf100 at a loss, its going to be a major hurting. How long til they learn their lesson and stop trying to make one giant die to target two very different market?

I read a joke before, someone said nVidia wouldn't be happy until they had a die the size of a whole wafer :sneaky:. They can afford to keep hurting and put off that lesson of yours for a good while though.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
My mistake, not thinking right. The leaked pics for fermi has 1x6 and 1x8 pin, which puts it 300W max. From other sources, the 360 will have 2x6 pins. A 360x2 is possible if they downclock majorly.

I like nV, but they've been selling the 200 series for a loss for a while now before EoL'ing them. If they are again forced to sell gf100 at a loss, its going to be a major hurting. How long til they learn their lesson and stop trying to make one giant die to target two very different market?

http://www.nvidia.com/object/io_1249591520243.html

vs.

http://www.nvidia.com/object/io_1257455428199.html

I'm not saying you're wrong, but if they're selling their chips at a loss, then how did they turn around a $105 million loss into $107 million gain in 1 quarter? If they are selling 200 series at a loss, they are making an absolute killing in other markets and would be better off withdrawing completely from the desktop graphic space.
 

lopri

Elite Member
Jul 27, 2002
13,329
709
126
As long as competition is there and prices get knocked down to within Earths orbit, we should be happy. Don't worry yourself about profits.
I agree. I don't care how much profit a hardware company makes. I think Mr. Huang said something along the line of "Our customers don't care about the die size" around G80 launch, and that's exactly how I feel. What matters to me is whether a product is high quality at a reasonable price. Die size and all that bits are only interesting as pure technology, or to the extent it affects to the quality and price of a product. If NV decides to sell Fermi at a loss, more power to them. I have zero complaint as long as it's competitive. (I might even be secretly gleeful. :twisted: )
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Before we go comparing apples to half eaten apples.. what map was the test done on from the anand and other numbers most are comparing the leak to?

I was under the impression that anand used a recorded action demo (the 60ish fps for 5870 being thrown around) and not the small farm.

For instance,
http://www.pcper.com/article.php?aid=820&type=expert&pid=6

Puts this card neck and neck with the 5870. Very likely it is the 360... but not in the "omg the 380 will own the sky" kind of context.

Also, was the box these leaks were done on the one on liquid or a different one?
 
Last edited:

sandorski

No Lifer
Oct 10, 1999
70,879
6,417
126
Looks like AMD's sense of Confidence for 2010 was correct. Nvidia will be Competitive performance wise, but AMD has a lot of room to either improve Performance or simply compete through Price. Even on the Price side they won't have to cut much at all, if any. If Fermi chip production can't significantly improve soon AMD will own DX11 throughout the year on Supply issues alone.

Nvidia will survive the year, but I wouldn't be surprised if they take Losses and AMD/ATI Profits(they may take Losses as well, depending on how their Production ramps).
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Dunno about you, but when i buy a GPU, i use it to play games and do a bit of video encoding on the side. I would consider that the vast majority of consumers. Sure, a small % may want to F@H but thats really it. Why pay more for an architecture that is not suited to your needs?

This is why nV is going to hurt big time this round. CUDA cores, pfft, gimmick is gimmick.

We will know the full details soon enough. Lets agree on that. :)

I can see this is just about all we would agree on.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
My mistake, not thinking right. The leaked pics for fermi has 1x6 and 1x8 pin, which puts it 300W max. From other sources, the 360 will have 2x6 pins. A 360x2 is possible if they downclock majorly.

I like nV, but they've been selling the 200 series for a loss for a while now before EoL'ing them. If they are again forced to sell gf100 at a loss, its going to be a major hurting. How long til they learn their lesson and stop trying to make one giant die to target two very different market?

How long? I can answer that pretty easily. Depends on the adoption rate of Fermi (Tesla) now that Fermi offers everything that was asked for over the things GT200 didn't have.
If the adoption rate is high, which is Nvidia's hope, They won't think twice about continuing creating large dies.

And, whether you realize it or not, they don't "have" to make two different dies for two different market. Hence the term General Purpose Graphics Processing Unit (GPGPU).
General Purpose. Dwell on that for a bit.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
How long? I can answer that pretty easily. Depends on the adoption rate of Fermi (Tesla) now that Fermi offers everything that was asked for over the things GT200 didn't have.
If the adoption rate is high, which is Nvidia's hope, They won't think twice about continuing creating large dies.

And, whether you realize it or not, they don't "have" to make two different dies for two different market. Hence the term General Purpose Graphics Processing Unit (GPGPU).
General Purpose. Dwell on that for a bit.

I think the more important part we are missing is if they make two different GPU's I won't be able to game on the hardware I want at work...

Edit: I know I'm not the only on in that boat!
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I agree. I don't care how much profit a hardware company makes. I think Mr. Huang said something along the line of "Our customers don't care about the die size" around G80 launch, and that's exactly how I feel. What matters to me is whether a product is high quality at a reasonable price. Die size and all that bits are only interesting as pure technology, or to the extent it affects to the quality and price of a product. If NV decides to sell Fermi at a loss, more power to them. I have zero complaint as long as it's competitive. (I might even be secretly gleeful. :twisted: )

Err excuse me but some of us are professionals, not gamers (only) - let us debate things that (apparently) for some reason Keys really don't want us to see debating...
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Err excuse me but some of us are professionals, not gamers (only) - let us debate things that (apparently) for some reason Keys really don't want us to see debating...

What does die size have to do with professionals?

I mean, I want power efficient and cheep as well.. but unless you hold stock the size and cost hit they take should not matter that much.

It could be 6*6cm or 6*6mm and as long as the thing does what I payed for I'm pretty happy... granted if I have the choice of the smaller one with equal power it is a no brainier.. but that is not often an option.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So what competes with a dual gpu gtx 360? A refreshed hd 5890?
So you are saying AMD will never own the single card crown again?

I am not saying AMD will never own the single card crown again. But their current strategy is NOT to have the single fastest GPU chip (that's NV's strategy). AMD is about price/performance and profitability. So if you want a faster version of 5870, get the 5970. If you want a slower version of 5870, you have 5770, etc.

dual gpu GTX 360 (with likely revisions to specs) is going to be the GTX 395 (in much the same way GTX 295 is right now). Given the power consumption requirements of a large monolithic GPU, NV will likely need a die shrink to get dual full-fledged GTX 380s on 1 card.

Of course I am guessing on the power consumption for NV's new cards. When we compared load levels of 5870 vs. 4890, the power usage is very close (total system):
http://www.techreport.com/articles.x/18288/9

Since Fermi GPU is going to be at least 2x more complex than GTX285, it's hard to imagine NV getting good power consumption #s.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Err excuse me but some of us are professionals, not gamers (only) - let us debate things that (apparently) for some reason Keys really don't want us to see debating...

Professional what? ;)
We'll have to excuse you anyway. Forums aren't really your strength in demonstrating congeniality.
Double edged sword you're weilding there though. On one hand, in this forum, it has been said to the point of exhaustion that 99.999999% of customers buying a graphics card has no use for a GPGPU other than gaming or the occasional video encode. They could care less about all the other stuff it could do. So why would they care about the companies die size or their profits?

Short answer. They really wouldn't.

Look dude. If a graphics card company delivers competitive performers for competitive prices, what exactly about your "professionalism" persuades you to care how big the die is or how much profit, or lack thereof, a graphics card company makes?

Frankly, I'm stymied. :)
or
Stymie, I'm Franklied. :eek:
 
Last edited:

Rezist

Senior member
Jun 20, 2009
726
0
71
In a away the GTX360 has to atleast equal the 5870 if not surpass it just due to cost to manufacture. Bigger die/larger bus then a 5870.

I'm not sure how they will add up performance wise but cost wise it should probably looks like this: 5970>GTX380>GTX360>5870>5850

Also if the GTX360=5870 that really leaves no 40nm part to fill in for the GTX285. We're all counting on AMD to release a 5830 to fill in the gap from 5770 to 5850 once the 4890 is gone.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
In a away the GTX360 has to atleast equal the 5870 if not surpass it just due to cost to manufacture. Bigger die/larger bus then a 5870.

I'm not sure how they will add up performance wise but cost wise it should probably looks like this: 5970>GTX380>GTX360>5870>5850

Also if the GTX360=5870 that really leaves no 40nm part to fill in for the GTX285. We're all counting on AMD to release a 5830 to fill in the gap from 5770 to 5850 once the 4890 is gone.

Well, they could cripple a gtx360 to put something head to head with the 5850. They will need something down there, unless they retool the 285 for that role.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
@Keysplayr

I don't think we can completely isolate ourselves from looking at the size of the GPU dies as well as the other costs of the card compared to the performance we are getting. While it's almost unquestioned the GT100 will be more powerful of a card than the Radeon 5870 we still have to know by how much as well as at what price. The reason is that most of us have a certain limit we can't/wont' go over when spending for new hardware even if we're enthusiasts. For that reason, the large die size of the GT100 as well as the need for higher costing parts (PCB, VRM's, etc) will affect the final price and therefore the bang for the buck.

This will affect the purchasing decisions of those looking to upgrade in the next 3 months if Fermi is regulated only to the upper end of the price scale with prices starting well above $400.

Not all of us can afford the top end gaming card of each generation. Most of us have to weigh our options and choose the best card depending on certain criteria such as performance for the dollar or if two cards cost roughly the same and perform roughly the same then we'd weight value by seeing which card performs better in the games we play.

You said it yourself, "competitive performers for competitive prices" which is the key word. Can nVidia produce Fermi at competitive prices? To a degree, I'd say yes. The 8800 series is an example of that. But with their ever monolithic GPU's, it's troubling to see that there is still no true GT200 derivatives in the lower end of the market. Even well after the introduction of the GT200 series we were getting multiple rehashed cards that were named like they were in the GT200 family but really were 9800 series chips (which were tweaked 8800's). While this attests to how great the 8800's were, it also points out the trouble nVidia is having scaling the GT200 down to the lower end of the market.

With this in mind, I wonder how long it will take for them to scale down the GF100 to go beyond the enthusiast markets. I have a lot of friends that I make video card recommendations to and pretty much all of them buy cards in the $150 range. I really can't recommend they buy an nVidia card at this moment if I can get a competitive option, performance wise, that also supports DX11 from AMD. In the last round of upgrades, I recommended a lot of nVidia cards. If my friends were to ask me for upgrade advice in the next 3 months, it'd almost have to be an AMD.

My next laptop, which I am likely to buy in 6 months, will almost undoubtedly have a mobile Radeon 5 series solution. nVidia likely won't have a DX11 mobile GPU for at least another year.

The monolithic nature of nVidia's chips means it's harder for them to scale them down and is affecting the purchasing decisions of lower end users as well as mobile users.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
The monolithic nature of nVidia's chips means it's harder for them to scale them down and is affecting the purchasing decisions of lower end users as well as mobile users.

Why exactly is this?

I mean.. In the world I'm used to it is always easier to make something smaller than bigger. What about the latest Nvidia cards makes it so hard to cut down?

I was under the impression that they didn't bother cutting down the gtx200s because of how strong the 92b's were already in that market. Not really because it would have been technically challenging, just pointless to release a GTX250 that performed within 3% of a 9800gtx given the cost of more R&D.

This certainly paints a bleak picture of current size:performance ratios for Nvidia, but there is no reason we can't see something nice in that market eventually.. though they may have to improve area:perf if it is to be cost effective for them.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
Well isn't the GT240 a GT200 derivative? I agree though it's not much of a card/chip.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Well isn't the GT240 a GT200 derivative? I agree though it's not much of a card/chip.

Indeed it is.. but it is worse than the 9600's.. so they never bothered until they needed something with gddr5, dx10.1 and 40nm. I don't think I'd pay it any more mind than the 4770 as I don't think it is representative of an attempt at a top tier 'product' they intended to sell en mass.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
I am? Try me. Not all of us are hurting in this economy, it's just you.

:\ Chill...I was just kidding. Plus, what does someone not wanting to bet have to do with the economy (gambling isn't everyone's cup of tea regardless of how much money they may have)?
 
Last edited:

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
There's no doubt whatsoever that GF100 will be a fast card, kicking ass specially in physics,gpgpu and the new dx11 engines, albeit it will come hot and power hungry. All AMD has to do is lower the prices,put up a 5890 and still be in profit due to its excellent yields and the small size of its chips. Come on NV, pull the cork on that GT360, launch a price war.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Dunno, even if it's true I'm not particularly blown away by this - remember, Crysis and Far Cry 2 are the archetype of TWIMTBP games, always performed much better on Nvidia cards than other games when compared to their ATI counterparts.

Show me Clear Sky or Pripyat benchmarks, DiRT2, Bioshock 2 benches etc. If those do not show at least 20% advantage over 5870 then Fermi is a complete dud for me - like an old, crappy prostitute: big, hot and bothered and overpriced. :D

T2k, I didn't even compare with the ATI numbers, because one game can't prove anything. But I compared the 84fps number supposedly for the GTX360 vs. 51fps for the GTX285, 64% difference. That's a nice jump in performance.