Radeon 7870/7850 Reviews Are up!

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Already sold off my HD 5870, so yeah, right now just waiting. Sucks cuz newegg has it in stock ONLY if you buy something with it, but nothing they bundle with it is worth the extra cost to me.

nvidia my be kind to us:p,what yo reckon?,7950 for me is my upgrade path so...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
nVidia is in the business to actually sell GPU's. They don't just have AMD to compete with but themselves -- in other words, offer compelling reasons for their customers to upgrade. And offering 10-25 percent more performance for price-points isn't really very compelling, one may imagine.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
nVidia is in the business to actually sell GPU's. They don't just have AMD to compete with but themselves -- in other words, offer compelling reasons for their customers to upgrade. And offering 10-25 percent more performance for price-points isn't really very compelling, one may imagine.

*coughs*ok
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
nVidia is in the business to actually sell GPU's. They don't just have AMD to compete with but themselves -- in other words, offer compelling reasons for their customers to upgrade. And offering 10-25 percent more performance for price-points isn't really very compelling, one may imagine.

And AMD isn't? :confused:
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Nick Stan may actually have a clue -- the both of you?
Yes, we are all completely clueless. :rolleyes:

How about you explain yourself instead of resorting to insults? I asked you to explain your derogatory comment, instead you linked to a site in Japanese, where an Nvidia rep is talking about how they are trying to make Kepler more efficient. So? This somehow proves AMD is not out to "actually sell GPUs" but Nvidia is?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Yes, we are all completely clueless. :rolleyes:

How about you explain yourself instead of resorting to insults? I asked you to explain your derogatory comment, instead you linked to a site in Japanese, where an Nvidia rep is talking about how they are trying to make Kepler more efficient. So? This somehow proves AMD is not out to "actually sell GPUs" but Nvidia is?

I didn't say that.

What I did say is if nVidia continues to raise the premium bar of 40nm price performance over-all with 28nm -- makes selling product much more difficult -- nVidia is in the business to sell GPU's and their competition is not only AMD but themselves -- nVidia has to offer compelling reasons to their customers to upgrade, too.

I guess this is laughable -- will see.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
And AMD isn't? :confused:

Imho,

Each consumer has to ask themselves what is important -- price/performance isn't one of AMD's strengths so far compared to 40nm, and may be the most important metric considering it is a new node and arch. To dismiss this is very one-sided to me.

Edit: And the irony was it was just a short time ago for many years.
 
Last edited:

Arzachel

Senior member
Apr 7, 2011
903
76
91
They don't have to compete with themselves when they can phase out their 40nm products, one of the advantages of being fabless. And with all the evidence pointing out that TSMC isn't exactly generous with 28nm wafers, I can't see why you would think that Nvidia would be eager to start a price war.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
THEY WONT BE ABLE TO SELL OTHERWISE. In large quantities.

everyone already upgraded for Crysis 2, Skyrim and BF3

Only absolute high-end would sell, and very little of anything else.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
THEY WONT BE ABLE TO SELL OTHERWISE. In large quantities.

everyone already upgraded for Crysis 2, Skyrim and BF3

Only absolute high-end would sell, and very little of anything else.

That's odd, I know a lot of people who haven't. Weird.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
So they missed BF3/Crysis2/Skyrim and will upgrade NOW?


Like HD5850/GTX460 ->7850, whole 25-30% more performance, for only $250...riiight

Trigger happy midrange potential upgraders did so en masse recently according to NV.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
So they missed BF3/Crysis2/Skyrim and will upgrade NOW?


Like HD5850/GTX460 ->7850, whole 25-30% more performance, for only $250...riiight

Trigger happy midrange potential upgraders did so en masse recently according to NV.

I didn't know you HAD to upgrade to enjoy those games? I mean, Crysis 2 launched as a DX9 port getting maxed out on GTX 460s. BF3 is playable on almost all cards with MSAA off (I played it fine ultra -MSAA on my HD 5870), and Skyrim is a CPU bottleneck that is very playable on a GTX 460.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Yeah, but you have dinosaurs which also upgraded for COD:MW3.

Large part of these dinosaurs can now play 2012 games just fine.

And if there is no incentive and gaming injection, they will happily game on their new 560s.

HINT: What we call GPU, they call Geforce.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Yeah, but you have dinosaurs which also upgraded for COD:MW3.

Large part of these dinosaurs can now play 2012 games just fine.

And if there is no incentive and gaming injection, they will happily game on their new 560s.

HINT: What we call GPU, they call Geforce.

Okay, doesn't change that I know a good amount of people that are looking to buy this year. So, there is an audience for both vendors.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Population with burning need to upgrade is _much smaller then in 2011.


If your hardware was fine in 2011, it will be in 2012 is what I'm saying.

Oh and monitor resolution seems to have settled on 1920x1080.
That was a big booster through all these years.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
From what I've seen the 7870 should be around or lower than GTX460 1GB, with many sources pointing at even lower than a 6870, which consumes less power than a GTX460.
HD7870-58.jpg
Just saw this, but Hardware Canucks uses 3DMark11 to measure load power consumption? Are they serious? There's another site I won't waste my time reading.