Q: max-OC'd FTW GTX460 vs max-OC'd XFX Vapor Chamber 6850, which wins? Answer inside.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The title of the thread is gtx460 overclocked vs 6850 overclocked ,who wins.

I said after reading the review posted the gtx460 won every test at a normal 1900x1080 resolution.

Blastingcap said it doesen't because in three years the 6850 will make up for the performance by using less electricity. :whiste:
A whole 11 watts less at load. Then he said yea but it uses 15 less watts at idle. :)unbelievable.

I went off basically. Whats next the 6850 is better because its lighter and uses higher gloss paint?

Leave me out of it dude, I said it's a close call and GTX460 might even have a price/perf advantage, and that it's hard to go wrong. You then spouted off on how superior the GTX460 was and I commented on power, at which point you said some pretty nasty things, and I gave you a lesson in electricity pricing 101. You seem incensed at this and attempted to create a poll.

My overall conclusion from the very first post of this thread STILL remains my opinion: it's hard to go wrong either way and both cards are good on price/perf. And virtually all 6850 reviews I've read agree. Even Notty agrees with my conclusion and we disagree on how to get there. You have a pretty lonely position if you're trying to make one card sound sooo superior to the other. Reading through this thread, I don't think even one other person disagrees with my conclusion, except for you.

As for my 3-year example it doesn't even need to be 3 years, 1 year of +$17 costs is probably enough to wipe out whatever price/perf advantage a GTX460 has over a HD6850, when the divisor is only $180 and when the perf advantage is NOT 15% in benchmarks, let alone actual gameplay situations.

So please, stop it with your revisionist history and putting words in my mouth.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
As for my 3-year example it doesn't even need to be 3 years, 1 year of +$17 costs is probably enough to wipe out whatever price/perf advantage a GTX460 has over a HD6850,

I am not even sure how both of you arrived at this argument. There is no conclusive evidence that supports the view that HD6850 consumes less power at idle than a GTX460:

GTX460 850mhz consumes 5 Watts less at idle: http://www.techreport.com/articles.x/19844/15

GTX460 850mhz consumes +/-2 Watts more at idle depending on the model of HD6850:
http://www.anandtech.com/show/3987/...enewing-competition-in-the-midrange-market/20

GTX460 and HD6850 have identical power consumption at idle:
http://www.techpowerup.com/reviews/HIS/Radeon_HD_6850/26.html

GTX460 850mhz consumes 4 Watts less at idle: http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-22.html

etc. etc. etc.

The bottom line is, if we are going to start nitpicking 10-15W of power differences, should we start washing our clothes by hand in cold water?? Both HD6850 and GTX460 are excellent cards. There is no need to focus on minute details like idle power consumption differences. Features, price, warranty, gaming bundles, noise levels, the particular games you play are far more important when comparing these 2. Plus, why does 1 absolutely have to be better than the other? With GTX460, you need to deal with rebates or the price is $200, while HD6850s are $20 less without the rebate. This alone might sway a person to buy one over the other. :p

At the of the day, you can't go wrong with either card.
 
Last edited:

thedosbox

Senior member
Oct 16, 2009
961
0
0
Whats next the 6850 is better because its lighter and uses higher gloss paint?

Huh? Some people care about the aesthetics of their PC's. Just look at the demands for cable management features in cases, LED fans, or case windows.

I went off basically.

That much is true.

In short, different strokes for different folks. Some people care about outright performance, others care about noise levels, others care about having the latest features (however useless they might be today). Get over it.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
At the of the day, you can't go wrong with either card.

Yet another person who agrees with my conclusion even if they differ on how to get there. :thumbsup:

For some reason system power draws tend to be higher in reviews with the GTX, not sure why, because the boards idle at nearly equal wattage. System power draw != board power draw. I talked about this earlier in a response to Notty. Also, idle with overvolts may differ from idles without.

By the way RS, I think that I am even more of a cheapskate than you. :p (Referring to our past discussion with Grooveriding, on waiting for deals on hardware and how it doesn't necessarily mean buying the worst possible brands.)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
By the way RS, I think that I am even more of a cheapskate than you. :p (Referring to our past discussion with Grooveriding, on waiting for deals on hardware and how it doesn't necessarily mean buying the worst possible brands.)

Grooveriding has 2x GTX480s in SLI. 90% of people on our board are "cheapskates" compared to him.

Scroll down to review by "Heather":

"I purchased two of these beasts .... since I drive 6 monitors the even sent me extra dvi to mini dvi connectors." ;)

That's $2400 for 2 videocards....:whiste:
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
Keep the spin up buddy. You honestly think 11 watts or 15 watts is worth arguing about vs performance/ price ?

If people are going to argue about minute differences in performance per dollar then taking considerations as to the total cost of ownership of products is completely legitimate and valid for arguing. In today's competitive market, ever since AMD introduced the Radeon HD 4000 series, just about every card has been priced according to its performance. As such I find "arguing" over price/performance stupid. You get what you pay for these days.

I've posted on this general topic before and my sentiment then was that if the electrical costs associated with the power-consumption of your choice of GPU represents a make-or-break decision based on the financial aspects then one should really question whether or not they can truly afford to be spending their time and money on the hobby in the first place.

On the other hand some people really do just get a real emotional charge out of feeling like they snagged the best deal possible, or a real feeling of satisfaction from knowing they analyzed their choices to the nth degree and if they could save a penny then they take that option.

Meaning the cost-savings portion of the decision matrix becomes a hobby in its own right.

If you talk to some hard-core F@H people you'll know what I'm talking about.

I agree along the same lines I just wrote earlier. Arguing over these minute differences in power isn't generally worth it (unless it's a legitimate concern for the user; case-by-case basis), and arguing over $10-$20 differences in upfront (point of sale) video card cost is also not generally worth it. A user should just buy whatever he needs under the price he has set for himself.

There is no need to focus on minute details like idle power consumption differences. Features, price, warranty, gaming bundles, noise levels, the particular games you play are far more important when comparing these 2. Plus, why does 1 absolutely have to be better than the other? With GTX460, you need to deal with rebates or the price is $200, while HD6850s are $20 less without the rebate.

Power consumption is (can be) just as important as it influences noise levels and price (total cost of ownership). Its importance, as well as the importance of any one of the other attributes you listed, is context specific. As such it can be more or less important, and the same can be said of price, performance, features, warranty, etc.


Blastingcap absolutely has a point and under the premises he assumes he has a very legitimate argument. If people are going to question it as if he's pushing an agenda (which I don't think he is doing at all), then we might as well question the stupid performance/price metric. Yes, I called it stupid because I think the ratio doesn't matter very much. Why? Because it is context-specific and changes when any variable, no matter how minute, changes.
And they have no idea why each are doing so.
Curious statement. I demand (yes, DEMAND!) you expand your thought here. Ok, demanding might be too harsh, but I do politely ask you to do so, even though I think it may be off topic, as I am quite curious to what you have to say in the manner. And curious in the absolute innocent sense, not the malicious one, in case you have second-thoughts on my motives.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Grooveriding has 2x GTX480s in SLI. 90% of people on our board are "cheapskates" compared to him.

Scroll down to review by "Heather":

"I purchased two of these beasts .... since I drive 6 monitors the even sent me extra dvi to mini dvi connectors." ;)

That's $2400 for 2 videocards....:whiste:

Just curious, what res and monitor arrangement are you running?

Sweet mother of mercy. I bet those run Crysis.

...and make it look like the Enterprise's (NCC-1701) bridge main viewing screen. :)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If people are going to argue about minute differences in performance per dollar... *snip*

Either it IS or IS NOT a big deal to have a <10% performance delta, or a <10% price delta. If you think one is significant the other should be too. And if you think one is insignificant the other should be too. If you are consistent about these things then you see that the price/performance ratio for the cards is about equal.

If you think the power draw issue, at less than 10% of the cost of the card, is "minute," then it should also follow than overvolt+overclock giving the GTX460 less than 10% performance advantage is also minute. <-- this is consistent

Same thing if you are saying $17 is unimportant in the big picture, then a slightly faster video card is ALSO unimportant in the big picture. <-- this is consistent

I am saying the overall price difference is mild (including electricity costs) but so is the performance difference. <-- this is consistent

A certain person on this thread seems to think it's okay to hype up the performance difference but--in a fit of inconsistency--to then wholly dismiss the price difference, in order to try to make it look like one card is significantly superior to the other. <-- this is inconsistent
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Dude,its similar in performance,has a new video decode engine,Display port and can run 6 monitors off one card.
Looks like the 6850 overclocks pretty well if that's your thing as well.
Power usage is really only super important if it makes the card way hot or very loud.If not I don't think the power difference is much of a deal breaker with either card.
More goodies to come like Morph AA with next month's drivers too....grab one and enjoy!
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
My XFX 6850 is better than the GTX 460 because it is kewl and the faint blue glow from my case is making my 6820 look like heaven.

OT: I don't know why it always has to end up like that heheh...for what, some less watts at load from Red team and a few more FPS from Green team?

Maybe i'm just not as perfectionnist as you are but I must say the debates are very interesting nontheless.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,274
41
91
Either it IS or IS NOT a big deal to have a <10&#37; performance delta, or a <10% price delta. If you think one is significant the other should be too. And if you think one is insignificant the other should be too. If you are consistent about these things then you see that the price/performance ratio for the cards is about equal.

That's what I was saying, and I agree to an extent.

Price to performance changes depending on the variables you include. Most people don't include power consumption. Also we mostly don't take a detailed consideration into the specific games a person will be playing, the resolution, the level of AA, the matched CPU, and etc. All of those things, as well as driver updates, affect the price to performance ratio for a specific person.

So if we're going to argue about price/performance at all then it is certainly well within anyone's right to bring in these variables. They are legitimate considerations.

But as a whole I now find arguing over price to performance a bit pointless. From top to bottom almost all cards are priced accordingly anyway. Such is the way of a competitive market, and it's basically been that way since Nvidia lowered prices on their GTX 200 cards to match AMD. Even before that I suppose the HD3000 series was priced to match the 8000 and 9000 cards.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Actually I think we agree 100%. I was talking about just price/perf in the previous post in this thread, but that doesn't mean I don't think there are other variables. Earlier in this thread I said that since price/perf is so close, people should choose based on whatever else matters to them, whether the tiebreaker is CUDA, PhysX, Eyefinity, or even something like the color of the cooler or whatever. And I agree about stuff like the specific games played, etc. And I understand that price/perf is NOT the top variable to consider for some people.

That's what I was saying, and I agree to an extent.

Price to performance changes depending on the variables you include. Most people don't include power consumption. Also we mostly don't take a detailed consideration into the specific games a person will be playing, the resolution, the level of AA, the matched CPU, and etc. All of those things, as well as driver updates, affect the price to performance ratio for a specific person.

So if we're going to argue about price/performance at all then it is certainly well within anyone's right to bring in these variables. They are legitimate considerations.

But as a whole I now find arguing over price to performance a bit pointless. From top to bottom almost all cards are priced accordingly anyway. Such is the way of a competitive market, and it's basically been that way since Nvidia lowered prices on their GTX 200 cards to match AMD. Even before that I suppose the HD3000 series was priced to match the 8000 and 9000 cards.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Total utter bullshit. 17$! no effen way man. 13 cents a kilowatt? is that the US avarage or something. I pay 8 cents here in Philly.

I pay about 11 cents, but we have a limited allotment of power at that cost. If we go above that allotment, the next "tier" of pricing jumps all the way to 30 cents a kWh.

Power usage does matter to some. 11 cents isn't too bad, but If I use "too much" and jump into 30 cents a kWh, the power stuff uses starts making a considerable difference. There was a time where every kWh I reduced in my usage was 30 cents, because I was always in the higher tier. Now I've reduced to the point where I'm not ALWAYS in the 30 cent tier, but it's pretty easy to justify the difference between more efficient and less efficient parts at 30 cents a tier.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just curious, what res and monitor arrangement are you running?

I think you got confused about the post. Blasting and I were talking about trying to find good deals on videocards, even if it means waiting for months until a good deal pops up. Then, I just brought an example of how even a person with 2x 480s can appear to be "cheapskate" to someone else who just dropped $2400 on videocards... That's why I linked the amusing review of someone else (not me) who bought 2x $1200 5970s.

Power consumption is (can be) just as important as it influences noise levels and price (total cost of ownership).

Yes, it can be important (i.e. HD6870 vs. GTX470/480). In this particular example when comparing GTX460 vs. HD6850, it isn't important. I just googled 4-5 websites and they all show nearly identical idle power consumption, and if anything it slightly favours the 460 by 2-3W. You can see the links I provided above.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Yes, it can be important (i.e. HD6870 vs. GTX470/480). In this particular example when comparing GTX460 vs. HD6850, it isn't important. I just googled 4-5 websites and they all show nearly identical idle power consumption, and if anything it slightly favours the 460 by 2-3W. You can see the links I provided above.


If it turns out that they're tied at idle (even systemwide, not just card), then the load wattage delta still makes the 6850 eat less energy, but it would be a far smaller amount. Instead of $50 extra over three years, it'd be more like $5 (at 13 cents/kWh and assuming a few hours of gaming per day on average).

Also, idle system draw isn't the same as idle *card* draw:

HWC shows +14w (or +15w, depending on which card) system
Guru shows +13w system (can't access it right now, but look earlier in the thread for the link)
HardOCP shows +4w system http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/8
AT shows zero difference at idle: http://www.anandtech.com/show/3987/...enewing-competition-in-the-midrange-market/20

On the other hand you do have other sites saying system draw is lower on the GTX460:

http://www.pcper.com/article.php?aid=1022&type=expert&pid=13 (-5w)

I speculated as to why that might be the case already--see my previous post about this (e.g., maybe NV cards require higher CPU utilization or something; or different cards by different AIBs with different circuitry and cooling could impact things).

One additional X-factor that I forgot about until now can easily explain idle discrepancies, too: according to reviewers like Anandtech, the voltage of GTX460s is not set! The voltage therefore can vary from card to card, they just boost voltage enough to hit 675MHz with a safety cushion, and call it a day: http://www.anandtech.com/show/3809/nvidias-geforce-gtx-460-the-200-king/17

"As we&#8217;ve discussed in previous articles, with the Fermi family GPUs no longer are binned for operation at a single voltage, rather they&#8217;re assigned whatever level of voltage is required for them to operate at the desired clockspeeds. As a result any two otherwise identical cards can have a different core voltage, which muddies the situation some. All of our GTX 460 cards have an idle voltage of 0.875v, while their load voltage is listed below."

While AT's small sample all had .875v at idle, I can imagine that not always being the case, that some cards need more voltage at idle. Perhaps HWC's GTX460 was at a slightly higher idle wattage?

And because there are almost no stock 6850s, the voltages may vary from AIB to AIB in the 6850's case as well.

If we distrust HWC's wattage maybe we should distrust its performance results as well. And performance averages vary depending on which games are sampled. Food for thought.

Anyway I no longer have much interest in this matter because it'd take a miracle for me to waver from the Sapphire 6850 at this point--Amazon started running out of stock last night, which got me to commit to buying one before they ran out entirely. I was going to wait till AT's roundup, but the time pressure made me cave. Sorry ASUS, but I wanted Sapphire's larger shroud (the better to funnel hot air OUT of my case rather than push it back into the case) and am willing to give up a PCB stiffener for it. The $7 premium for the ASUS doesn't help. (For anyone else making the same ASUS vs Sapphire comparison: I looked at Rage3d's review and Vortez and others and it appears to me that Sapphire and ASUS both used solid caps for their 6850s. They also have the same number of power phases and similar GPU temps, noise, etc. And the cooler designs are very similar, except ASUS has a smaller shroud and direct-contact heatpipes.)

Of course if Amazon ships me a defective 6850, I may revisit this matter. :)
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,288
3,429
136
www.teamjuchems.com
So is the only thing keeping nvidia from releasing and updated 460 with an 800 mhz core clock the fact they already used the number 465? Or are we going to see a "refreshed" 560?

All the OC bull crap tends to wear on my patience - if nvidia wanted to sell 460's clocked almost 200 mhz higher the solution is simple.

Create a new SKU.

That goes for getting cheaper ones and clocking them way, way up - what kept nvidia from clocking them all around 750 mhz?

Factory "OC" cards in the past have been a PITA for me because even if BFG/EVGA/Visiontek tested their cards with the majority of the current games, something can come out and stress a part of the card that maybe wasn't before (like Crysis and the damn floating around in the ship part) you can get instability/hard locks and you might find yourself in the position of manually downclocking your card even if you aren't the type of person who likes to dork around with that crap.

My $.02 on the deal.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
So is the only thing keeping nvidia from releasing and updated 460 with an 800 mhz core clock the fact they already used the number 465? Or are we going to see a "refreshed" 560?

All the OC bull crap tends to wear on my patience - if nvidia wanted to sell 460's clocked almost 200 mhz higher the solution is simple.

Create a new SKU.

Factory "OC" cards in the past have been a PITA for me because even if BFG/EVGA/Visiontek tested their cards with the majority of the current games, something can come out and stress a part of the card that maybe wasn't before (like Crysis and the damn floating around in the ship part) you can get instability/hard locks and you might find yourself in the position of manually downclocking your card even if you aren't the type of person who likes to dork around with that crap.

My $.02 on the deal.

NV has never had a problem with re-using numbers before.
They could always change the preceding letters or add a suffix anyway, like GTX465 Ultra.

IMO it's more a play to be nice to partners, allowing them to heavily differentiate offerings, rather than having specs mandated by NV.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Nice write up blastingcap, thanks for posting all that. It looks like the 6850 is the new price/performance king.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
So is the only thing keeping nvidia from releasing and updated 460 with an 800 mhz core clock the fact they already used the number 465? Or are we going to see a "refreshed" 560?

All the OC bull crap tends to wear on my patience - if nvidia wanted to sell 460's clocked almost 200 mhz higher the solution is simple.

Create a new SKU.

That goes for getting cheaper ones and clocking them way, way up - what kept nvidia from clocking them all around 750 mhz?

Factory "OC" cards in the past have been a PITA for me because even if BFG/EVGA/Visiontek tested their cards with the majority of the current games, something can come out and stress a part of the card that maybe wasn't before (like Crysis and the damn floating around in the ship part) you can get instability/hard locks and you might find yourself in the position of manually downclocking your card even if you aren't the type of person who likes to dork around with that crap.

My $.02 on the deal.

I'm guessing here, but if I had to speculate I would suspect this has more to do with NV's AIB partners wanting to have this little niche opportunity to differentiate their products as they see fit versus having Nvidia tell five different partners that they need to build and compete with each other over a new SKU at a pre-defined clockspeed and MSRP.

Consider that unlike the case with CPUs, all these AIB's are scrambling and clawing to find some way to "add value" to the supply chain above and beyond the competitors offerings.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
I've posted on this general topic before and my sentiment then was that if the electrical costs associated with the power-consumption of your choice of GPU represents a make-or-break decision based on the financial aspects then one should really question whether or not they can truly afford to be spending their time and money on the hobby in the first place.

That's only an issue for people who are arguing about a few extra dollars per month in their electricity bill as the only relevant factor in the usage of extra electricity for no extra performance gain. Yet, I have posted on this topic before too and nobody seems to realize that we're in an age where our energy consumption matters collectively.

When I leave my apartment I turn off the lights, and I hope you do too. Similarly, if I can have a computer that uses 100W less per hour of use (gaming) than another computer, all else being equal, I will buy that one. If I did not, I would be committed to saying that (all else equal), I'm happy to leave at least one 100W light on at home when I leave FOR NO REASON. (Aside: if you think it's a security deterrent because you live in Compton, blah blah, then you have an additional reason to leave that light on, but that's not analogous to my 'all else equal' scenario).

In essence, there are plenty of us with loads of money that care about efficiency not simply for efficiency's sake, but for the causal relationship WASTING ENERGY WHERE ALL ELSE IS EQUAL has on this planet. When gamers choose between cards they ought to care about power efficiency as well.