HD 6850 Meta-Analysis of oc'ing headroom and EyeFinity performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Drawing conclusions from OC'd cards never made any sense as chips/cards can vary very wildly if they come, say from a different batch or different card mfr etc.
It's good in reviews for informal info but one has to check his own card for his own results.
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
blastingcap: thanks for your thoughts on GPU requirements for 3x1080p gaming. As it stands, my budget & PSU can stretch to 2 mid-range cards, if Crossfire or SLI can handle my less graphics-heavy games across 3 screens. Holding out for Cayman makes sense - if I can resist from jumping on a current mid-range hot deal.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
blastingcap: thanks for your thoughts on GPU requirements for 3x1080p gaming. As it stands, my budget & PSU can stretch to 2 mid-range cards, if Crossfire or SLI can handle my less graphics-heavy games across 3 screens. Holding out for Cayman makes sense - if I can resist from jumping on a current mid-range hot deal.

Single-GPU = less power/heat/noise usually. Also means not having to worry about scaling and microstutter as much as people who use SLI/CF. But I understand your itch to upgrade.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Single-GPU = less power/heat/noise usually. Also means not having to worry about scaling and microstutter as much as people who use SLI/CF. But I understand your itch to upgrade.
There is no single GPU today that can give good performance for 3 X 1080p config. A 5870 is bought down to its knees and GTX 480, although the fastest GPU today, cannot be used unless you SLI.

IMO, a single GPU based multi monitor gaming is at-least a generation away, may be next gen cards like 7xxx or Kepler may be able to handle that.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
There is no single GPU today that can give good performance for 3 X 1080p config. A 5870 is bought down to its knees and GTX 480, although the fastest GPU today, cannot be used unless you SLI.

IMO, a single GPU based multi monitor gaming is at-least a generation away, may be next gen cards like 7xxx or Kepler may be able to handle that.

read the thread and his and my comments. he isn't trying to run metro 2033 at 3x1080p. and cayman is not "today" but reportedly ~5970
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
*edit

your right blastingcap miss read it.
The 5970 is the fastest single card, no doubt about that.

Thought he ment the 480 being the fastest singel chip, and was gonna defend him saying thats true, but the 5970 is the fastest card on the market.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
blastingcap he said fastest GPU today, which hes right is the fastest gpu. The 480 is the fastest single chip gpu there is today.

He didnt say fastest card on the market or such, in which case the 5970 would be.


nvidia has the fastest chip.
amd has the fastest card.

all that might change with the 6970 and 6990 and the 580, though.

I dont believe nvidia in such short time could have fixed the fermi enough to get 20% increases over the 480s. I think when the 6970 releases that ll be the fastest chip, and the 6990 will be the fastest card on the market.

Read the thread before commenting on this please, not just the last few comments. The person I responded to wanted to run games on medium in 3x1080p. I gave my $.02 about CF/SLI which I dislike for above-stated reasons. If we narrow it down to single-GPU, I said that an oc'd 5870 could sorta get the job done (depending on what level of fps and settings you are willing to tolerate), but that a Cayman XT GPU is really what he might want to wait for assuming that it is ~5970 performance level.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
and cayman is not "today" but reportedly ~5970

The reason I didn't mention 5970 was very obvious, its a dual GPU card which you do not prefer.

I was not trying to interfere between you and betasub, I should have made it clear in my previous post and I quoted the wrong post. It was my opinion based on the OP.

Even a ~5970 level of performance is not going to cut it, IMO better IQ>Multi monitor gaming.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The reason I didn't mention 5970 was very obvious, its a dual GPU card which you do not prefer.

I was not trying to interfere between you and betasub, I should have made it clear in my previous post and I quoted the wrong post. It was my opinion based on the OP.

Even a ~5970 level of performance is not going to cut it, IMO better IQ>Multi monitor gaming.

ok, you responded to a response I posted to someone else which is why i though you were not commenting on the first post of this thread. glad we cleared that up

when i wrote cayman as being ~5970 i was talking about expected performance level, not about a 5970 card

if you are going back to the first post of this thread, i was happy playing tf2 on medium-high settings on 5040x1050 on an oc'd 5850 but understand that others may think that's not good enough IQ or high enough fps for them. bully for them, they can get tri-SLI GTX480 or whatever is necessary to meet their standards. :thumbsup:
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
O_O' id much rather have a 3-6 pc-screen eyefinity setup than better IQ? what are you even talking about? IQ wise what are the ATI cards lacking?

You cannot be serious. Its not the AMD cards that lack IQ, when I say better IQ I mean playing at native resolution+high quality+AF. I am more than happy when I play with high quality settings on a single monitor than at medium/low settings with no AF and AA on multiple monitors.

What is use of pushing in excess of 5 million pixels when all you get is a slideshow performance.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
ahh yeah... agreed.

however... check out that video :) gawd that is some smexi eyefinity setup.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
ok, you responded to a response I posted to someone else which is why i though you were not commenting on the first post of this thread. glad we cleared that up

when i wrote cayman as being ~5970 i was talking about expected performance level, not about a 5970 card

if you are going back to the first post of this thread, i was happy playing tf2 on medium-high settings on 5040x1050 on an oc'd 5850 but understand that others may think that's not good enough IQ or high enough fps for them. bully for them, they can get tri-SLI GTX480 or whatever is necessary to meet their standards. :thumbsup:

Cayman coming in at 5970 performance levels would be pretty sweet.

Going on 6870 performance and estimates we've seen at Cayman's die size. It seems possible with the improvements to crossfire scaling. I guess we could see the same situation repeated again with the efficiency improvements.

6990 using two downclocked 6970 Cayman cores and 6950 being a mildly stripped down 6970.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
(oc to 900MHz @ 1.087V drove the GTX460's power consumption to 237 watts at load!)

I have seen quite a few individual posts boasting that HD6800 overclock >1000mhz at "only" 1.35V-1.45V. Who actually thinks it's "reasonable" to overvolt their aircooled HD58xx/HD68xx series for 24/7 operations at that voltage?!

1.087V and 237W for 900mhz GTX460 overclock isn't that bad actually. Take a look what happens once you clock Cypress RV870 chips beyond 1000mhz @ 1.35V.

ocgraph.png


I agree with the concensus that HD6850 @ $179 is the better value over the HD6870 once clocked to 900mhz. But then if we take overclocking into the equation, you can overclock the $180 GTX460 with better coolers to 850-900mhz at which point HD6850 will actually be slower. For overclockers with single monitors <1920x1200, GTX460 is still the better card imo (and the $140 GTX460 768mb is a screamer too).
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I have seen quite a few individual posts boasting that HD6800 overclock >1000mhz at "only" 1.35V-1.45V. Who actually thinks it's "reasonable" to overvolt their aircooled HD58xx/HD68xx series for 24/7 operations at that voltage?!

1.087V and 237W for 900mhz GTX460 overclock isn't that bad actually. Take a look what happens once you clock Cypress RV870 chips beyond 1000mhz @ 1.35V.

ocgraph.png

That's why I was reluctant to go over 1.15V on my 5850 (I left it at stock voltage and ~866MHz core for 1680x1050 and went to 1.15V + 940MHz *only* when in Eyefinity), and why I will be similarly reluctant to push my GTX460-768 over 1.15V. Heck, I think I will just leave it at 1.087V unless I have some pressing reason to do otherwise.

People who think they can get free performance from overvolting (as opposed to oc'ing without voltage modification) are paying extra for electricity and potentially shortening the lifespan of the GPU while driving up heat and noise.

Oc'ing without voltage mod will also drive up power/heat/noise but it a) won't shorten the life of the GPU and b) will not drive power/heat/noise up nearly as much as overvolting does.

The above paragraphs are still relevant if you watercool a GPU or have some other powerful cooling solution like a good vapor chamber and fan. Although powerful cooling offsets the increased temps due to voltage and thus prolong the life of the GPU, the drawn wattage still has to go somewhere. The heat will get dumped out the back of the computer and heat up the room. And you still pay extra for the extra electricity drawn. Furthermore, water cooling and vapor chamber designs cost more, which defeats the purpose of oc'ing if one oc's to get "free" performance.

Although AT had this article about CPUs and not GPUs so it's potentially less applicable, this is an article about how sustained overvoltage can erode CPU lifespan: http://www.anandtech.com/show/2468/6
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Although AT had this article about CPUs and not GPUs so it's potentially less applicable, this is an article about how sustained overvoltage can erode CPU lifespan: http://www.anandtech.com/show/2468/6

Definitely and the same applies to GPUs. This is why my i7 is staying put at 1.312V and my GTX470 @ 750mhz at 1.0V. I always try to find the highest overclock achieved as close to default voltage as possible. Once serious overvolting is added into the equation, you can kiss good-bye to performance / watt and efficiency <-- now do you see why I sometimes get a little irritated when people discuss GTX480 vs. HD5870 power consumption running Core i7 920/930s @ 4.0ghz+ hehe.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Definitely and the same applies to GPUs. This is why my i7 is staying put at 1.312V and my GTX470 @ 750mhz at 1.0V. I always try to find the highest overclock achieved as close to default voltage as possible. Once serious overvolting is added into the equation, you can kiss good-bye to performance / watt and efficiency <-- now do you see why I sometimes get a little irritated when people discuss GTX480 vs. HD5870 power consumption running Core i7 920/930s @ 4.0ghz+ hehe.

One thing I never got was why one would want to pay, say, $100 to watercool a GPU X in order to push it to the performance level of GPU Y, when GPU Y costs less than $100 more than GPU X and draws less power than the overclocked GPU X. Not to mention how watercooling voids most companies' warranties. Not to mention that GPU Y will have headroom to oc without overvolting, on top of that. :confused:

I guess they must really value silence or something. :)
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
You cannot be serious. Its not the AMD cards that lack IQ, when I say better IQ I mean playing at native resolution+high quality+AF. I am more than happy when I play with high quality settings on a single monitor than at medium/low settings with no AF and AA on multiple monitors.

What is use of pushing in excess of 5 million pixels when all you get is a slideshow performance.

Meh, not me, I find the immersion from multi-monitor much more enjoyable than having the settings cranked up for increased IQ. Especially when some of the IQ settings are things I don't want anyway, like Depth of Field.. But I don't know why you need to turn off AF. That's the one setting I can't live without and I never have an issue with it at 16x on Eyefinity.
 

Triggaaar

Member
Sep 9, 2010
138
0
71
Even a ~5970 level of performance is not going to cut it, IMO better IQ>Multi monitor gaming.
If I could drop a bit of IQ (1 drop in res, less/no AA) and keep reasonable frame rates, I'd rather have 3 screens than 1 at perfect settings. Obviously it depends on what style of game you like too.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
So Amazon sent me a semi-defective 460-768MB that may die at any moment but which I have 30 days to return. I ordered a 460-1GB Cyclone to replace it, and it's been 3 days now and no activity re: shipping? The card is listed as in stock. Wtf? It only took TWO business days for the 460-768MB to ship, what's the holdup with the 1GB? They took so long that I canceled the 460-1GB order to order the ASUS 6850 instead, now that the price on those has dropped to a reasonable $187 and the shipping time has an estimate now (instead of no estimate). I'll be on vacation or otherwise busy for mid-November so 2-4 weeks isn't so bad. Better yet, in the meantime I can use the 460-768 from now till mid-November, since I have 30 days to return it. Oh and, I realized just how much I actually *do* miss Eyefinity over the last few days with this GTX460... funny how much I took for granted with my old 5850 setup in TF2.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,722
418
126
6870 hit 1000 on the core here with no voltage bump, the fps graphs are a let-down as they are merely bar graphs lacking specific numbers.

http://www.overclock3d.net/reviews/gpu_displays/asus_hd6870_crossfire_exclusive_review/4

Hover your mouse pointer over each bar to see the number.

All this darn sites trying to innovate where isn't needed - just put the fracking number there I don't want to go bar by bar!


We allow cussing in P&N and OT, not in the tech forums.

Moderator Idontcare
 
Last edited by a moderator:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Got my Sapphire 6850 yesterday and ran some 20 to 25-minute OCCT tests to get an idea of how hard the card could be pushed a stock volts with case fans on Low.

Overnight results:

90 minutes of OCCT with zero errors: 951/1236@stock volts. With my mildly aggressive fan profile, temps and fan peaked together at ~65C/65&#37;. TriXX + GPU-Z = mimics Afterburner. Afterburner has some sort of problem whenever I switch Eyefinity modes--which is a lot--so it looks like TriXX + GPU-Z will have to do until Afterburner fixes that problem.

I might be able to squeeze a few more MHz but 951 is good enough, and I want to leave a cushion in case of hot summer days or something.

I'm curious how high this thing goes at 1.3V but honestly I don't like overvolting much and as far as I'm concerned, this card's stock volts is ALREADY an overvolt, compared to my reference HD5850 (something like 1V or 1.087V).

951 would rank as the second-highest result on the list of review site oc's on the first page of this thread, as of today, so I think my card did relatively well. :)
 
Last edited: