X-Bit Labs review 68xx vs. 460/470

IndyColtsFan

Lifer
Sep 22, 2007
33,656
687
126
Perfect, thanks OP. I'm debating between a 6870, an OC 460, and a 470 and this will be very helpful.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
It took them only like 3 weeks to have their review out. I've been waiting for this one.
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
It took them only like 3 weeks to have their review out. I've been waiting for this one.

Me too. They took their sweet time, but very good. Don't miss the power consumption graph. Wow..
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Are they kidding? Look at this graph. Whats wrong with this picture?

hd6800_power.png
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Are they kidding? Look at this graph. Whats wrong with this picture?

hd6800_power.png

Looks like they had a bum HD6850 card:

"Thus, the new Radeon HD series proves to be very energy efficient in our tests except for the disappointing failure of the PowerPlay technology in the Radeon HD 6850 in some modes, but we don’t think that off-the-shelf versions of the card will have that defect."
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
How does the gtx460 768 beat a 6850 in half the test @ 1900x1080? That would mean a gtx460 1gb is faster.
I don't get these results.

Quote from review:

" But it is not definitely better than the GeForce GTX 460 768MB."
They do state it (6850) wins at 2500x1600.

Radeon&
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
The thing i like about xbit is that they are legit and do a good job.
I honestly doubt some of their numbers in many reviews. a while back that would get horrible performance with the gtx260 that they had and I contacted them about it and on more than one occasion they literally just changed the numbers in the benchmark after I complained. o_O
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I honestly doubt some of their numbers in many reviews. a while back that would get horrible performance with the gtx260 that they had and I contacted them about it and on more than one occasion they literally just changed the numbers in the benchmark after I complained. o_O

yea, there is no way these numbers are right. It would put the gtx460 1gb faster then the 6850.

It also makes the gtx460 768 a cheaper and equally as fast card vs the 6850.

I have a feeling Xbit labs reviews will no longer be posted.
They must be Nvidia fan boys. :p

Wonder how this one will be spun? Incomming in 3 ,2,1....:whiste:
 
Last edited:

Sickamore

Senior member
Aug 10, 2010
368
0
0
The thing is i didn't like the anandtech review because they added the ftw in the review and made it look like the 460 was that powerful compared to a 6 series card.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The thing is i didn't like the anandtech review because they added the ftw in the review and made it look like the 460 was that powerful compared to a 6 series card.

I believe they used a stock gtx460 also?

I wonder what Xbit used to make the gtx460 768 look like its faster and cheaper then the 6850?

There will probrobly be a 6850 hot fix 10.2b version 5 driver for there update of this article. j/k :)
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
A lot of the results are waaaaaay off what AT had in their review (sure, different settings etc) but nowhere near enough to do the sorts of swings that seem to have happened in the Xbitlabs review.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
A lot of the results are waaaaaay off what AT had in their review (sure, different settings etc) but nowhere near enough to do the sorts of swings that seem to have happened in the Xbitlabs review.

I didnt notice, could it have been new Nvidia drivers?
The revews is a little later then the others.
Na they used the same 260.89's, just checked
 

Sickamore

Senior member
Aug 10, 2010
368
0
0
I believe they used a stock gtx460 also?

I wonder what Xbit used to make the gtx460 768 look like its faster and cheaper then the 6850?

There will probrobly be a 6850 hot fix 10.2b version 5 driver for there update of this article. j/k :)


Yeah they did use a 460. However if i had to go for a card i would for the 460 ftw because it was cheaper. Why pay more for same performance.

Also the reason that i like the xbitlabs reviews is that with certain hardware that i have they have done a great job reviewing it.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
A lot of the results are waaaaaay off what AT had in their review (sure, different settings etc) but nowhere near enough to do the sorts of swings that seem to have happened in the Xbitlabs review.

The numbers are almost never the same from one review to another due to different settings and benchmarks/methodology used. For example, AT tests BattleForce in DX10 vs. Xbitlabs in DX11. Also, since they use Fraps in some of their games, it's impossible for the numbers to be aligned since the reviewers aren't going to run identical areas of the game. However, look at the general trends between many websites, such as Xbitlabs and AT. You will see similar trends.

AMD cards are generally faster in:
BF:BC2, Crysis, Mass Effect 2, Mafia 2

NV cards are generally faster in:
Lost Planet 2, Metro 2033, Starcraft 2, Far Cry 2, STALKER: CoP, Hawx 2, Civilization 5.

Other games like Just Cause 2 and Dirt 2 are a toss up depending on the review site.

So while the numbers are not comparable, the "trends" in Xbitlabs review are not much different than what is reported in TechReport, Hardware Canucks, Legion Hardware/Techspot, AT, Computerbase.de, BitTech.net, etc.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
The numbers are almost never the same from one review to another. For example AT tests BattleForce in DX10 while Xbitlabs tests in DX11. Also, since they use Fraps in some of their games, it's impossible that the numbers will be aligned since the reviewers aren't going to run identical areas of the game. However, look at the general trends between many websites, such as Xbitlabs and AT. You will see similar trends.

AMD cards are better for:
BF:BC2, Crysis, Mass Effect 2, Mafia 2

NV cards are better for:
Lost Planet 2, Metro 2033, Starcraft 2, Far Cry 2, STALKER: CoP, Hawx 2, Dirt 2.

So while the numbers are not comparable, the "trends" in Xbitlabs review are not much different than what is reported in TechReport, Hardware Canucks, Legion Hardware/Techspot, AT, Computerbase.de.

The AT review shows ATI being better at Starcraft 2, with the 6850 beating the GTX460 1GB (non-OC) by a decent margin, while Xbitlabs shows the 6870 to barely match the GTX460 1GB.

That's a pretty damned huge difference.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The AT review shows ATI being better at Starcraft 2, with the 6850 beating the GTX460 1GB (non-OC) by a decent margin, while Xbitlabs shows the 6870 to barely match the GTX460 1GB.

That's a pretty damned huge difference.

You picked 1 game that's 100% certain to not have identical results since there is no benchmark for SC2. Each review site had to create their own from scratch. Even then, I am not even sure what you mean by "better" when GTX460 is putting out 50fps in SC2 AT bench at 1920x1200 4AA vs. 65 fps on HD6870 - basically impossible to tell the difference in gameplay in a strategy game.

Besides the point, AT's review in fact is more of an outlier compared to what everyone else is reporting. HD6870 is not much faster than the GTX460 1GB is in SC2 (not that this matters since both of them can play the game very well, so any performance advantage by either side is trivial).

sc2hardwarecanucks.jpg


starcraft2201920.png


sc2m.gif


2560x1600 4AA, Xbitlabs has 38 fps for GTX470 and 33 fps for HD6870 vs. 39 fps for GTX470 and 36 fps for HD6870 on TechReport. That's easily within a margin of error.

The bottom line HD5850/70, HD6850/70, GTX460/470 all play SC2 with no problems. That's why I think this game is a waste of time for testing modern videocards above $200. I mean you can't tell the difference between 42 fps and 36 fps in a strategy game like SC2 ;)
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
The numbers are almost never the same from one review to another due to different settings and benchmarks/methodology used. For example, AT tests BattleForce in DX10 vs. Xbitlabs in DX11. Also, since they use Fraps in some of their games, it's impossible for the numbers to be aligned since the reviewers aren't going to run identical areas of the game. However, look at the general trends between many websites, such as Xbitlabs and AT. You will see similar trends.

AMD cards are generally faster in:
BF:BC2, Crysis, Mass Effect 2, Mafia 2

NV cards are generally faster in:
Lost Planet 2, Metro 2033, Starcraft 2, Far Cry 2, STALKER: CoP, Hawx 2, Civilization 5.

Other games like Just Cause 2 and Dirt 2 are a toss up depending on the review site.

So while the numbers are not comparable, the "trends" in Xbitlabs review are not much different than what is reported in TechReport, Hardware Canucks, Legion Hardware/Techspot, AT, Computerbase.de, BitTech.net, etc.

RS, I am largely in agreement with you in that other thread where you talked about how the most stressful new games were the ones that really mattered, since older games would get maxed out more easily anyway. (Exception: games that you plan to play in Eyefinity. That's why I still care about DX9 performance, because I like to play Source games at 5040x1050. So even though Source games are not typically thought of as new or stressful, I like it when people bench new cards with Left4Dead2.)

I would like to add that canned benchmarks matter less than actual playthroughs (especially in the past, when NV/ATI optimized for game benchmarks more than they do now), hence why I read AT for hardware analysis and HardOCP for performance comparisons in *games* rather than *game benchmarks*. In fact, your recommendation of testing actual performance in a few new, stressful games, is basically what HardOCP already does: http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/ They just do it in a way that is a bit weird to some people, in that they try to push details and resolutions up to the limit of each card, rather than have apples-to-apples comparisons at specific resolution/setting combinations. They do have a few apples-to-apples charts up in each review, though, to appease people who want to see those kinds of things.

Something I would like to see is HardOCP-style reviews that compare oc vs. oc, but due to the variable nature of oc I understand why those kinds of reviews are rarer. I mean, you might luck out and get a GTX460 that hits 880MHz@stock volts, or you could get a card that hits a wall at 825MHz. (GURU3d tested six cards and that was the approximate range.)

On the other hand, with enough data points we can get a sense of what a likely range of max oc's at stock volts is, like how most stock-clocked and moderately-oc'd GTX460s max out at about 840MHz with good cooling (see Guru3d review), or how most stock-clocked Radeon 6850s max out at ~900Mhz with good cooling (I made a meta-analysis thread about this). So I'd like to see more reviews that pit, say, a GTX460-1GB@840Mhz@stockvolts vs. a HD6850@900@stockvolts, because those are "typical" maximum oc's without overvolting. The reviews should use actual gameplay, not canned benchmarks or synethic tests like Vantage.

I'd also like to see similar studies with overvolting, but all we have so far are reviews like HWC and TR where they maxed out volts on specially-binned GTX460s but did not max out volts on 6850s.

Edited to add, for those who have no idea why HardOCP does things the way it does: http://enthusiast.hardocp.com/article/2008/02/11/benchmarking_benchmarks
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RS, I am largely in agreement with you in that other thread where you talked about how the most stressful new games were the ones that really mattered, since older games would get maxed out more easily anyway.

I agree. That's why it's trivial to compare "wins" in Call of Duty 4: MW2, Far Cry 2, Starcraft 2, Dirt 2, Hawx 1 & 2, Mass Effect 2, Wolfenstein. For older games, they should dedicate 1 page and only test these games at 2560x1600 resolution or higher (multi-monitor setups). Look at Medal of Honor which just came out last month. GTX275 is running at 80fps at 1920x1200.....Even with 4AA, you'll still have 60fps+ on something like an HD6850. Not exactly cutting edge graphics.

With consoles slowing down the progress of DX11 and graphics in general, that would leave only 5-6 modern games in reviews. :oops: I guess HardOCP is loving this because they don't need to do extensive reviews.

I would like to add that canned benchmarks matter less than actual playthroughs (especially in the past, when NV/ATI optimized for game benchmarks more than they do now), hence why I read AT for hardware analysis and HardOCP for performance comparisons in *games* rather than *game benchmarks*. In fact, your recommendation of testing actual performance in a few new, stressful games, is basically what HardOCP already does:

Agreed, unless the canned benchmark closely resembles actual real world performance. HardOCP does have cool reviews. Just look back at the time when Doom 3, F.E.A.R., Far Cry 1, Crysis came out. We haven't had anything that really begged for an upgrade besides Metro 2033 I would say. HD6850 @ 900mhz, HD6870 are so fast for today's games, I can't see much demand for HD6970 for 1920x1080 resolutions. These cycles always happen where software is way behind and then bam 3-4 next generation titles and your card is a paper weight. :p
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
RS, I am largely in agreement with you in that other thread where you talked about how the most stressful new games were the ones that really mattered, since older games would get maxed out more easily anyway. (Exception: games that you plan to play in Eyefinity. That's why I still care about DX9 performance, because I like to play Source games at 5040x1050. So even though Source games are not typically thought of as new or stressful, I like it when people bench new cards with Left4Dead2.)

I would like to add that canned benchmarks matter less than actual playthroughs (especially in the past, when NV/ATI optimized for game benchmarks more than they do now), hence why I read AT for hardware analysis and HardOCP for performance comparisons in *games* rather than *game benchmarks*. In fact, your recommendation of testing actual performance in a few new, stressful games, is basically what HardOCP already does: http://www.hardocp.com/article/2010/10/21/amd_radeon_hd_6870_6850_video_card_review/ They just do it in a way that is a bit weird to some people, in that they try to push details and resolutions up to the limit of each card, rather than have apples-to-apples comparisons at specific resolution/setting combinations. They do have a few apples-to-apples charts up in each review, though, to appease people who want to see those kinds of things.

Something I would like to see is HardOCP-style reviews that compare oc vs. oc, but due to the variable nature of oc I understand why those kinds of reviews are rarer. I mean, you might luck out and get a GTX460 that hits 880MHz@stock volts, or you could get a card that hits a wall at 825MHz. (GURU3d tested six cards and that was the approximate range.)

On the other hand, with enough data points we can get a sense of what a likely range of max oc's at stock volts is, like how most stock-clocked and moderately-oc'd GTX460s max out at about 840MHz with good cooling (see Guru3d review), or how most stock-clocked Radeon 6850s max out at ~900Mhz with good cooling (I made a meta-analysis thread about this). So I'd like to see more reviews that pit, say, a GTX460-1GB@840Mhz@stockvolts vs. a HD6850@900@stockvolts, because those are "typical" maximum oc's without overvolting. The reviews should use actual gameplay, not canned benchmarks or synethic tests like Vantage.

I'd also like to see similar studies with overvolting, but all we have so far are reviews like HWC and TR where they maxed out volts on specially-binned GTX460s but did not max out volts on 6850s.

Edited to add, for those who have no idea why HardOCP does things the way it does: http://enthusiast.hardocp.com/article/2008/02/11/benchmarking_benchmarks

In short don't believe canned benches and use this set at HardOCP?
Here, I even give you a link.?

I found my answer to my question in post #10. jeeeez :(
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I can't see much demand for HD6970 for 1920x1080 resolutions. I guess gives more time to save up :p

3D

Surround/Eyefinity

Both will soak up those spare frames/second. Surround/Eyefinity may also be useful for productivity tasks, too.

Btw am I smoking crack or does Crytek seemed to have "solved" the 3D performance problem with their nifty software workaround? Instead of halving framerates by alternative left/right eye frames, and forcing you to get a 120Hz monitor, they render the image ONCE, and then IN SOFTWARE they slightly shift the image information. Then it gets combined into one outputted image, for which you can use stereoscopic 3D glasses.

Source: http://www.pcgameshardware.com/aid,...perfomance-drop-and-8-core-optimization/News/
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
In short don't believe canned benches and use this set at HardOCP?
Here, I even give you a link.?

I found my answer to my question in post #10. jeeeez :(

I was responding to RS's post which I think may be the answer to your questions: benchmark results can apparently differ from site to site, though usually there is some sort of trend. I suggested that perhaps the only real way to get to the bottom of it is to do what HardOCP does. Please don't be so rude. Thanks. If I were as rude as you, I'd be pestering you all the time about whether you owned NVDA stock or something. Okay? So please stop it with your insinuations.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was responding to RS's post which I think may be the answer to your questions: benchmark results can apparently differ from site to site, though usually there is some sort of trend.

One of the reasons HardwareCanucks is one of my favourite review websites right now is because they aren't cool with "canned", "rolling demos" benches. They even did a short article on that. Based on the fact that each review website will differ in their methodologies, it makes it extremely hard to draw conclusions on which review is the most accurate. Personally, I read many reviews and just try to get an idea of a general trend (I am not saying this is the right method, just what I like to do). I mean if we start comparing exact numbers across reviews, it's just going to leave us frustrated (45 fps on Website #1, 32 fps on Website #2, 38 fps on Website #3, 2 of them are using Fraps, 1 is using a pre-built bench, etc. too many variables).

It's way too easy to fall back on "selective tactics" - find at least 1 website with contradictory results and present it as evidence of a general trend. Oh look, I just found a GTX470 beating an HD5870 in Crysis Warhead. Therefore, the 470 is faster than HD5870 in Crysis....However, if we stack up 10 reviews, we'll probably find 9 of them showing the HD5870 to be faster than the 470 in Crysis. Therefore, it's fair to say that HD5870 is 'generally' faster in Crysis.
 
Last edited: