Official GTX560 Review Thread (updated with 17 reviews at this time)

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Does anyone have an explanation for the ridiculous performance difference between AMD and nV in Civ 5?

NV released new drivers that now appear to remove CPU bottleneck for the 5 series cards, Ryan Smith talks about it in Anandtech review, he says SLI performance has been fixed too!
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
The graph notty22 posted shows the 6950 and overclocked 560 equal. The default 6950 is faster than the default 560.

If you want to bring in things that are not 100% for all then the 6950 unlocks and "slaughters" the default 560.

Yeah, but it uses more power and is noisy compared to the 560...LOL
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
The difference in overclocking potential is what I see. Hardware canucks was able to get an extra 169 mhz out of a gtx 560 but only 55mhz out of the 6950 1gb at stock voltages. The 560 also seems to run cooler so when volts are added the 6950 is going to past the comfort zone quicker.


GTX-560-98.jpg

Dont forget its quieter...
 

Reckoner

Lifer
Jun 11, 2004
10,851
1
81
Ended up ordering the gigabyte soc off tigerdirect. With the $10 visa promo, I ended up paying $272 shipped
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Ended up ordering the gigabyte soc off tigerdirect. With the $10 visa promo, I ended up paying $272 shipped

You my friend just got the best price/performance video card deal on the internet.
Let us know how it overclocks.

Any reason why you didn't choose the 6950 1gb for 268$ shipped. It is cheaper.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
You forgot to put "when both are overclocked".

We don't know that. We don't know how much those 1050mhz overclocks increase power consumption for both cards. Seeing as the 6950 is faster when both are overclocked a bit more power use is ok.

I have to admit the 560 is a quiet card, we just need to see those custom 6950 benched (The sapphire, XFX and HIS cards we saw)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
What? The 6950 uses less power than the 560.

http://www.anandtech.com/show/4135/nvidias-geforce-gtx-560-ti-upsetting-the-250-market/16
"6950 1GB rig is drawing 44W less"

And it was only 3dB louder. You make it sound like it was some 20dB differance.

Xbit measures power at the card, instead of the entire system. (Should be the only way to measure, but whatever).

http://www.xbitlabs.com/articles/video/display/geforce-gtx-560-ti_4.html#sect0

Seems the 6950 does indeed draw more power.
 

Reckoner

Lifer
Jun 11, 2004
10,851
1
81
You my friend just got the best price/performance video card deal on the internet.
Let us know how it overclocks.

Any reason why you didn't choose the 6950 1gb for 268$ shipped. It is cheaper.

I like NVIDIA's driver support better than amd. Can't go wrong with either one, but I have been a NVIDIA fan since my ti4600 (loved that card)
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
Ended up ordering the gigabyte soc off tigerdirect. With the $10 visa promo, I ended up paying $272 shipped

Why the SOC then the DirectCU? They'll both overclock about the same and asus has a longer warranty
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Forget OC for a moment, take two cards at default factory settings like GTX 560 Ti SOC and GTX570 (non OC model). You have almost same performance and less power usage with less heat and it cost less.

For people that don’t OC, the GTX560 Ti SOC is better and cheaper.
Situation changes if we OC both cards and everyone knows that GTX570 will be the winner.
Better performance than a stock 560, sure, and that makes it a good value. But it's not always better than the 570. And if we're talking about the Guru3D benchmarks I have my doubts; I don't think they are using the same driver for the 570 and 560, since the 570 gets the same framerates in the 560 review as it does in their 6900 review, and they used 263.09 in that review.

Oh look, I think I was right. This TechReport review, with the Gigabyte 560-OC, is never showing it faster than the GTX 570, contrary to what the Guru3D review showed, and they are using the latest drivers:
http://techreport.com/articles.x/20293/1

Plus, as you can see, it actually has the Gigabyte card using more power. To get that speed you simply have to use more power; you're not going to get that performance for free since both GF110 and GF114 are refined for performance and power. Plus they use a better and more comparable method for power usage than Guru3D does.

Simply put, concerning Nvidia chips, if you want GTX 570 performance you are going to consume 570-like power. Overclocking a 560 is going to raise the power consumption, and the TechReport results show power consumption correlates to performance: The Gigabyte card uses about the same power and delivers about the same performance. Although technically, the 570 is offering better performance/watt in their review, but just by a hair. I'd basically call it even. As far as performance goes, the 560-OC does look good if you're only concerned about out of the box performance. But the 570 is simply faster. It has a lot more shader units, so overclocking them has a bigger impact than on chips with less units (and thus it doesn't need to reach such high speeds), it has more memory, and it will have more memory bandwidth.

Why should we forgetaboutit? The 570 does not OC like the 560 so I'd say very important to think about since it's not even close to 900Mhz OC.
http://www.techpowerup.com/reviews/P...atinum/31.html
http://www.techpowerup.com/reviews/A...TX_570/30.html

If 560's regularly gets 1000 it's a faster card both OCed for $100 cheaper.

Even if both not OCed the 570 is an awful value compared to 560/6850.
The 570 doesn't need to overclock the same in terms of GHz because it has more shader units (a lot more), more memory bandwidth, and more memory.

Besides he wasn't saying we need to forget about it. He was only saying that to try to illustrate a point (he was making a hypothetical situation), such that the point he was trying to make was to not consider home-brew overclocking but instead consider what you pay and what you get (guaranteed) out of the box.

And faster cards always command a premium, especially the ones at the very top like the 580 and 570. This is nothing new and we saw this with the GTX 285 and 275.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126

happy medium

Lifer
Jun 8, 2003
14,387
480
126
We don't know that. We don't know how much those 1050mhz overclocks increase power consumption for both cards. Seeing as the 6950 is faster when both are overclocked a bit more power use is ok.

You mean you dont know that. :)
Me and Castiel do.
A maximum overclocked, overvolted 6950 will draw 300 watts. Why? To get a maximum overclock for AMD cards, you need alot of extra voltage.
Do you think a gtx @ 1050 core will do that? It draws 195 watts @ 1000 core, with no voltage tweak. So 50mhz more will draw 105 watts? I don't think so.
amazing ha?
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Xbit measures power at the card, instead of the entire system. (Should be the only way to measure, but whatever).

http://www.xbitlabs.com/articles/vid...i_4.html#sect0

Seems the 6950 does indeed draw more power.

Two failures here:

1. Xbit is using a synthetic test, OCCT, and that is not indicative of the real world because Nvidia artificially limits their cards under OCCT:

http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/3
As a result having this protection in place more or less makes it impossible to toast a video card or any other parts of a computer with these programs. Meanwhile at a PR level, we believe that NVIDIA is tired of seeing hardware review sites publish numbers showcasing GeForce products drawing exorbitant amounts of power even though these numbers represent non-real-world scenarios. By throttling FurMark and OCCT like this, we shouldn’t be able to get their cards to pull so much power. We still believe that tools like FurMark and OCCT

The vast majority of websites, utilizing a consistent and clear power consumption testing methodology, show the 6950 using less than the 560.

2. Measuring power consumption at the card level is a good method, but the other method (employed by Anandtech and several sites) is just as valid, so long as they test the cards under the same conditions (which Anandtech and most, but not all, sites do). It's valid simply because total system power consumption does matter in the long run, since a video card has to run in a system and "can't" run by itself. How a video card and it's drivers react with the rest of the system can also impact power consumption and performance. Since AMD/Nvidia most likely employ a slightly different approach on just how their drivers utilize the CPU in conjunction with their GPUs, measuring the power consumption of the entire system can bring to light such happenings - especially if you were to compare card-only results to system-wide results.


Odd, they show the 6950 using +3 watts while AnandTech has it using 25 less.

Does that mean the nvidia drivers make the i7 CPU work harder than AMD's drivers, causing a system with a 560 to draw an extra 28 watts of power?

No, they are using OCCT and Nvidia (and AMD) limits the power usage under Furmark and OCCT. Well, I don't know if AMD does it under OCCT, but they do it under Furmark. If you look at reviews that use real games for testing power, they will show the 6950 being lower.

Do you think a gtx @ 1050 core will do that? It draws 195 watts @ 1000 core, with no voltage tweak. So 50mhz more will draw 105 watts? I don't think so.
amazing ha?

LOL, are you still using 195 watts that Guru3D got as Gospel? Their power consumption testing methodology is a joke compared to other sites.

And even then, the Gigabyte cards are probably using "cherry" cores. Therefore their power consumption numbers may not be correlative to all GTX 560s at those clockspeeds.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The power consumption, temps, and noise, are mostly acceptable for all these cards until you overclock. This is where the gtx560 pulls ahead.

I think we can all accept that.?
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
I'm probably the only one who doesn't care how much power overclocked cards consume. Just give me the best bang for the buck.

As for the 560's.. I want them but they're a sidegrade. Think i might just wait till the Asus GTX 580 DirectCUII gets released.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'm probably the only one who doesn't care how much power overclocked cards consume. Just give me the best bang for the buck.

As for the 560's.. I want them but they're a sidegrade. Think i might just wait for the Asus GTX 580 DirectCUII gets released.

I agree, unless they are way out of wack.
 

4ghz

Member
Sep 11, 2010
165
1
81
I would love to get the SOC, but the 1 year warranty really makes me nervous

I'm pretty sure that's a typo. All Gigabyte Cards should carry a 3 year warranty from the manufacture date which can be found in the serial number.
 

Nged72

Member
Jan 25, 2011
131
0
71
I still don't know what to get, seeing as im using an 8800 GTS OC from a Loooong time ago.

All the cards look good to me, but since I have never really Overclocked a GPU idk. I have heard mixed things about the old 460 MSI Frozr as well as the Gigabyte SOC and the Mosfets.

Now the Asus is cheaper and apparently overclocks just as much, but is a newer design.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
You mean you dont know that. :)
Me and Castiel do.
A maximum overclocked, overvolted 6950 will draw 300 watts. Why? To get a maximum overclock for AMD cards, you need alot of extra voltage.
Do you think a gtx @ 1050 core will do that? It draws 195 watts @ 1000 core, with no voltage tweak. So 50mhz more will draw 105 watts? I don't think so.
amazing ha?

No, that's for the specially binned gigabyte card, not the average 560.

300W for a "maximum overclocked, overvolted 6950"? What exactly is that a 1.5v 1500mhz overclock?

Seems a bit odd that you would compare an overvolted card with a stock volted card.

FYI a 6950 will do over 900mhz with stock volts IIRC.

http://www.overclockersclub.com/reviews/nvidia_asus_gtx560ti/16.htm

http://www.overclockersclub.com/reviews/amd_hd6970_hd6950_review/6.htm
 
Last edited: