Finally a GTX 460 @ 900 core review. Beats both a gtx470 and 5870!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
There we see a 211 watt spike for the OC model over the base model. People don't make a big deal about it because those who buy performance parts and then overclock them heavily aren't looking for the ideal solutions in terms of power efficiency. Go into the CPU forum and read the threads dealing with the largest overclocks, how many of the people are investigating their performance per watt? If this were a thread about the general properties of a specific architecture it would be one thing, lamenting the power draw of a heavily overclocked part? Really?

The point is interesting because people do say the GTX460 consumes much less power compared to a GTX470 and on exactly the same sentence say it can OC to its performance, while bashing the GTX470 for consuming too much power.

See the point?

If you get the same performance you are also consuming the same power.

So it is either "with the GTX460 you get a card which consumes less power, it is cheaper and a bit slower" or "you get a card that OCs great so you are saving money".

Although, I don't even see a lament - only a shocked expression. I guess toyota was under the impression the GF104 was much more power efficient (and this is because of what I said in the beginning of my post).
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Still, NV could have clocked those GTX460s much faster from the factory. Now imagine 2x GTX460s @ 850mhz for $400 vs. a single 5870 for $350. Fermi failed (sarcasm).

You could say it failled, if AMD can sell half the performance/die size for around the same price, they are doing something well. :D (PS: This post is not to be taken too serious)

EDIT: And I see you repost that link I mentioned.
EDIT2: Actually isn't one that show GTX470 and 5870?
 
Last edited:

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
It is a little surprising to see the hefty power consumption increase with a heavily OC'd GTX 460. I read a lot of reviews before purchasing and what sold me is I can either take the increased performance, power consumption and heat generated by the GTX 460 big overclock, or I can leave it at stock for games that don't need it flat out and avoid the two "cons". Its still a capable midrange card at 675 Mhz. Or I can do a more flexible middle ground OC like what I have at the moment. Its my choice and I like having the three options for a little over $200.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Sorry, some parts below are a bit off topic.

How many of us have recommended getting a 5850 and OCing it to the moon rather than buying a 5870? I sure did. I can therefore understand the frustration when every thread turns into "Card A is better because it consumes less power and makes less noise." (I am guilty of that myself too btw).

We are also proud of overclocking since we are able to get $150-250 CPUs and clock them to match the performance of $500+ CPUs. Why not the same with videocards? Sometimes I need to snap back because the times are changing and people are starting to care more and more about factors such as power consumption. So now we are faced with 2 conflicting objectives - maximizing performance yet minimizing power consumption.

Back in the days I remember the fuss about the inflated 480+W power supply requirements for 6800GT/Ultra cards (I Quote" "The GeForce 6800 Ultra is power-hungry (haha!) and is capable of consuming up to 110W under a peak load. The GeForce FX 5950 Ultra topped out at 80W"). Many people went bonkers on this forum because they thought they had to upgrade their 350W power supplies :) Now we consider 500-700W power supplies "mainstream" and GTS450 at 106W TDP child's play. Constantly increasing power consumption of components is eventually unsustainable; so once in a while both companies should reassess the performance/per watt of their parts.

Let's say this again together: 5950 Ultra, top of the line was only 80W at load, GTX480 is 250W today, 5870 not much better at 188W. And in 10 years should we expect 300-400W then? Seems like it.

A little interesting tid-bit from Anandtech's review of X800XL vs. 6800GT:

"The X800 XL does consume less power and thus, will run cooler than the 6800GT, which is a plus for ATI. Given that the 6800GT is already a single slot solution, the power/heat advantage isn't one that is entirely noticeable considering that the X800 XL cannot be run fanless."

6 years ago, Anand basically dismissed the 10W power difference between the 2 cards as immaterial. Today, it's not unusual to see bickering in the forums over 10-20W of idle power consumption between HD5000 and Fermi (for example), or 30W at load. Of course we know 6800GT sold extremely well despite the power consumption disadvantage and $100 higher price tag simply because it was faster than the X800XL. The same argument isn't made for GTX470 over 5850 today, is it? No because the power consumption difference is more like 90W.

Still, we have even seen arguments that a $100 more expensive 5870 is worth it over say a GTX470 often on the basis that it consumes less power and runs cooler. Wait a second, when did we see a cooler running $100 more expensive videocard with lower idle power consumption lose a lot of market share to a hotter running card with similar performance that cost $100 less than the competition at launch? 4870 at $299 was that card positioned right against a $399 GTX260. Fast forward, and hardly anyone is recommending a 470 over a 5870 or thinks 5870 should have a $100 price cut. So what happened??? Has price/performance become secondary to enthusiasts? Well 5870s have been selling for $350-370 for 12 months (and still are)...while NV had to quickly cut GTX260 to $299....

What about when X1900XTX replaced X1800XT? An increase of almost 60W of power at load, about the difference between a GTX470 and GTX460 today. I bet those who remember X1800XT would say it was a disappointment and X1900XTX was far superior. What happened to the 60W of extra power? It didn't matter because the performance was in spades. Now if a faster performing card consumes 40-60W of power, it's basically dismissed as a hot, inefficient dud. (not trying to start an argument of which card is better, only looking at power consumption in relative terms).

I guess the environmentalists are doing a great job because now we have to deal with (1) price (2) noise (3) power consumption (4) heat and (5) performance when recommending videocards, with #5 dead last?

Just my 2 cents.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
How many of us recommended getting a 5850 and OCing it to the moon rather than buying a 5870? I sure did. I can therefore understand the frustration when every thread turns into "Card A is better because it consumes less power and makes less noise." (I am guilty of that myself too btw). We are proud of overclocking in the first place because we are able to get $150-250 CPU and clock them to match $500+ CPUs. Why not the same with videocards? Sometimes I need to snap back because the times are changing and people are starting to care more and more about factors such as power consumption. I guess the environmentalists are doing a great job!!

People also bash AMD processors for requiring more power than Intel processors to do the same amount of work (not talking about the cases where AMD simply can't reach Intel performance) and the difference is much smaller.

And people did talk about the 4800 series being less power efficient even though differences were minimal and it was exacerbated by the temperature the AMD cards presented at stock (AMD went for less noise over heat dissipation to the surroundings).

Generally the differences in power consumption aren't that high as this generation though, particularly in the case of the GTX480 (the GTX470 is much more tame), read as, people can be picky because they can choose between similar performances at very different power consumption levels.

Of course once the price of the power hungry cards lower, people start to be more receptive to trading power consumption/higher temperature in their rooms/noise for cheaper performance.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If you could point these many people out it would be greatly appreciated. I've never met anyone who bought a graphics card in this segment that had power consumption as thier biggest concern. If power consumption is your biggest concern in your video choice, use an i530- it really is that simple. Nothing nVidia or ATi release in the form of a dedicated video card is going to be close in performance per watt. Why not troll some Corvette and Porsche forums and crap in threads with fuel economy ratings? It would be the same thing :)

This is probably right that GTX-460 buyers aren't real concerned with efficiency. If they want an efficient card that performs at ~GTX-470 levels they'll buy an ATI card in the first place. There are people out there who believe that a card that uses ~100W less for the same performance is worth paying more for. I assume you are just being dramatic throwing out the integrated graphics as an alternative. Unless you actually believe that they can do the job? :rolleyes:
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
BTW guys, the reason Happy medium didn't post the added power consumption as a negative when overclocking was because the Pros/Cons were provided by Xbitlabs, not by Happy medium. So if anything, the author of the original article should have noted it. Don't shoot the messenger!

I just read the whole thread and feel like I've been shot to death. :)

Really, while the power draw is much greater, personally I wouldn't give a crap.
I'm sure most cards are the same way, as Russian has allready pointed out.
It still runs cool and quiet according to the review.

I must say, I love the enthusiasm in this thread. :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
WTF everyone went haywire when power consumption was mentioned. He stated a FACT. If the point he made doesn't concern you then his post isn't intended for you as some people actually care for power consumption(not everyone lives in a place where electricity is cheap). Some of you are even saying that because it is not included in the OP that it shouldn't even be discussed which is.........

Personally I think the noted passion for discussion that we observe in this thread is healthy and indicative of a lively and diversified forum community.

In regards to the OP, a 900MHz GTX460 and power-consumption, I noticed a similar thing with my recent OC'ing adventures with the GTX460 MSI Cyclone.

In OCCT stability testing my APC UPS was reporting an power-load that was about 200W just for my OC'ed card at 1V and a mere 815MHz.

(idles at ~135W and full-load is ~380W...my PSU is 80%, so I figure of the extra 245W used by the PSU at load about 200W is to support the vid-card)

My temps were much higher than those in the review though, I see 83C with this cyclone card. I think the shroud design on the Palit in the review is a much better way to force airflow over the PCB.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,216
3,130
146
interesting results. I just wish my 5870's were easier to OC past 900. Does anyone know of any modified drivers or other programs that will do it?

I am kinda regretting paying over 800 for these cards :C Could have gotten 2 470's at $250 each and overclocked those very nicely.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
interesting results. I just wish my 5870's were easier to OC past 900. Does anyone know of any modified drivers or other programs that will do it?

I am kinda regretting paying over 800 for these cards :C Could have gotten 2 470's at $250 each and overclocked those very nicely.

If you have a card that can overvolt, try MSI Afterburner. Heck you can use that program even if your card can't overvolt, just push the oc past what Catalyst can give you.

http://event.msi.com/vga/afterburner/
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,216
3,130
146
the latest one has the CCC limits for me. Even mentions it under the changes. Any way to unlock it?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
the latest one has the CCC limits for me. Even mentions it under the changes. Any way to unlock it?

Yes. Go to the afterburner directory, find Afterburner.cfg, and change the unofficial overclocking variable to "EnableUnofficialOverclocking = 1" You may also be able to unlock voltage in Afterburner itself, depending on your card.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,216
3,130
146
nice, thanks, that worked. Had to edit permissions on the file though to get it to work :C

UPDATE: the 2nd card, a visiontek non reference model, does not work with afterburner :C

a small bump in core mhz causes BSOD.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
nice, thanks, that worked. Had to edit permissions on the file though to get it to work :C

UPDATE: the 2nd card, a visiontek non reference model, does not work with afterburner :C

a small bump in core mhz causes BSOD.

You're welcome. Sorry to hear about the non-ref version, this is why I usually try to buy reference cards as it eliminates incompatibilities like this. Perhaps you could get better compatibility from GPU Clock Tool though that's not supposed to be publicly available. http://downloads.guru3d.com/AMD-GPU-Clock-Tool-v0.9.26.0-For-HD-5870-download-2383.html

If that still doesn't work, I think it's the GPU and not the software. Good luck! :thumbsup:
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,216
3,130
146
no worries, I may trade the visiontek card for a reference card that my friend has from xfx. He runs his at 900/1200, so it wont be an issue for him.

oddly enough, i thought i was getting a reference card, but they shipped me a different one. Got it from cyberpowerPC, but w/e.
 
Feb 19, 2009
10,457
10
76
The gtx460 is awesome. It's getting cheaper by the day too.

When you OC, power use goes up, but its not that important. As long as its not too noisy, i'd leave it at OC speeds and enjoy the perf. gains. It's a very good card and the only thing keeping NV competitive these past months. It'll be very competitive with Barts if 1) you don't mind OCing and 2) NV lowers price some more. Bart samples we've got seem to OC ~30%.

If you OC the 5870, you need to bump the vcore a bit. In afterburner, in the options, enable vcore monitor/adjustment otherwise its off by default.
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
Personally I think the noted passion for discussion that we observe in this thread is healthy and indicative of a lively and diversified forum community.

In regards to the OP, a 900MHz GTX460 and power-consumption, I noticed a similar thing with my recent OC'ing adventures with the GTX460 MSI Cyclone.

In OCCT stability testing my APC UPS was reporting an power-load that was about 200W just for my OC'ed card at 1V and a mere 815MHz.

(idles at ~135W and full-load is ~380W...my PSU is 80%, so I figure of the extra 245W used by the PSU at load about 200W is to support the vid-card)

My temps were much higher than those in the review though, I see 83C with this cyclone card. I think the shroud design on the Palit in the review is a much better way to force airflow over the PCB.

While some i do agree that the noted passion for discussion that we observe in this thread is healthy and indicative, some people are rather quick in dismissing the said problem. In my country an additional 100w would cost me ~ $106.12 a year.

Edit:
Found a flaw with my math wherein i used 24hr as the daily usage. So dividing it by 3 gives me around $33 a year which is still quite high. (The PC in our house is used 12+ hr a day but since not all of the time is spent in gaming i reduced it to 8)
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
While some i do agree that the noted passion for discussion that we observe in this thread is healthy and indicative, some people are rather quick in dismissing the said problem. In my country an additional 100w would cost me ~ $106.12 a year.

Is that gaming an hour or two per day? Or 24/7 365 days a year at full load?
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
Is that gaming an hour or two per day? Or 24/7 365 days a year at full load?

Edited my post above.

I would also like to say that i keep my card for 3+ years so it does really add up. Example i could buy the 460 then OC it to get better performance but in the long run it might actually be cheaper to buy a 5850 OCed to ~5870 speed or i could buy a 470 instead it would cost about the same in the long run but i will actually have warranty and plus the assurance that my card is made to run at that speed.

Edit
Just looked at the 470 power consumption again it looks like i was wrong about the 470 but the 5850 scenario look plausible
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
While some i do agree that the noted passion for discussion that we observe in this thread is healthy and indicative, some people are rather quick in dismissing the said problem. In my country an additional 100w would cost me ~ $106.12 a year.

Edit:
Found a flaw with my math wherein i used 24hr as the daily usage. So dividing it by 3 gives me around $33 a year which is still quite high. (The PC in our house is used 12+ hr a day but since not all of the time is spent in gaming i reduced it to 8)

I happen to be one of those people that would be quick to dismiss the subject as much ado about nothing.

$33/year for electricity is super silly cheap entertainment in my book.

You already paid how much for the video card? And how much for the computer that houses said video card? And how much are you paying rent/mortgage for the square-footage you are allocating to said computer area? How much to light said computer area? heat? AC? Electricity for everything related to powering up the computer, LCD's, lights etc so you can play your video games? And the expenditures for acquiring said video games?

Add all that up and you still want to make a big deal about a measly $3 a month extra cost that goes towards enhancing the gaming experience that cost you how much once you total up everything I listed above?

Do you analyze/scrutinize every source of power-consumption in your lifestyle? Take super-short showers so your water bill is a few pennies less, heat your food by sunlight instead of using that microwave or oven?

My monthly electric bill is right around $200-$300 month depending on the time of year (higher in the winter). For me an extra $3 a month is unnoticeable.

Heck I spend $3 just buying one gallon of gas that might move me and my mini-van about 16 miles, and my daily driving is around 20 miles.

Not everyone lives the same "luxurious" lifestyle, I'll admit, but from my point of view if your economic situation is such that you can't easily afford an extra $3 month in electricity then you probably have more pressing needs to be spending money towards instead of buying a computer and a GTX460 to game with to being with.
 

zebrax2

Senior member
Nov 18, 2007
977
69
91
I happen to be one of those people that would be quick to dismiss the subject as much ado about nothing.

$33/year for electricity is super silly cheap entertainment in my book.

You already paid how much for the video card? And how much for the computer that houses said video card? And how much are you paying rent/mortgage for the square-footage you are allocating to said computer area? How much to light said computer area? heat? AC? Electricity for everything related to powering up the computer, LCD's, lights etc so you can play your video games? And the expenditures for acquiring said video games?

Add all that up and you still want to make a big deal about a measly $3 a month extra cost that goes towards enhancing the gaming experience that cost you how much once you total up everything I listed above?

Do you analyze/scrutinize every source of power-consumption in your lifestyle? Take super-short showers so your water bill is a few pennies less, heat your food by sunlight instead of using that microwave or oven?

My monthly electric bill is right around $200-$300 month depending on the time of year (higher in the winter). For me an extra $3 a month is unnoticeable.

Heck I spend $3 just buying one gallon of gas that might move me and my mini-van about 16 miles, and my daily driving is around 20 miles.

Not everyone lives the same "luxurious" lifestyle, I'll admit, but from my point of view if your economic situation is such that you can't easily afford an extra $3 month in electricity then you probably have more pressing needs to be spending money towards instead of buying a computer and a GTX460 to game with to being with.

Read my post above

I'm just really cheap but a saving is still a saving :p

BTW nice performance. It really is a nice card for its price. Is there anyone here who has a benchmark of a stock 460 vs OCed 460 in terms of CUDA?
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
While some i do agree that the noted passion for discussion that we observe in this thread is healthy and indicative, some people are rather quick in dismissing the said problem. In my country an additional 100w would cost me ~ $106.12 a year.

Edit:
Found a flaw with my math wherein i used 24hr as the daily usage. So dividing it by 3 gives me around $33 a year which is still quite high. (The PC in our house is used 12+ hr a day but since not all of the time is spent in gaming i reduced it to 8)

You game 8 hours per day, 7 days per week, 365 days per year? That is more hours than a full time job per year if you include the weekends and not a single holiday. If so, (and it isn't) you would probably have to lay out more money in headache medicine (excedrin, advil, whatever) than you would pay in extra electricity. I would go ahead and cut your current electricity usage guesstimate by another third. And that would be 2+ hours 7 days a week 365 days a year. Generous. 1 dollar per month if you're lucky.
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
977
69
91
You game 8 hours per day, 7 days per week, 365 days per year? That is more hours than a full time job per year if you include the weekends and not a single holiday. If so, (and it isn't) you would probably have to lay out more money in headache medicine (excedrin, advil, whatever) than you would pay in extra electricity. I would go ahead and cut your current electricity usage guesstimate by another third. And that would be 2+ hours 7 days a week 365 days a year. Generous. 1 dollar per month if you're lucky.

I'm not the only one using the computer :p i have brothers and sisters that play as well.

The more i talk about it to you guys the more i realize how pretty minuscule the problem is but that doesn't mean that it should be ignored. I guess i owe you guys an apology or at least a thank you for spending a little bit of your time with me :D