Nvidia shows some details on Cooling chamber on Upcoming Card Plus Video of Black Ops

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
I believe you that lots of us overclock our cards but when responding to something like that, I would rather just be upfront instead of trying to twist it.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76

I'm not even going to step into the crossfire where you guys are calling each liars or whatever.

I will comment about power use though:

Idle power draw is probably more important when calculating actual costs, for those that leave their PCs on 24/7. And on that front, the 5850 annihilates GTX480 idle power draws. 54W vs. 18W differential.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html

Assuming 80% PSU efficiency that difference is actually 45W at the wall (1.25 times (54-18)). 0.045kWh x 24 hours x 365 days = 394.2 kWh of electricity. Assuming 13 cents per kWh, that's $51.25. If you use it to game then the difference between the power draws at load may be different, but the point is that idle draws matter more when calculating 24/7 costs because there are only so many hours a day someone can game; the vast majority of power draw is at idle.

Take this example if you still don't understand:

A gamer named Jack leaves his PC on 24/7. Inside his PC is a 5850. He is at work, school, sleeping, surfing the web, downloading pr0n, etc. so the GPU is idle 22 hours per day on average.

Jack games 2 hours per day on average (sometimes more, sometimes less).

Of those 2 hours, 1.9 hours is spent on the usual console port trash that doesn't even stress a 5850@stock. So he leaves the 5850 on stock.

The remaining .1 hours per day is spent on the one or two games per year that he plays which stresses his GPU, maybe Metro 2033 or something. (Note that .1 hour/day over a year = 36.5 hours.) So he clocks to 1000/1200 and takes the energy hit. But it's only for 36.5 hours per year so it's not that much money.

The VAST majority of the relevant power draw is idle: 22 hours of each day.

That said, there can be indirect costs of high power loads, such as needing a bigger PSU or something. But that would apply in either case, in the situation you are arguing about (highly oc'd 5850 or GTX480).
 
Last edited:

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
The VAST majority of the relevant power draw is idle: 22 hours of each day.


I agree with you

Not too many people really care about power consumption in this subforum (it seems) but I'm one of them. I liked how ATi's cards were pretty efficient but actually decided to get a GTX 460 because, even though its max isn't great, its idle is pretty good.
 
Feb 19, 2009
10,457
10
76
I'm a gamer and i hate loud PC setups. Headphones are a solution, but i prefer a nice surround sound setup with a few woofers.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Yup, because to gamers fan-noise is a MEH-topic...suprised?
I'm a gamer, and it's certainly not a "MEH-topic" to me. A GTX480 is intolerable under load without headphones playing at reasonable volumes, and it'll easily disturb anyone in the same room without them.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I'm a gamer, and it's certainly not a "MEH-topic" to me. A GTX480 is intolerable under load without headphones playing at reasonable volumes, and it'll easily disturb anyone in the same room without them.

So, it's pretty safe to say then, that we have people who are gamers completely on the opposite side of the spectrum. We have Lonbjerg who isn't bothered by fan noise because he can't hear it over his 7.1 surround system, and on the extreme other side, we have BFG10k who has been affectionately dubbed "noise princess" by his bud Apoppin a while back.

On this video however, it doesn't look like they were that excited for a quieter card. It was like, "Yeah, so?" Bring on the games!!!!

We all have to accept that there are VERY different people in this world.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Actually, depending on the source you use, the GTX480 uses about 385W when running Crysis. How do I know? I don't, I just pulled that number out of thin air with no link to back it up.

Seriously?
You used to be a mod and you're going to just troll rather than actually providing evidence against other people's incorrect statements?


z460_power.png


http://www.xbitlabs.com/articles/video/display/gigabyte-gf-gtx400_6.html#sect0

I think the 3D load test is Crysis (since that's what's shown in the breakdowns above).
HD5850 uses ~120w not ~100, still below TDP.

@ blastingcap
I've posted this before, but NV doesn't always use "gaming load" as TDP, or rather, hasn't historically. It seems to vary with card.
This is a little out of date (all numbers from Xbitlabs IIRC, but I lost my original spreadsheet with the numbers): http://img408.imageshack.us/img408/509/nvtdp.png


Mod callouts are never acceptable, ever.

Re: "Seriously? You used to be a mod..."

AnandTech Forum Guidelines

13) Baiting moderators will not be tolerated nor will Mod Call Outs. Any action that reasonably can be considered baiting a moderator, or multiple consecutive actions that heavily push the boundaries of any of these guidelines will result in an instant short term vacation. Repeated violation of this rule may result in a permaban.
Moderator Idontcare
 
Last edited by a moderator:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I'm a gamer, and it's certainly not a "MEH-topic" to me. A GTX480 is intolerable under load without headphones playing at reasonable volumes, and it'll easily disturb anyone in the same room without them.

I guess we have different opinions about "reasonable"...I consider 50% volume on my setup to be reasonable...75% is LOUD...100% is warfare on the neighbourghs :twisted:

(I might add I like to feel the bass eg. when in a FPS...a "damage" from my army time and actually being in real battles *shrugs*)
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Seriously?
You used to be a mod and you're going to just troll rather than actually providing evidence against other people's incorrect statements?


z460_power.png


http://www.xbitlabs.com/articles/video/display/gigabyte-gf-gtx400_6.html#sect0

I think the 3D load test is Crysis (since that's what's shown in the breakdowns above).
HD5850 uses ~120w not ~100, still below TDP.

@ blastingcap
I've posted this before, but NV doesn't always use "gaming load" as TDP, or rather, hasn't historically. It seems to vary with card.
This is a little out of date (all numbers from Xbitlabs IIRC, but I lost my original spreadsheet with the numbers): http://img408.imageshack.us/img408/509/nvtdp.png

Because "YOU" think it's trolling doesn't automagically make it so. I was making a point. Which you apparently missed. And leave my being a moderator out of public discussions please??? Thank you.
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
On this video however, it doesn't look like they were that excited for a quieter card. It was like, "Yeah, so?" Bring on the games!!!!

A 10 minute video with about 40 seconds of game play? With today's attention spans you could have set off firecrackers in there and still found the vast majority of that crowd laser locked onto the display hoping for a glimpse of a game after all of that chit chat.

That Nvidia are addressing the issue means that they are aware that, at least for some people, noise and heat matter. So, it's not a 'meh' topic for Nvidia as they try to address as much of the market as possible. Whether or not it's 'meh' for a specific individual is irrelevant really: the proof is in Nvidia's stance on the topic. I'm excited to see what the 580 offers, and some leaked pricing information looks promising (though I have no idea how accurate that is).
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Yes, Sapphire may have been the first to debut a video card with a vapor chamber cooler, but I think this is the first time a reference design has used the technology. It should be interesting to see how much of a temperature/sound level drop it gives over a standard heat pipe cooler once some non-standard 580's hit the market.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
A 10 minute video with about 40 seconds of game play? With today's attention spans you could have set off firecrackers in there and still found the vast majority of that crowd laser locked onto the display hoping for a glimpse of a game after all of that chit chat.

That Nvidia are addressing the issue means that they are aware that, at least for some people, noise and heat matter. So, it's not a 'meh' topic for Nvidia as they try to address as much of the market as possible. Whether or not it's 'meh' for a specific individual is irrelevant really: the proof is in Nvidia's stance on the topic. I'm excited to see what the 580 offers, and some leaked pricing information looks promising (though I have no idea how accurate that is).

You're assuming now? Could I have really set off some fireworks in there and not have half of them notice? Please don't exaggerate, you saw the vid, you saw the MEH expression on their faces as the Nvidia guy even had to coax them into applause because they didn't seem to care about the improvements in noise/cooling. Do you deny this? Or are their attention spans only selectively short and come into full focus when evaluating the noise a given PC generates?
I need consistency here.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
So you can tell in that video with 100% certainty the HD5850 was running at 1050mhz and Core i5 @ 4.0ghz??? No CPU frequency, CPU voltages, GPU voltage or GPU frequency are visible. Plus you can just move the slider to any voltage/ frequency value you want in MSI Afterburner, just don't press Apply. You ever thought about that?

Even if true, those results look rather odd considering that's lower than any stock HD5850 and a stock Core i system. Don't you think that's somewhat suspicious?
So what, I'm lying now? It's funny how you go on and on about maturity and respectfulness and then you act like this. Rather than admit you were wrong, you're going to accuse me of lying. Really mature :rolleyes:.

And those results were at 1000MHz, not 1050MHz. Your source referred to 1000MHz, that's what the test was run at.

A 10 minute video with about 40 seconds of game play? With today's attention spans you could have set off firecrackers in there and still found the vast majority of that crowd laser locked onto the display hoping for a glimpse of a game after all of that chit chat.

That Nvidia are addressing the issue means that they are aware that, at least for some people, noise and heat matter. So, it's not a 'meh' topic for Nvidia as they try to address as much of the market as possible. Whether or not it's 'meh' for a specific individual is irrelevant really: the proof is in Nvidia's stance on the topic. I'm excited to see what the 580 offers, and some leaked pricing information looks promising (though I have no idea how accurate that is).
I think that shows a general truth - a lot of that audience was there for the Black Ops footage. They might remember the name "NVIDIA" after they leave, and that's just how the general public is.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
You're assuming now? Could I have really set off some fireworks in there and not have half of them notice? Please don't exaggerate, you saw the vid, you saw the MEH expression on their faces as the Nvidia guy even had to coax them into applause because they didn't seem to care about the improvements in noise/cooling. Do you deny this? Or are their attention spans only selectively short and come into full focus when evaluating the noise a given PC generates?
I need consistency here.

Here we go again (and I bet you won't respond to full content of this post, Keys: I'm still waiting for your 'consistency' in the post about 2 mid range vs one high end card).

If you're right, that it's a meh topic, then Nvidia wasted money designing a card that generates less noise and heat. Conclusion = Nvidia is one stupid company for not knowing their target market before wasting money designing a product that has features that the vast majority of people don't care about...(oops!)

Or, if you look at your previous post, you have people at opposite ends of the spectrum. Some care about noise and heat and some don't. This is what I agreed with. I'm saying that in that crowd it's likely some cared, some didn't. Technical improvements like that in efficiency don't generally get people going "YEAH MAN! Did you see how QUIET that is? SWEET!" but they can be appreciated features nonetheless. Understand?

The 'meh' looks in this video could have occurred for a number of reasons and my assumption (that some of them cared and some don't) is in line with your previous post, yet you jump into this context claiming that NONE of them cared? (LOL). Maybe all of them cared but just didn't feel the need to applaud Nvidia for something that should never have been a problem in the first place, right? Or did AMD fan boys come up with Thermi themselves? Or maybe some people wanted to see the game play after all of that rambling?

But YOU don't know any more about their individual motivations any more than I do. So the consistency you're looking for should start with your own posting, followed by your reading comprehension (especially the ability to see both sides of the same coin) and so on. How's that for consistent?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Here we go again (and I bet you won't respond to full content of this post, Keys: I'm still waiting for your 'consistency' in the post about 2 mid range vs one high end card).

If you're right, that it's a meh topic, then Nvidia wasted money designing a card that generates less noise and heat. Conclusion = Nvidia is one stupid company for not knowing their target market before wasting money designing a product that has features that the vast majority of people don't care about...(oops!)

Or, if you look at your previous post, you have people at opposite ends of the spectrum. Some care about noise and heat and some don't. This is what I agreed with. I'm saying that in that crowd it's likely some cared, some didn't. Technical improvements like that in efficiency don't generally get people going "YEAH MAN! Did you see how QUIET that is? SWEET!" but they can be appreciated features nonetheless. Understand?

The 'meh' looks in this video could have occurred for a number of reasons and my assumption (that some of them cared and some don't) is in line with your previous post, yet you jump into this context claiming that NONE of them cared? (LOL). Maybe all of them cared but just didn't feel the need to applaud Nvidia for something that should never have been a problem in the first place, right? Or did AMD fan boys come up with Thermi themselves? Or maybe some people wanted to see the game play after all of that rambling?

But YOU don't know any more about their individual motivations any more than I do. So the consistency you're looking for should start with your own posting, followed by your reading comprehension (especially the ability to see both sides of the same coin) and so on. How's that for consistent?

Dude, we get it. You hate Nvidia and I don't. No big surprise. We will always argue and never see eye to eye. Never.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Ive long had a dream of putting my pc in another room than my keyboard/mouse/screen, by means of reeeeeaally long cables.

why?

Id love to have 0 db from the pc Im playing on. Unfortunately Ive never done that... my pc is right next to me and its loud enough to bug me (I have a AMD card).

To me the idle mode noise is the most important though... because like keys said, when you have speakers on with sounds ect it takes abit away from the fan noise (so it doesnt seem as bad).

Idle and movies... id love a pc that was 0 db.

I hope both amd and nvidia aim to make quieter cards, because I think overall it would please alot of people.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Ive long had a dream of putting my pc in another room than my keyboard/mouse/screen, by means of reeeeeaally long cables.

why?

Id love to have 0 db from the pc Im playing on. Unfortunately Ive never done that... my pc is right next to me and its loud enough to bug me (I have a AMD card).

To me the idle mode noise is the most important though... because like keys said, when you have speakers on with sounds ect it takes abit away from the fan noise (so it doesnt seem as bad).

Idle and movies... id love a pc that was 0 db.

I hope both amd and nvidia aim to make quieter cards, because I think overall it would please alot of people.

You game with no sound? o_O

I am amazed at gamers, spending +$1000 on their PC...but don't have sound :hmm:
 

WelshBloke

Lifer
Jan 12, 2005
31,413
9,307
136
Ive long had a dream of putting my pc in another room than my keyboard/mouse/screen, by means of reeeeeaally long cables.

why?

Id love to have 0 db from the pc Im playing on. Unfortunately Ive never done that... my pc is right next to me and its loud enough to bug me (I have a AMD card).

To me the idle mode noise is the most important though... because like keys said, when you have speakers on with sounds ect it takes abit away from the fan noise (so it doesnt seem as bad).

Idle and movies... id love a pc that was 0 db.

I hope both amd and nvidia aim to make quieter cards, because I think overall it would please alot of people.


If you got cards which made zero sound you soon find other sounds your PC makes which send you round the bend. Its a never ending quest that leads to madness and ruin. :$

I'm seriously considering some SSD's not for the performance but because the clicking of my harddrives in driving me mad.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Dude, we get it. You hate Nvidia and I don't. No big surprise. We will always argue and never see eye to eye. Never.

Funny, I don't see any hatred for nV, all I see is him pointing out that your post seem a bit weird, inconsistent and vague.
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
Dude, we get it. You hate Nvidia and I don't. No big surprise. We will always argue and never see eye to eye. Never.

I do not think that it is fair for you to argue that I hate Nvidia, especially as nothing in my post history supports that claim. Rather, it should be obvious that I have standards, and that is not necessarily a bad thing. Because I do not see these discussions of products in the light of hate or love, like or dislike. I am not that sort of consumer when it comes to performance parts, whether they are dishwashers, video cards, vacuum cleaners, stand mixers, cell phones and so forth.

As a gamer, especially at a LAN party, the last thing I would want is to show up to an event advertising game play footage and have that footage be less than 10% of what is, on the whole, essentially an advertisement. This isn't an Nvidia call-out. I'm sure AMD does similar things, and when there are topics about AMD doing that I'll be there to comment as well.

Why you did not notice our agreement boggles my mind. We both agree there are people at opposite ends of the spectrum (that is what you told BFG10K). He is not alone, nor am I. That is why I use a pair of UE-10 Pro Customs when I game (for the immersion). Lonjberg's setup sounds nice too, and I had a sweet Klipsch setup for a similar purpose before. However, neighbours come and go, and some are not as tolerant as their predecessors :$. So, if I can hear video cards even through my noise isolating earphones, I get annoyed. A friend with a dual card setup from a company that shall remain nameless does not seem to mind that his PC sounds like a small vacuum cleaner, can be heard in his adjacent living room, and so on.

Stating that I (not in isolation) have a preference that Nvidia seems to be catering to with the 5XX series...should show that I am glad that Nvidia is doing this. You have no grounds to see my posts about Nvidia in an entirely negative light. You are very defensive these days, and I do not think it is fair for you to lump anyone who is critical of anything (however trivial) Nvidia does into some 'Nvidia hate group'.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Improving upon heat and noise if you can is common sense. However, it would be naiive to think that Nvidia is not directly responding to the AMD marketing campaign against Fermi. That is how competition goes though, and how good competition = better products for the consumer.