SweClockers: Geforce GTX 590 burns @ 772MHz & 1.025V

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Heat that are radiated cannot travel towards the HDD bay because there are obstacles blocking its path. Radiation can't bend and will only travel at a straight line. The thermal reading of the HDD bay can't be a result of thermal radiation. Look at the bottom of the case in the 6970CF's IR graph, look at the heat distribution and compare it with the heat distribution at the HDD bay of the 6990 IR graph.
1. Technically black body radiation is a wave... but it will travel in a generally straight line.
2. Convection is not radiation... when we say heat is "radiated" we typically mean to say "heat is transfered to surrounding fluids substances via convection" not "heat is converted into black body radiation and emitted as photons"
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Regardless, their testing method for the IR images is slightly flawed. Even a small difference in time when the case is opened and images are taken can have a large effect, especially in regards to how hot the AIR in certain places is.


Their procedure page indicates that the side of the case is left on for 30 minutes, opened briefly, then left on again for another 15 minutes, at which point it is quickly taken off and a thermal image taken as soon as possible afterwards. I do not believe inaccuracies due to time effects will have a significant impact. The temperatures of the components themselves (what the IR image measures) will not change significantly, unless they leave it off for a long (1 min+) period of time. I would think it would be trivial to make a setup where taking off the side of the case and taking a thermal image is possible within 5 seconds. The IR images do not measure the temperature of the air.

Hypermatrix, very nice overclock.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
[/B]Their procedure page indicates that the side of the case is left on for 30 minutes, opened briefly, then left on again for another 15 minutes, at which point it is quickly taken off and a thermal image taken as soon as possible afterwards. I do not believe inaccuracies due to time effects will have a significant impact. The temperatures of the components themselves (what the IR image measures) will not change significantly, unless they leave it off for a long (1 min+) period of time. I would think it would be trivial to make a setup where taking off the side of the case and taking a thermal image is possible within 5 seconds. The IR images do not measure the temperature of the air.

Hypermatrix, very nice overclock.

This experiment was nowhere near controlled. No valid claims can be made from it if there are literally dozens of open arguments as to why, or why isn't, this test valid.
Trash it. Begin anew. Third party site.
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
This experiment was nowhere near controlled. No valid claims can be made from it if there are literally dozens of open arguments as to why, or why isn't, this test valid.
Trash it. Begin anew. Third party site.

That's a pretty bold statement, considering they do have all of the variables laid out on their test setup page, including their procedure and what they were running when the test was done. You've made a big accusation against that site. The dozens of open arguments you speak of can come out even with the strongest of proofs of the validity of these tests. I can think of some real-life movements which have rather large followings which raise similar concerns which seek to disprove perfectly valid, well-documented scientific studies that have been peer reviewed and independently reproduced.

The only concern that has been raised that has some sort of solid backing is the idea that there is a cool spot in the HD 6990 case when it is running, as compared to the lack thereof in the GTX 590's case. Otherwise, test procedures were well documented and laid out by the site itself; see their procedures & test setup page. To say that they didn't follow those procedures would be similar to saying to Anandtech, "I don't believe you followed your procedures when you did those tests!"
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Sadly, no. I've tried through a few applications that are designed to allow voltage increases but apparently it's been blocked in the new beta driver. At least for those apps. I haven't tried all software options.

The voltage adjuster is just completely removed from smart doctor with these new drivers.

I am running at 678mhz and 3708 memory without any voltage increases at the moment. That's 11.7% better results than the original drivers without OC'ing. 11.7% without upping voltage...while fully stable...isn't all that bad, to be honest with you.

Thanks, appreciate it!:)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
That's a pretty bold statement, considering they do have all of the variables laid out on their test setup page, including their procedure and what they were running when the test was done. You've made a big accusation against that site. The dozens of open arguments you speak of can come out even with the strongest of proofs of the validity of these tests. I can think of some real-life movements which have rather large followings which raise similar concerns which seek to disprove perfectly valid, well-documented scientific studies that have been peer reviewed and independently reproduced.

The only concern that has been raised that has some sort of solid backing is the idea that there is a cool spot in the HD 6990 case when it is running, as compared to the lack thereof in the GTX 590's case. Otherwise, test procedures were well documented and laid out by the site itself; see their procedures & test setup page. To say that they didn't follow those procedures would be similar to saying to Anandtech, "I don't believe you followed your procedures when you did those tests!"

If that were true, then why hundreds of posts in arguments? That, in itself, is far more than enough evidence to question the merit of the test. Not so cut and dry as you believe. Actually, nowhere near.
Bold nothing. It's as true a statement as any. If there are literally dozens of arguments (or even one argument that can't be proven or disproved) as to why or why not this test is conclusive or inconclusive, then the test should be scrapped altogether and re-run the RIGHT way. Controlled environment and WELL documented all details and aspects of the testing that leaves NO QUESTIONS remaining about what, or how things were done and why.
Ever work in a lab?
 
Last edited:

kevinsbane

Senior member
Jun 16, 2010
694
0
71
If that were true, then why hundreds of posts in arguments? That, in itself, is far more than enough evidence to question the merit of the test. Not so cut and dry as you believe. Actually, nowhere near.

Because people will believe whatever they want to believe. Re: (Obama) Birthers, moon landings were faked, world is flat, 9/11 was planned, vaccinations cause autism, America is awesome, America sucks, metric's the way to go, AMD > NVidia/NVidia > AMD, Intel>AMD> Intel, etc etc etc. Those have not created hundreds of posts; no, they have created many thousands of gigabytes worth of arguments. Doesn't mean the moon landing was faked, or that vaccinations cause autism, or even that the metric system is superior to imperial (I think metric's awesome, I hate imperial construction drawings; drives me nuts). Arguments alone do not mean that there is a problem, merely that people disagree about the interpretation of results.

Bold nothing. It's as true a statement as any. If there are literally dozens of arguments (or even one argument that can't be proven or disproved) as to why or why not this test is conclusive or inconclusive, then the test should be scrapped altogether and re-run the RIGHT way. Controlled environment and WELL documented all details and aspects of the testing that leaves NO QUESTIONS remaining about what, or how things were done and why.
Ever work in a lab?
As a matter of fact, I have worked in a lab. Worked in a structural concrete testing laboratory; we were impact testing ultra-high-strength-fibre-reinforced-concrete destructively. Very interesting tests; very boring to set up the testing apparatus. Sometimes felt silly; spend three days to set up a test; over in less than a second.

I don't understand though. How is the test setup not a controlled environment? How are the details not well documented? On their GTX 590 review, they even state that cards are "Always placed in the same box". How is it any different from, say, how Anandtech or Tom's or TechPowerUp or even Xbitlabs documents and carries out their procedures? Even when they didn't record any details, if they used the same setup, at least the relative comparisons are consistent with themselves (comparable to itself). They all have enough information that you can set up a system that is very similar to theirs and get similar kinds of results.It's very true that hardware.fr's review not as well documented as a scientific paper, but it's as well documented as any other internet video card review. If we were to throw it out based on the number of arguments alone, then we might as well throw out the validity of any other reviews (many dozens...) where people argue over the validity of their results.

I say it's a bold statement because you're basically saying that the reviewers at www.hardware.fr are either straight-out lying, or totally incompetent, when you said the following:
This experiment was nowhere near controlled. No valid claims can be made from it if there are literally dozens of open arguments as to why, or why isn't, this test valid.
Trash it. Begin anew. Third party site.
From what I read, you're saying that either they lied to us and changed the conditions of the tests without telling us, or they are totally incompetent, and could not even manage to maintain the conditions they set out when they told us how they performed the tests.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I say it's a bold statement because you're basically saying that the reviewers at www.hardware.fr are either straight-out lying, or totally incompetent, when you said the following:

Keys referenced quote: "This experiment was nowhere near controlled. No valid claims can be made from it if there are literally dozens of open arguments as to why, or why isn't, this test valid.
Trash it. Begin anew. Third party site."

Bolded above: Your words, not mine.
I believe the reviewers could have done much better in this case.
I mean, why leave this much room for doubt? If I were reviewing, you could bet that I would do the absolute best that I could to remove any and all variables in my testing. Leave no room for questions, or at least at an absolute bare minimum. Nice and clean.
Anyway, a question for you. Would you call that part of the review flawed? or flawless?

Also, you keep asking how this and how that, but Kevin. Do you not see all the questions raised in this thread? MANY. Unanswered, debatable, contestable, questionable. Give it up man and lets find someone else who CAN and is willing to re-do this test. Maybe even the original dudes themselves, but that would be admitting some measure of incompetence if they agreed to that, so forget it. :D
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Because people will believe whatever they want to believe. Re: (Obama) Birthers, moon landings were faked, world is flat, 9/11 was planned, vaccinations cause autism, America is awesome, America sucks, metric's the way to go, AMD > NVidia/NVidia > AMD, Intel>AMD> Intel, etc etc etc. Those have not created hundreds of posts; no, they have created many thousands of gigabytes worth of arguments. Doesn't mean the moon landing was faked, or that vaccinations cause autism, or even that the metric system is superior to imperial (I think metric's awesome, I hate imperial construction drawings; drives me nuts). Arguments alone do not mean that there is a problem, merely that people disagree about the interpretation of results.

As a matter of fact, I have worked in a lab. Worked in a structural concrete testing laboratory; we were impact testing ultra-high-strength-fibre-reinforced-concrete destructively. Very interesting tests; very boring to set up the testing apparatus. Sometimes felt silly; spend three days to set up a test; over in less than a second.

I don't understand though. How is the test setup not a controlled environment? How are the details not well documented? On their GTX 590 review, they even state that cards are "Always placed in the same box". How is it any different from, say, how Anandtech or Tom's or TechPowerUp or even Xbitlabs documents and carries out their procedures? Even when they didn't record any details, if they used the same setup, at least the relative comparisons are consistent with themselves (comparable to itself). They all have enough information that you can set up a system that is very similar to theirs and get similar kinds of results.It's very true that hardware.fr's review not as well documented as a scientific paper, but it's as well documented as any other internet video card review. If we were to throw it out based on the number of arguments alone, then we might as well throw out the validity of any other reviews (many dozens...) where people argue over the validity of their results.

I say it's a bold statement because you're basically saying that the reviewers at www.hardware.fr are either straight-out lying, or totally incompetent, when you said the following:
From what I read, you're saying that either they lied to us and changed the conditions of the tests without telling us, or they are totally incompetent, and could not even manage to maintain the conditions they set out when they told us how they performed the tests.

This post is amazing. I agree with you that people will cherry pick what they want to hear what they want. Don't agree with the conclusion? Easily discredit the authors and inject your fabrications on how you'd have handled it and yada yada yada.

That's why I try to only rely on one source for my comparatives. I use Anandtech, someone will counter me Tech Power Up because it fits their opinion and in the end we're left arguing two different scenarios with different results.

I completely agree with you; people are either accusing the authors of being liars or incompetent. I guess in the end everyone is one of the two if they don't find your perception of the products.

I can probably dig out some people who praise W1zzard for his OC results when it paints their brand nicely and now they slam him for being incompetent for doing what he's always been doing.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
This post is amazing. I agree with you that people will cherry pick what they want to hear what they want. Don't agree with the conclusion? Easily discredit the authors and inject your fabrications on how you'd have handled it and yada yada yada.

That's why I try to only rely on one source for my comparatives. I use Anandtech, someone will counter me Tech Power Up because it fits their opinion and in the end we're left arguing two different scenarios with different results.

I completely agree with you; people are either accusing the authors of being liars or incompetent. I guess in the end everyone is one of the two if they don't find your perception of the products.

I can probably dig out some people who praise W1zzard for his OC results when it paints their brand nicely and now they slam him for being incompetent for doing what he's always been doing.

I think the point is, it is one example and shouldn't be the end-all-be-all, blanket view and more testing would be welcomed, too. I have no problem with the data and like Tridam's site, HardwareFR and one of my favorites over-all.

Don't have a problem with Techpowerup's worse case OC investigations either.

Personally read all the reviews to gauge hardware and don't rely on just one but do have favorites like ComputerBase, PCgamesHardware and HardwareFr. I will not close my mind because a site may have a different opinion than I and only read views that are similar to my own.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
...
I don't understand though. How is the test setup not a controlled environment? ...
You are grossly mistaken. Yes, there are procedures and it is in a controlled environment, but are those designed to determine whether or not 590's VRM are frying due to heat?

The purpose of those IR pictures is the show heat released can affect the rest of PC setup in the case. Some HS only vent from the back of the card, while others my simply draw heat off the card.

Those IR graphs clearly showed that both 6990 and 590 releases a lot of heat into the case, while 6970CF and 580SLI is far better. Note that with or without a vent inside the case, different locations are heat up. The test clearly demonstrated what it was designed to do.

But that isn't what other people are using those data for. Clearly, people are using the difference of the temp measured on 590 and 6990 as a proof on 590 VRM overheating. This is not the purpose of those IR graphs. The purpose of them is nothing but to demonstrate heat released by GPU, not temp of a specific component of a specific card. If testing is what you do everyday, then this should not be something new to you.

If I am to make the test, the things I need to control is the intake of the HS, the load, the amount of electricity going into the card, and different readings on varies parts. If my goal is to compare VRM temp, then I will take readings off VRMs at different load and electricity input. These are missing in the procedure and the recordings in input electricities are way off, 6990 doesn't draw 375Watt at stock, nor 450Watt with bios2. That means, those data are irrelevant to the subject at hand.

Besides, techreport did an IR test on 590 at max load and get 106c, which isn't alarming. What caused the difference in reading? Does it mean one of the 2 reviews lie? Nope, just different setup. So which one is accurate? Well, we will have to wait until someone creates a setup for this purpose, then we will know.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I am under an impression that those image are removed by mods as linking images directly from another site may have violated rules of the forum. Yet, go back to hardware.fr and dig those image yourself is the same thing.

Edit: I fixed one of them, the others magically appears again.

For the record, we only remove hotlinked embedded images if they NSFW or if a hosting website contacts us and asks us to take them down.

Images that are not coming up in these forums are most likely not coming up because the hosting site has detected the presence of, and disabled, hotlinking of their images.

If your embedded images are not showing up here you should then try and link the image with URL and not as an embedded image, or link to the actual hosting web-page.

Idontcare
Super Mod
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
For the record, we only remove hotlinked embedded images if they NSFW or if a hosting website contacts us and asks us to take them down.

Images that are not coming up in these forums are most likely not coming up because the hosting site has detected the presence of, and disabled, hotlinking of their images.

If your embedded images are not showing up here you should then try and link the image with URL and not as an embedded image, or link to the actual hosting web-page.

Idontcare
Super Mod

Curse HTML!!!!

Thanks for the info though
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
[/B]The temperatures of the components themselves (what the IR image measures) will not change significantly, unless they leave it off for a long (1 min+) period of time. I would think it would be trivial to make a setup where taking off the side of the case and taking a thermal image is possible within 5 seconds. The IR images do not measure the temperature of the air.

Yep, I never mentioned the temp of the components. I said the "AIR" around the HDD bay being affected quickly once you take off the case door. That's a good point about air and the IR images though. It won't measure exact temps but you will still see whether it is cool or warm relative to the other components right? If it doesn't then this whole debate about the "extra" fan is pointless.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I think the point is, it is one example and shouldn't be the end-all-be-all, blanket view and more testing would be welcomed, too. I have no problem with the data and like Tridam's site, HardwareFR and one of my favorites over-all.

Don't have a problem with Techpowerup's worse case OC investigations either.

Personally read all the reviews to gauge hardware and don't rely on just one but do have favorites like ComputerBase, PCgamesHardware and HardwareFr. I will not close my mind because a site may have a different opinion than I and only read views that are similar to my own.

I completely agree with you, more data on the subject would be reasonable. But, I disagree with you on that the point of this was to request more data.

Disqualifying someone's findings when they've explained how they reached their findings because you don't agree with their methodology is questionable. What motives does one request that someone's findings be ruled inadmissible when their data is provided and their controls are present too?

I'd word more "give us another graph to look at" as reasonable not "clearly this one is useless since it leaves so much up for speculation." People motives and posting history speaks louder to me sometimes. Specifically when someone tries to use bogus physics as "factual."
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Bolded above: Your words, not mine.
I believe the reviewers could have done much better in this case.
I mean, why leave this much room for doubt? If I were reviewing, you could bet that I would do the absolute best that I could to remove any and all variables in my testing. Leave no room for questions, or at least at an absolute bare minimum. Nice and clean.
Anyway, a question for you. Would you call that part of the review flawed? or flawless?

If you feel that my characterization of your position is flawed, then I apologize. It is, however, the message I got when I read your words.

I wouldn't consider this review flawless. It is flawed in certain respects. However, I do consider that it has enough credibility that the data itself can be taken at face value. Absent any other hard data, it is difficult to claim that the data is worthless.

Also, you keep asking how this and how that, but Kevin. Do you not see all the questions raised in this thread? MANY. Unanswered, debatable, contestable, questionable.
I see all the questions, and note that most of the questions are answered. People keep asking questions after the moon landing conspiracists have been disproved; people asking questions does not mean something is invalid. Where is the

Give it up man and lets find someone else who CAN and is willing to re-do this test. Maybe even the original dudes themselves, but that would be admitting some measure of incompetence if they agreed to that, so forget it. :D
It would be good for someone else to do a test too, so it can be compared against this review. But to completely dismiss this particular review as invalid and worthy to be trashed out of hand because people are asking questions is definitely a bold statement. They have "proof" that it was controlled. To the extent that they've shared that, it has been posted here.

This is what I think.

To the extent of knowledge that we have, this review was well controlled. The reviewers' methodology is documented, and no obvious errors or biases are present. Claims of discrepancies made from this data must be made with the assumption that discrepancies in the data may not necessarily be due to flaws in testing, but may be because of internal factors (the video cards in this case) the tests were designed to find. Third party testing would be good to confirm or disprove any facts and findings that were found in this particular review.

Claims of flawed methodology should be accompagnied by similar levels of proof as what has been provided to validate that methodology if the claim is to be taken seriously. No such level of proof has been provided as of yet. Alleged discrepancies have been noted, but there is nothing that says what was shown isn't what actually happened.

Seero, if people are trying to use these photos as proof for the GTX 590 popping, I agree, any link you can draw here is pretty weak. I haven't and will not try to make a case that the (alleged) VRM issues are a result of VRM temps, because frankly, there isn't enough data overall. This particular review doesn't try to make the link and hasn't made any such link. It shows that it gets hot, and that part of the card gets to 112C under their test. But what I will do is defend the data.

thilanliyan, it will measure the infrared emissions which can be approximated to the surface temperature of anything you can see. The relative temperatures will be accurate given that they are not given sufficient time to cool off once the case is opened. Relative to its own image, and to images taken by the same scanner under similar conditions, the colours will provide good relative temperature readings.

railven, the problem with using only a single source of data is present in this thread itself; ie, people will disagree with your findings, no matter how well you document or how good of a test you might have. I do not imply this particular test is perfect. But it is "good enough" to show data; redundancy can only improve the quality of that data.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Seero, if people are trying to use these photos as proof for the GTX 590 popping, I agree, any link you can draw here is pretty weak. I haven't and will not try to make a case that the (alleged) VRM issues are a result of VRM temps, because frankly, there isn't enough data overall. This particular review doesn't try to make the link and hasn't made any such link. It shows that it gets hot, and that part of the card gets to 112C under their test. But what I will do is defend the data.
I agree with what you said.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
This experiment was nowhere near controlled. No valid claims can be made from it if there are literally dozens of open arguments as to why, or why isn't, this test valid.
Trash it. Begin anew. Third party site.

There's plenty of valid data available from this test. You just don't like the results, which is why you want the tests thrown out. What's next? You ask for the thread to be locked and pretend the issue doesn't exist?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
railven, the problem with using only a single source of data is present in this thread itself; ie, people will disagree with your findings, no matter how well you document or how good of a test you might have. I do not imply this particular test is perfect. But it is "good enough" to show data; redundancy can only improve the quality of that data.

Oh, I get you with that. But the major issue with starting to introduce different articles is the variables in how they attained their findings.

I wouldn't say any one singular article is flawless, but I can assume whatever flaws exist are spread across all the products tested.

Now if the source has a big discrepancy with other reviews, then my red flag goes up. But so far I've enjoyed Anandtech's reviews and try to use them in my buying/suggesting decisions.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
There's plenty of valid data available from this test. You just don't like the results, which is why you want the tests thrown out. What's next? You ask for the thread to be locked and pretend the issue doesn't exist?

Valid data. Sure ok. Valid data the tells us what? Data that causes questions.
There have been numerous arguments for and against the test. There has been no definitive conclusion. I don't know why you can't accept this fact. Too many variables and possibilities for us to take any of it seriously.

What's the harm in a retest? Is there a problem with requesting such a thing after about 30 pages or more of disagreements? I don't believe there is any harm in it at all. Actually, it's the logical thing to request, don't you think?
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Perhaps NV News will do a comprehensive test.
That way there would be no chance of their findings being flawed or questionable...seeing they are a disinterested 3rd party...right?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Valid data. Sure ok. Valid data the tells us what? Data that causes questions.
There have been numerous arguments for and against the test. There has been no definitive conclusion. I don't know why you can't accept this fact. Too many variables and possibilities for us to take any of it seriously.

What's the harm in a retest? Is there a problem with requesting such a thing after about 30 pages or more of disagreements? I don't believe there is any harm in it at all. Actually, it's the logical thing to request, don't you think?

No, we don't need to keep retesting until the results you like appear.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Valid data. Sure ok. Valid data the tells us what? Data that causes questions.
There have been numerous arguments for and against the test. There has been no definitive conclusion. I don't know why you can't accept this fact. Too many variables and possibilities for us to take any of it seriously.

I disagree. The test results ARE useful, but NOT to prove/disprove the existence of an "extra" fan IMO.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I disagree. The test results ARE useful, but NOT to prove/disprove the existence of an "extra" fan IMO.

I agree with this. Someone sees a blue spot, they yell conspiracy. And all members of the same mind set run in to defend/support.

I personally didn't even take these pictures as anything to debate about until people started to lie to win their case.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Valid data. Sure ok. Valid data the tells us what? Data that causes questions.
There have been numerous arguments for and against the test. There has been no definitive conclusion. I don't know why you can't accept this fact. Too many variables and possibilities for us to take any of it seriously.

What's the harm in a retest? Is there a problem with requesting such a thing after about 30 pages or more of disagreements? I don't believe there is any harm in it at all. Actually, it's the logical thing to request, don't you think?

It seems valid data that tell us the temperature the cards (and not only the GPU core) can achieve and the effect it has on the case interior.

And the site refrains from doing any considerations other than maybe a front exhaust fan might be a better solution for the 6990.

Other than that not much.