cusideabelincoln
Diamond Member
- Aug 3, 2008
- 3,275
- 46
- 91
It's also possible if we overclock! But we're not talking about overclocked cards! So it's pointless if you are "right"!
and I was talking about REAL games right here in front of me. there was literally over 20 watts difference between the 4670 and 8600gt. if my 8600gt would have been stock at the time then it would have been 25 watts difference in games. and AGAIN the 6670 clearly uses more power than a 4670.
wow you have some problems. what part of the stock 4670 using 20 more watts than a slightly oced 8600gt in games are you confused about? a 6670 can use more power than a 4670 in games so again what are you confused about?It's also possible if we overclock! But we're not talking about overclocked cards! So it's pointless if you are "right"!
you would blame my testing just like I said earlier.I'll trust professional reviewers with accurate equipment, like TPU, over you and your kill-a-watt.
you would blame my testing just like I said earlier.
what part of the stock 4670 using 20 more watts than a slightly oced 8600gt in games are you confused about?
but RIGHT IN FRONT OF YOU in the Anandtech review is a 9500gt, which is within 5 watts of an 8600gt according to most reviews. its using 23 watts less than a 4670 so that backs up my experience of using about 20 watts less than a 4670 with my 8600gt. so if my REAL world results back up what Andandtech is showing then why cant you accept that?And yet you are completely disregarding TPU's testing. Hell you're even disregarding Xbit's testing...
http://www.xbitlabs.com/articles/graphics/display/axle-radeon-hd5670-1gb_3.html#sect0
http://www.xbitlabs.com/articles/graphics/display/gainward-bliss9600gt-512gs_7.html#sect0
Using your logic, the 5670 is peaking at 30W... how can there be a 30W difference in favor of the 8600GT... unless it uses 0W?
Two different reviewers show you are wrong. So yes, I don't believe you. Actually, I don't believe the conclusion you've drawn from your tests. The root of the issue is testing methodology you are using, which I do not trust one single bit.
but right in front of you in the anandtech review is a 9500gt, which is within 5 watts of an 8600gt according to most reviews. Its using 23 watts less than a 4670 so that backs up my experience of using about 20 watts less than a 4670 with my 8600gt. So if my real world results back up what andandtech is showing then why cant you accept that?
So again a 6670 uses more power than a 4670 which already uses about 20 watts more than an 8600gt. So yes it is "possible" to have 25-30 watts difference at some point.
I did not compare Furmark so MY experience in REAL games still reflects what Andantech showed. I have results for 8600gt oced, 4670, and 9600gt which were all used in the exact same pc. stock 4670 used between 16-22 watts more than 8600gt oced. and the stock 9600gt used between 49-54 more watts in the few games I tested them in.Furmark. You're slipping.
sometimes cards can use more or sometimes they can use less than their TDP. I believe the 6670 actually uses the same if not a couple watts less than a 5570. it will probably just come down to the individual review since they are pretty close.I would hope for 30 bucks extra dell would put something in there that wasn't terrible. If a 6450 is rated at 27w and a 6670 at 60w or so, I'd imagine you have that 33w of headroom. If you want to take the the safe route I'd say a get a 5570 which uses about 43w supposedly .
MY experience
I did not compare Furmark
I said I had the same wattage difference between games as they did in whatever testing they did. instead of being a man and admitting that its possible to have 25-30 watts difference, which it clearly is, all you can do is twist and use whatever numbers are convenient for your side of the story. grow up and realize that what you link to does not change what Anandtech and my own testing shows.Your experience does not match TPU and Xbit's findings in 3D rendered graphics. I bet you're also subtracting the measurements you took at the wall, which is not reflective of the actual video cards' differences because inflated in that number is the efficiency of the PSU and the quality of voltage coming over your AC power grid.
You need to wake up and realize YOUR EXPERIENCE is not the end all truth of the situation like you make it out to be.
Anandtech did and yet you are trying to use them to back up your claim. That was my point, and obviously it flew right over your head. You're slipping.
I said I had the same wattage difference between games as they did in whatever testing they did. instead of being a man and admitting that its possible to have 25-30 watts difference, which it clearly is, all you can do is twist and use whatever numbers are convenient for your side of the story. grow up and realize that what you link to does not change what Anandtech and my own testing shows.
