Looking for new Optiplex 790 video card

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
It's also possible if we overclock! But we're not talking about overclocked cards! So it's pointless if you are "right"!
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
and I was talking about REAL games right here in front of me. there was literally over 20 watts difference between the 4670 and 8600gt. if my 8600gt would have been stock at the time then it would have been 25 watts difference in games. and AGAIN the 6670 clearly uses more power than a 4670.

I'll trust professional reviewers with accurate equipment, like TPU, over you and your kill-a-watt.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
It's also possible if we overclock! But we're not talking about overclocked cards! So it's pointless if you are "right"!
wow you have some problems. what part of the stock 4670 using 20 more watts than a slightly oced 8600gt in games are you confused about? a 6670 can use more power than a 4670 in games so again what are you confused about?
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
you would blame my testing just like I said earlier.

And yet you are completely disregarding TPU's testing. Hell you're even disregarding Xbit's testing...

http://www.xbitlabs.com/articles/graphics/display/axle-radeon-hd5670-1gb_3.html#sect0
http://www.xbitlabs.com/articles/graphics/display/gainward-bliss9600gt-512gs_7.html#sect0

Using your logic, the 5670 is peaking at 30W... how can there be a 30W difference in favor of the 8600GT... unless it uses 0W?

Two different reviewers show you are wrong. So yes, I don't believe you. Actually, I don't believe the conclusion you've drawn from your tests. The root of the issue is testing methodology you are using, which I do not trust one single bit.

what part of the stock 4670 using 20 more watts than a slightly oced 8600gt in games are you confused about?

The part where two professional reviewers do not show that discrepancy. Duh.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
And yet you are completely disregarding TPU's testing. Hell you're even disregarding Xbit's testing...

http://www.xbitlabs.com/articles/graphics/display/axle-radeon-hd5670-1gb_3.html#sect0
http://www.xbitlabs.com/articles/graphics/display/gainward-bliss9600gt-512gs_7.html#sect0

Using your logic, the 5670 is peaking at 30W... how can there be a 30W difference in favor of the 8600GT... unless it uses 0W?

Two different reviewers show you are wrong. So yes, I don't believe you. Actually, I don't believe the conclusion you've drawn from your tests. The root of the issue is testing methodology you are using, which I do not trust one single bit.
but RIGHT IN FRONT OF YOU in the Anandtech review is a 9500gt, which is within 5 watts of an 8600gt according to most reviews. its using 23 watts less than a 4670 so that backs up my experience of using about 20 watts less than a 4670 with my 8600gt. so if my REAL world results back up what Andandtech is showing then why cant you accept that?

so again a 6670 uses more power than a 4670 which already uses about 20 watts more than an 8600gt.
so yes it IS "possible" to have 25-30 watts difference at some point.
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
but right in front of you in the anandtech review is a 9500gt, which is within 5 watts of an 8600gt according to most reviews. Its using 23 watts less than a 4670 so that backs up my experience of using about 20 watts less than a 4670 with my 8600gt. So if my real world results back up what andandtech is showing then why cant you accept that?

So again a 6670 uses more power than a 4670 which already uses about 20 watts more than an 8600gt. So yes it is "possible" to have 25-30 watts difference at some point.

Furmark. You're slipping.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Furmark. You're slipping.
I did not compare Furmark so MY experience in REAL games still reflects what Andantech showed. I have results for 8600gt oced, 4670, and 9600gt which were all used in the exact same pc. stock 4670 used between 16-22 watts more than 8600gt oced. and the stock 9600gt used between 49-54 more watts in the few games I tested them in.
 
Last edited:

Wizlem

Member
Jun 2, 2010
94
0
66
I would hope for 30 bucks extra dell would put something in there that wasn't terrible. If a 6450 is rated at 27w and a 6670 at 60w or so, I'd imagine you have that 33w of headroom. If you want to take the the safe route I'd say a get a 5570 which uses about 43w supposedly .
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I would hope for 30 bucks extra dell would put something in there that wasn't terrible. If a 6450 is rated at 27w and a 6670 at 60w or so, I'd imagine you have that 33w of headroom. If you want to take the the safe route I'd say a get a 5570 which uses about 43w supposedly .
sometimes cards can use more or sometimes they can use less than their TDP. I believe the 6670 actually uses the same if not a couple watts less than a 5570. it will probably just come down to the individual review since they are pretty close.
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
MY experience

Your experience does not match TPU and Xbit's findings in 3D rendered graphics. I bet you're also subtracting the measurements you took at the wall, which is not reflective of the actual video cards' differences because inflated in that number is the efficiency of the PSU and the quality of voltage coming over your AC power grid.

You need to wake up and realize YOUR EXPERIENCE is not the end all truth of the situation like you make it out to be.

I did not compare Furmark

Anandtech did and yet you are trying to use them to back up your claim. That was my point, and obviously it flew right over your head. You're slipping.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Your experience does not match TPU and Xbit's findings in 3D rendered graphics. I bet you're also subtracting the measurements you took at the wall, which is not reflective of the actual video cards' differences because inflated in that number is the efficiency of the PSU and the quality of voltage coming over your AC power grid.

You need to wake up and realize YOUR EXPERIENCE is not the end all truth of the situation like you make it out to be.



Anandtech did and yet you are trying to use them to back up your claim. That was my point, and obviously it flew right over your head. You're slipping.
I said I had the same wattage difference between games as they did in whatever testing they did. instead of being a man and admitting that its possible to have 25-30 watts difference, which it clearly is, all you can do is twist and use whatever numbers are convenient for your side of the story. grow up and realize that what you link to does not change what Anandtech and my own testing shows. :rolleyes:
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I said I had the same wattage difference between games as they did in whatever testing they did. instead of being a man and admitting that its possible to have 25-30 watts difference, which it clearly is, all you can do is twist and use whatever numbers are convenient for your side of the story. grow up and realize that what you link to does not change what Anandtech and my own testing shows.

You use a completely different testing methodology than Anandtech with completely different hardware and different measuring devices. There is no correlation. Hence why I think it's completely stupid of you to keep trying to hammer on about the same point: "I said I had the same wattage difference."

I already admitted there could be a 30W difference... when a card is overclocked and/or using synthetic tests.

I'm not twisting anything you fool. Xbit and TPU are the few, really the only I can think of, which actually measure the card's power consumption and not the system power consumption. You are trying to say card X uses more than card Y by using total system power consumption measurements when you have two reviewers who measure actual card power consumption with more accurate and precise equipment showing a different conclusion.

Grow up? What the hell are you talking about. Look in the mirror and start using logic. The anandtech numbers show the difference for total system power consumption (which includes the inefficiency of the PSU) for FURMARK. That is a completely different context than the numbers I have been trying to show. That's not twisting at all; that's just the facts: Two entirely different contexts, not really comparable to each other. And I'm sorry (not really), but I simply don't trust your numbers at all, because no doubt you are using a cheap kill-a-watt while the other reviewers are using better equipment. You haven't provided a detailed report about the system specs and the setup of your test, you haven't shown any pictures verifying your claim, and you expect me to just take your word on it? Fat chance So if you're going to ask me, and anyone else, to choose, I'm going with the two more popular reviewers. Grow up? Maybe you should grow up and realize your testing methodology (your EXPERIENCE) does not show the actual truth you are trying to say it does. My links do not change your experience, all they do is simply tell us that your experience is your own.
 
Last edited:

e-drood

Member
Jun 15, 2011
169
0
0
msi / newegg link good choice & may run well off 250w case ps

pls see user reviews righthand margin
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Keep in mind that it lacks an HDMI output. It might come with a DVI to HDMI adapter dongle in the box, though. If not, most HDTVs also have a VGA input. Other than that it's great, a real steal if you cash in on the rebate to $46.99.
 
Last edited:

weirdwaldo

Member
Nov 29, 1999
65
0
66
I currently have a displayport/hdmi adapter, so thats not much of an issue. However it does bring up a point, do these cards do audio out via hdmi or displayport? Im kinda (way) outta the pc loop nowadays
 

Voxata

Member
Jun 26, 2012
27
1
0
You most definitely can put a real nice graphics card in your Optiplex 790 (MicroATX nonslim), however you WILL need to upgrade the power supply. Choose one that has the same measurements as the stock power supply - I chose a Silverstone 600w Strider. I then had to drill out the HDD cage to remove it so my graphics card could fit and tossed in an ATI 6950 2GB videocard. This is a highend card for this system for sure - runs games like a dream now.
I have the i3-2100 version with 4GB Ram. Obtained this unit used for 150 dollars and turned it into an HTPC w/a good punch behind it.