what is best graphics card I can use

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dclive
http://www.tomshardware.com/re...e-gtx-275,2266-14.html is another interesting post from Tom's Hardware. Basically it shows those (fully modern) GPUs all would be fine in a system with a PSU of 300W or so, call it 350W for lots of excess capacity. The 4870x2 would need about 500W.

That's it. And that's for an i7/965 setup with all power saving disabled. That's a fully modern, fully current system.

This doesn't tell much. Just says AC power usage. Nor did they test a modern system with a 300 or 350 watt power supply instead they tested with 1100 watt power supply with plenty of amps. Just because a power supply is rated at such doesn't mean it can run a modern full fledged system with it.

Now if you show me a 300/350 watt power supply showing results and extensive testing to random reboots and stability I would believe it.
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
Originally posted by: Azn


This doesn't tell much. Just says AC power usage. Nor did they test a modern system with a 300 or 350 watt power supply instead they tested with 1100 watt power supply with plenty of amps. Just because a power supply is rated at such doesn't mean it can run a modern full fledged system with it.

Now if you show me a 300/350 watt power supply showing results and extensive testing to random reboots and stability I would believe it.

Earlier someone said a GTX280 used 236W. Both Anand and Tom say an entire SYSTEM with a 285 (granted, slightly different) use about 260W....and it's a nice, high end system to boot.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dclive
Originally posted by: Azn

Which GPU is rated to 236 watts?

GTX 280 is rated at 236watts.
.

Per Tom's Hardware, a fully loaded system with a GTX 285, high-end i7 CPU, RAM, 10,000 RPM hard drives, etc... takes about 260 watts from the wall, total.

http://www.tomshardware.com/re...e-gtx-275,2266-14.html

(Anandtech's data is almost exactly the same...)

Why don't you explain to me why I get ramdom reboots when I raise the voltage on my CPU with a 450 watt power supply with 2 12v @ max 30A? I didn't get random reboots when I had G92 8800gts in the same system when I raised voltage BTW nor I get random reboots at default voltage with GTX260.

edit: I run my system 24/7 BTW so any little things like this I notice right away.
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
Originally posted by: Azn


Why don't you explain to me why I get ramdom reboots when I raise the voltage on my CPU with a 450 watt power supply with 2 12v @ max 30A? I didn't get random reboots when I had G92 8800gts in the same system when I raised voltage BTW nor I get random reboots at default voltage.

A GTX280 doesn't need 236W by itself, which is the only point I'm rebutting. As one can see by looking at the Anandtech and Tom's Hardware review, the GTX 280 / 285 likely only need a fraction of that 236W power because an entire system only needs about 260W of power - with a pretty modern system being powered.

If your system needs more power due to overclocking a CPU or whatnot, go for it. That has nothing to do with the GTX 280/285's power requirements.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dclive
Originally posted by: Azn


Why don't you explain to me why I get ramdom reboots when I raise the voltage on my CPU with a 450 watt power supply with 2 12v @ max 30A? I didn't get random reboots when I had G92 8800gts in the same system when I raised voltage BTW nor I get random reboots at default voltage.

A GTX280 doesn't need 236W by itself, which is the only point I'm rebutting. As one can see by looking at the Anandtech and Tom's Hardware review, the GTX 280 / 285 likely only need a fraction of that 236W power because an entire system only needs about 260W of power - with a pretty modern system being powered.

If your system needs more power due to overclocking a CPU or whatnot, go for it. That has nothing to do with the GTX 280/285's power requirements.

Default voltage for my cpu is 1.325volts. I raise to 1.4 volts I get random reboots.

Raising voltage 10% raises 20% power usage on the CPU itself and I use a 65watt rated CPU which comes to 78watts @ 20% difference of 13watts and I'm being generous here even though I didn't raise the voltage by 10%. So you say you can run a full fledged GTX280 system with a 300/350 watt power supply. My GTX260 is a 55nm version BTW which uses less watt than a GTX 280 probably by 50watts or more.

So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

I said it was rated at such per paper. Didn't say it uses 236watts in real life. Depending on the situation it would fluctuate how much watts it uses.
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
Originally posted by: Azn

I said it was rated at such per paper. Didn't say it uses 236watts in real life. Depending on the situation it would fluctuate how much watts it uses.

Can you detail (or guess, even) on a condition under which it would hit 236W, given that both THW and AT both stress-tested their systems (with newer CPUs and with older ones) and, for the entire SYSTEM, only required a little bit more power than that - 260W for the entire SYSTEM (in the case of Tom's results)?

Where is it rated at 236W? I'd be interested in reading that paper. Is that nVidia's whitepapers, or ?

That seems really high, because per http://techreport.com/articles.x/16504, they're saying that the max that a single-plug PCIe board can use is 150W. I have no other information on the topic, but it's an interesting discussion point.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.

The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
Originally posted by: happy medium
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.

The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.

Can a quad-core 2.4 Intel CPU with 3-4 HDDs, 2 tuners, and 4GB of RAM be sufficiently powered by Acer E700's default 300W PSU, once one adds an nVidia 8800GTS?

So many people said it couldn't be done, but I've been happily running that system (and later variants) for quite some time. Those who also bought an 8800GTS/320 and put it into that system also reported that it runs great.

Why does PSU efficiency have anything to do with this? It seems like you're now saying it's _nothing_ to do with wattage (which I agree with - but PSU efficiency is usually tied to wattage) and everything to do with the 12V rail power - is that right?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: dclive
Originally posted by: Azn

I said it was rated at such per paper. Didn't say it uses 236watts in real life. Depending on the situation it would fluctuate how much watts it uses.

Can you detail (or guess, even) on a condition under which it would hit 236W, given that both THW and AT both stress-tested their systems (with newer CPUs and with older ones) and, for the entire SYSTEM, only required a little bit more power than that - 260W for the entire SYSTEM (in the case of Tom's results)?

Where is it rated at 236W? I'd be interested in reading that paper. Is that nVidia's whitepapers, or ?

That seems really high, because per http://techreport.com/articles.x/16504, they're saying that the max that a single-plug PCIe board can use is 150W. I have no other information on the topic, but it's an interesting discussion point.

It's my understanding that a pci-e slot gives 75watts and each pci-e 6 pin plug also give you an additional 75 watts 75 x 3=225 watts.
 

dclive

Elite Member
Oct 23, 2003
5,626
2
81
OK, so that's maximum theoretical power that a PCIe board could conceivably use. Hopefully everyone can agree that using that, rather than actual posted benchmarks from trusted sources (hopefully AT and THW are trusted), is not a good idea.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: dclive
Originally posted by: happy medium
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.

The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.

Can a quad-core 2.4 Intel CPU with 3-4 HDDs, 2 tuners, and 4GB of RAM be sufficiently powered by Acer E700's default 300W PSU, once one adds an nVidia 8800GTS?

So many people said it couldn't be done, but I've been happily running that system (and later variants) for quite some time. Those who also bought an 8800GTS/320 and put it into that system also reported that it runs great.

Why does PSU efficiency have anything to do with this? It seems like you're now saying it's _nothing_ to do with wattage (which I agree with - but PSU efficiency is usually tied to wattage) and everything to do with the 12V rail power - is that right?

Did you check your voltage? Are you undervolting? Good way to kill your system slowly.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: dclive
http://www.tomshardware.com/re...e-gtx-275,2266-14.html is another interesting post from Tom's Hardware. Basically it shows those (fully modern) GPUs all would be fine in a system with a PSU of 300W or so, call it 350W for lots of excess capacity. The 4870x2 would need about 500W.

That's it. And that's for an i7/965 setup with all power saving disabled. That's a fully modern, fully current system.

Those numbers were at full GPU load. Not full system load. With the CPU also loaded expect those number to be higher
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: dclive
Originally posted by: happy medium
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.

The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.

Can a quad-core 2.4 Intel CPU with 3-4 HDDs, 2 tuners, and 4GB of RAM be sufficiently powered by Acer E700's default 300W PSU, once one adds an nVidia 8800GTS?

So many people said it couldn't be done, but I've been happily running that system (and later variants) for quite some time. Those who also bought an 8800GTS/320 and put it into that system also reported that it runs great.

Why does PSU efficiency have anything to do with this? It seems like you're now saying it's _nothing_ to do with wattage (which I agree with - but PSU efficiency is usually tied to wattage) and everything to do with the 12V rail power - is that right?

Psu efficiency has nothing to do with wattage.

I have a 350 watt high quality 26 amp 12v rail psu.
Now correct me if I'm wrong but I believe the motherboard,hardrives , and video cards draw off the 12v line. I find it hard to believe that a quad core cpu with 5 hard drives and motherboard will run off a 300watt psu's 12v line for more then a few min unless it has a super amp 12v rail. MY 26 amp rail is rated at 300+ watts. A 8800gts power draw is much less then a gtx 280 also.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: dclive
Originally posted by: Azn

Which GPU is rated to 236 watts?

GTX 280 is rated at 236watts.
.

Per Tom's Hardware, a fully loaded system with a GTX 285, high-end i7 CPU, RAM, 10,000 RPM hard drives, etc... takes about 260 watts from the wall, total.

http://www.tomshardware.com/re...e-gtx-275,2266-14.html

(Anandtech's data is almost exactly the same...)

Again, the only load present is GPU load. CPU is barely being used. If you read Anandtechs disclaimer (I assume you're referring to this article http://www.anandtech.com/video...wdoc.aspx?i=3520&p=11) they noted that the CPU and memory were not being taxed, and to expect an extra 50-100w when gaming for example
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: yh125d
Originally posted by: dclive
Originally posted by: Azn

Which GPU is rated to 236 watts?

GTX 280 is rated at 236watts.
.

Per Tom's Hardware, a fully loaded system with a GTX 285, high-end i7 CPU, RAM, 10,000 RPM hard drives, etc... takes about 260 watts from the wall, total.

http://www.tomshardware.com/re...e-gtx-275,2266-14.html

(Anandtech's data is almost exactly the same...)

Again, the only load present is GPU load. CPU is barely being used. If you read Anandtechs disclaimer (I assume you're referring to this article http://www.anandtech.com/video...wdoc.aspx?i=3520&p=11) they noted that the CPU and memory were not being taxed, and to expect an extra 50-100w when gaming for example

very good point I missed that.:thumbsup:
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: Azn
Originally posted by: dclive
Originally posted by: Azn


Why don't you explain to me why I get ramdom reboots when I raise the voltage on my CPU with a 450 watt power supply with 2 12v @ max 30A? I didn't get random reboots when I had G92 8800gts in the same system when I raised voltage BTW nor I get random reboots at default voltage.

A GTX280 doesn't need 236W by itself, which is the only point I'm rebutting. As one can see by looking at the Anandtech and Tom's Hardware review, the GTX 280 / 285 likely only need a fraction of that 236W power because an entire system only needs about 260W of power - with a pretty modern system being powered.

If your system needs more power due to overclocking a CPU or whatnot, go for it. That has nothing to do with the GTX 280/285's power requirements.

Raising voltage 10% raises 20% power usage on the CPU itself

False. Voltage and power are linearly related, a 10% increase in voltage is a 10% increase in power when current is constant
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: dclive
Originally posted by: Azn

I said it was rated at such per paper. Didn't say it uses 236watts in real life. Depending on the situation it would fluctuate how much watts it uses.

Can you detail (or guess, even) on a condition under which it would hit 236W, given that both THW and AT both stress-tested their systems (with newer CPUs and with older ones) and, for the entire SYSTEM, only required a little bit more power than that - 260W for the entire SYSTEM (in the case of Tom's results)?

Where is it rated at 236W? I'd be interested in reading that paper. Is that nVidia's whitepapers, or ?

That seems really high, because per http://techreport.com/articles.x/16504, they're saying that the max that a single-plug PCIe board can use is 150W. I have no other information on the topic, but it's an interesting discussion point.

http://www.nvidia.com/object/p...eforce_gtx_280_us.html

Scroll down, "Maximum Graphics Card Power (W) - 236w"


GTX280 could *theoretically* pull 300w from its power sources before going above their spec. 75w from pci-e slot, 75w from 6 pin connector, 150w from 8 pin connector
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: yh125d
Originally posted by: Azn
Originally posted by: dclive
Originally posted by: Azn


Why don't you explain to me why I get ramdom reboots when I raise the voltage on my CPU with a 450 watt power supply with 2 12v @ max 30A? I didn't get random reboots when I had G92 8800gts in the same system when I raised voltage BTW nor I get random reboots at default voltage.

A GTX280 doesn't need 236W by itself, which is the only point I'm rebutting. As one can see by looking at the Anandtech and Tom's Hardware review, the GTX 280 / 285 likely only need a fraction of that 236W power because an entire system only needs about 260W of power - with a pretty modern system being powered.

If your system needs more power due to overclocking a CPU or whatnot, go for it. That has nothing to do with the GTX 280/285's power requirements.

Raising voltage 10% raises 20% power usage on the CPU itself

False. Voltage and power are linearly related, a 10% increase in voltage is a 10% increase in power when current is constant

You might want to research a bit more. I forget the web page where I read this but it's 20%.

Here's a CPU calculator that show 20% as well...

http://bakkap.free.fr/Misc/wCalc.html
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: dclive
Originally posted by: happy medium
Originally posted by: dclive
Originally posted by: Azn


So tell me do you use a 300/350 watt power supply to power a GTX 280? If you don't why do you suggest running such power supply with such video cards when you haven't had any experience with it other than just reading someone else's testing which doesn't tell much other than that they are running a much higher rated power supply to test for wattage?

Someone suggested that a GTX 280 used 236W.

I showed two tests - from two good sources - that showed it doesn't. Both tests show the entire system run using under 300W.

Are Anandtech and Tom's Hardware unreliable?

No they are not unreliable but do give the imppression to novice users that you can use a cheap 300/350 watt psu to run a high end gaming system just because it only draws 275 watts from the wall. It depends on 12v+ line amperage , psu efficiency and the ambient temperature which these components operate in.

The op has a 300watt (el chepo) psu and should not even consider running anything more then a 4770 if that even.

Why does PSU efficiency have anything to do with this? It seems like you're now saying it's _nothing_ to do with wattage (which I agree with - but PSU efficiency is usually tied to wattage) and everything to do with the 12V rail power - is that right?

Efficiency always matters. Especially when you're judging based on AC draw. For example...

System A and B both require 250w DC at full load. System A has an 85% efficient PSU at that load. System B has a 75% efficient PSU at that load. System A will pull 250w/.85 = 294w AC from the wall. System B will pull 250w/.75 = 333w AC from the wall. By only changing efficiency, system B appears to be using 40w more than system A, when its really using the same power.


Also, an efficient PSU will run cooler, as more AC power is being converted into DC power rather than thermal power. Cooler running PSU's deliver cleaner, stable, quieter power and operation
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Originally posted by: Azn


You might want to research a bit more. I forget the web page where I read this but it's 20%.

Here's a CPU calculator that show 20% as well...

http://bakkap.free.fr/Misc/wCalc.html

Flimsy internet calculators mean nothing to Joules law

http://en.wikipedia.org/wiki/Joule%27s_Law


Power = Potential Difference (voltage) x Current. 10% increase in voltage will ALWAYS mean a 10% increase in power. Always. There is no maybe. There are no exceptions.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: yh125d
Originally posted by: Azn


You might want to research a bit more. I forget the web page where I read this but it's 20%.

Here's a CPU calculator that show 20% as well...

http://bakkap.free.fr/Misc/wCalc.html

Flimsy internet calculators mean nothing to Joules law

http://en.wikipedia.org/wiki/Joule%27s_Law


Power = Potential Difference (voltage) x Current. 10% increase in voltage will ALWAYS mean a 10% increase in power. Always. There is no maybe. There are no exceptions.

How could you be so sure? There are hundreds of calculators that say 20%. So they are all wrong to determine the wattage? You don't think they know about Joule's law?

Far as I'm concerned the math goes something like this...

65W x (1.4575/1.325)^2 = 78W

I'll look for that article in the process.
 

brblx

Diamond Member
Mar 23, 2009
5,499
2
0
you're wrong.

more ohm's law than joule's law, though. i guess either one arrives at the other. V(voltage) = I(current) x R(resistance). and P(wattage) = I x V.

increasing voltage does not magically double anything. it's linear. by your calculations, using P = I x R, 1 x 1 = 1, and 2 x 1 = 4. sorry, no.