confirmation on r600 power req

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
Intel may not be changing sockets, but support only lasts a year or two for a particular motherboard, it ends up having worse support than AMD.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.

The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.

Right, so what you're saying is, somebody who games a lot on a R600 is in deep electricity bills as opposed to only the websurfer non gamer, who would never own anything like an R600 in the first place.

No he's saying that for all but the most hardcore gamers maximum power draw won't be experienced the majority of the time. If you are gaming 40 hrs a week then you have more problems than a power-hungry graphics card.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Sudheer Anne
Originally posted by: keysplayr2003
Originally posted by: Creig
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.

The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.

Right, so what you're saying is, somebody who games a lot on a R600 is in deep electricity bills as opposed to only the websurfer non gamer, who would never own anything like an R600 in the first place.

No he's saying that for all but the most hardcore gamers maximum power draw won't be experienced the majority of the time. If you are gaming 40 hrs a week then you have more problems than a power-hungry graphics card.

And would you say there are a lot of hardcore gamers here amoung us? Or no...? Cmon, lets be real here. People were complaining about power usage forever.

I personally doubt the R600 uses 300W. Just like I doubt it uses 200W. Somewhere in the middle sounds right. 240-250 at load. Why? Because 300W is just ridiculous. It cant be 300.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor

I disagree. I'm sure there will be quite a few disgruntled parents who won't buy AMD products again after searching their house for the radiant heater that seemed to have been left on 24/7 only to discover it was little Johnny's AMD based computer chewing through the power like there was no tomorrow.

RotFL

keep going ... you've almost captured Rollo's spirit ... :p

:D

it better *not* be 300w
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.

You'd be amazed. Trust me, you get plenty of low-income people who get some seasonal work, decide to upgrade their graphics or CPU and are surprised when their 550W Codegen PSU can't cope or their PCI-e card won't fit the PCI slot on their motherboard...
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
well, on thing is for certain. this is all good for psu makers.

psu's markup is only exceeded by cables, heh...
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.

First of all try to control your prehistoric emotions and name calling.. Are you still living in a cave? Probably by the style of your writing..And what you claim is totally irrelevant, do you have the ability to comprehend a sentence rationally? Read again and tell me what's the purpose of your answer..
And yes one might have the money to buy the card or the money to pay the electricity bill, but he might choose not to do so.. It's called choice, have you ever heard the word genius? It doesn't mean that because I have money I don't care about my bills or anything else.. Got it now or do you want to spell it for you? :disgust:

*edit for the poor ppl I really don't understand why are you mentioning them..Since they can't afford it ,why should they care about the power requirements of R600 in the first place genius? The mainstream or low end cards are not a problem in the first place anyway..WTF are you talking about? As for those that barely have the ability to buy a high end gpu and run it in a cheap psu I told you my opinion.. Now tell me what was the purpose of your stupid comment ?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Originally posted by: Matt2

AMD also announced a January release.

Link?

are you *denying* that r600 is delayed? :p

-it was "expected" just after g80's launch
-it was *scheduled* for CeBit ... and a Hard Launch

Even AMD's *partners* are praying for a miracle:

:Q

http://www.digitimes.com/mobos/a20070302VL201.html
Graphics partners try to keep the faith as AMD pulls the rug on CeBIT

Ricky Morris, DigiTimes.com, Taipei [Friday 2 March 2007]

Reports from Taiwan's graphics card manufacturing industry are revealing that AMD's recently announced delay of its upcoming flagship graphics processing unit (GPU), the R600, was not just a shock and disappointment to fans of the former-ATI, who have been waiting for graphics cards based on the new chip, but also to AMD's partners, who were set to introduce products based on the chip at CeBIT in March.

AMD and its partners were arranging to launch R600-based products at CeBIT, according to industry sources, and live demos were being planned by individual partners. These have now almost all been canceled after the announcement by AMD, with only a few select few partners being allowed to show demo systems ? provided by AMD ? in private showings, while the rest will have to make do with showcasing old products or, at best, new variations of old products at one of the industry's largest trade shows, which starts in a little than over two weeks, on March 15.

But it's not the inconvenience that's causing problems for the card makers, but rather the uncertainty that the late change of plans casts over AMD's ability to deliver in the graphics card market and its hope of being able to compete with Nvidia following the AMD-ATI merger. Nvidia launched its current high-end GPU in November 2006.

According to graphics card makers, the R600 is AMD's hope of getting back on top and the delay only serves to highlight how far behind the company has fallen. The fear that many card makers now face is that supporters of the ATI brand who have been waiting months for the new product will finally decide enough is enough, the effect of which would be reduced confidence, and therefore sales, of all ATI-branded products.

However, despite the doubts, it still appears that graphics card makers are backing AMD and hope that the company will succeed. "We have our fingers crossed," said one maker while another asked everyone to "keep the faith." With the R600 launch now scheduled for early in the second quarter amid added speculation that mainstream (R630) and low-end (R610) parts will launch at the same time, the new plans could result in AMD having a complete top-to-bottom product lineup at launch, a rare occurrence in the graphics market, but one that many industry experts believe may prove successful for the company.
do you need someone to draw a more descriptive "picture" for you?
:confused:

keep the "faith"
rose.gif

:Q

--especially when their big Partner AMD is acting as confused as hell and is putting out stupid FUD as their *lame excuse* for this disasterous delay
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Posting the same altered quote (major hard on the eyes bolded text) over and over again is pathetic. Get some new material.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: jim1976


Creig of course this applies to a small change in bill.. Nobody is trying to accuse ATI here for being the only company that charges our bill. But for some even this change is significant.. We do have O/C systems and certainly many of our house devices consume much more electric power than our PCs.. The point is that ATI is generally not so "power friendly and efficient" as it is obvious from the last generations compared to nVIDIA in general with some exceptions.. And while this does not make much of a difference overall, they should be looking for better solutions..
Despite that many of the users don't just use their GPU for gaming, but for other applications as well that require 3D mode..
I think a rational user is not trying to bash ATI for it's power requirements, but if I am to choose between two similar products, performance/spec wise, then I'd definately go the nVIDIA way..

While I totally agree that using less power is better. I am not so sure that with Intel's and Nvidia's high idle usage - that they are much if any better in daily power consumption. The G80 currently has the power pig award and it certainly has not hurt sales. Unfortunately the r600 could use twice as much and if 40% faster in new games - and people would buy. Better tech is needed from both companies, not marketing spin about fps per frame.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Posting the same altered quote (major hard on the eyes bolded text) over and over again is pathetic. Get some new material.
why? ... it tends to shut up all but the most retarded fanboys ;)

no alteration whatsoever ... bolded does not alter the quote

and it is *much* better than posting pathetic fanboy excuses over-and-over
:roll:

the "quote" shows how disappointed AMD's PARTNERS are ...

Some are asking other partners to "keep the faith" ...

i always thought *prayer* was an act of desperation
:evil:

... in the business world :p
 

jdoggg12

Platinum Member
Aug 20, 2005
2,685
11
81
I think the R600 will use 322.75311 watts at all times, but i'm only basing that on the same crap pulled out of the same asses that most of the misinformation about the r600 comes from on this forum.

/instigation :)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: jdoggg12
I think the R600 will use 322.75311 watts at all times, but i'm only basing that on the same crap pulled out of the same asses that most of the misinformation about the r600 comes from on this forum.

/instigation :)

Well that would certainly win the power pig award. :D
 

secretanchitman

Diamond Member
Apr 11, 2001
9,352
23
91
thats too much wattage for my tastes, imo. even 200w is a bit too much.

thats one hell of a long card though.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: ronnn
Originally posted by: jim1976


Creig of course this applies to a small change in bill.. Nobody is trying to accuse ATI here for being the only company that charges our bill. But for some even this change is significant.. We do have O/C systems and certainly many of our house devices consume much more electric power than our PCs.. The point is that ATI is generally not so "power friendly and efficient" as it is obvious from the last generations compared to nVIDIA in general with some exceptions.. And while this does not make much of a difference overall, they should be looking for better solutions..
Despite that many of the users don't just use their GPU for gaming, but for other applications as well that require 3D mode..
I think a rational user is not trying to bash ATI for it's power requirements, but if I am to choose between two similar products, performance/spec wise, then I'd definately go the nVIDIA way..

While I totally agree that using less power is better. I am not so sure that with Intel's and Nvidia's high idle usage - that they are much if any better in daily power consumption. The G80 currently has the power pig award and it certainly has not hurt sales. Unfortunately the r600 could use twice as much and if 40% faster in new games - and people would buy. Better tech is needed from both companies, not marketing spin about fps per frame.

Ronn, if R600 proves to be 40% faster then by all means it deserves to use more power..
I just don't like this trend.. The problem is that ATI should produce a more power friendly/efficient lineup.. Because if they continue like this they will eventually face a lot of problems,don't you agree?
ATM G80 is a using more power than the previous lineup. But not that much compared to X1950XT/XTX especially in Xfire.. If the rumors about R600 needing at least 220-240w per card turns to be true, then this is a problem..Eventually they should look at the power requirements for a change.. nVIDIA in almost every generation the last 2-3 years proves to be much more efficient.. Yes slower in many instances, but less power oriented and less heat generated.. This might not be so important for a part of the enthusiast market, but for some it does play a significant role don't you agree? And proportionally I think they are out of the limits in power terms..
I really hope R600 proves us idiots and be an awesome gpu in all aspects. Honestly.. I want it.. And I will be one of the first that will buy it...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Posting the same altered quote (major hard on the eyes bolded text) over and over again is pathetic. Get some new material.

OK ... how about this shocker

http://www.theinquirer.net/default.aspx?article=37982
AMD asks to double common stock to 1.5 billion shares

Not an anti-takeover move

By Cher Price: Saturday 03 March 2007, 15:37
CHIP FIRM AMD said its authorised capital stock is 750 million shares of common stock and 1,000,000 preferred shares.

But, it said in an SEC filing it made yesterday, that the board of directors approved an amendment to double the shares from 750 million to 1.5 billion. That needs shareholder consent.

The AMD SEC filing said the motivation wasn't to prevent a takeover. "Our Board of Directors believes that it is in the best interests of the stockholders for the Board of Directors to have the flexibility to issue additional shares of Common Stock in any or all of the above circumstances. Although the issuance of additional shares of Common Stock could, in certain instances, discourage an attempt by another person or entity to acquire control of us, we have not proposed the increase in the number of authorized shares of common stock with the intention of using the additional authorized shares for anti-takeover purposes."

But, to the extent that the additional shares are issued they might decrease shareholders' percentage equity and dilute voting rights, earnings, and book value. The board recommends that shareholders vote "for" this proposal.

The shareholders' meeting is on May 3rd. If shareholders don't vote, a "for" will be presumed.

Currently the following outfits own more than five per cent of common stock as of this coming Monday: Capital Research Management Company of LA (57,267,600), Oppenheimer (42,722,190) and AXA of Paris, France (28,182,589). µ

* MEANWHILE A SET of separate SEC filings was made yesterday. Harry Wolin disposed of 2,380 shares at $15.07 and took options* of 9,000 shares at $0; Martin Seyer disposed of 2,500 shares at $15.07 and acquired 9,000 options at $0; Hector Ruiz disposed of 16,663 shares at $15.07 and took options on 63,000 at $0; Robert River disposed of 18,000 shares at between $14.66 and $14.99 and took options of 18,000 at $0; Henri Richard disposed of 6,435 shares at $15.07 and took options of 18,000 at $0; Derrick Meyer disposed of 4,761 shares at $15.07 and took options of 18,000 at $0; and Thomas McCoy disposed of 12,000 shares at $15.0359 and took options of 12,000 at $0.

* CORRECTION These are restricted stock units (RSUs), less risky than options.
looks like an anti-takeover move to me .. *amazing timing* :p
:Q
of COURSE i'll post it in the *Delayed* thread ... where it really belongs ...
this thread should have never been started, anyway.

but since you asked

:evil:

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: jim1976

Ronn, if R600 proves to be 40% faster then by all means it deserves to use more power..
I just don't like this trend.. The problem is that ATI should produce a more power friendly/efficient lineup.. Because if they continue like this they will eventually face a lot of problems,don't you agree?
ATM G80 is a using more power than the previous lineup. But not that much compared to X1950XT/XTX especially in Xfire.. If the rumors about R600 needing at least 220-240w per card turns to be true, then this is a problem..Eventually they should look at the power requirements for a change.. nVIDIA in almost every generation the last 2-3 years proves to be much more efficient.. Yes slower in many instances, but less power oriented and less heat generated.. This might not be so important for a part of the enthusiast market, but for some it does play a significant role don't you agree? And proportionally I think they are out of the limits in power terms..
I really hope R600 proves us idiots and be an awesome gpu in all aspects. Honestly.. I want it.. And I will be one of the first that will buy it...

The 200w rumour is too much! The fact is that every new generation is using more power. This is not good and can't sustain itself. If the r600 is faster at all, there will come situations where it is 40% faster - but that don't make it right. Just because the G80 only uses a bit more power than the r580 at load (quite a bit more at idle), doesn't make it more efficient. My point was all of these companies have work to do here. Even vista idles at slightly higher consumption. This is insane - we should be reducing usage. Am personally thinking that maybe should just get a ps3 or xbox360 - and a quiet, efficient multimedia machine.

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Gstanfor
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.

You'd be amazed. Trust me, you get plenty of low-income people who get some seasonal work, decide to upgrade their graphics or CPU and are surprised when their 550W Codegen PSU can't cope or their PCI-e card won't fit the PCI slot on their motherboard...

their fault for not reasearching. its always more trouble to buys smart than buy expensive. also ppor plp ALWAYS buy midrange/low end and midrange r600 will have regular power requirements.
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: jim1976
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.

First of all try to control your prehistoric emotions and name calling.. Are you still living in a cave? Probably by the style of your writing..And what you claim is totally irrelevant, do you have the ability to comprehend a sentence rationally? Read again and tell me what's the purpose of your answer..
And yes one might have the money to buy the card or the money to pay the electricity bill, but he might choose not to do so.. It's called choice, have you ever heard the word genius? It doesn't mean that because I have money I don't care about my bills or anything else.. Got it now or do you want to spell it for you? :disgust:

*edit for the poor ppl I really don't understand why are you mentioning them..Since they can't afford it ,why should they care about the power requirements of R600 in the first place genius? The mainstream or low end cards are not a problem in the first place anyway..WTF are you talking about? As for those that barely have the ability to buy a high end gpu and run it in a cheap psu I told you my opinion.. Now tell me what was the purpose of your stupid comment ?

i was responding to ppl complaining over electricity bills. dumb ****** can't get it through there heads that ppl who buy expensive computer hardware don't care about a few more dollars in electricity.

the only reason high power requirements make any difference is PSU (again why are you buying a $600 card if you can't afford a reasonable $70-80 PSU) and more importantly over clocking and temperatures.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: jim1976
Ronn, if R600 proves to be 40% faster then by all means it deserves to use more power..
I just don't like this trend.. The problem is that ATI should produce a more power friendly/efficient lineup.. Because if they continue like this they will eventually face a lot of problems,don't you agree?

umm.. you mean kinda like ibm w/ netburst? ;)

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: tanishalfelven
their fault for not reasearching. its always more trouble to buys smart than buy expensive. also ppor plp ALWAYS buy midrange/low end and midrange r600 will have regular power requirements.
I can assure you that not all (or even most of them) always buy low/midrange, especially not when they pick up some seasonal/casual work.

I'd say its more the fault of shonky computer store owners (and there are plenty of them) convincing them that a Codegen PSU and PC-Chips motherboard will handle anything they can throw at it (most small computer stores stock nothing *but* codegen PSU's)...
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: tanishalfelven
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

dumb asses like you need to get one thing straight. poor ppl who can't play a few more dollars in electricity bills don't buying 600 dollar cards or 100 dollar PSU. get that through your ****** head you ****** moron.

First of all try to control your prehistoric emotions and name calling.. Are you still living in a cave? Probably by the style of your writing..And what you claim is totally irrelevant, do you have the ability to comprehend a sentence rationally? Read again and tell me what's the purpose of your answer..
And yes one might have the money to buy the card or the money to pay the electricity bill, but he might choose not to do so.. It's called choice, have you ever heard the word genius? It doesn't mean that because I have money I don't care about my bills or anything else.. Got it now or do you want to spell it for you? :disgust:

*edit for the poor ppl I really don't understand why are you mentioning them..Since they can't afford it ,why should they care about the power requirements of R600 in the first place genius? The mainstream or low end cards are not a problem in the first place anyway..WTF are you talking about? As for those that barely have the ability to buy a high end gpu and run it in a cheap psu I told you my opinion.. Now tell me what was the purpose of your stupid comment ?

i was responding to ppl complaining over electricity bills. dumb ****** can't get it through there heads that ppl who buy expensive computer hardware don't care about a few more dollars in electricity.

the only reason high power requirements make any difference is PSU (again why are you buying a $600 card if you can't afford a reasonable $70-80 PSU) and more importantly over clocking and temperatures.

Pointless.. Read again everything is answered in my previous post..