confirmation on r600 power req

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Gstanfor
And I don't normally carry vendettas at all, but the fnantaics and those within and without ATi who supported them are a special case. Not even the intel shills at the introduction of K7 were as bad. I guess everyone has something that gets his goat. Fanatics are what get mine.

You do realize that you're the equivalent of an nVidia fanatic, right?
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
R600 doesn't consume 300W !!! these report are lies and AMD came out and said R600 they are demonstrating at GDC consume 200W each. Also ATI didn't want to delay the R600 GPU... it was AMD's Henri Richard insistence.
 

MmmSkyscraper

Diamond Member
Jul 6, 2004
9,472
1
76
Originally posted by: munky
The claims of this article are too ridiculous to be true. It's been long rumored that the retail version is shorter, and yet these guys claim it will be over 13 inches long...:roll:

I know of something else that's 13 inches long :D

Oh, that's shens too :(

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
HardOCP lost all credibility when they said AMD X2 were as good as Core 2 Duos, but apart from that...

That would explain why a chip as large and hot as R600, supposedly featuring a 512 bit bus is reportedly having trouble keeping up with G80

Having trouble? Heh... At this point I bet it will be a good 50% faster, but its so late it doesnt matter anyway
 

nrb

Member
Feb 22, 2006
75
0
0
Originally posted by: Matt2
AMD also announced a January release.
No they didn't. They announced an April release. That was their only firm announcement on a release date. Yes, they certainly did intend to launch in January at one time, but it was never actually announced.

 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Gstanfor
Originally posted by: jim1976
Wait a minute..
If I can see correctly this is an I-will server board with dual Xeons if I'm not mistaken..
Who on earth would try an X-fire on a server board? Is this a fake one or an OEM version and they are BS us? :p

You may remember I linked to an AMD interview recently where Ruiz said AMD were focussed on strong server performance.

The thing is, AMD may be ***TOO FOCUSSED*** on server performance and not focussed enough on consumer performance.

I fear that so far as AMD is concerned, R600 is merely a data cruncher to accelerate servers and not a GPU...

Oh well, at least the fanatics will have a physical representation of their e-penis egos...

:confused:

Nm...
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: gersson
Originally posted by: Gstanfor
Originally posted by: jim1976
Wait a minute..
If I can see correctly this is an I-will server board with dual Xeons if I'm not mistaken..
Who on earth would try an X-fire on a server board? Is this a fake one or an OEM version and they are BS us? :p

You may remember I linked to an AMD interview recently where Ruiz said AMD were focussed on strong server performance.

The thing is, AMD may be ***TOO FOCUSSED*** on server performance and not focussed enough on consumer performance.

I fear that so far as AMD is concerned, R600 is merely a data cruncher to accelerate servers and not a GPU...

Oh well, at least the fanatics will have a physical representation of their e-penis egos...

OMG you are as annoying as your avatar...

Can you not be such an nvidia fanboy. It makes you look stoopid.

Agreed
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
At this point the R600 is the Duke Nukem Forever of video cards. It's takes 300w or 200W. It's a foot long or it's not. It's coming out in December 2006 or July 2007. It will beat the 8800 or it won't. Whatever, by the time they do get it out the door it will be outdated.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Originally posted by: Aberforth
it means huge electricity bills too...

It means a lot of people won't buy it,I'm looking for a nice new video card around 8800 GTS power requirements.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: tuteja1986
R600 doesn't consume 300W !!! these report are lies and AMD came out and said R600 they are demonstrating at GDC consume 200W each. Also ATI didn't want to delay the R600 GPU... it was AMD's Henri Richard insistence.

while 300w certainly could be incorrect, did you read how that statment was worded?

"In the configuration demoed in San Francisco, each R600 is said to have used 200W..."

i'm not sure that's any kind of a "difinitive" statement...

besides, 300w is better. @ 200w it would be hard for me to get the word "porn" in the subtitle :D
 

HannibalX

Diamond Member
May 12, 2000
9,359
2
0
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.

The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Creig
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.

The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.

Creig of course this applies to a small change in bill.. Nobody is trying to accuse ATI here for being the only company that charges our bill. But for some even this change is significant.. We do have O/C systems and certainly many of our house devices consume much more electric power than our PCs.. The point is that ATI is generally not so "power friendly and efficient" as it is obvious from the last generations compared to nVIDIA in general with some exceptions.. And while this does not make much of a difference overall, they should be looking for better solutions..
Despite that many of the users don't just use their GPU for gaming, but for other applications as well that require 3D mode..
I think a rational user is not trying to bash ATI for it's power requirements, but if I am to choose between two similar products, performance/spec wise, then I'd definately go the nVIDIA way..
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Oh, I don't disagree. Given two video cards that offer identical performance in all respects except power consumption, I'd take the one that uses less electricity as well. The nice thing is that less power consumption also means less heat generated.

And some people do use their cards for things other than gaming (3D rendering, F@H, etc) which can mean hours, days or even months of having the GPU run flat out at 100% useage. In these instances, the electricity consumption will be much higher.

I just wanted to point out that, for the average user, the monthly electric bill shouldn't really be a factor in choosing a video card.
 

thilanliyan

Lifer
Jun 21, 2005
12,019
2,235
126
Originally posted by: 40sTheme
Originally posted by: thilan29
Originally posted by: CrystalBay
GDC, GoW Screens

HOLY MOTHER OF..........!!!!!!!!!!!!!!!!
IF that is for real (and is not just an ingame cinematic)...WOW. :Q :confused: :D

I hope GoW on my 8800 will look like that.

That's not GoW; that's Battlefield: Bad Company.

Is that a brand new game or a sequel to something else?? Still...it looks amazing if that's in-game screens.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: thilan29
Originally posted by: 40sTheme
Originally posted by: thilan29
Originally posted by: CrystalBay
GDC, GoW Screens

HOLY MOTHER OF..........!!!!!!!!!!!!!!!!
IF that is for real (and is not just an ingame cinematic)...WOW. :Q :confused: :D

I hope GoW on my 8800 will look like that.

That's not GoW; that's Battlefield: Bad Company.

Is that a brand new game or a sequel to something else?? Still...it looks amazing if that's in-game screens.

It's the sequel to the console versions of BF2, AFAIK it's not coming to the PC

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Creig
I just wanted to point out that, for the average user, the monthly electric bill shouldn't really be a factor in choosing a video card.

I disagree. I'm sure there will be quite a few disgruntled parents who won't buy AMD products again after searching their house for the radiant heater that seemed to have been left on 24/7 only to discover it was little Johnny's AMD based computer chewing through the power like there was no tomorrow.
 

Praxis1452

Platinum Member
Jan 31, 2006
2,197
0
0
Originally posted by: Gstanfor
Originally posted by: Creig
I just wanted to point out that, for the average user, the monthly electric bill shouldn't really be a factor in choosing a video card.

I disagree. I'm sure there will be quite a few disgruntled parents who won't buy AMD products again after searching their house for the radiant heater that seemed to have been left on 24/7 only to discover it was little Johnny's AMD based computer chewing through the power like there was no tomorrow.

umm ok so you triple creigs calculations. 8x3=24 hours at 100w. hmm... $10.
R600=$650... 10/650=1/65.


oh shut up. Your sure of ****** is what. quite a few disgruntled parents? If parents are spending $650 on a vid card. $10 will not be much. In fact 300W isn't a radiant heater. when I use my oil heater 600W is barely warm. I have to go to 900W to actually warm my room.

900W 24/7 is $21 bucks a week at 9.86 cents a kw hour.

btw I'll bet r600 is going to use closer to 200W. they respun for a reason. If so 20W more than your precious 8800GTX though the gap will widen with 8900gtx though it still might increase because of clockspeed.


gstanfor you never change. gI used to want to reply to every post you make since it's almost always full of crap. sadly I've realized that your mostly worthless.


 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
You know, not all countries in the world enjoy the same cheap energy prices the USA does. AMD has traditionally been very big in those countries because they offer value for money. People accustomed to thinking they can just buy AMD and be assured of a cheap purchase price, long platform life and low running costs vs the competition are in for a rude awakening, and AMD will see their marketshare evaporate at as a result (its already happening on the CPU side thanks to AMD's incessant socket changing fetish).
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Gstanfor
The fanatics had their fun with nv30, I don't see why nvidia supporters aren't allowed to return the favor.

And apart from my last paragraph there is nothing idiotic about what I wrote. ATi fans may not like it, but thats hardly my problem.

Your inability to rise above it is actually sadder than any ATI fanatics that you have problems with. You are in no position to say anything about anything, at anytime.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Creig
Originally posted by: jim1976
Originally posted by: Pale Rider
Ojhhh Noeess!!11 Change is bad!! Run!!! We want to run our CRAY SUPER COMPUTER on our 300 watt no name brand PSU!!! WHY CAN'T WE!?!?!?!!!111oenelolwtfbbq!!

You do realize that this has nothing to do with the purchase of a high end psu don't you? Anyone that doesn't invest on a good psu and risks his high end rig is irrational to say the least.. Thtat's not the point.. The point is your electricity bill every month...

Not really. A video card is only drawing maximum power when engaged in some sort of 3D rendering. For web browsing, word processing and other 2D functions, the card is using significantly less electricity. In the case of the 8800GTX, that can mean an average of 100 watts less at idle than during gaming. So if you take the number of hours spent actually playing games per month and divide that by 10, you'll come up with the number of kilowatt hours used while in 3D mode.

The average residential cost in the US in 2006 per kilowatt-hour was 9.86¢. This means that if you spent the national average of 8 hours a week gaming, your total cost per month would be $3.15 minus whatever the cost you were paying with your previous, less power hungry card. All in all, it would amount to a very small price to pay considering the increased level of graphics you would be enjoying.

Right, so what you're saying is, somebody who games a lot on a R600 is in deep electricity bills as opposed to only the websurfer non gamer, who would never own anything like an R600 in the first place.