NV G70 To Be Power Hungry

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Nvidia asked PCI-Sig, the PCI and PCIe standardisation body, to provide some more juice for Nvidia's next generation cards. It turns out that 75W from external connector and 75W from motherboard is not enough. Nvidia wants an additional 75W for its G70 card.
Link

:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
It's the inquirer...aka Speculation Station. I severely doubt it will pull 225W, even under load. That would be about a 100W bump from the most power hungry cards out there today, and running them in SLI would be a ludicrous proposition.

I doubt Nvidia is anywhere near stupid enough to shoot themselves in the foot in this manner.

I remember how everyone crapped a brick about the 6800U's power consumption until Nvidia noted that they were being cautious, and you could actually run the card with much less than they had rated it at.

Good companies have to inflate their specs to allow for stupid people who don't know what they're doing and would otherwise break their shiny new stuff.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
that's scary :p

can you just buy two quality 300w psu's and use them both in your rig, cos that will be far cheaper that the monster psu's you will need to run that alone?
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Forget about sli then, unless you get a "sli only" secondary psu with 4 6-pin plugs.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: fbrdphreak
Nvidia asked PCI-Sig, the PCI and PCIe standardisation body, to provide some more juice for Nvidia's next generation cards. It turns out that 75W from external connector and 75W from motherboard is not enough. Nvidia wants an additional 75W for its G70 card.
Link

:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Nothing against you Pontiac boy. Its just that whoever reads that is going to "oooohhh & ahhhhhh".

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Beyond that, who cares?

Even if that speculation is correct, I'm guessing people who want what it has to offer will pay?

The rest will post "That thing uses too much power/heat/noise/space! My 9800Pro was good enough in 2003, it's good enough today!"
 

MDE

Lifer
Jul 17, 2003
13,199
1
81
Originally posted by: keysplayr2003
Originally posted by: fbrdphreak
Nvidia asked PCI-Sig, the PCI and PCIe standardisation body, to provide some more juice for Nvidia's next generation cards. It turns out that 75W from external connector and 75W from motherboard is not enough. Nvidia wants an additional 75W for its G70 card.
Link

:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Exactly. The card may only use 135W but Nvidia wants to be extra sure they have enough clean power going to the card without pushing the limits.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: keysplayr2003
Originally posted by: fbrdphreak
Nvidia asked PCI-Sig, the PCI and PCIe standardisation body, to provide some more juice for Nvidia's next generation cards. It turns out that 75W from external connector and 75W from motherboard is not enough. Nvidia wants an additional 75W for its G70 card.
Link

:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Nothing against you Pontiac boy. Its just that whoever reads that is going to "oooohhh & ahhhhhh".

I agree that the card won't necessarily use all 225W of power potentially available.

Here is an extract from nVidia's forthcoming GPU Gems2 Book, Chapter 29:
Power
Although smaller transistors require less power than larger ones, the number of transistors
on a single processor die is rising faster than the amount at which power per transistor
is falling. Consequently, each generation of processors requires more power: the
ITRS estimates that the maximum power allowed for 2004 chips with a heat sink is
158 watts and will gradually rise to a ceiling of 198 watts by 2008. This power constraint
will be one of the primary limitations of future processors; the future figure of
merit may no longer be the number of operations per second but instead the number
of operations per second per watt.
 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Eh, I searched and didn't find anything about the G70 power consumption in Video.

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Nothing against you Pontiac boy. Its just that whoever reads that is going to "oooohhh & ahhhhhh".
Gee, thanks for the expert analysis. I was posting this for ppl who would be interested in knowing that nvidia is trying to get PCI-Sig to add another power connector, due to the higher power consumption. I know it won't necessarily use all 225W. But thanks for assuming other people's ignorance due to your post count :roll:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: fbrdphreak
Eh, I searched and didn't find anything about the G70 power consumption in Video.

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Nothing against you Pontiac boy. Its just that whoever reads that is going to "oooohhh & ahhhhhh".
Gee, thanks for the expert analysis. I was posting this for ppl who would be interested in knowing that nvidia is trying to get PCI-Sig to add another power connector, due to the higher power consumption. I know it won't necessarily use all 225W. But thanks for assuming other people's ignorance due to your post count :roll:

Your Welcome. But you'd never guess that from your thread topic, caption and no less than 13 surprised smileys. now would ya.... ;)

Edited to add winking smiley face for context.

 

Rent

Diamond Member
Aug 8, 2000
7,127
1
81
Originally posted by: keysplayr2003
Originally posted by: fbrdphreak
Eh, I searched and didn't find anything about the G70 power consumption in Video.

Ummm dude? 225W would be the power at the G70's disposal with 75W from the PCI-E bus and 2 external 75W connectors. This does not mean the card will use it all. I do wish folks would use a little common sense when reading these things. And I also wish the inquirer would shrivel up and fold because they publish malinformitive crap like this.
225W would just be what is at the cards disposal. I'm guessing Nvidia would rather have more than enough than not enough. E.G. 225W instead of 150W.

Nothing against you Pontiac boy. Its just that whoever reads that is going to "oooohhh & ahhhhhh".
Gee, thanks for the expert analysis. I was posting this for ppl who would be interested in knowing that nvidia is trying to get PCI-Sig to add another power connector, due to the higher power consumption. I know it won't necessarily use all 225W. But thanks for assuming other people's ignorance due to your post count :roll:

Your Welcome. But you'd never guess that from your thread topic, caption and no less than 13 surprised smileys. now would ya.... ;)

Edited to add winking smiley face for context.

QFT :thumbsup:
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.


Completly true, you cannot call a winner. Even when they are out drivers can change things a lot, let alone unconfirmed rumors to claim winners. We will see who and why is the best overall. Maybe the best is not the best performer ;)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: McArra
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.


Completly true, you cannot call a winner. Even when they are out drivers can change things a lot, let alone unconfirmed rumors to claim winners. We will see who and why is the best overall. Maybe the best is not the best performer ;)

True dat homey.

Gotta love the vaporware wars though- they're right up there with debating religion for speculation.

The price of gas is God's retribution for Britney Spears!
No way! It's Zeus getting back at us for eating fewer high fat olives and the decline in Greece's economy!


 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



Posts like these make me wonder why people consider Rollo an NV fanboy. If you've been around here a while, it really doesn't add up. Foo' just likes videocards.

A lot.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Nvidia wouldn't shoot themself in the foot by making a card that power hungry, and if they did I would just laugh
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



I am 100% confident that when both cards are released, you will realize yet again that Nvidia is the winner. :beer: And this time you likely will not be alone, if the hints on b3d are accurate. The hints also suggest that present day cards will not be good value, so waiting to buy may be a good thing right now.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: ronnn
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



I am 100% confident that when both cards are released, you will realize yet again that Nvidia is the winner. :beer: And this time you likely will not be alone, if the hints on b3d are accurate. The hints also suggest that present day cards will not be good value, so waiting to buy may be a good thing right now.
Now this is an example of a true fanboy, someone who will pull for something even though they have no proof to back up their claims, before calling someone else a fanboy please compare them to this.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: ronnn
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



I am 100% confident that when both cards are released, you will realize yet again that Nvidia is the winner. :beer: And this time you likely will not be alone, if the hints on b3d are accurate. The hints also suggest that present day cards will not be good value, so waiting to buy may be a good thing right now.


I'm not going with any for-sures, but I bought a 6600GT because it can slam around anything out right now, and the next major technology leap won't show up until Unreal Engine 3 next year. By then, we'll have seen what G70 and r520 can do - and whether or not high end cards this generation were a good buy.

I myself personally believe that we're going to see disguting (in a good way) numbers from both R520 and G70.

Of course, this is all jsut still rumor, but that's why I went with an SLI motherboard but just a single bang/buck card right now. If G70 and R520 are disappointing, I can SLI for cheap and wait for the next big leap. If not, I'll have not wasted money on a $500 card that will be vastly outdone in 3 or 4 months.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Originally posted by: ronnn
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



I am 100% confident that when both cards are released, you will realize yet again that Nvidia is the winner. :beer: And this time you likely will not be alone, if the hints on b3d are accurate. The hints also suggest that present day cards will not be good value, so waiting to buy may be a good thing right now.


What are you talking about? The x850xtpe is the fastest card right now and the x800xl being the best bang for the buck going for as low as 270... Although the 6600gt is the winner in the $ 200 area.

Either way, I consider both companies equal and i'm greatful they're 2 high end gfx card makers.


Either way, like others said, with that statement, you just prooved your just a fanboy. Meaning your coments mean nothing.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Insomniak
Originally posted by: Rollo
Originally posted by: ronnn
G70 should be the best card available when released. The bad news is that we will all have to spend $600.00 to keep up. :beer:

You must know something the rest of us don't? Why do you think a G70 will be better than a R520?

All I've seen are rumors.



Posts like these make me wonder why people consider Rollo an NV fanboy. If you've been around here a while, it really doesn't add up. Foo' just likes videocards.

A lot.


His comment is the only logical conclusion one could come to right now.