Single board GTX295 more power hungry than original

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
If you check out the EVGA Forums, there are a lot of people having problems with the high end cards. My 280GTX is failing already... I have it underclocked in order for it to work. I narrowed it down to the memory or the memory controller as I can increase the core and shader clocks quite a bit while just dropping memory clock sightly and it will become stable again. I am going to RMA to EVGA.

But anyway, the point is that maybe the power delivery to some of these cards was less then optimal and could be the reason for the increase in power consumption. Or, perhaps t his review site did not run the test properly.

Insufficiant data for any conclusion at this point.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
And if your cards fail, so what, that's what the warranties are for. You may get an upgrade out of it if they have no more GTX280's.
:::crosses fingers for ya:::
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
And if your cards fail, so what, that's what the warranties are for. You may get an upgrade out of it if they have no more GTX280's.
:::crosses fingers for ya:::

One can only expect such a "rosie-colored" post from a member of an Nvidia focus group.....

Why don't we let this play out and see if it is a legitimate problem....remember the 7800/7900gt/x's?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.

While this is a true statement, I don't believe this is just some random specific issue that only affects a very small userbase. A lot of people are coming out of the wood work to report the same issue. People experienced with computers who have already ruled out other possibilities. While I don't think every card is affected, there certainly are a large amount of cards out there with these specific problems. In most cases, the issue seems to manifest itself in MMOs, probably because of all the texturing loading and unloading that is involved with those types of games.

I have a feeling only a few people stress the card the way I do with turning on virtually every eye candy option. Including transparency anti-aliasing. If I disable most of the eye candy features, the card is certainly more stable. Still, my main MMO (LOTRO) causes this card to crash, lose video signal, etc after only 2 minutes in the game at stock nVidia reference speeds.

But, my point isn't to start a thread or hysteria that the nVidia cards are plagued with problems, because I don't neccessarly believe that to be the case. But I do know there there are a ton of 285 (some 280s) GTX's that refuse to operate correctly even at their stock speeds in certain applications. Typically the 'no signal' on monitor is becoming more and more common. The NVDDKMD type error is also very common.

As soon as the 4th of July is over, I plan to RMA it. The card works fine if I underclock the memory to 1000. So that is where I have it set in Riva Tuner for now. For the most part, I am losing ~5% or so performance over a stock GTX, and it really doesn't bother me that much. However, I paid for a OC card that can't seem to run stable even at the lower refence speeds. Reason enough to RMA, but only at my convienence.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Well my personal experience is this. I got a new 295 that was artifacting like mad in Crysis to where it was unplayable. It looked like you were in Wonderland it was so bad. I got a refund on it and then went to 260 SLI. One of the 260's died within a day..no signal at all.

I have owned a 5900, 6800GT, 7800GTX, 7800GTX 512, 8800GTX 640, 8800GT and a 9800 GX2 and none of them ever failed on me. I play games up to 40 hours a week sometimes so I stress the cards pretty good. It is my conclusion that there is something wrong..unless you think I was "due" for some failures. lol I do believe it is memory related somehow because of the artifacts and the one 260 that died would display in only 8-bit color before it died completely.
 

thilanliyan

Lifer
Jun 21, 2005
11,871
2,076
126
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Snakexor
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
And if your cards fail, so what, that's what the warranties are for. You may get an upgrade out of it if they have no more GTX280's.
:::crosses fingers for ya:::

One can only expect such a "rosie-colored" post from a member of an Nvidia focus group.....

Why don't we let this play out and see if it is a legitimate problem....remember the 7800/7900gt/x's?

Ok, explain to the rest of the forum my motives for posting such slander. LOL.

Know what? Nevermind. Keep it OT if you can handle it.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).

I've gotten 4 duds in my video card history.

5900XT
9700
9700pro
9500pro
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).

I've gotten 4 duds in my video card history.

5900XT
9700
9700pro
9500pro

This is my first card that has given me a problem. I have owned mostly nVidia cards. I prefer ATi but they always seem to fail to deliver when I am going to purchase a card. Just the luck of the draw I guess.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ArchAngel777
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).

I've gotten 4 duds in my video card history.

5900XT
9700
9700pro
9500pro

This is my first card that has given me a problem. I have owned mostly nVidia cards. I prefer ATi but they always seem to fail to deliver when I am going to purchase a card. Just the luck of the draw I guess.

Have you tried switching the rails that supply power to your card?
I'd like to see what (if anything) changes. Just an "experiment".
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Keysplayr
Originally posted by: ArchAngel777
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).

I've gotten 4 duds in my video card history.

5900XT
9700
9700pro
9500pro

This is my first card that has given me a problem. I have owned mostly nVidia cards. I prefer ATi but they always seem to fail to deliver when I am going to purchase a card. Just the luck of the draw I guess.

Have you tried switching the rails that supply power to your card?
I'd like to see what (if anything) changes. Just an "experiment".


Yep, sure did. I did that about 3 days ago. Also checked Antec's website to verify. They made the PSU pretty much dummy proof, IMO. But, I checked anyway. Plus, I essentially verified it was not a power issue by increasing core clock, voltage (evga's program) and shader, while minorly downclocking the memory and it works fine. So putting more power to the card, increasing core and shader while dropping memory down slightly (memory drop is nowhere near enough to offset the core/shader/voltage increase) is stable. I have the card pegged down to either bad memory or a failing memory controller.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.

The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...

Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html

....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.

The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...

Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html

....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.


Or perhaps neither? By the way, an increase in power consumption means an increase in thermals assuming the same heatsink and fan combination. No two ways around that one... So if the power consumption is greater, then... I think you get the point.

Edit ** By the way, it appears they increased the fan speed on the single PCB or increased the fan size because while consuming more power than the dual PCB, the chip itself is running cooler. That is only possible by dissipating the heat faster. That could also have been achieved on the dual PCB - BTW.

Edit 2 ** Ahh, just what I was looking for.

Temperatures are perfectly fine, I would have loved to see higher idle and slightly higher load temperatures in return for a quieter fan. Even the impressive overclock didn't push temperatures into an unsafe range.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.

The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...

Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html

....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.


Or perhaps neither? By the way, an increase in power consumption means an increase in thermals assuming the same heatsink and fan combination. No two ways around that one... So if the power consumption is greater, then... I think you get the point.

That's the point... we're not assuming the same heatsink and fan combination. The new cooling system on the single PCB card is very different from the cooling on the dual PCB card.

You're right though, perhaps it is neither reason... but, why the change then?

Originally posted by: ArchAngel777
Edit ** By the way, it appears they increased the fan speed on the single PCB or increased the fan size because while consuming more power than the dual PCB, the chip itself is running cooler. That is only possible by dissipating the heat faster. That could also have been achieved on the dual PCB - BTW.

Edit 2 ** Ahh, just what I was looking for.

Temperatures are perfectly fine, I would have loved to see higher idle and slightly higher load temperatures in return for a quieter fan. Even the impressive overclock didn't push temperatures into an unsafe range.

Yes, I saw all of that prior to my first post. That's exactly why I said what I did.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.

The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...

Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html

....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.


Or perhaps neither? By the way, an increase in power consumption means an increase in thermals assuming the same heatsink and fan combination. No two ways around that one... So if the power consumption is greater, then... I think you get the point.

That's the point... we're not assuming the same heatsink and fan combination. The new cooling system on the single PCB card is very different from the cooling on the dual PCB card.

You're right though, perhaps it is neither reason... but, why the change then?

But the point is that they could have increased the fan speed on the dual PCB, increase voltage and done the same thing... But, I agree, it was probably a cost saving measure, IMO. Or maybe a design flaw? Really, all of our answers are speculation, as you said.

 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
But, I agree, it was probably a cost saving measure, IMO. Or maybe a design flaw? Really, all of our answers are speculation, as you said.

I would doubt a design flaw. Changing the components isn't something that happens by accident. I'm sure that NV knew that the single PCB card draws more power before this article was published, and they probably know it would prior to building it based on the specs of the voltage regs they used.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: nitromullet
Originally posted by: ArchAngel777
But, I agree, it was probably a cost saving measure, IMO. Or maybe a design flaw? Really, all of our answers are speculation, as you said.

I would doubt a design flaw. Changing the components isn't something that happens by accident. I'm sure that NV knew that the single PCB card draws more power before this article was published, and they probably know it would prior to building it based on the specs of the voltage regs they used.

You are probably right on that. So nVidia could have potentially had a better card on their hands but decided to take short cuts. Am I understanding that right? Otherwise, the power issue I noted in my first post is about as plausible as any other suggestion. If they just wanted to give it more voltage - Why? Just to increase the fan noise? Doubtful. Just to decrease costs? Possibly... Or maybe to increase stability? Possibly... I see more than one viable reason.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Actually, I got to thinking about it and despite the voltage regulators being cheaper and no longer software controlled (according to that article), they can still be controlled prior to the card going to production - right? If that is the case, why didn't they tone down the voltage before production? In other words, if they can still control how much voltage the card is getting prior to the design finalization, then it begs this question - Why did they increase it? I don't think they want to create more heat than they need to. They also don't want a dust buster fan on their card, right? I have to believe it was to increase stability. Well, that is my best speculation at any rate. :D We really have insufficient data.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yeah, it is somewhat of a mystery. This is my guess... The choice of the new voltage regs was made on the basis of cost reduction, but the higher power draw (and presumably higher voltages) has to do with stability/performance. Perhaps the new regs aren't as efficient, and there isn't actually an increase in voltage, just an increase in consumption. I don't know. :)
 

Zap

Elite Member
Oct 13, 1999
22,377
2
81
Originally posted by: ArchAngel777
So nVidia could have potentially had a better card on their hands but decided to take short cuts... Just to decrease costs?

That is normal behavior. NVIDIA almost always decreases the cost of a graphics card design through the lifespan of that product. For instance the GTX 260 dropped from 14 layers to 10 layers to 8 layers and lost the Volterras at 14 layers. GTX 285 lost 4 layers and is at 10 layers now. Heck, even older/lower end cards suffer this fate. Latest 9800 GT (P360) is at 6 layers.

The end result is to keep profits (which are important for R&D) while allowing room for price moves (AKA lower prices). The dramatic cost reduction of the GTX 260 is what allowed it to go from a $450 product to a $180 product without losing its performance characteristics.

The single PCB GTX 295 is probably cheaper and simpler to make. While I don't expect prices to drop below $500 for now, at least once these things start shipping in volume (which I expect more than the more complex dual PCB version) I'm hoping at least prices will stay around $500. Anyone notice that the dual PCB GTX 295 was often sold at above NVIDIA's target $500 pricing? I often found prices at Newegg/Tiger/etc. to be $530-550 for the dual PCB version. NVIDIA didn't charge more and I know BFG didn't charge more, so who's making extra money off the fact that demand was outstripping supply, hmmmm?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Zap
Originally posted by: ArchAngel777
So nVidia could have potentially had a better card on their hands but decided to take short cuts... Just to decrease costs?

That is normal behavior. NVIDIA almost always decreases the cost of a graphics card design through the lifespan of that product. For instance the GTX 260 dropped from 14 layers to 10 layers to 8 layers and lost the Volterras at 14 layers. GTX 285 lost 4 layers and is at 10 layers now. Heck, even older/lower end cards suffer this fate. Latest 9800 GT (P360) is at 6 layers.

The end result is to keep profits (which are important for R&D) while allowing room for price moves (AKA lower prices). The dramatic cost reduction of the GTX 260 is what allowed it to go from a $450 product to a $180 product without losing its performance characteristics.

The single PCB GTX 295 is probably cheaper and simpler to make. While I don't expect prices to drop below $500 for now, at least once these things start shipping in volume (which I expect more than the more complex dual PCB version) I'm hoping at least prices will stay around $500. Anyone notice that the dual PCB GTX 295 was often sold at above NVIDIA's target $500 pricing? I often found prices at Newegg/Tiger/etc. to be $530-550 for the dual PCB version. NVIDIA didn't charge more and I know BFG didn't charge more, so who's making extra money off the fact that demand was outstripping supply, hmmmm?

Yep, I am glad we agree that it is normal behavior for most (if not all) companies.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ArchAngel777
Actually, I got to thinking about it and despite the voltage regulators being cheaper and no longer software controlled (according to that article), they can still be controlled prior to the card going to production - right? If that is the case, why didn't they tone down the voltage before production? In other words, if they can still control how much voltage the card is getting prior to the design finalization, then it begs this question - Why did they increase it? I don't think they want to create more heat than they need to. They also don't want a dust buster fan on their card, right? I have to believe it was to increase stability. Well, that is my best speculation at any rate. :D We really have insufficient data.

This single PCB, IMHO, would undoubtedly have more layers and highly compacted traces to accomodate the 448bit wide bus for each core. It may take more power to accomplish this. I don't really know, it's just a speculation.