- Jun 21, 2005
- 11,944
- 2,172
- 126
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
And if your cards fail, so what, that's what the warranties are for. You may get an upgrade out of it if they have no more GTX280's.
:::crosses fingers for ya:::
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
Originally posted by: Snakexor
Originally posted by: Keysplayr
You could go to any manufacturers forums and find people with problems. That's primarily what those forums are for.
And if your cards fail, so what, that's what the warranties are for. You may get an upgrade out of it if they have no more GTX280's.
:::crosses fingers for ya:::
One can only expect such a "rosie-colored" post from a member of an Nvidia focus group.....
Why don't we let this play out and see if it is a legitimate problem....remember the 7800/7900gt/x's?
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).
I've gotten 4 duds in my video card history.
5900XT
9700
9700pro
9500pro
Originally posted by: ArchAngel777
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).
I've gotten 4 duds in my video card history.
5900XT
9700
9700pro
9500pro
This is my first card that has given me a problem. I have owned mostly nVidia cards. I prefer ATi but they always seem to fail to deliver when I am going to purchase a card. Just the luck of the draw I guess.
Originally posted by: Keysplayr
Originally posted by: ArchAngel777
Originally posted by: Keysplayr
Originally posted by: thilan29
Hmm..thankfully from all the nV and ATI cards I've owned I haven't got any duds (knock on wood).
I've gotten 4 duds in my video card history.
5900XT
9700
9700pro
9500pro
This is my first card that has given me a problem. I have owned mostly nVidia cards. I prefer ATi but they always seem to fail to deliver when I am going to purchase a card. Just the luck of the draw I guess.
Have you tried switching the rails that supply power to your card?
I'd like to see what (if anything) changes. Just an "experiment".
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.
The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...
Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html
....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.
Temperatures are perfectly fine, I would have loved to see higher idle and slightly higher load temperatures in return for a quieter fan. Even the impressive overclock didn't push temperatures into an unsafe range.
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.
The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...
Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html
....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.
Or perhaps neither? By the way, an increase in power consumption means an increase in thermals assuming the same heatsink and fan combination. No two ways around that one... So if the power consumption is greater, then... I think you get the point.
Originally posted by: ArchAngel777
Edit ** By the way, it appears they increased the fan speed on the single PCB or increased the fan size because while consuming more power than the dual PCB, the chip itself is running cooler. That is only possible by dissipating the heat faster. That could also have been achieved on the dual PCB - BTW.
Edit 2 ** Ahh, just what I was looking for.
Temperatures are perfectly fine, I would have loved to see higher idle and slightly higher load temperatures in return for a quieter fan. Even the impressive overclock didn't push temperatures into an unsafe range.
Originally posted by: nitromullet
Originally posted by: ArchAngel777
Originally posted by: nitromullet
Sorry, but I fail to see how someone's issues with thier GTX 280 has anything to do with the power consumption of the single PCB GTX 295. The notion that this was done to fix a supposed issue with GTX 280/285 cards is pure speculation.
The linked article claims that the new voltage regulators are cheaper than the ones previously used, then he goes on to be impressed with the OC he got from his card. Perhaps its a money saving thing, or perhaps the OC is a result of increased voltage provided bu the new regs which is now possible because of the better thermal properties of the new card. Perhaps, it's a bit of both...
Final overclocks for dual PCB GTX 295: 693 MHz core (20% overclock) and 1207 MHz Memory (21% overclock). http://www.techpowerup.com/rev...eForce_GTX_295/33.html
Final overclocks for single PCB GTX 295: 702 MHz core (22% overclock) and 1198 MHz Memory (19% overclock). http://www.techpowerup.com/rev...X_295_Platinum/32.html
....I'm honestly not really sure why W1zzard seemed surprised with the OC he got from the single PCB GTX 295. Not that big of a difference IMO. I'm gonna stick with the idea that it's a cost saving thing.
Or perhaps neither? By the way, an increase in power consumption means an increase in thermals assuming the same heatsink and fan combination. No two ways around that one... So if the power consumption is greater, then... I think you get the point.
That's the point... we're not assuming the same heatsink and fan combination. The new cooling system on the single PCB card is very different from the cooling on the dual PCB card.
You're right though, perhaps it is neither reason... but, why the change then?
Originally posted by: ArchAngel777
But, I agree, it was probably a cost saving measure, IMO. Or maybe a design flaw? Really, all of our answers are speculation, as you said.
Originally posted by: nitromullet
Originally posted by: ArchAngel777
But, I agree, it was probably a cost saving measure, IMO. Or maybe a design flaw? Really, all of our answers are speculation, as you said.
I would doubt a design flaw. Changing the components isn't something that happens by accident. I'm sure that NV knew that the single PCB card draws more power before this article was published, and they probably know it would prior to building it based on the specs of the voltage regs they used.
Originally posted by: ArchAngel777
So nVidia could have potentially had a better card on their hands but decided to take short cuts... Just to decrease costs?
Originally posted by: Zap
Originally posted by: ArchAngel777
So nVidia could have potentially had a better card on their hands but decided to take short cuts... Just to decrease costs?
That is normal behavior. NVIDIA almost always decreases the cost of a graphics card design through the lifespan of that product. For instance the GTX 260 dropped from 14 layers to 10 layers to 8 layers and lost the Volterras at 14 layers. GTX 285 lost 4 layers and is at 10 layers now. Heck, even older/lower end cards suffer this fate. Latest 9800 GT (P360) is at 6 layers.
The end result is to keep profits (which are important for R&D) while allowing room for price moves (AKA lower prices). The dramatic cost reduction of the GTX 260 is what allowed it to go from a $450 product to a $180 product without losing its performance characteristics.
The single PCB GTX 295 is probably cheaper and simpler to make. While I don't expect prices to drop below $500 for now, at least once these things start shipping in volume (which I expect more than the more complex dual PCB version) I'm hoping at least prices will stay around $500. Anyone notice that the dual PCB GTX 295 was often sold at above NVIDIA's target $500 pricing? I often found prices at Newegg/Tiger/etc. to be $530-550 for the dual PCB version. NVIDIA didn't charge more and I know BFG didn't charge more, so who's making extra money off the fact that demand was outstripping supply, hmmmm?
Originally posted by: ArchAngel777
Actually, I got to thinking about it and despite the voltage regulators being cheaper and no longer software controlled (according to that article), they can still be controlled prior to the card going to production - right? If that is the case, why didn't they tone down the voltage before production? In other words, if they can still control how much voltage the card is getting prior to the design finalization, then it begs this question - Why did they increase it? I don't think they want to create more heat than they need to. They also don't want a dust buster fan on their card, right? I have to believe it was to increase stability. Well, that is my best speculation at any rate. We really have insufficient data.