blastingcap
Diamond Member
- Sep 16, 2010
- 6,654
- 5
- 76
Is this a poll, or yet another
first they came for my GPU Voltage, then they sent us to Fema conc. camps thread
I admit, I loled.
Is this a poll, or yet another
first they came for my GPU Voltage, then they sent us to Fema conc. camps thread
The poll is how enthusiasts will feel (kinda, since again the OP is equating OCing and Overvolting, which is intellectually dishonest).
The real world doesn't operate like that. This is like an argument in the console or smart-phone forums over some function/feature that nobody really cares about, and all sides still sell a mST anyway.
nV has cash. AMD is losing badly in the GPU market on a whole, is always behind in the discreet GPU market, and is facing very large debt payments while its stock is toxic.
Honestly this sub-forum is such a fake picture of AMD's health. Please go read the CPU forum, where people don't just discuss OCing, etc, they discuss the companies as a whole.
I over-clock, I do not over-volt.
The gains in over-volting are not noticeable in any game. If you have to over-clock so high to play a game that you need more voltage, you should probably turn down your settings.
The only thing I wish for is an adequate explanation. As i've mentioned before, we're all well aware that this is completely within the rights of nvidia, yet users do not have to like it.
I have MSI lightning 680s which I love, I am still able to over volt them with some tricks. It does look odd when enthusiasts have over volted GPUs for several years (it was very very common with Fermi) and then suddenly the course is reversed. Like I said, I love my cards and will still enjoy them, heck I can get 1300 with no over voltage. But I would really love for an explanation to be provided, something better than "because nvidia said so". That's the only thing on my wish list.
Is it because of warranty issues? Brand name? What? I would be much more at ease if someone came forward and said, "Hey guys this is why we're doing this" and then it would be cool. I mean I still would not like it, but I would understand it.
Here's the way I see it. A reference GTX 680 cannot handle over voltage well, we all know that the cooler and VRM is not up to the task. But why apply the same voltage methodology to a card like the MSI lightning or EVGA classified? These cards are specially equipped with hardware that is completely fine with handling the additional voltage! This is the main reason I do not understand the change. A reference with voltage locked? Yeah, thats fine with me. But why apply the same methodology with substantially better cooling and hardware?
Yeah it sounds a lot better than just allowing everyone to bump up the voltage. What drove the OP to create this thread was his dissatisfaction stemming from a purchase based on a of a special sample 7850.
I put doesnt bother me, the frame rate difference between OV OC & just OC isn't much to a gamer, maybe if you get into benchmarking?
Just my 2cents
It's terrible on several fronts. For the enthusiast buyer shelling out $500+ for a top end card losing the ability to tweak it, or the mid-range enthusiast looking to get a more affordable card and tweak it to higher performance.
There is a clear advantage having voltage control. Look at an AMD Radeon 7950 with voltage control, tweaking voltage makes that $300 card as fast or faster than a $500 GTX 680. Granted the 680 is basically a red lined and bandwidth limited card regardless. There is not enough memory bandwidth on the stock card already, overclocking it makes that more evident.
I think we'll see voltage control on the GK110 consumer card. Pretty sure the reason it's locked on GK104 is they are already redlining the chip to try to stay competitive and much more voltage would kill/degrade the cards.
NV pulled an IVB with GTX600 series by cutting corners everywhere and I think removing voltage control reduced the chances of many failed parts, like the blown up 570/590s they had last round. Still, that shouldn't have mattered since as Blackened23 stated, EVGA and MSI included bullet-proof components on EVGA Classy and MSI Lightning parts to ensure that they are able to handle the added voltage with aplomb. NV could have been afraid that even users with regular 670/680 reference cards would start overclocking with voltage control in MSI AB and then they could have dealt with a PR disaster of high rates of GPU failures.
Vote with your wallet if you don't like it.
Also, it's not entirely fair to compare a high-wattage part like GTX 480 vs GTX 680. The GTX 480 was built on 40nm and was rushed out, compared to GTX 580 (also on 40nm but with tamer wattage). The GTX 680 is a 28nm part with much lower wattage so it needs much less cooling. This is a fairer comparison of cards with similar TDP and shows that the GTX 670 has a downright pathetic heatsink: http://www.overclock.net/t/1256156/serious-cost-cutting-on-the-gtx-670s-heatsink
And as you noted, the stock GTX 680 cooling is pretty pathetic even considering the wattage difference between it and GTX 4/580, and I think they went a little overboard with all their cost-cutting, from shrinking the PCB to making weaker, cheaper cooling, to skimping on memory bandwidth, etc. etc. As was widely reported and EVERYBODY should know this by now, the GK104 chip was originally intended to be midrange, not high-end.
And it shows.
Hopefully NV will allow voltage control in future GPUs that were intended to be high-end from the get-go.
Can you reword this or correct the grammar so I can understand your points, please? I'm not sure what you are getting at.
Originally Posted by [B said: