• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX 780 rumors

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
As long as its a real high end card and not some mid-range POS (680) I will take it. Assuming I am able to modify voltage and clocks.
 
35% faster than the GTX580 is very disappointing as "high end". Usually a new generation+shrink provides 60+% performance increase like the 7970 GHz vs 6970.
 
35% faster than the GTX580 is very disappointing as "high end". Usually a new generation+shrink provides 60+% performance increase like the 7970 GHz vs 6970.

a 680 GTX is as fast as a 590 GTX according to benches.

I would say thats fast given one is a dual GPU card.

i remember when GPU performance increases were 20% max in the 7800 GTX days.
 
That's only because the GTX 590 is basically SLI and there may be some titles included in the ratings that don't scale (well). If scaling is good and no CPU bottleneck is present, the GTX 590 is 50-55% faster than the GTX 580. Additionally, the 590 is quite castrated in comparison to for example the 690 which is a decent dual GPU card with +90% more performance vs. the 680.

SGPU should be compared with SGPU, and there the 680 is undoubtedly only 35% faster than the 580:
http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/23.html

The 280 had more improvement vs the 8800 GTX. The 480 had more improvement vs the 280. Sure, this time instead of going up, power consumption went down. But that doesn't give me more fps 😉
 
Anyone know whether it's a permanent move from NVIDIA to prevent users from changing voltages?

I'm thinking of moving to NVIDIA after experiencing the famous Crossfire microstutter, but am less keen if I'll no longer be able to overclock very far
 
Seeing that some 7970 (GHz) are voltage locked as well already and that AMD copies much of what Nvidia does, I would say it might not be relevant in the future.
 
Seeing that some 7970 (GHz) are voltage locked as well already and that AMD copies much of what Nvidia does, I would say it might not be relevant in the future.


I think that is dependent on the AIB because I haven't seen any reference boards that are voltage locked. It just sounds like the AIBs are removing the voltage control from the non ref models to save $$.
 
Seeing that some 7970 (GHz) are voltage locked as well already and that (INSERT BIT THAT I HAVE CHOSEN TO IGNORE), I would say it might not be relevant in the future.

Yeah ok, but IMHO the only reason to buy a Ghz edition in the first place is if you are either,
1) Afraid/unwilling to overclock, or
2) Painfully ignorant

My vanilla Sapphire 7970 OCs far beyond the Ghz edition's clockspeeds with only minor voltage increases. I see the Ghz edition as a marketing gimmick and a particularly inelegant move on AMD's behalf.

So other than the Ghz edition, are there any other cases of Southern Islands cards with locked voltages? I'm not aware of any...
 
I wasn't trying to flame, sorry. I just meant they copied the turbo for example, adaptive vsync and the fps limiter via RadeonPro. So I thought it's not far fetched they could restrict overvolting, too.

I definitely read about some partner cards that were locked, I would have to go back and see if those were GHz Editions or not.
 
No harm done 🙂

Btw, Radeon Pro is not created by, updated by or otherwise affiliated with AMD. It's made by a dude somewhere. By all accounts he's freaking awesome and AMD's driver team could use more people like him. I've even heard recent rumors about AMD tentatively sponsoring him. But a feature being added to that tool does not necessarily constitute AMD copying NVIDIA
 
Anyone know whether it's a permanent move from NVIDIA to prevent users from changing voltages?

I'm thinking of moving to NVIDIA after experiencing the famous Crossfire microstutter, but am less keen if I'll no longer be able to overclock very far

I think it really just depends on how they design their pcb's moving forward. The 6xx boards weren't exactly overbuilt. They work fine within specs butnvidia was obviously nervous about users going too far past specs on them.
 
I think it really just depends on how they design their pcb's moving forward. The 6xx boards weren't exactly overbuilt. They work fine within specs butnvidia was obviously nervous about users going too far past specs on them.

Hmmm, fair point, but users always risked voiding their warranty and frying their cards by overclocking. I really don't like the way they're trying to limit what we can do with or cards out of spec. So what if I push my card too far and fry it? That's my own retarded fault, and wouldn't make me hold it against my chip manufacturer
 
Back
Top