• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[H] GTX 780 Ti vs. R9 290X 4K Gaming

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
The downside is that it is grotesquely loud compared to aftermarket and even reference 780 Tis, and has almost no reasonable overclocking headroom and thus gets crashed by quieter, much faster cards.

I know you are you convinced that the 290X and the 290 have no OC headroom.

Why you convinced yourself that these are the only cards that won't reach 1200-1300 MHz core in the 28 nm process is a bit weird though.

An image that for a bit of time was often posted in these forums to show "absurd power consumption levels" was the following.

voltagetuning.jpg


AMD's Radeon R9 290X shows fantastic clock scaling with GPU voltage, better than any GPU I've previously reviewed. The clocks do not show any signs of diminishing returns, which leads me to believe that the GPU could clock even higher with more voltage and cooling.
 
I know you are you convinced that the 290X and the 290 have no OC headroom.

Why you convinced yourself that these are the only cards that won't reach 1200-1300 MHz core in the 28 nm process is a bit weird though.

An image that for a bit of time was often posted in these forums to show "absurd power consumption levels" was the following.

voltagetuning.jpg
congratulations on picking up 7 percent performance increase for an additional 225 watts of power consumption. Lol
 
The current generation of cards cannot handle 4K, so it's all pointless anyway. You need to overclock the current flagship cards in CF or SLI just to reach 40 FPS. The next generation may see dual cards hitting 60fps at 4K, and the following generation we may start seeing single cards become playable at 4K.

I think the monitors will become affordable well before 4K becomes a mainstream gaming resolution. They can scale down to 1080P, and will probably be able to run 240Hz at that resolution (at the very minimum, 120Hz). They'll have their place even if a lot of hardware still could not handle the full 4K res.
 
Last edited:
esp on the multi-GPU front.

with Mantle coming along..

http://www.google.com/translate?hl=.../13445/apu13-amd-mantle-premiers-details.html

hardware.fr said:
The application also acquires the ability to take control of the multi-GPU and to decide where to run each command issued. Why AMD has provided in Mantle access CrossFire compositing engine, data transfer between GPU etc.. Will allow multi-GPU modes that go beyond the AFR and adapt better example to use GPU computing in games or asymmetric multi-GPU systems as is the case for APU combined with a GPU. For example it is possible to imagine the GPU load based rendering and APU handle the post processing.
 
Dang with all this hate for team red I have the sudden urge now to stick with them for the long haul just to see them improve their products even more. Was waiting on black Friday or cyber Monday for a deal because phys x is my type of eye candy and also just to experience an nvidia card but I don't now. What do you all think, a cheap nvidia card or wait for their high to drop a bit and run it with a radeon card?
 
congratulations on picking up 7 percent performance increase for an additional 225 watts of power consumption. Lol

That is assuming he extensively tested the settings and didn't just overvolt for whatever clock he got due to time constrains.

We talking of a 25% increase in voltage.
 
But now charging 27% premium for a smidgen more performance and lower noise is something to be admired.

The default GTX 580 charged around a 35 percent premium over the default HD 6970 and offered a 12 percent difference over-all at 1600p, x4 aa and a 2 percent difference with x8 aa based on Computerbase findings. Nothing has really changed except both AMD and nVdia are charging more for their single flag-ship sku's.
 
The default GTX 580 charged around a 35 percent premium over the default HD 6970 and offered a 12 percent difference over-all at 1600p, x4 aa and a 2 percent difference with x8 aa based on Computerbase findings. Nothing has really changed except both AMD and nVdia are charging more for their single flag-ship sku's.

580 Premium was pointed out at the time to, but GPU compute apparently justified it according to at least part of "the internet". Now there is even less of a difference in that regard. If Nvidia can fix their 4K multiGPU issue in short order there will also be a slim difference in regards to this threads topic.
 
Dang with all this hate for team red I have the sudden urge now to stick with them for the long haul just to see them improve their products even more. Was waiting on black Friday or cyber Monday for a deal because phys x is my type of eye candy and also just to experience an nvidia card but I don't now. What do you all think, a cheap nvidia card or wait for their high to drop a bit and run it with a radeon card?

If PhysX is your thing (I can't get excited about it,no matter how hard I try),I'd suggest sticking to a cheap NVidia card.I think none will have problems handling just PhysX.Although I suggest buying your AMD GPU first,play with it,and if you can't get used to not having PhysX,buy an NVidia too.I'd say you won't have a problem adjusting,though,especially if,like me,your previous system is just not up to snuff.
 
Back
Top