[H] GTX 780 Ti vs. R9 290X 4K Gaming

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
there should not be a "quiet" or "performance" preset. AMD should combine both into one - nothing unreasonable about that.

Why is that a problem? What is the downside?

"OMG! I bought a 290X and it has a switch with 2 different presets! OMG! The dilemma! The horror!".
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Why is that a problem? What is the downside?

"OMG! I bought a 290X and it has a switch with 2 different presets! OMG! The dilemma! The horror!".

The downside is that it is grotesquely loud compared to aftermarket and even reference 780 Tis, and has almost no reasonable overclocking headroom and thus gets crashed by quieter, much faster cards.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The interesting part to me is that the 780 ti doesn't have the performance advantage that NV cards have had such as the 6970-580 gap, yet the price premium is large.

Ironically, The price premium percentage is less with the GTX 780ti and some forget how competitive the default HD 6970 was with default GTX 580 at launch over-all at 1600p:;

12 percent with x4 aa and 2 percent with x8 aa at 1600p:

http://www.computerbase.de/artikel/grafikkarten/2010/test-amd-radeon-hd-6970-und-hd-6950/26/
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Looking forward to the upcoming XFire V SLI results...
KYLE[H]
I have been sitting here playing with 290X in CrossFire for the last hour which I installed in my personal box sitting under my desk. These are not "insanely loud." In fact, once I am gaming I do not notice these at all. Can you here these cards in CrossFire, sure, but they are far from distracting. I will do a write up this week on the 290X CrossFire experience as moving from TITAN SLI. Actually that is why I am sitting here in middle of the day playing games. YAY FOR ME!
smile.gif
__________________
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Reviewers at times do 180's


Kyle Bennet said:
The GTX 480 is hot. I know many enthusiasts are not going to be satisfied with a video card that idles at 73C (and even hotter in most SLI setups) and runs 93C under pretty much any kind of load. There have to be long term implications to this and I would just suggest that you buy from a board builder that has a rock solid warranty and a reputation for great customer service if Fermi is for you.

Curious if there was this temp implication for 290x and 290? OR many enthusiasts are not going to be satisfied with high temperatures!

HardOCP said:
Do not be alarmed at the 95c temperatures on the R9 290X. Remember, the video card is designed to run with the GPU that hot. It may be a shock to us, expecting much lower temperatures, but 95c really is OK for the card and GPU.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
It seems [H] is going too soft :p but honestly where is all the oc reviews? I remember people saying it's the oc numbers that matters but it seems they have become awfully quiet all of a sudden.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Are you calling out Brent about his opinion on cards that you have no first hand experience with?


No I'm simply stating that in his opinion it isn't too loud, while other reviewers have expressed differently of course also in their opinion.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Well, to me it looks like a call out but, eh, whatever. I too wonder why there have been a lack of OC numbers from reviewers.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
No I'm simply stating that in his opinion it isn't too loud, while other reviewers have expressed differently of course also in their opinion.

Yes, but it seems the reviewers that have put it in a case are the ones that don't think it's too loud and those that open bench with the unit just ~1m away that think it is too loud. Judging by the lack of "my card is making me deaf" 290/290X owner posts I have to give the nod to the "not too loud" side until I have first hand experience.

Which may be never, considering I have only had one reference card out of a half dozen in the last few years.

I'd expect anyone looking to game at 4K will either have a very nice sound system/headphones making the case noise level less of a concern. Or if they are audiophiles in search of a pristine experience an isolated water cooled setup would be the only sensible approach.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Considering how important the fan threshold and temperature threshold are for dynamic clocking -- the higher the threshold -- the more performance ----- AMD intentionally traded higher temperatures and acoustics for performance to me.

Sure hope nVidia keeps their balanced default approach and the market doesn't reward higher temperatures out-of-box to eek out that extra performance to showcase performance.

This will be interesting to see how the market reacts to this.
 

BrentJ

Member
Jul 17, 2003
135
6
76
www.hardocp.com
Still waiting for you to overclocked something, seems a bit auspicious that you haven't overclocked any card since the 280x review.

It's been one launch review after another since then that I've had to work on. If you notice, most of the reviews lately have been video card related, by me. I haven't had time yet to get to it, we also have priorities that were pre-planned to work on before all these launches. Overclocking testing is planned, it's on the schedule. The time it takes to produce these evaluations is highly underestimated. I'll let you know when I gain super powers that allow me to operate above human speed.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Considering how important the fan threshold and temperature threshold are for dynamic clocking -- the higher the threshold -- the more performance ----- AMD intentionally traded higher temperatures and acoustics for performance to me.

Sure hope nVidia keeps their balanced default approach and the market doesn't reward higher temperatures out-of-box to eek out that extra performance to showcase performance.

This will be interesting to see how the market reacts to this.

I'm not sure I would say NVidia took the balanced approach as your check book balance shows the difference between the AMD purchase and NVidia one
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Want firsthand experience with the sound levels of something? Watch a YouTube video! You too can be an expert!

:rolleyes:


It's been one launch review after another since then that I've had to work on. If you notice, most of the reviews lately have been video card related, by me. I haven't had time yet to get to it, we also have priorities that were pre-planned to work on before all these launches. Overclocking testing is planned, it's on the schedule. The time it takes to produce these evaluations is highly underestimated. I'll let you know when I gain super powers that allow me to operate above human speed.

Don't take the bait!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I'm not sure I would say NVidia took the balanced approach as your check book balance shows the difference between the AMD purchase and NVidia one

Easy when my context was efficiency! Even with the GTX 480 there was a premium -- nothing new!
 

tolis626

Senior member
Aug 25, 2013
399
0
76
Better user experience is worth more money to a lot of people, oddly enough.

Last time I went with this thought in mind it was when I bought the HTC Desire HD for a little over 600€ instead of the Samsung Galaxy S for 400€,and I regretted it sorely.Every time I remembered I gave 200€ extra for practically the same performance (Which is what is important to me),I wanted to hit my head on the wall.I pitied the wall,so I didn't.

I think the whole noise issue is overblown.Yeah,it's a problem,but not as big as people make it out to be.My old PS3 (The original 60GB Fat edition) can be heard from space if I game on it for more than an hour.There is NO way the 290x can be worse than that.My only concerns with the reference design are that AIB cards will have better thermals,thus behave better when clocked high,and,for the most part,more robust circuitry,especially for power delivery.That they will be less noisy is just a nice bonus for me.Plus,I don't think that I will have a problem buying a second open-air 290x down the road and using it in CF in my HAF-X.That thing has tremendous air flow.More than 2 cards though and I have an oven it seems.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
the trouble is that you don't know what some peoples usage of these things are. A while back I gave a card to a buddy and he never stopped complaining about how loud it was. I told him he was crazy. When I went to his house I soon realized what the problem was. He had the card in an itty bitty case that looked like it was 10 years old with no air flow and he took the side cover off. The worst part is that pc was on his desk literally about 12 inches from his shoulder and ear.

My father puts his case in an desk cubby with a door closed. Hot box anyone?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Reviewers at times do 180's




Curious if there was this temp implication for 290x and 290? OR many enthusiasts are not going to be satisfied with high temperatures!

Temperatures, noise, premiums...

But lets see what the conclusion of the review actually said.

GeForce GTX 480

The GeForce GTX 480 has more relevance to gaming, but at a higher cost compared to the competition. With the GeForce GTX 480 we saw some situations where it provided a superior gameplay experience compared to the Radeon HD 5870. In Metro 2033 it allowed us to use a higher AA setting, and in Bad Company 2 we could utilize 8X CSAA at 2560x1600. In other games, the GeForce GTX 480 did allow high levels of AA at high resolutions like 2560x1600.

The GTX 480 is quite simply not a "Radeon HD 5870 Killer." We don’t know if we were supposed to think it would be or not, but with the power consumption this beast requires, you would hope it would be providing a bit more performance than it is. We can’t say that any of the real world gameplay advantages blew us away compared to the experience of gaming on a Radeon HD 5870. The Radeon HD 5870 proved to allow high AA settings at 2560x1600 in many games, the same as the GTX 280, just not as high. The only game that clearly favors the GeForce GTX 480 is Metro 2033. (And we know that AMD still has its driver team looking over the final code release of the game and has not yet tweaked for it.) Even in BC2, 8X CSAA isn’t a huge improvement over 4X AA which the HD 5870 allowed at 2560x1600. What is the value to the gamer of being able to use 8XAA instead of 4XAA in Bad Company 2?

We are seeing some trades back and forth; in some games the GTX 480 is slightly faster, in other games the HD 5870 is slightly faster. There wasn’t anything that really stood out, other than Metro 2033. Once again, we have to factor in cost, power, and heat. The GeForce GTX 480 consumes more power than the HD 5870, produces more heat, and costs more money (which might also include the cost of a new PSU). It doesn’t seem like the best value with all this factored in. The only thing that "blew us away" was the heat coming out of the video card and the sound of the fan. If you have not taken a look and a listen to the videos on the previous page, you should.

And they even said the GTX 480 SLI was a reasonable value in terms of absolute performance.

GeForce GTX 480 SLI

More relevant still is the awesomeness of GeForce GTX 480 SLI performance. We simply were not disappointed in the performance that GTX 480 SLI delivered. We think a large part of that may be the fact that the GTX 480 benefits from a larger framebuffer per GPU. Still, the facts are it provided incredible performance at incredible AA settings at 2560x1600. We can make an educated guess that this configuration will equally impress when using multi-display gaming. NV Surround is not yet supported in the current driver release but we are expecting in within 30 days hopefully. While the price of admission is going to be very high, 3x1 display NV Surround gaming looks as though it will possibly bring a "can of whoopass" to multi-display gaming. SLI will be required for NV Surround 3x1 gaming, but you will be able to use 3 DVI displays natively. No DisplayPort required.

We saw real-world benefits, we think are related to the framebuffer, in Aliens vs. Predator. The Radeon HD 5970 was not able to play with 4X AA at 2560x1600, whereas GeForce GTX 480 SLI was. We also experienced awesome performance in Bad Company 2 with 16X CSAA and 2X TR SSAA that was much more than "playable." Even more impressive was DiRT 2, which we were able to have 8X TR SSAA plus 8X CSAA enabled, for a truly breathtaking visual experience. And get a load of this. GeForce GTX 480 SLI allows Crysis Warhead to be playable at 2560x1600 4X AA/16X AF all Enthusiast settings. Take that to the bank, GTX 480 SLI is the real deal.

The downsides to GTX 480 SLI follow the same pattern as the others, cost, power and thermals. GTX 480 SLI is going to set you back $1000 for the cards alone, and the power requirements are severe. GeForce GTX 480 SLI is very much an extravagant enthusiast only solution.

NVIDIA rules the schoolyard when it comes to multi-GPU scaling. CrossFire gets left with a black eye.

Then we have that second article that Sir Pauly quoted.

The Bottom Line
The GTX 480 is hot. I know many enthusiasts are not going to be satisfied with a video card that idles at 73C (and even hotter in most SLI setups) and runs 93C under pretty much any kind of load. There have to be long term implications to this and I would just suggest that you buy from a board builder that has a rock solid warranty and a reputation for great customer service if Fermi is for you. These cards are hot and can very much raise the temperature in the room you are using your setup in. Maybe NVIDIA can bundle a coupon for a new ceiling fan?

The GTX 480 is loud. The fans however do not become annoying unless they are spun up to very high RPM levels. Even SLI, while you can certainly hear it, is not distracting once you get used to it. Single card noise is manageable if you have a lower ambient temperature to start with and have good chassis airflow. No doubt this is one of the reasons we are seeing a "Fermi Certified" chassis from Thermaltake. If you have good airflow in your chassis, the sound is not annoying high pitched or bothersome. However If you are a person that cherishes silence, the GTX 480 is not for you.

Some people still avoid acknowledging that the GTX 480 was mostly criticized because it was considerably more expensive than the 5870, consumed twice the power (or around 100W more) and only offered something like 11% more performance at the time of release. In DX11 games it did offer more performance (around 20%).

Sometime ago charging 10% premium at release for 15% more performance while having more memory and consuming less power was a crazy absurd premium.

But now charging 27% premium for a smidgen more performance and lower noise is something to be admired.

People really do 180s.
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Wrong thread!

I'll let the moderators moderate me thanks... I may stray off topic a little bit but I believe my posts are loosely related and at minimal helpful for someone in the given situation.

Somebody looking for either a 780 or 290 for 4k gaming may have missed a random thread about heat and noise and be curious how it pertains to them and their situation.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'll let the moderators moderate me thanks... I may stray off topic a little bit but I believe my posts are loosely related and at minimal helpful for someone in the given situation.

Somebody looking for either a 780 or 290 for 4k gaming may have missed a random thread about heat and noise and be curious how it pertains to them and their situation.

Actually I posted a graph meant for another thread, when I said wrong thread that was my edit removing the graph that wasn't intended for this thread -> Wrong Thread!