So what's the verdict on R9 290X ?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

djnsmith7

Platinum Member
Apr 13, 2004
2,612
1
0
I would say it depends on the group of people that answer the question. For those of us that care about GPU temps, fewer people will pass than those that will buy. From the looks of it, the 290x could hit temps that are 15c higher than say, a 7970, at idle & load & that's just too damn hot.

You're always going to have the group of inexperienced folks who just don't know any better or don't do much research before they buy, but they won't be the majority.

Overall, I don't think this card will be a success, as its engineering faults are highlighted by many early on. Even under water, I think the card will still run hotter than what we want.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Nvidia won't have to. They will have the better product with the more reliable name. Intel doesn't drop their prices to match AMD for the same reason. Aftermarket coolers may tame the 290x somewhat, but don't look for any major improvement from the ones that cost $20 more. The cards with the real highend coolers are going to cost $50-100+ more. A non-crippled 290x at $600 is a far better value than the mess that is the OEM design at $550. With a 290x at $600, a 780ti would be an equal value at $650. I wouldn't pay $700 for a video card, but the state of the market would make even $700 a justifiable price for a 780ti.

To answer the original question, the 290x is currently in beta mode. I wouldn't pay for a beta product. Once the aftermarket fixes the bugs and delivers a usable product we will reevaluate it.
Off topic, your signature is hilarious :p

/Carry on everyone....
//PS...I even included a pic for you to drool over
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
The most flawed awesomeness you can imagine. Sounds awful, but's is it.

Very true.

Great performance but poor implementation. These things should never have been released on the reference cooler. You can see the thermal throttling in GPGU benchmarks where the R90X only marginally outperforms the 7970GE in bitmining for example.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Depends on how many they had to sell the first day, seeing they sold out in less the a hour, and how many they plan on bringing back to market fast enough..

If they can't keep up with demand, gamer will turn towards next best thing.

No they won't. The 780 has been out long enough for anyone that wanted one to get one.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Is it a great success?

Good card, good price, bad cooler.
Only buy the reference design if you are watercooling. Otherwise, wait a couple months for Asus and MSI to come out with their custom versions.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Verdict to be determined... wait till non-reference cooler 290X's come out and also for the GTX 780 Ti tests before buying, if you can wait.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
THE GOOD (before throttling)

crisis 3 @ 2560x1440 - 53.8fps vs 52.2fps
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/12

bf3 @ 2560x1440 - 67.1fps vs 59.2fps
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/11

metro lt @ 2560x1440 - 58.5fps vs 49.5fps
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/8

hitman @ 2560x1440 - 76.5fps vs 65.7fps
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/15



THE BAD

30% loss of performance from thorttling
http://www.techpowerup.com/reviews/AMD/R9_290X/33.html

780 runs cooler - 81c vs 93c (load crysis3)
780 runs quieter - 47.5db vs 58.9db (load crysis3)
780 uses less power - 327w vs 425w (single gpu load crysis3) / 536w vs 727w (dual gpu load crysis3)
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/19

780 faster at tesselation - 106.6fps vs 91.1fps (heaven)
http://www.pcper.com/reviews/Graphi...-4K-Preview-Testing/3DMark-and-Unigine-Heaven

780 sli offer better frame pacing.
http://www.pcper.com/reviews/Graphi...-CrossFire-and-4K-Preview-Testing/Battlefield-
http://www.pcper.com/reviews/Graphi...90X-CrossFire-and-4K-Preview-Testing/Crysis-3
http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,8.html
http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,9.html
http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,10.html



THE UGLY

single gpu - it is a toss up.

multi gpu - still nVidia.
by the time amd (1) goes underwater to control the throttling and the noise, (2) a larger psu to feed the added current draw, (3) the lesser performance in tesselation, (4) the lesser multi-gpu frame pacing.
nvidia simply wins by default.



-----



if success is almost catching up to nVidia - then yes.

How do you know it wasn't throttling when it was kicking NVidia's ass? From what I've read, it's beating out a 780/Titan more often than not even after the throttle kicks in. And the scaling from CF seems quite good actually. This post has NVidia enthusiast written all over it.

As far as the frame pacing is concerned, here's a quote from your own source.

The small spikes are not noticeable in-game whatsoever. Crossfire is rendering sof ast that we are almost closing on on the monitor latency already.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
The stock cooler is a letdown, besides that it looks promising.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
A few observations (well more than a few)

1) Competition is always good

2) It runs hot, is a little power hungry(minor gripe really at these performance levels), is a little loud in current form.

3) Crossfire is getting better but still not perfected. It should be noted that it is a far sight better than we were at during the last AMD GPU launch.

4) The pricing is decent at first glance. I don't think people looking at this level of card cares much about value but when taken as a whole package I don't see much value there. As a whole I mean the cooling, noise etc. It all has to go together elegantly IMO and I'm not particularly picky on noise or power draw, but cooling. I wish I didn't have the versions of cards I'm running now and waited longer for better options due to the cooling.

5) Overclocking room looks to be small in comparison to Nvidia's offering right now(thermal throttling).

6) Anyone looking at this card should also wait to see two things. What the GTX 780ti brings in terms of performance and whether the prices adjust on the regular 780 pushing it into a position that might present better value. Also the regular 290 from AMD which might, but it is unknown, eat into the total value of the 290x part. It is possible for the 290 to perform pretty close to the 290x, similar to how an overclocked 670 was close to a 680 or a 7950 could overclock well and be within a few percentage points of a 7970 for less money. This might be a better buy for people. We don't know yet but if a regular 290 can generally be BIOS flashed to work as a 290x without issues arising from doing so, we may have to reevaluate where the 290x fits in, especially if the 290 has aftermarket cooling solutions quickly available from Sapphire, Asus, MSI etc.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
even if its not a wild success for the short term, its at least good news for the future. 64ROPs and at least 4GB of RAM are the new standard for the high end. 20nm parts should be extremely interesting as we're either going to get even beefier/faster parts or we're going to see even more cost effective ones (maybe a mix in between)
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Good news is that who ever buys a 290x or 780Ti, they will likely have the fastest single GPU card for quite some time. I have a feeling that they are squeezing the 28nm node about as tight as they can and any significant performance increases coming out of 28nm will simply not be a cost effective solution for either brand as die sizes will be too big and the power delivery circuitry will get more expensive. These are likely going to be top tier GPU's until 20nm.

Whatever the case, it will be interesting to see which camp can squeeze the most out of 28nm.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I guess that depends on who you ask. There are mainly 2 types of people on forum, pro amd or pro nvidia.
One will say it is best thing on this world. The other will say it is worthless and most expensive space heater ever!
There is no middle-ground most of the time.
If you don't care about a few dB more 290X delivers top performance and is $100 cheaper then closest competition. If you want silent card, wait for custom designs/780ti to make a decision or buy this:http://www.techpowerup.com/reviews/powercolor/hd_7850_scs3_passive/
;)
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I'd say there is a middle ground most of the time. It's just that folks on the two extremes talk more and talk louder.
 
Feb 19, 2009
10,457
10
76
I guess that depends on who you ask. There are mainly 2 types of people on forum, pro amd or pro nvidia.
One will say it is best thing on this world. The other will say it is worthless and most expensive space heater ever!
There is no middle-ground most of the time.

This is false. Many of us "pro AMD" have been as vocal or more about the stupid cooler AMD keeps putting on its reference designs. As I've said, I for one see absolutely no reason to buy a reference R290X unless you go water. People running on the stock HSF is just silly.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
This is false. Many of us "pro AMD" have been as vocal or more about the stupid cooler AMD keeps putting on its reference designs. As I've said, I for one see absolutely no reason to buy a reference R290X unless you go water. People running on the stock HSF is just silly.
I would think so but I've seen some people put up with ungodly sound because "more noise = faster, obviously." Reminds me of that idiotic Vantec Tornado fad a decade ago. Hopefully people speak with their wallets and wait for aftermarket solutions. I think the larger problem with the new cooler design is that it seems essential to run the fan at 100% to extract full performance from the chip even at stock speeds. Hopefully these things can be undervolted a bit for those wanting to cut down on the noise.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
It's true about the 290X cooler being bad. It can't handle the Hawaii die's power consumption. That in combination with the "boost" their Powertune is trying to make @ 95C makes having a GPU waterblock a necessity for this product.

Because of GPU's boosting & having turbo, there has never been another time where you would see so much of a performance gain with a waterblock on a video card other than right now with this card. The boost is unfortunately backwards compared to Nvidia's wonderful GPU boost 2.0, and the Cooler isn't even sufficient enough to let the card run at its default speed. See how it is a win-win? get the 1200,1300,1400 water clocks & low temps coming from a 727-900mhz card that you thought was a 1000mhz card when you were looking at its benchmarks.

Water is like a two for one with the 290x. People with EK blocks and 290X's are having a field day. An affordable combination too if you already have a watercooled system. The reviews are kinda meh. Some say its equal or slower than a 780 and some say it beats the titan and is the fastest video card. For games & stock v. stock reviews it kidna trails the titan IMO. Max overclocks is a different story though. $550 card + $140 EK block, completely smashes a $1000 card + $140 block. Go to 8xaa, 1600P, multi-monitor or 4K and the lead grows stronger.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
I think its a great card with the worst stock cooler ever made. I wish they made the price 600 USD and added a decent cooler.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
The high-end doesn't really hold any appeal to me anymore. The thought of spending $550 or $650 for a card that is probably < 30% faster than my 7970 @ 1125Ghz, I dunno... Just doesn't seem worth it.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
How do you know it wasn't throttling when it was kicking NVidia's ass? From what I've read, it's beating out a 780/Titan more often than not even after the throttle kicks in. And the scaling from CF seems quite good actually. This post has NVidia enthusiast written all over it.

As far as the frame pacing is concerned, here's a quote from your own source.

as for 290x beating 780/titan while throttling. that is an excellent question. hopefully 290x owners can answer this soon.

just keep in mind. (1) such benchmarks are only a few minutes long. then the gpu gets a few minutes to cool down. then the next benchmark is run. all on an open bench. verses. (2) actual gaming which goes on for hours on end without any cool down. inside a case.

so for actual gaming. we all can agree gpu throttling needs to be kept in check.
single gpu needs a better cooler and multi gpu needs under water.

----

as for frame pacing in multi-gpu.
all reviews clearly show that pushing 2560x1440 (3.7MP). sli is better.
all reviews clearly show that pushing 4k (8.3MP). sli is significantly better.

so for those considering multi-gpu. we all can agree that as more gpu is added and as resolution is increase. frame pacing will only magnify.
2 gpu is a toss up w/ nvidia slightly ahead. 3 gpu is time seriously consider nvidia. 4 gpu is definitely nvidia.

----

as for fan boy ism. could care less which camp. zero loyalty.
had 7970 crossfire first. only to be greeted with cross broke.
now with 680 trisli. everything working as it should.

upgrade path goes something like this.
290x (550x2) + water (500) + case delta (100) + psu delta (50) = 1750
gtx780ti (750x2) = 1500*

this has not even account for the ongoing increase in electricity usage.

math does not lie. nvidia is the overall simpler solution.

-----

at the end of the day, it is your money, spend it as you see fit.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I do believe the pcper review specifically stated they allowed the cards to warm up before they ran their tests and it still matched titans performance. You seem to be doing an awful lot of speculation against AMD for someone who isn't brand loyal.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
as for 290x beating 780/titan while throttling. that is an excellent question. hopefully 290x owners can answer this soon.

just keep in mind. (1) such benchmarks are only a few minutes long. then the gpu gets a few minutes to cool down. then the next benchmark is run. all on an open bench. verses. (2) actual gaming which goes on for hours on end without any cool down. inside a case.

so for actual gaming. we all can agree gpu throttling needs to be kept in check.
single gpu needs a better cooler and multi gpu needs under water.

----

as for frame pacing in multi-gpu.
all reviews clearly show that pushing 2560x1440 (3.7MP). sli is better.
all reviews clearly show that pushing 4k (8.3MP). sli is significantly better.

so for those considering multi-gpu. we all can agree that as more gpu is added and as resolution is increase. frame pacing will only magnify.
2 gpu is a toss up w/ nvidia slightly ahead. 3 gpu is time seriously consider nvidia. 4 gpu is definitely nvidia.

----

as for fan boy ism. could care less which camp. zero loyalty.
had 7970 crossfire first. only to be greeted with cross broke.
now with 680 trisli. everything working as it should.

upgrade path goes something like this.
290x (550x2) + water (500) + case delta (100) + psu delta (50) = 1750
gtx780ti (750x2) = 1500*

this has not even account for the ongoing increase in electricity usage.

math does not lie. nvidia is the overall simpler solution.

-----

at the end of the day, it is your money, spend it as you see fit.

Maybe you should read some reviews of the 290x before writing about it.

1. Several reviewers stated they warmed up the cards before benchmarking.

TPU
We then made sure the card was at constantly realistic long-term-use temperatures for our benchmarks.

Actually, please include sources since you have so many claims. Until then I assume they are all as "true" as your first claim which is indeed false. I'm not going to spend the time debunking your claims, you can just include the sources so we can go see for ourselves.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I'm waiting for aftermarket versions AND Mantle results before I draw my own conclusions, frankly.