[H] PowerColor PCS+ R9 290X Video Card Review

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
http://www.hardocp.com/article/2015/03/30/powercolor_pcs_r9_290x_video_card_review/1

Looks like [H] is doing a make up after the Asus Poseidon review. :D

The 290X does well here. The PCS+ is a very good aftermarket 290X. Not the quietest, as is shown in this review. The Sapphire Tri-X would have been better, and not the cheapest, the XFX DD can be had for $290AR, but still a very good cooler overall and reasonably priced.

PowerColor PCS+ R9 290X vs MSI GeForce GTX 970 4G

In this review these two video cards performed the closest to each other. The GTX 970 and PowerColor PCS+ R9 290X performed similarly using stock reference clock speeds in most games. Once we overclocked each video card, we saw some variation as to which provided a better gaming experience. Generally, the PowerColor PCS+ R9 290X with its max overclock out-performed the MSI GeForce GTX 970 4G using its highest overclock. That helped tilt the performance advantage to the PowerColor PCS+ R9 290X when overclocked

Interesting that with both cards O/C'd the 290X was faster. Not what the general consensus would be with people believing that once O/C'd Maxwell has an advantage.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
When going higher than 1080p, as tested in this review, 980/970's advantage over 290/X's shrinks, or disappears. That 512 bit bus is sure doing its part here for the 290x.

It's interesting to see the system power consumption being almost the same between a 970 and a 290x, both OCed.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
When going higher than 1080p, as tested in this review, 980/970's advantage over 290/X's shrinks, or disappears. That 512 bit bus is sure doing its part here for the 290x.

It's interesting to see the system power consumption being almost the same between a 970 and a 290x, both OCed.

I believe MSI uses a much higher TDP target on their Gamer series. Notice that the 970 actually uses more power than the reference 980. A bit off topic but that's why I believe that much of Maxwell's superior efficiency is due to software rather than an inherent hardware advantage. I think it's a continued refinement of their boost technology. AMD is definitely behind here and needs to put some work in.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Another review that doesn't jive with their past reviews. As always, one half of the posters here will be happy and the other half will write them off. I'll be the first to say it: The 970's OC numbers are off. OC'd 970's are generally 95-100% the speed of a stock GTX 980. In this particular review, the OC'd 970 is all over the place in performance, but averages significantly slower than the 95-100% speed of a GTX 980.

Anyways, next week's review will show a 290x being slower than a GTX 960 and the crowd here will flip sides as to how valid [H] is.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Another review that doesn't jive with their past reviews. As always, one half of the posters here will be happy and the other half will write them off. I'll be the first to say it: The 970's OC numbers are off. OC'd 970's are generally 95-100% the speed of a stock GTX 980. In this particular review, the OC'd 970 is all over the place in performance, but averages significantly slower than the 95-100% speed of a GTX 980.

Anyways, next week's review will show a 290x being slower than a GTX 960 and the crowd here will flip sides as to how valid [H] is.

As I said, "Makeup" for the Asus Poseidon review.

You are always going to have someone complain though, as you say. The last review the Poseidon was a golden sample, this one the 970 is a dog. Go figure. :D
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I believe MSI uses a much higher TDP target on their Gamer series. Notice that the 970 actually uses more power than the reference 980. A bit off topic but that's why I believe that much of Maxwell's superior efficiency is due to software rather than an inherent hardware advantage. I think it's a continued refinement of their boost technology. AMD is definitely behind here and needs to put some work in.

I found with my Asus Strix 970,simply enabling 120% power my temps which usually load about 62cel shoot right up to 68cel.Guess when the 970 is overclocked they use quite a bit more power then usual.Not far off from what my old 770 did with a weaker cooler.

I like stock and dropping the power limit for older games,the card really loads much cooler at about 51cel with performance still better then my 770.That works better then vsync/frame capping anyways.Clocks on full load hit about 1000-1114.
 
Feb 19, 2009
10,457
10
76
The reason for the difference is this:

"For the PowerColor PCS+ R9 290X we are using Catalyst Omega 14.12. We recorded most of our data using this driver, however did use the latest 15.3 Beta dated 3/20/2015 for both Far Cry 4 and Dying Light."

The recent DL patch along with the beta drivers has boosted performance in that GW title massively for AMD GPUs. Same for FC4. It takes these GW devs many months after launch to make a patch so that AMD GPUs aren't so gimped.

Because their small sample of games make [H] review very vulnerable to bias skews, AMD improving in 2 GW titles make the R290X seem much more competitive.

But we already knew all that for a long time, R290X is within 10% of a 980 across many titles. That puts it ahead of the 970 by default. OC v OC, the 970 will catch up and potentially exceed it, due to the average 20% OC headroom vs 15% for the R290X.

It still doesn't make the R290X viable anywhere close to the 970 in pricing because OC v OC, the power usage gap is huge. R290X has to be at least $50 cheaper to make it worthwhile. Yes, even with the gimped 3.5gb vram on the 970.

ps. As soon as GTA V and Witcher 3 (GameWorks) lands in [H]'s small suite of games, you can bet the incessant whine regarding poor AMD performance or lack of CF support will return. -_-
 

Abwx

Lifer
Apr 2, 2011
11,804
4,726
136
A bit off topic but that's why I believe that much of Maxwell's superior efficiency is due to software rather than an inherent hardware advantage. I think it's a continued refinement of their boost technology. AMD is definitely behind here and needs to put some work in.

I wont insist on this subject, i posted elsewhere my opinion that there is more than design at play, it s impossible that a same process would require the same voltage to run two different GPUs at frequencies that are 25-30% apart.

My opinion is that Nvidia did pay TSMC a premium to have a custom process, exactly what Altera did with this foundry :

Process Techniques on 28HP

Custom low-leakage transistors (1)
Custom low bulk leakage (Ibulk) (1)
(1) Exclusively available and used by Altera only.

28% dynamic power reduction.

2012 Altera Corporation.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The reason for the difference is this:

"For the PowerColor PCS+ R9 290X we are using Catalyst Omega 14.12. We recorded most of our data using this driver, however did use the latest 15.3 Beta dated 3/20/2015 for both Far Cry 4 and Dying Light."

The recent DL patch along with the beta drivers has boosted performance in that GW title massively for AMD GPUs. Same for FC4. It takes these GW devs many months after launch to make a patch so that AMD GPUs aren't so gimped.

Because their small sample of games make [H] review very vulnerable to bias skews, AMD improving in 2 GW titles make the R290X seem much more competitive.

But we already knew all that for a long time, R290X is within 10% of a 980 across many titles. That puts it ahead of the 970 by default. OC v OC, the 970 will catch up and potentially exceed it, due to the average 20% OC headroom vs 15% for the R290X.

It still doesn't make the R290X viable anywhere close to the 970 in pricing because OC v OC, the power usage gap is huge. R290X has to be at least $50 cheaper to make it worthwhile. Yes, even with the gimped 3.5gb vram on the 970.

ps. As soon as GTA V and Witcher 3 (GameWorks) lands in [H]'s small suite of games, you can bet the incessant whine regarding poor AMD performance or lack of CF support will return. -_-

That's your opinion. Personally I consider the 970 broken and not worth anywhere as much as a 290X. Also, judging by recent events with nVidia's older cards the 290X is going to age much more gracefully maintaining it's performance longer. To each his own though.

You can get much cheaper 290X's as well. Right now the XFX DD is $290AR @ Newegg. [H] never selects the best value 290's when comparing them.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
The reason for the difference is this:

"For the PowerColor PCS+ R9 290X we are using Catalyst Omega 14.12. We recorded most of our data using this driver, however did use the latest 15.3 Beta dated 3/20/2015 for both Far Cry 4 and Dying Light."

The recent DL patch along with the beta drivers has boosted performance in that GW title massively for AMD GPUs. Same for FC4. It takes these GW devs many months after launch to make a patch so that AMD GPUs aren't so gimped.

Because their small sample of games make [H] review very vulnerable to bias skews, AMD improving in 2 GW titles make the R290X seem much more competitive.

But we already knew all that for a long time, R290X is within 10% of a 980 across many titles. That puts it ahead of the 970 by default. OC v OC, the 970 will catch up and potentially exceed it, due to the average 20% OC headroom vs 15% for the R290X.

It still doesn't make the R290X viable anywhere close to the 970 in pricing because OC v OC, the power usage gap is huge. R290X has to be at least $50 cheaper to make it worthwhile. Yes, even with the gimped 3.5gb vram on the 970.

ps. As soon as GTA V and Witcher 3 (GameWorks) lands in [H]'s small suite of games, you can bet the incessant whine regarding poor AMD performance or lack of CF support will return. -_-

rubbish. I would take a R9 290X over a quirky GTX 970 which is already showing its limitations even at 1080p in games like Rome Total War Atilla. the GTX 970 is going to be the most shortsighted investment a gamer can make. People were arguing that 2GB is enough for 1080p as late as 2013 when the consoles launched. guess what they were all wrong. the fact that GTX 970 is already showing its limitations does not augur well for the future.

As for Gameworks its a program meant to harm AMD. no doubt about it. The fact that these Gameworks licensees are taking months to provide a good gaming experience on AMD GPUs is disgusting and downright unethical. :thumbsdown:

Another review that doesn't jive with their past reviews. As always, one half of the posters here will be happy and the other half will write them off. I'll be the first to say it: The 970's OC numbers are off. OC'd 970's are generally 95-100% the speed of a stock GTX 980. In this particular review, the OC'd 970 is all over the place in performance, but averages significantly slower than the 95-100% speed of a GTX 980.

Anyways, next week's review will show a 290x being slower than a GTX 960 and the crowd here will flip sides as to how valid [H] is.

read the review properly. 970 OC is 95% of GTX 980 perf in FC4, Crysis 3 and BF4. In WatchDogs its 96% and in Dying Light its 92%. these numbers are what you would expect.

That's your opinion. Personally I consider the 970 broken and not worth anywhere as much as a 290X. Also, judging by recent events with nVidia's older cards the 290X is going to age much more gracefully maintaining it's performance longer. To each his own though.

You can get much cheaper 290X's as well. Right now the XFX DD is $290AR @ Newegg. [H] never selects the best value 290's when comparing them.

well said. R9 290X is a more consistent card and truly has aged very well. As for R9 290X deals the Tri-X for USD 320 AR is a very good one. it has a better cooler than the XFX DD which is selling for USD 290 AR
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
As for R9 290X deals the Tri-X for USD 320 AR is a very good one. it has a better cooler than the XFX DD which is selling for USD 290 AR

Tri-X @ $320 would be my choice. Great cooling, AMD reference PCB (which is typically better than custom ones despite what most people think), and quieter than the PCS+. If [H] would have compared the Tri-X or even the DD, the 970 wouldn't have looked very good at all with both of them being less expensive.
 

96Firebird

Diamond Member
Nov 8, 2010
5,736
329
126
If [H] would have compared the Tri-X or even the DD, the 970 wouldn't have looked very good at all with both of them being less expensive.

But then you could also get a cheaper 970 (Zotac going for $310 on Newegg, but I'd probably go with the MSI for $310AR) and still get a pretty good showing with the 970. Comparable to the Tri-X 290X, and slightly cheaper as well.

Im surprised the 290X has gone up in price since it fell after Maxwell 2 released.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Ive owned both cards the tri x is the better cooler.the psc+ 290 definitely needs a custom fan curve as its way to aggressive out of the box.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
But then you could also get a cheaper 970 (Zotac going for $310 on Newegg, but I'd probably go with the MSI for $310AR) and still get a pretty good showing with the 970. Comparable to the Tri-X 290X, and slightly cheaper as well.

Im surprised the 290X has gone up in price since it fell after Maxwell 2 released.

The MSI is $340AR. That's the one they used in the review.
 

96Firebird

Diamond Member
Nov 8, 2010
5,736
329
126
I was talking about this MSI. Hell, right now even the Gigabyte G1 Gaming 970 is cheaper than the Tri-X 290X. [H] probably used the MSI Gaming 970 because that is what they were given for launch reviews.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was talking about this MSI.

That 970 is junk. Many users reported poor overclocking on it and even thermal / power throttling.

Gigabyte Windforce 970 for $320. The key bonus today is TW3 game being thrown in. That makes the 970 cheaper than an R9 290X. With this game, probably the most anticipated game of 2015 for many gamers, the 970 is going to be by far the preferred gaming choice. Also, knowing that the 1st half of 2015 is loaded with GW titles, it's a safer bet to buy a 970 for 2015. AMD needs to drop R9 290X to $249-259 to sell against a GTX970+TW3 combo at $320.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Focusing on only 290x vs 970 is myopic when the 290 is only 10% slower than the 290x yet significantly cheaper. (not criticizing H as I understand they have to scope their articles, rather for forum posters who have no such concern). The real value choice is 290, not 970 or 290x. The calculus changes a little if you were planning on day 1 buying TW3.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That 970 is junk. Many users reported poor overclocking on it and even thermal / power throttling.

Gigabyte Windforce 970 for $320. The key bonus today is TW3 game being thrown in. That makes the 970 cheaper than an R9 290X. With this game, probably the most anticipated game of 2015 for many gamers, the 970 is going to be by far the preferred gaming choice. Also, knowing that the 1st half of 2015 is loaded with GW titles, it's a safer bet to buy a 970 for 2015. AMD needs to drop R9 290X to $249-259 to sell against a GTX970+TW3 combo at $320.

I assume you are saying that based on your opinion of the masses, not just yours or mine? I don't believe you would take a 970 over a 290X at the same price, and I know I wouldn't. The 290X is a bit faster, has the superior uarch for the upcoming API's, and kills the 970 in memory bandwidth, etc. That's before we consider the broken memory system in the 970.

Focusing on only 290x vs 970 is myopic when the 290 is only 10% slower than the 290x yet significantly cheaper. (not criticizing H as I understand they have to scope their articles, rather for forum posters who have no such concern). The real value choice is 290, not 970 or 290x. The calculus changes a little if you were planning on day 1 buying TW3.

We're doing it because the thread is about the review and that's what was in the review.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Yes, but that's weak. Just because [H] chose to exclude the 290 doesn't mean people talking about the best value card (or "preferred gaming choice") should do so as well. [H] has different constraints in talking about things than forum posters do. Appealing to the [H] post which has totally different circumstances surrounding the choice of cards excludes a legitimate point.

bottom line: if we're discussing the best value high end video cards, excluding the 290 is excluding the best value high end card...
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No denying the value of the 290. If I was to buy I'd get the 290. Once O/C'd you aren't going to be able to tell the difference playing games on it from the other two. Especially the 970 which is a bit slower than the 290X anyway.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
402
126
Check that 290, might just unlock.
Both of my ~$180 XFX 290 DDs did, much to my surprise :)
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Very happy with my 2 Sapphire Tri-X 290s under water in CF!

Good review. Show the 290X is solid. Sure Nvidia is the leader, but AMD is much closer than some believe, especially at higher resolutions.

I "owned" a Gigabyte G1 GTX970 for @ 1 month and had it in my 3770k rig. It was slightly faster than the Tri-X but when the memory allocation issue surfaced, I returned it and "downgraded" to another Sapphire Tri-X 290.

I'm interested to see how the release of the 390 will affect the attention AMD pays to the 290 series.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
It's amazing the difference of opinion between two authors on one website:

Grady McKinney - The PowerColor PCS + R9 290X averaged 59.2 FPS out-of-box in this game. This was right at our 60 FPS framerate target, and executed reliable gameplay that allowed us to adapt to each new scenario.
http://www.hardocp.com/article/2015/03/30/powercolor_pcs_r9_290x_video_card_review/7#

Brent Justice
- The R9 290X, even when overclocked insanely high, is just barely playable at 4X MSAA in this game averaging right at the 60 FPS mark.
http://www.hardocp.com/article/2015/03/03/asus_rog_poseidon_gtx_980_platinum_video_card_review/8#
 
Last edited: