Am I alone in thinking AMD's Hawaii GPU has aged really well?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Yeah, it's a great GPU, AMD really dropped the ball on release with that terrible stock cooler.
 
Feb 19, 2009
10,457
10
76
Their biggest flaw was the reference cooler, and it's still a flaw that haunts them.

On the face of it, a well cooled R290X uses ~230 to 250W gaming load. Very close to a 780Ti.

Except the reference ran hot, causing more leakage (higher peak load) and throttling clocks.. so it was hot, loud, power hungry and slower.

Whoever decided to go with that reference cooler, dealt a major blow to AMD's marketshare.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
They have leveraged GCN architecture very well. Since they have the gpu in both consoles, and most games are console ports...this has worked to their advantage.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I think all of the AMD GCN cards aged pretty well.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Running 2 EK water cooled Sapphire Tri-X R9 290s in CF in my 4790k rig. Watercooling handles the biggest problem with the Hawaii gpu - heat!
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,605
6,093
136
Still using 290s here. Waiting for 14nm GPUs to upgrade.
 

moonbogg

Lifer
Jan 8, 2011
10,734
3,454
136
Running 2 EK water cooled Sapphire Tri-X R9 290s in CF in my 4790k rig. Watercooling handles the biggest problem with the Hawaii gpu - heat!

Those cards have some serious longevity for sure. Pretty crazy really. The extra Vram over the GTX 700 series really paid off big time for the AMD cards.
 

KaRLiToS

Golden Member
Jul 30, 2010
1,918
11
81
I bought 4 x Reference Sapphire R9 290x Battlefield 4 Edition on release week. I don't plan to buy other GPUs until big die 14nm.

^^This is me justifying my purchase
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Heck, I'm still sporting a 7970 in my 2nd gaming machine. It's aged a lot better than the 680s I was running in my main machine thanks to that extra 1GB vram.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
7970 in backup rig.

295x2 in main rig.

Love the cards but the drivers for crossfire are getting annoying. Great performance when everything is working though. 7970 is near perfect driver wise due to being single card. Dual card setups need a lot of work.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,986
1,283
126
My 290x is great, i have no desire to upgrade it even though i've had it for quite some time.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Revisiting this thread and bringing the discussion of the GM206 vs Pitcarin, we can also argue that Pitcarin XT also aged really well.

The conclusion needs to be that GCN as a whole aged really well performance wise, and Pitcarin even perf/watt wise. What will be concerning is how Fiji matures over time, performance I think will improve something, but the card seems severely bottlenecked in some tests. Probably the eventual move to 1440p will show Fiji in a better light.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Revisiting this thread and bringing the discussion of the GM206 vs Pitcarin, we can also argue that Pitcarin XT also aged really well.

The conclusion needs to be that GCN as a whole aged really well performance wise, and Pitcarin even perf/watt wise. What will be concerning is how Fiji matures over time, performance I think will improve something, but the card seems severely bottlenecked in some tests. Probably the eventual move to 1440p will show Fiji in a better light.

It does make me wonder whether the larger Polaris GPU being released will be a modified version of Fiji with some of the bottlenecks removed??
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
It does make me wonder whether the larger Polaris GPU being released will be a modified version of Fiji with some of the bottlenecks removed??

What does this mean? As in, just the same number of SPs, but with the revised architecture and on 14nm?
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
If anything it should mean that they learned what to do and what not to do between Hawaii and fiji.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Hoping that Polaris continues this trend but at the same time, a less problematic release :)
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
Fiji was compromise as 20nm wasn't delivered so they had to put something together. They know where the bottlenecks are in the design and allowed them to work the kinks out with HBM memory.

GCN has been great design and extremely forward thinking; only now are we really starting to see just how good it is.
 

Z15CAM

Platinum Member
Nov 20, 2010
2,184
64
91
www.flickr.com
I have 2x's reference 290X's with Elpida ram in CF with XSPC water blocks which will run under full load at 1130x1500 to 1180x1500 with say +0.143mv and temps never go above 45C and does not regulate.

------------------------
ASUS P8 Z68V-Pro Gen3 / 17 2700k @ 4.8G's / 16GB Samsung MV-3V4G3D-US_DDR3 @ 1866mhz 9-9-9-24 1T at 1.34mv with a QNIX 2560x1440 display running between 96 to120Hz.
 
Last edited:

ultima_trev

Member
Nov 4, 2015
148
66
66
Hawaii sure does seem to be stretching its legs in newer titles. I'm always seeing threads on OCN about people being able to reach 1100/1600 on the core without increasing the voltage, although mine only manages about 1030/1500... And upping the voltage in my case will not do.

In hindsight, I'm wishing I would have either gotten a GTX 970 or perhaps the R9 380X. The PowerColor PCS+ version I have has an insufferably loud cooler (gets to 90 percent or more by default), despite being a triple fan solution and gets to over 90 centrigrade if left unchecked. In order to keep in manageable:

- I have to lower the core voltage by 50mv, aux by 25mv, power limit by 20%
- Decrease the core / memory to 950 / 1250 (R9 290 speeds)
- Set a framerate cap of 60 in Radeon Settings (40 in Witcher 3 or Crysis 3)
- Set a custom fan curve in MSI Afterburner

ASIC score in GPUZ is 71%, maybe the silicon lottery has something to do with it?
 

MongGrel

Lifer
Dec 3, 2013
38,466
3,067
121
Still just use a Tahiti myself, but the ASUS R9 280X DCUII TOP 3G OC's works pretty well, it's not pushing multiple screens in games.

But it is getting old I guess for anything serious gaming these days.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Hawaii sure does seem to be stretching its legs in newer titles. I'm always seeing threads on OCN about people being able to reach 1100/1600 on the core without increasing the voltage, although mine only manages about 1030/1500... And upping the voltage in my case will not do.

In hindsight, I'm wishing I would have either gotten a GTX 970 or perhaps the R9 380X. The PowerColor PCS+ version I have has an insufferably loud cooler (gets to 90 percent or more by default), despite being a triple fan solution and gets to over 90 centrigrade if left unchecked. In order to keep in manageable:

- I have to lower the core voltage by 50mv, aux by 25mv, power limit by 20%
- Decrease the core / memory to 950 / 1250 (R9 290 speeds)
- Set a framerate cap of 60 in Radeon Settings (40 in Witcher 3 or Crysis 3)
- Set a custom fan curve in MSI Afterburner

ASIC score in GPUZ is 71%, maybe the silicon lottery has something to do with it?

If your card is running that loud and hot there is an issue. Either the TIM needs to be replaced, the cooler is dirty, or you have really poor airflow in your case. Something's wrong though.
 

h4rm0ny

Member
Apr 23, 2015
32
0
0
Actually I think it aged normally. The only cards that didn't age normally were Kepler, because Nvidia's driver team packed it up and moved into the Maxwell office and they are getting ready to pack it up again. Out with the old and in with the new people. Nvidia should just charge its customers a subscription fee and send new GPU's each year in the mail. That's what its like now. Nvidia GPU's are like magazines. Read once, throw away.

Pretty much this. If you think about it, there has to be something else at play than the AMD card improving with age to keep up. :) NVIDIA has long gained an advantage by throwing huge amounts of development resource at making games run well on its hardware, persuading studios to incorporate NVIDIA technologies... All advantages for the NVIDIA customer. But when they swap to their lastest card, all that development effort is focused purely on their current ones and they abandon the old. So the older generation cards suddenly drop in relative performance on current games.

It's not about AMD hardware so much as NVIDIA has a cycle of degradation for older cards.
 

caswow

Senior member
Sep 18, 2013
525
136
116
iam an nvidia customer right now and no most of the people cant use nvidia tech because it slows down your games. there are zero perf enhancing technologies that were implemented in all of gw games. my kepler card is crippled beyond beliefe. when i bought my card it was neck in neck with the 7870 while priced similar. now my 660 is at the bottom right with the 750ti and the "new" 7870 still stands where it belongs to.

the so called nvidia technologies benefit only people who bought 700$ cards. i cant use physx or any other nvidia "technologie"
 
Feb 19, 2009
10,457
10
76
iam an nvidia customer right now and no most of the people cant use nvidia tech because it slows down your games. there are zero perf enhancing technologies that were implemented in all of gw games. my kepler card is crippled beyond beliefe. when i bought my card it was neck in neck with the 7870 while priced similar. now my 660 is at the bottom right with the 750ti and the "new" 7870 still stands where it belongs to.

the so called nvidia technologies benefit only people who bought 700$ cards. i cant use physx or any other nvidia "technologie"

Technically the GameWorks tech really only benefit people with 2x $700 GPUs who game at 1080p.

Because even at 1440p when you crank up GW, performance tanks to unacceptable levels.

HairWorks in Witcher 3 for the first few months is a prime example, minimum FPS were horrid. Godrays in Far Cry 4 was another example.

I mean just compare, GW on:

1436520543zZMsl7GpwE_6_3.gif


GW off:

1436520543zZMsl7GpwE_6_4.gif


Again and again...

The cinematic experience:
1436520543zZMsl7GpwE_3_3.gif


GW off:
1436520543zZMsl7GpwE_3_4.gif


Normal gamers with single GPUs don't benefit from it.

A more recent example, The Division, NV PCSS makes shadows softer, 20% performance loss thanks. Just for slightly blurrier and softer shadows.

Then guess what, HBAO+ makes shadows darker and more defined... what?!