[WCCF] AMD Radeon R9 390X Pictured

Page 95 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Very impressive! I don't know how anyone can look at this and not be impressed.
56gPDvd.jpg

Too bad they've got a crap R&D budget. :p /sarc.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Very impressive! I don't know how anyone can look at this and not be impressed.
Too bad they've got a crap R&D budget. :p /sarc.

The answer is simple: AMD did it first -- AIO CLC, 17-20 cm PCB and near > 500mm2 die with HBM1, despite this being a flagship card, not a puny 350mm2 HBM2 card. These innovations are going to leave a mark on the GPU industry for the next decade or so. HBM2 will continue to scale and maybe there will be HBM3. I fully expect a lot more flagship cards to feature AIO CLC and more high-end gamers accepting this as the preferable solution. eVGA has 980Ti Hybrid which means 980 Hybrid must have sold well.

Major AIBs are jumping on the AIO CLC bandwagon as well. MSI and Corsair aren't standing still on this front.

LL

Source
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
Yes, seeing recent announcement, the AIO CLC must be really big fail as so many here have been telling us. :rolleyes:
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
The answer is simple: AMD did it first -- AIO CLC, 17-20 cm PCB and near > 500mm2 die with HBM1, despite this being a flagship card, not a puny 350mm2 HBM2 card. These innovations are going to leave a mark on the GPU industry for the next decade or so. HBM2 will continue to scale and maybe there will be HBM3. I fully expect a lot more flagship cards to feature AIO CLC and more high-end gamers accepting this as the preferable solution. eVGA has 980Ti Hybrid which means 980 Hybrid must have sold well.

Major AIBs are jumping on the AIO CLC bandwagon as well. MSI and Corsair aren't standing still on this front.

LL

Source

That Fiji die is definitely > 550 sq mm. In fact I would not be surprised if its as big as GM200 or even slightly bigger. Remember this GPU will double up as a Firepro flagship in a few months. Thats why I think the dual link interposer makes a lot of sense. With 4 Hi stacks (1 GB capacity) and dual link interposer you get 4 x (2 x 1) = 8 GB . With 8 Hi stacks (2 GB) you can get 4 x (2 x 2) = 16 GB. AMD would surely not want to regress on VRAM capacity especially for pro graphics where they have started to make an impact and have been taking market share going up to 25% from their historical mid-teens %. With Fiji being uncontested for fp64 till well into 2017 AMD has some serious opportunity to take the market share level upto even 40%. Remember unless GP104 comes with fp64 , the dual GK210 based Tesla K80 will be the fp64 flagship till mid-2017.

http://www.anandtech.com/show/8729/nvidia-launches-tesla-k80-gk210-gpu

I see serious trouble brewing up for Nvidia if Fiji XT aka Fury takes the GPU crown decisively (20% faster). AMD will launch an onslaught which will be even better than the HD 4800 - HD 5800 timeline. AMD will get to 14nm and HBM2 first easily due to their invaluable experience with 28nm and HBM. Nvidia's 16nm products with HBM2 could arrive months later. We will not know the time gap. But I would not be surprised if its 3-6 months later. Then finally when Zen based APUs with HBM2 launch in mid-2017 the real brunt of the damage will start to be felt. :D

btw RS could you please clear your messages as I guess nobody is able to send you PM. The message we get is "RussianSensation has exceeded their stored private messages quota and cannot accept further messages until they clear some space."
 
Last edited:

flopper

Senior member
Dec 16, 2005
739
19
76
I see serious trouble brewing up for Nvidia if Fiji XT aka Fury takes the GPU crown decisively (20% faster). AMD will launch an onslaught which will be even better than the HD 4800 - HD 5800 timeline. AMD will get to 14nm and HBM2 first easily due to their invaluable experience with 28nm and HBM. Nvidia's 16nm products with HBM2 could arrive months later. We will not know the time gap. But I would not be surprised if its 3-6 months later. Then finally when Zen based APUs with HBM2 launch in mid-2017 the real brunt of the damage will start to be felt. :D

The AMD plan, Mantle now a standard for the Industry with Dx12, Vulkan and Mantle.
The HBM route for GPU and APU.
Created fury for the dx12 train.

AIO watercooling solutions will soon be the only viable solutions until the power of bigger cores are going way down years from now. 980ti isnt power saving at all.
 
Feb 19, 2009
10,457
10
76
If they can save power, they haven't given us enough grunt on the high end. It's a shame that GPUs so strong like Titan X struggles at 1080p in modern games. We need much much more processing power.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
If they can save power, they haven't given us enough grunt on the high end. It's a shame that GPUs so strong like Titan X struggles at 1080p in modern games. We need much much more processing power.

If a Titan X is struggling to max a game at 1080p then the developer is definitely not doing a good job. plain and simple. :D
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
If a Titan X is struggling to max a game at 1080p then the developer is definitely not doing a good job. plain and simple. :D

You do know that we are being played to keep us buying more? No matter how powerful GPU's get they will find some feature/effect/setting to tank performance forcing us to want moar!
 

Pinstripe

Member
Jun 17, 2014
197
12
81
You do know that we are being played to keep us buying more? No matter how powerful GPU's get they will find some feature/effect/setting to tank performance forcing us to want moar!

...mostly placebo effects, on top of that. But the sheep will always throw out big money to satisfy their inner egos for notching up that one extra Ultra setting.
 

Abwx

Lifer
Apr 2, 2011
12,028
4,990
136
One way to think of it is that the GloFo 28nm is like a quarter or a sixth node improvement over TSMC 28nm, but it's only coming online now.

I would put it well above a quarter, more like a half in perfs.

A lot of people here seem to be predicting that most of the AMD GPU line will switch over to GloFo for 300-series. I frankly don't see this happening, as making all those masks can't possibly be worth it. We might see a chip or two do the switch.

Dont know, what is sure is that Kabini GPU part perf/watt was improved by 60-65% when transiting from TSMC to GF, and i m not sure that AMD did anything else than a simple porting.

Indeed this chip was completely screwed by TSMC half node evolution, and retrospectively i m also suspicious about Hawai.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
A lot of people here seem to be predicting that most of the AMD GPU line will switch over to GloFo for 300-series. I frankly don't see this happening, as making all those masks can't possibly be worth it. We might see a chip or two do the switch.

I expect to see Hawaii and Tonga moved to GloFo (becoming Grenada and Antigua, respectively). And I think there's a good chance there will be a new low-mid range chip, to replace the badly aged GCN 1.0 Pitcairn/Curacao.

Cape Verde won't be moved over; if they were going to do that, then Apple would already have gotten the updated version for the new MacBook Pro. And it's quite possible that Bonaire won't be moved over, either; it's really close enough to Pitcairn that if they made a new GloFo chip with 1024-1280 shaders, then Bonaire at 896 would be unneeded and superfluous.
 

96Firebird

Diamond Member
Nov 8, 2010
5,749
345
126
I hope you are doing it on purpose. :cool:

You've added so little to the topic at hand, I'm wondering why you even bothered to respond to me...

But I guess this is the type of response I should expect from someone who isn't smart enough to come up with a counterpoint. :thumbsup:
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Very impressive! I don't know how anyone can look at this and not be impressed.
56gPDvd.jpg

Too bad they've got a crap R&D budget. :p /sarc.

It is impressive looking for sure. If it has performance to match the reviews will be very interesting.

On another note....If you throw a Nvidia logo on there we'd be hearing things like.: Final death blow to AMD, last nail in the coffin, etc.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
I fully expect a lot more flagship cards to feature AIO CLC and more high-end gamers accepting this as the preferable solution. eVGA has 980Ti Hybrid which means 980 Hybrid must have sold well.

I will say it's hard to go back to a stock open air or blower based cooler once you go the AIO CLC GPU route. I removed my NZXT G10 bracket for a couple of months because the ThermalTake Water extreme 2.0 connected to it was too large for my case, and I had to rig it in order for it to work.

I recently bought a Corsair H55 AIO because it would fit better in my case, as I missed the much lower GPU temperatures and the lack of fan noise. I just put it back together last night. 1300MHz+ below 60c. It will be really hard for me to go back... also note, the G10 works much better on custom cards with more phases as VRM temps are not an issue.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Yes, seeing recent announcement, the AIO CLC must be really big fail as so many here have been telling us. :rolleyes:

Who exactly has said this?

It seems more the debate is around if the CLC was used because it was necessary to keep thermals/temp acceptable or if it will really let the card stretch it's legs. Neither suggest it is a fail.

There is another argument on whether a CLC is ideal for someone's current case/build, but assuming there is a partial grain of truth in some of the leaked options, a non-CLC will be available. This is good for those who may not want a CLC (for a variety or reasons) or those who can't use one.

Best of both worlds.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I will say it's hard to go back to a stock open air or blower based cooler once you go the AIO CLC GPU route. I removed my NZXT G10 bracket for a couple of months because the ThermalTake Water extreme 2.0 connected to it was too large for my case, and I had to rig it in order for it to work.

I recently bought a Corsair H55 AIO because it would fit better in my case, as I missed the much lower GPU temperatures and the lack of fan noise. I just put it back together last night. 1300MHz+ below 60c. It will be really hard for me to go back... also note, the G10 works much better on custom cards with more phases as VRM temps are not an issue.

I do agree to a point. As long as we continue to have custom, well-built air versions as an alternative, that's good for everyone. I don't want to pay a premium for a CLC, if I want to use it in a very small build or an existing loop, it is a waste.

The 'H2O editions' are nice too, but often they are very difficult to actually find in-stock (EVGA case in point) and sometimes have a ridiculous premium attached. Getting a water-ready card that you just need to connect to a loop is really convenient though....:p

All that said, nice, more compact CLC cards are a great thing for the GPU industry. If I am building for a friend, its easier to plan for a CLC vs. a huge-length card that necessitates a much larger case, in some situations. Its convenient, reliable, and really helps with thermals and noise.

Edit: Honestly, I wonder if we will reach a point where you can just order the GPU PCB and just forget all the cooling? Give me stock speeds with great power phases and chokes. I will handle the performance tweaking and find a block/CLC/fan that I want to use...
 
Last edited:

njdevilsfan87

Platinum Member
Apr 19, 2007
2,349
270
126
I think 5 years from now, CLCs will be more popular than air coolers on GPUs. Air cooling has come a long way, but air cooled GPUs are now as large and heavy as bricks. Air cooling has been maxed out in terms of power density (not total power) for a while now too. CLCs raise the ceiling on both by a pretty decent amount.
 
Last edited:

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
Who exactly has said this?

It seems more the debate is around if the CLC was used because it was necessary to keep thermals/temp acceptable or if it will really let the card stretch it's legs. Neither suggest it is a fail.

There is another argument on whether a CLC is ideal for someone's current case/build, but assuming there is a partial grain of truth in some of the leaked options, a non-CLC will be available. This is good for those who may not want a CLC (for a variety or reasons) or those who can't use one.

Best of both worlds.
It's not like I keep some kind of record who says what and where, but I remember when info about water cooling was leaked, few "enthusiast" commenting something like "lol, fiji need water cooler to work lol" even though leaked 300w energy consumption is no more than 290x and we all know that with AIB air coolers it's perfectly fine.

And the rest like you said are wondering can they fit the cooler inside the case which is quite obvious to me at least. If you can't bother to learn how to install the WC rad to the 120mm exhaust hole, you don't even deserve to use one. Besides, it doesn't even have to be mounted on the exhaust hole, it can just push the air inside the case and let the fans move the air outside of the case. This is how AIB air coolers work, so why we should be forced to use the WC differently?
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
I've bought on the basis of best performance for price and that means I've nearly always bought AMD. Also the Nvidia cheating with the AF/AA with the 4600Ti where whole areas had no textures which then miraculously appeared fully textured with my 9700 pro plus the hyper tessellated concrete blocks in Crysis 2 have made me lean to the red team. If they say Fiji is the World's fastest GPU I have a tendency to believe them and I have money waiting for the next upgrade which is looking like FijiX2 or Crossfire. I'm loyal to any manufacturer that doesn't try to screw me and I fight for the little guy. I do hope AMD deliver and can't wait to see benchmarks but it all seems so up in the air at the moment. I wish to God Samsung would buy AMD to give them the money to compete properly with Nvidia and Intel.
I reckon Fiji will be very close to 295x2 in performance with 8GB memoryand if so will buy.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
so when the heck is this thing going to be reviewed?
Well considering it hasn't been officially announced yet that's the first question you want to ask...

Since you already know the announce day is June 16th something we've known for god only knows how long at this point, you can safely assume it'll be after June 16th.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,407
8,595
126
Well considering it hasn't been officially announced yet that's the first question you want to ask...

Since you already know the announce day is June 16th something we've known for god only knows how long at this point, you can safely assume it'll be after June 16th.

well, no, i didn't know that. thanks!
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
I do agree to a point. As long as we continue to have custom, well-built air versions as an alternative, that's good for everyone. I don't want to pay a premium for a CLC, if I want to use it in a very small build or an existing loop, it is a waste.

Agreed...I always custom water cool my cards and so don't want to pay for the added expense of a CLC. Also, considering I use a universal block, the non-reference air cooled versions are perfect as they usually have a separate VRM heatsink on them, so I can just reuse that.
 
Status
Not open for further replies.