[VC]AMD Radeon R9 390X WCE Speculation Thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
After 2.5 years of 28nm GPUs, im waiting for a R9 380/GM200 cut-down version with higher than GTX980 performance at $300-350 max.
If that doesnt happen i will not upgrade my GPU and will not recommend any of those new GPUs from both companies.

You need a miracle for that. So pocket the money for another year or 2.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
They both can have a $999 special edition GPU, i dont care. But GTX980 + 5-10% higher perf at $300-350 should be doable after 2.5 years of 28nm.

I think the R9 380X will be atleast 10% faster than GTX 980 for USD 400. This is how I see the chips fall

R9 390X, 8 GB HBM, WCE - USD 800 (4096 sp) (slightly faster than Titan-X)
R9 390, 8 GB HBM, USD 700 ( 3584 - 3840 sp)

R9 390X die will be designed with 1/2 rate double precision performance and will launch in a flagship Firepro version later this year.

R9 380X will be a perf/watt and cost optimized die specifically targetted against GM204 aka GTX 980. Globalfoundries clearly mention that for > 400 sq mm die size they designed a special bump termination called CRTM for a specific customer (which is likely AMD for their R9 390X). They also mention yields are challenging on large die and large interposer combination.

http://www.youtube.com/watch?v=po29B53bpic

For logic die sizes below 400 sq mm they have a cheaper and better yielding bump termination. So R9 380X would be mostly < 400 sq mm, which is quite possible given the HBM memory controller on GPU consumes much lesser die area than a 512 bit GDDR5 memory controller.

R9 380X, 4 GB HBM - USD 400 (3072 sp)
R9 380, 4 GB HBM, - USD 330 (2816 sp)
R9 370X, 2 GB GDDR5 - USD 200 (1536 sp)
R9 370, 2 GB GDDR5 - USD 149 (1408 sp)
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I think the R9 380X will be atleast 10% faster than GTX 980 for USD 400. This is how I see the chips fall

R9 390X, 8 GB HBM, WCE - USD 800 (4096 sp) (slightly faster than Titan-X)
R9 390, 8 GB HBM, USD 700 ( 3584 - 3840 sp)

R9 390X die will be designed with 1/2 rate double precision performance and will launch in a flagship Firepro version later this year.

R9 380X will be a perf/watt and cost optimized die specifically targetted against GM204 aka GTX 980. Globalfoundries clearly mention that for > 400 sq mm die size they designed a special bump termination called CRTM for a specific customer (which is likely AMD for their R9 390X). They also mention yields are challenging on large die and large interposer combination.

http://www.youtube.com/watch?v=po29B53bpic

For logic die sizes below 400 sq mm they have a cheaper and better yielding bump termination. So R9 380X would be mostly < 400 sq mm, which is quite possible given the HBM memory controller on GPU consumes much lesser die area than a 512 bit GDDR5 memory controller.

R9 380X, 4 GB HBM - USD 400 (3072 sp)
R9 380, 4 GB HBM, - USD 330 (2816 sp)
R9 370X, 2 GB GDDR5 - USD 200 (1536 sp)
R9 370, 2 GB GDDR5 - USD 149 (1408 sp)

I hope you are right but I definitely think you are the optimistic side. That said, if 380x was 400 for +10% over the 980 I will buy two immediately on launch.
 
Feb 19, 2009
10,457
10
76
I'm strongly considering AMD this round and I'm willing to wait in order to compare it with GM200. Yet I will miss TXAA and GPU-PhysX (Batman and Witcher 3). I hate not being able to turn on everything :D
And I hope AMD gets their CPU overhead down for GTA V. If NV is like 30-50% faster here in CPU bound benchmarks (open world...), it will be hard to switch to AMD.

I've been a long time user of both AMD/NV GPUs, but GameWorks is the final straw. Particularly if Witcher 3 and GTA V (both AAA games I've been looking forward to a long time) runs like a dog on AMD GPUs, I'll have to folk over extra $ for the NV tax just to play games I like without sacrificing on quality.
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
I've been a long time user of both AMD/NV GPUs, but GameWorks is the final straw. Particularly if Witcher 3 and GTA V (both AAA games I've been looking forward to a long time) runs like a dog on AMD GPUs, I'll have to folk over extra $ for the NV tax just to play games I like without sacrificing on quality.

Seriously Silver drop the gameworks; it screws just as many NV users as it does AMD; actually more. There hasn't been a single gameworks release for pc that wasn't a complete broken mess for both sides.

Check the forums; people are holding back from buying gameworks titles until a couple patches are out as that's how bad the reputation is for it....

The only one I have part hope for is Witcher 3; but first two witchers weren't exactly the best launches either but they do take care of PC gamers.

Gamesworks isn't not something anyone looks forward to...
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
So, some people clearly are willing to pay hundreds of dollars for only couple of games? Just wondering, is there something wrong with your brain? It doesn't make any sense and there fore you don't have the right to [complain] about the price of the product and power usage if the card runs quiet.

Profanity isn't allowed in the technical forums, and neither are personal attacks.
-- stahlhart
 
Last edited by a moderator:

nvgpu

Senior member
Sep 12, 2014
629
202
81
GameWorks games works fine. Don't mix up Ubisoft's poorly coded games with GameWorks.

[Gaming] Evolved games meanwhile hurts Nvidia users, period. We've seen that from the Tomb Raider mess and from other [Gaming] Evolved games which runs worse on much faster higher performance Nvidia cards.
 
Last edited by a moderator:

itsmydamnation

Diamond Member
Feb 6, 2011
3,076
3,907
136
GameWorks games works fine. Don't mix up Ubisoft's poorly coded games with GameWorks.

Sabotage Evolved games meanwhile hurts Nvidia users, period. We've seen that from the Tomb Raider mess and from other Sabotage Evolved games which runs worse on much faster higher performance Nvidia cards.

i dare you to try add some more hyperbole, Tomb Raider runs fine on NV and the NV could even look at the source for things like TressFX because its published on AMD website, unlike Gameworks.

so you know keep fighting the good fight..................
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I've been a long time user of both AMD/NV GPUs, but GameWorks is the final straw. Particularly if Witcher 3 and GTA V (both AAA games I've been looking forward to a long time) runs like a dog on AMD GPUs, I'll have to folk over extra $ for the NV tax just to play games I like without sacrificing on quality.

I don't care about that, I just want most features and best performance. "Politics and ethics" doesn't interest me in the slightest. There are other areas in life where I can do good.
 

itsmydamnation

Diamond Member
Feb 6, 2011
3,076
3,907
136
http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues

Deny all you want, the bitter truth can't be hidden.

Sabotage Evolve games intentionally hurts Nvidia users.

sigh really, funny then:
http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-980-g1-gaming-review,13.html

wow a game on release works poorly on either AMD or NV hardware...... STOP THE PRESS!!!!!

We then come back to the points you completely ignored from my previous post.......... Keep up the emotive language it really strengthens your well considered argument. Can AMD run AA on batman yet :rolleyes:
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
1. 12GB vram >>> Doesnt matter at all for gamers. Game developers and software engineers, absolutely
2. CUDA >>> Doesnt matter at all for gamers. Developers and GPGPU users, absolutely.
3. NV ecosystem & "better drivers" >>> If I go with R9 390X instead of Titan X, I will def miss all the regular driver updates from Nvidia. And Geforce Experience. Not gonna lie there. Nvidia have the edge here. But here`s to hoping AMD will do better here in the future.
4. GameWorks >>> Nvidia have "their" games they get the edge on due to pushing GameWorks on developers. But AMD also have games they do better. I`d say they edge each other out more or less.
5. GPU PhysX >>> Its a nice feature. I give them that. But frankly, Im not sure if I will miss it. With a fast CPU I can dedicate some of its threads to do some PhysX atleast.

There's a few, surely it will apply for many NV enthusiast, even if 390X ends up faster, they won't switch. I saw a few posts on other forums, one guy even claims he will never ditch NV for AMD unless his NV GPU rapes his wife and burns down his house. That is what it would take for the switch. I had to lol.
I sort of understand them. Why switch if they had zero problems with a brand and if money is no issue anyway.
Its more to that with Titan X and 390X this time though

Yeah, my point is the chart is wrong, so it makes it questionable.
The HBM slide along with the performance chart looks very legit to me.
Probably just a typo from AMD about the memory clock. It happens

Really? I'd like to see AMD get back in the ring, but there's no way they will be able to charge $999 for a single-GPU card. Whether or not you or I agree, Nvidia is considered a more premium brand than AMD, so they can charge more for equivalent performance. It may be unfair, but it's the way things are. If AMD charges the same as Nvidia for similar performance, they won't get any sales. Especially if the AMD solution is hotter and more power-hungry. I hope this isn't the case, but if these leaks are legit, two 8-pin power connectors on a single-GPU reference board are not a good sign. If the R9 390X takes the performance crown by a non-negligible margin, AMD might be able to get $700 for it, if they're lucky. Otherwise, $599 at most.

Personally, I find CLC to be an ugly hack and wish it would go away. Air cooling with as few and as high-quality fans as possible is the way to go.
You have two GPUs:
One have massive more bandwidth with a brand new memory type. The other one have 4GB more VRAM.
Bandwidth is greatly needed on the resolution these GPUs are meant for, ie 4K and such. Everything above 8GB is overkill. 4-6GB VRAM usage is where we are at now.

Then we have traditional air cooling on the Titan X, while R9 390X get custom made water cooling from Cooler Master. 295X2 had what temps, 40C ish? 780 Ti, 70-80C?

AMD have clearly a more premium product here if you ask me.


After 2.5 years of 28nm GPUs, im waiting for a R9 380/GM200 cut-down version with higher than GTX980 performance at $300-350 max.
If that doesnt happen i will not upgrade my GPU and will not recommend any of those new GPUs from both companies.
Try 2016 for 16nm FinFET GPUs. Nvidia and AMD have spent much money on getting better performance out of the same node, while earlier they relied on a smaller node. TSMC messed up. AMD/Nvidia had to scale up the chips, more silicon, HBM, Water cooling, Titan cooling and not cheap midrange cooling you find on $300 cards. All of this cost money. AMD/Nvidia got to get the money somehow. Thats where we come in.
 
Last edited:
Feb 19, 2009
10,457
10
76
So, some people clearly are willing to pay hundreds of dollars for only couple of games? Just wondering, is there something wrong with your brain? It doesn't make any sense and there fore you don't have the right to [complain] about the price of the product and power usage if the card runs quiet.

Are you kidding? Plenty of people folk out major $ for rigs just to play BF4 MP64 smoothly with no min fps dips below 60 fps.

When its a great game that you will be playing a lot of, the entire purpose of a gaming rig you build is just to give you the best experience/immersion from that game.

If I had to pay $799 for 390X that struggles with GTA V or Witcher 3, that hurts man.
 
Last edited by a moderator:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Let's try to bridge the gap, shall we?

Will DX12 make many of these custom tools irrelevant? Most of these tools seem designed to overcome the draw call limitations of DX11 and previous. If game developers have near-metal access to all hardware, then they wouldn't need TressFX, Faceworks, etc. to get the rendering performance they need, right? At that point, performance would rely 100% upon the raw performance of the GPUs.

/oversimplification
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Let's try to bridge the gap, shall we?

Will DX12 make many of these custom tools irrelevant? Most of these tools seem designed to overcome the draw call limitations of DX11 and previous. If game developers have near-metal access to all hardware, then they wouldn't need TressFX, Faceworks, etc. to get the rendering performance they need, right? At that point, performance would rely 100% upon the raw performance of the GPUs.

/oversimplification


Tools and middleware aren't the same thing. Gameworks is mre for game devs to just plugin effects, note that devs are just doing integration and not designing fe scratch.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Let's try to bridge the gap, shall we?

Will DX12 make many of these custom tools irrelevant? Most of these tools seem designed to overcome the draw call limitations of DX11 and previous. If game developers have near-metal access to all hardware, then they wouldn't need TressFX, Faceworks, etc. to get the rendering performance they need, right? At that point, performance would rely 100% upon the raw performance of the GPUs.

/oversimplification

No, not even close. It's middleware, the API it's on only matters insofar as it must align with the API the game uses generally. It's primarily to save developers time from having to code the effects themselves. DX12 is going to make coding take more time, not less. That's the trade off with low level programming. Pre-packaged middleware like GameWorks, TressFX, UE4, SpeedTree, wwise etc are going to become more commonplace, not less.
 
Mar 10, 2006
11,715
2,012
126
After 2.5 years of 28nm GPUs, im waiting for a R9 380/GM200 cut-down version with higher than GTX980 performance at $300-350 max.
If that doesnt happen i will not upgrade my GPU and will not recommend any of those new GPUs from both companies.

I hear ya, but at a die size of 601mm^2 for the GTX Titan X (per the leaked GPU-Z screenshots), no way you're going to be getting one of those for $300-$350 max.

At any rate, can't wait for some FinFET GPUs.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I hear ya, but at a die size of 601mm^2 for the GTX Titan X (per the leaked GPU-Z screenshots), no way you're going to be getting one of those for $300-$350 max.

At any rate, can't wait for some FinFET GPUs.

We had a 520mm2 die at $349 with the GTX570 4+ years ago at 40nm.

28nm is already 3 years in full production, wafer prices should be very cheap by now. 500-600mm2 dies should not cost more than what 500-600mm2 dies were at 40nm in late 2010.

Also, AMD already sells 430mm2 Hawaii (R9 290 4GB) for $250. I would not be willing to spend more than $300-350 for a 400-450mm2 GPU at 28nm even with 4GB HBM in 2015.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Wow! Where do you get this from? AMD's using heavily throttled cards for the comparison? Default and uber? Reference and uber? It was quiet and uber. AMD supplied the leaks? I suppose this could all make sense to you, but realize there is nothing leading you to believe this except your imagination?

We've only seen rumored performance from either of these cards, so either/both/neither could end up a disappointment. We should be excited about both of them and hope they both compete hard enough that neither manufacturer can take advantage.

The performance improvement we are seeing from both companies considering no node shrink is quite impressive. If these archs had been able to be released on time along with a shrink we would have likely seen greater than 2x performance increase. It's been a long while since we've seen that. I'm impressed with what we are seeing from both of these companies.

And before you think I'm getting defensive, I'm not. I'm just a bit shocked at how you view these two cards. :)

Lol!!!
Well, I think you actually got one thing right here. It is 100% all speculation that I imagine in my head. I think that was already really clear but you can pat yourself on the back there for catching something so blatantly obvious.

By the way, its not a lot different from how you are imagining that everything AMD is doing with this launch is right. Or that they are somehow delaying things so they can make sure they get it right this time, or whatever it is you think.

So when we speculate, it is always imagined.

There is a lot of my post you didn't seem to get though. So to really try to keep it simple, the point iof it is that maybe we shouldn't get overly optimistic with those charts.

It wasnt to trash on AMD at all nor was it to put down the 290x. As for the 290x über or not, the point is that the 290x reference out of the box performance is not equal to the vendor options that can be bought today. You might want to forget the original 290x and its very special dual bios but it really happened. The 290x reference didn't ship with a random bios setting, they were shipped in "quiet" mode. If you wanted to run uber mode, the switch need to be slid over. When I say default, I am just referring to this. The out of the box reference 290x was defaulted to quite mode.

Since AMD launched the 290x reference and they decided to water cool the 399x, would they not be very proud of the outcome of such a decision. And since these leaks said nothing of how quiet the 390x is, It appears they want to showcase its performance. And since they are comparing one generation to their previous, why would you expect they used anything other than a reference 290x as a basis?

I cannot possibly say what version of the 290x AMD used to base the performance off of in these slides but I can say that there is a real gap between the throttled down reference 290x in its defaulted "quiet" mode and the aftermarket editions that run full speed at all times. This is undeniable. If we are speculating, it is important where that bar starts. So excuse me, I was just trying to gauge the charts and just started thinking on it.

And then your last mix up I will address here. You are shocked at how I view these two cards. Its obvious in your speedy response to defend AMD you totally misunderstood anything I said. I never meant that somehow the Titan X will be great and awesom and the 390x just won't. Whatever you think my view is on these two cards, it is all in your head. If this leak is real then AMD set out to rally up their partners in the wake of another nvidia card launch. The Titan x will be yet another card that outperforms the 290x and AMD must be trying to soften the blow by having a 390x prep rally. I have absolutely no opinion of how the 390x will stack up against the TitanX. In my opinion, The only thing good about the nvidia card is that it will launch very soon and that it will be even more powerful than what is currently out. The only thing bad about the 390x is that it won't be around for awhile. That is it. That is how I feel. That is also why AMD might have an event with their partners, to soften the blow of get another nvidia launch when they have had nothing since the 980 launched 6 months ago.
If you read anymore into it than that, its your problem.

And if you read anything more into me talking about the performance, than its on you. Cause I never said that the 390x won't be amassing. I never said it won't beat a titanX. But I did warn against setti g our expectations too high for a card yet to see the light of day. And just for the record, I never said the Titan X will be this awesome untouchable card. But it is a given that it will be more powerful than both the 290x and the 980. Which is the only reason AMD might have wanted to throw a 390x party the day before titanX launches anyway.

Its not because the 390x will stink in comparison, it is because AMD having nothing right now, nothing 6 months after the 980, nothing in the wake of another nvidia major launch.

I hope that AMD can launch the 390x in a very short time. I hope the 390x stomps the Titan X. I do not want to set the bar that high, but I really do hope they can at least catch the Titan X and all the better if they stomp it. This would be a terrific outcome. It is excitement the GPau market needs
 

DownTheSky

Senior member
Apr 7, 2013
800
167
116
I'm afraid this launch will get over-hyped and leave people disappointed.

I don't know what to say. Based alone on the GPU specs and arch improvements perf should be at least +45%. Then comes the HBM. Only way they can screw this up is delaying the launch too much.

If AMD was smart they'd use this launch to gain market share and do their best to impress people in order to keep it. That means good drivers @ launch and heavy focus on their drivers with day 1 updates for every major game release. Like nvidia does.

If you'd go on gamer forums you'd see the mindset is very different from here. Gaming = nvidia and AMD gets trashed every chance they get.
 
Last edited:

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
So, at the Titan X thread there's a screenshot of AT's review stating 33% average over the 980. No DP, only Titan in name. Dissapointing. Well, they couldn't do much more on 28nm, but it is what it is. $1000... gimme a break.

The 390x performance slide has scores ranging from 50 to 65% over the 290x: 50, 55, 60, 65, this leaves us with a 57.5% average

980 = 100 (our baseline)
290x = 90 (10% slower than the 980)

GM200 = 100*1.33 = 133
390x = 90*1.575 = 141.5

Overclocking aside (since probably GM200 will overclock nicely and has some headroom untapped), stock to stock (considering the leaked clocks put 390x at 1050MHz), it looks like this one is AMD's on performance alone, if the seemingly true leaked slides are to be believed.

The thing now is, how much power will GM200 consume at 980 clocks...?
 
Last edited: