[Rumor]R9 300 series will be manufactured in 20nm!

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
This compares GM200 vs. Hawaii (W9100). The M6000 will have to compete with Fiji. I also wonder if the card will maintain it's max boost clocks while doing compute. The 7TFLOPS SP is assuming 1.12GHz.

Besides, all I'm saying is that overall GCN is not inferior to Maxwell.
You don't seem to really disagree with that on any particular metric.

As far as sales go, that's nothing I was attempting to address. I'm sure more Quadro cards are sold. It's likely to stay that way until AMD can figure out how to pry the Quadros out of the workstations at Autodesk, etc... and get them to actually design their software with their cards included in the workflow, instead of just AMD optimizing drivers as best they can.

LOL, seriously, I though GCN was an architecture, not a chip name!, and i dont see GCN (Hawaii) anywhere near Maxwell in performance or efficiency. As for DP and OpenCL, NV has CUDA and a huge proportion of the professional market using it, while OpenCL still has bugs in the compilers from NV, AMD and Apple. That is why most professionals, aren't moving to AMD cards and OpenCL.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I think it would be freaking wonderful if the 390x launched as a 20nm part

I think most people would love to see that.

To tell you the truth, I would just love to see Fiji launch period.........
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
LOL, seriously, I though GCN was an architecture, not a chip name!, and i dont see GCN (Hawaii) anywhere near Maxwell in performance or efficiency. As for DP and OpenCL, NV has CUDA and a huge proportion of the professional market using it, while OpenCL still has bugs in the compilers from NV, AMD and Apple. That is why most professionals, aren't moving to AMD cards and OpenCL.

CUDA has nothing to do with the relative performance of the two designs. Nor does it's commercial success. And if you are talking games only for efficiency, then I agree. But that's the only place that it's more efficient. That doesn't make it overall superior.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
LOL, seriously, I though GCN was an architecture, not a chip name!, and i dont see GCN (Hawaii) anywhere near Maxwell in performance or efficiency.

Your post makes the very same mistake you accuse the person you are responding to of making. You just implied GCN architecture is worse than Maxwell architecture in performance and efficiency by comparing older generations of GCN to NV's newest Maxwell. Generation wise, GCN 1.0-1.1 was always meant to compete with Kepler, not Maxwell. Did HD7970/7970Ghz compete well with 680/770? Yes. Did R9 290/290X compete well with 780/780Ti? Yes. That means we have this:

HD7970/7970Ghz (GCN 1.0) vs. GTX680/770
R9 290/290X (GCN 1.1) vs. GTX780/780Ti
???????? vs. GTX970/980/Titan

What do we insert for ?????? R9 300 series. AMD is behind which is why R9 290/290X by default are looking like Maxwell competitors but in reality they were never intended to serve that role. People who have followed GPU generations clearly understand this. It's no wonder NV is walking all over GCN 1.0-1.1 in perf/watt.

The reason Maxwell is easily winning is because we haven't seen the corresponding GCN architecture from AMD which was always meant to be the Maxwell competitor. How is this so hard for people to understand? How fair would it be to claim NV is finished/doomed/way behind if we just compared HD5000 series to GTX200 or HD7000 series to Fermi and not given NV a chance to release its own corresponding generation? How short is your memory that you forgot NV was 6-9 months late with GTX460/470/480?!

So in reality the true competitor to Maxwell was always R9 300 series, not R9 200 series. Do you realize the comparison you just made? It's like saying GTX580 is crap because it's only as fast as an HD7870 but uses 2X the power. No **** Sherlock! That's because you aren't comparing like-for-like generations, just like today it's absurd to compare Maxwell to GCN 1.0-1.1 and make the statements people like you make in regard to Maxwell vs. GCN.

Comparing GTX970/980 (new gen mid-range) to R9 290X/780Ti (old gen flagship) is akin to comparing HD7870 (new gen mid-range) to a GTX580 (old gen flagship). You need to be able to compare like-for-like GPU generations to be able to make statements that GCN is not competitive with Maxwell in performance or perf/watt.

It's amazing how no one claimed that NV is doomed when HD7870 smashed GTX580 by 2X in perf/watt but today AMD is clearly doomed, yet they haven't even released the proper Maxwell competitor. :rolleyes:

perfwatt_1920.gif
 
Last edited:

lopri

Elite Member
Jul 27, 2002
13,329
709
126
IIRC, TSMC publicly stated that 20nm brought performance improvement of 15% compared to its 28nm. I don't remember whether they meant density, per-clock performance, or power reduction - but whatever that was I don't think 15% improvement will be enough to justify designing new GPUs on that node. After adding whatever new features AMD/NV plan to bring with next-gen GPU there will be no transistor budget left for the kind of performance increase that is often expected from generational products.

The "Enhanced" Cyclone performs barely 10% faster than the original Cyclone. And Apple had to disable one of the cores in their A8X. Qualcomm's S810 is an underperforming laughingstock. There are enough circumstantial evidences of TSMC's 20nm sucking. Apparently TSMC know this and there have been talks about 16FF, 16FF+, and even 10nm from TSMC and how they plan to bring those processes online quickly.

Under this cloud, the chance of 20nm GPU happening is slim to zilch.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Well if my korean guy is correct on 20nm or if its really 28nm, as long as they get the TDP down, either two processes are fine by me.

Expecting AMD to significantly reduce their TDP below 250W and still compete well with Titan X (250W TDP) is asking too much. AMD has some great engineers but I don't believe the 300 series will be significantly more efficient than Maxwell. If they hit parity with Maxwell, I'd say they're doing really well with a card that's not made for pure gaming.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Your post makes the very same mistake you accuse the person you are responding to of making. You just implied GCN architecture is worse than Maxwell architecture in performance and efficiency by comparing older generations of GCN to NV's newest Maxwell. Generation wise, GCN 1.0-1.1 was always meant to compete with Kepler, not Maxwell. Did HD7970/7970Ghz compete well with 680/770? Yes. Did R9 290/290X compete well with 780/780Ti? Yes. That means we have this:
Where did I compare older gen GCN, I stipulated Hawaii, which is the current gen. 3D stated the names, not me!
Unreleased products dont count, therefore GCN compared to Maxwell is miles behind period.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Expecting AMD to significantly reduce their TDP below 250W and still compete well with Titan X (250W TDP) is asking too much. AMD has some great engineers but I don't believe the 300 series will be significantly more efficient than Maxwell. If they hit parity with Maxwell, I'd say they're doing really well with a card that's not made for pure gaming.

^^^This^^^

If it's 1/2 DP and even comes close to Maxwell gaming efficiency @ 1/32 DP it will be very impressive. I don't think it's going to happen.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Where did I compare older gen GCN, I stipulated Hawaii, which is the current gen. 3D stated the names, not me!
Unreleased products dont count, therefore GCN compared to Maxwell is miles behind period.

Except in those workloads I stipulated which you then dismissed. Overall it's not miles better, or even any better. If I was buying a GPU for OpenCL development or DP workloads I'd be laughing at nVidia's latest gen products. Even perf/W, never mind perf/$.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
CUDA has nothing to do with the relative performance of the two designs. Nor does it's commercial success. And if you are talking games only for efficiency, then I agree. But that's the only place that it's more efficient. That doesn't make it overall superior.

Good point well made!....however OpenCL is far from a professional platform anyway, the only "firewhatever" users I know, ditched the card back for CUDA as they lost too many hours due to OpenCL issues, NV and AMD both.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Except in those workloads I stipulated which you then dismissed. Overall it's not miles better, or even any better. If I was buying a GPU for OpenGL development or DP workloads I'd be laughing at nVidia's latest gen products. Even perf/W, never mind perf/$.

Then you would be cursing it!
 

Vaporizer

Member
Apr 4, 2015
137
30
66
IIRC, TSMC publicly stated that 20nm brought performance improvement of 15% compared to its 28nm. I don't remember whether they meant density, per-clock performance, or power reduction - but whatever that was I don't think 15% improvement will be enough to justify designing new GPUs on that node. After adding whatever new features AMD/NV plan to bring with next-gen GPU there will be no transistor budget left for the kind of performance increase that is often expected from generational products.

The "Enhanced" Cyclone performs barely 10% faster than the original Cyclone. And Apple had to disable one of the cores in their A8X. Qualcomm's S810 is an underperforming laughingstock. There are enough circumstantial evidences of TSMC's 20nm sucking. Apparently TSMC know this and there have been talks about 16FF, 16FF+, and even 10nm from TSMC and how they plan to bring those processes online quickly.

Under this cloud, the chance of 20nm GPU happening is slim to zilch.
IF AMD would use 20nm it would be produced at GF and not TSMC. Therefore all your examples of a broken TSMC are not relevant.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Does AMD has a similar lead in single precision compute? If they could exploit their architecture for gaming (gaming physics etc) that would be good. Simply getting things into games that would require better compute performance.

I guess they don't have a single precision advantage when it comes to maxwell. if its not going to be beneficial for gaming they really should cut it down.

I've always liked that their cards were 2 in 1 and its obvious they offer more in their hardware than nvidia does, but they need to exploit that. A standard for physics is way overdue.

it is also worth noting that although the 290/290x are older, they are still relevant against the maxwell chips. Years from now the 290x might well be ahead of the 980 in performance as driver updates weaken. 970 is beat by its older competitor.
 
Last edited:

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
it is also worth noting that although the 290/290x are older, they are still relevant against the maxwell chips. Years from now the 290x might well be ahead of the 980 in performance as driver updates weaken. 970 is beat by its older competitor.

In a strict business sense, they aren't really relevant. They are selling at roughly half or less than their original prices! and you don't want people sticking with their purchases for its longevity because that simply equates to a lost costumer for the next gen or two.

On future performance, its going to seriously expose the DX12 implementation from both camps. I think this will really determine who has the upperhand with future titles in the years to come.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Except in those workloads I stipulated which you then dismissed. Overall it's not miles better, or even any better. If I was buying a GPU for OpenCL development or DP workloads I'd be laughing at nVidia's latest gen products. Even perf/W, never mind perf/$.

Good luck ;)
 

alcoholbob

Diamond Member
May 24, 2005
6,390
470
126
Expecting AMD to significantly reduce their TDP below 250W and still compete well with Titan X (250W TDP) is asking too much. AMD has some great engineers but I don't believe the 300 series will be significantly more efficient than Maxwell. If they hit parity with Maxwell, I'd say they're doing really well with a card that's not made for pure gaming.

It makese sense why it was leaked as a water-cooled part all along. It might be about as fast or faster than a Titan X, but to need a CLC for your baseline model means it's probably going to be a ~350W+ card.

This also means Nvidia is free to have their partners slap a CLC on a 6GB GM200, clock it up to 1500MHz or or higher and easier surpass the 390.

People who have their Titan Xs on water now already have that performance today.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
In a strict business sense, they aren't really relevant. They are selling at roughly half or less than their original prices! and you don't want people sticking with their purchases for its longevity because that simply equates to a lost costumer for the next gen or two.

On future performance, its going to seriously expose the DX12 implementation from both camps. I think this will really determine who has the upperhand with future titles in the years to come.

thats for the shareholders and ceos to worry about. Thinking from a consumers perspective.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It makese sense why it was leaked as a water-cooled part all along. It might be about as fast or faster than a Titan X, but to need a CLC for your baseline model means it's probably going to be a ~350W+ card.

This also means Nvidia is free to have their partners slap a CLC on a 6GB GM200, clock it up to 1500MHz or or higher and easier surpass the 390.

People who have their Titan Xs on water now already have that performance today.

Or AMD didn't see the point in investing the money in developing a blower when the AIO already existed and all they had to do was buy it.

Considering that the original Titan suffered from throttling and reviewers haven't had much good to say about the stock cooler on the Titan-X why would AMD want to invest anything in their own version?

People can try and spin the AIO as something negative, but the reality is it's simply a superior solution.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It makese sense why it was leaked as a water-cooled part all along. It might be about as fast or faster than a Titan X, but to need a CLC for your baseline model means it's probably going to be a ~350W+ card.

Not this again. You do NOT need AIO CLC to dissipate even 500W of heat from a GPU. The reason to use AIO CLC on a graphics card is not because it's needed, but because it's simply better vs. any air cooled solution when it comes to 2 key metrics: performance and noise levels. There is no air cooler in the world that can compete with a 120mm AIO CLC as far as GPU cooling goes. It only makes sense that GPU makers move forward to the 21st century and offer us with options for both air and water-cooled warrantied solutions, especially since water is the better performer. If you don't want water in your system, there will be MSI Lightning, Sapphire Tri-X, Asus Strix versions, etc. It still amazes me to this day how conservative some PC gamers are and they refuse to embrace options.

The inclusion of water cooling on a component in no way means it's required. If Intel shipped Corsair H100i GTX with some SKUs of its X99 CPU series, would you say it's required or would you say 'Wow, thanks Intel for giving me the option of having warrantied AIO CLC"?

This also means Nvidia is free to have their partners slap a CLC on a 6GB GM200, clock it up to 1500MHz or or higher and easier surpass the 390.

Great! More competition, even better performance. We win as consumers. :thumbsup: Nothing would be better than NV giving us 1.5Ghz GM200 6GB AIO CLC, all warrantied. If AMD forces NV to make such a card, why wouldn't gamers be excited? It's not about NV or AMD winning, but us winning, us, the gamers!

People who have their Titan Xs on water now already have that performance today.

That's great! If some people are willing to pay $1K for Titan X and $2K for 2x Titan Xs, there is no need to wait for R9 390X / GM200 6GB cards. Some people could care less about GTX780 and 780Ti or R9 290/290X and simply bought a pair of the original $1000 Titans too. If price/performance and actual cash outlay of $1000 per GPU don't mean much to some consumer, by all means they are free to buy Titan X(s) and slap water blocks on them. Clearly some people on this forum do not understand that some of us have other hobbies outside of games and that means we won't spend $2000 on GPUs. The R9 390 series is not meant to get Titan owners to upgrade. It's about establishing new price/performance levels for more gamers, making that performance more accessible, forcing more competition with GM200 6GB, forcing price drops on 980 / possibly forcing NV to release faster cards like 960Ti, 970Ti, 980Ti, etc. If you are already on a 1.4Ghz Titan X, R9 390X series is not for you. I don't know why this is surprising to you. If someone purchased a 6800U or a 7900GTX first, they wouldn't side-grade to an X850XT PE or X1950XTX in the same generation.

This reminds me of $1K original Titan owners that would be proud of having $550 R9 290X series performance for 9 months already. That's nice, someone 9 months later essentially could get 2x Titans by buying 2x R9 290Xs. That's how the market works and early adopters understand this. If R9 390X matches the Titan X or comes in a 95% of its performance for $699, Titan X owners won't care. And if in 18 months there is an NV/AMD card for $500 that beats the Titan X, Titan X owners again shouldn't care.
 
Last edited:
Feb 19, 2009
10,457
10
76
If R9 390X matches the Titan X or comes in a 95% of its performance for $699, Titan X owners won't care.

Well, let's face reality, its very hard for AMD to swing loyal NV users into switching.

Even if 390X is 25% faster than Titan X at $799.

Reading some comments from Titan X owners on [H], OCN, NeoGAF etc, is hilarious. Example include: "Wouldn't touch AMD GPU even if NV raped my wife & burn down my house".. such classics like that, is why NV can charge whatever the heck they want for GPUs. Including $3,000 Titan Z that's slower & throttles badly.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
HD7970/7970Ghz (GCN 1.0) vs. GTX680/770
R9 290/290X (GCN 1.1) vs. GTX780/780Ti
???????? vs. GTX970/980/Titan

Wouldn't your "table" mean this:

HD7970/7970Ghz (GCN 1.0) vs. GTX680/770 (GK104)
R9 290/290X (GCN 1.1) vs. GTX780/780Ti (GK110)
R9 285 (GCN 1.2) vs. GTX970/980/Titan (GM204/GM200)
 
Feb 19, 2009
10,457
10
76
Wouldn't your "table" mean this:

HD7970/7970Ghz (GCN 1.0) vs. GTX680/770 (GK104)
R9 290/290X (GCN 1.1) vs. GTX780/780Ti (GK110)
R9 285 (GCN 1.2) vs. GTX970/980/Titan (GM204/GM200)

Not unless R390/X is identical to GCN 1.2, which we know it won't be due to at least HBM, that would require an entire new memory subsystem.

So you could say GCN 1.3 or GCN 2.0 or whatever AMD wants to refer to it, is the Maxwell competitor.
 
Mar 10, 2006
11,715
2,012
126
Well, let's face reality, its very hard for AMD to swing loyal NV users into switching.

Even if 390X is 25% faster than Titan X at $799.

Reading some comments from Titan X owners on [H], OCN, NeoGAF etc, is hilarious. Example include: "Wouldn't touch AMD GPU even if NV raped my wife & burn down my house".. such classics like that, is why NV can charge whatever the heck they want for GPUs. Including $3,000 Titan Z that's slower & throttles badly.

Dude, I haven't bought an AMD card for my own use in years, but if 390X had been in the market at the same time as the Titan X, and the 390X was 25% faster and $200 cheaper, then I'd have gone with the R9 390X.

But the reality is, AMD didn't show up to the fight, and NVIDIA won by default. I think it was AMD's own people who stressed how important showing up to the fight is:

http://www.anandtech.com/show/2937