[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Not sure why you are ignoring the evidence. 4800 and 5800 followed the 2900 and 3800 series. the GPU division increased market-share to almost even and was making a 10% profits DURING the recession when nVidia were making losses and losing marketshare.

I have no idea how you can call that strategy a failure. Those profits were huge at that time.

exactly. he fails to see things in the context of the overall GPU market, the general economy due to recession in 2008-2009 and the backdrop of the HD 2900XT debacle when AMD GPU market share took a real beating. AMD turned it around with HD 4800 series and consolidated it with the HD 5000 series.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
imho,

The sweet spot strategy was a success and turned around their graphics division, making profits, and did take discrete leadership away from nVidia. AMD's execution and balanced, efficient GPU's placed a lot of pressure based on nVidia tried to bring the big dies first to market -- and very difficult.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
I'll judge it as a non-bias gamer, thanks. I will say it again: Coming very late with a massive die, high TDP & an advantage on memory tech without beating Titan X is FAILURE. Utter FAILURE.

I don't buy my video cards to beat other video cards. I buy them to render pixels on my monitor. And I buy the card that will give me the highest FPS for what I have to spend.

What you're talking about is bragging rights. Basing your video card purchases simply to be king of the hill is the utter failure in my opinion.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I don't buy my video cards to beat other video cards. I buy them to render pixels on my monitor. And I buy the card that will give me the highest FPS for what I have to spend.

This.

Like all marketing (GPUs, soft-drinks, politicians, etc.) the goal is to make you feel superior to the opposition - even if that opposition is competing in a different event altogether. Not unlike gluten-free advocates that suddenly become wise sages regarding economics. One has nothing to do with the other.

If the 390X is slower than the Titan X/980Ti and more expensive, then we have a problem. Otherwise, it will more than likely be priced appropriately to its performance level.
 
Feb 19, 2009
10,457
10
76
I don't buy my video cards to beat other video cards. I buy them to render pixels on my monitor. And I buy the card that will give me the highest FPS for what I have to spend.

What you're talking about is bragging rights. Basing your video card purchases simply to be king of the hill is the utter failure in my opinion.

It's not about bragging rights.

For people who want the best and willing to pay for it, that's where the top of the market is. That's why NV can sell 980 and Titan X at such inflated prices and selling beyond their expectations.

390X is AMD's biggest opportunity to steal the performance crown in a very long time. Look at how well AMD competed against GK110 with Hawaii. Much smaller die, same memory tech, matching & exceeding on performance.

For AMD to not be able to exceed GM200 with a huge 390X GPU with the HBM advantage, would be going backwards on progress. That's a fail.
 
Feb 19, 2009
10,457
10
76
If the 390X is slower than the Titan X/980Ti and more expensive, then we have a problem. Otherwise, it will more than likely be priced appropriately to its performance level.

How low the price would it be appropriate?

R290/X at $240/280 doesn't help them be profitable, it's actually a major loss given the quarterly reports.

If the 390X is 20% slower than Titan X (or 20% faster than 980) as rumored, you can bet NV will put out a cut-down GM200 to price pressure it.

Imagine the cut-down 390 if that were the case. It will be only slightly faster than the 980, probably equal to custom 980 models. The 970/980 are small mid-range GPUs, 224/256 bit bus PCBs are cheap. NV drops the 980 to $399.

Now AMD is forced to sell an expensive HBM + massive GPU for less than that (NV premium tax and all).

Now, I don't foresee this being much of a chance from happening. Because to me, seeing as how fast R290/X is compared to NV's big Kepler, there's almost zero chance of a large next-gen 390X + HBM being slower than GM200.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
For AMD to not be able to exceed GM200 with a huge 390X GPU with the HBM advantage, would be going backwards on progress. That's a fail.
Again, if AMD can put out cards that compete favorably against Nvidia's offerings on a price:performance comparison, it's not a fail in my book. I don't really care which company makes the uber ultimate super mega fastest card. I care about which card will give me the most performance for my money when I feel the need to upgrade. Apparently I'm not the only person who thinks this way as TechPowerUP has dedicated a page in every video card review comparing "Performance per Dollar" of many recent video cards:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/31.html

If people want to spend 100% more to get 15% more performance, that's up to them. If I think they're idiots for doing so, that's up to me. Obviously their goal is to get every last FPS they can, price be damned. I prefer to get the best value I can for my money. It's simply a difference of priorities.
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
How low the price would it be appropriate?

R290/X at $240/280 doesn't help them be profitable, it's actually a major loss given the quarterly reports.

If the 390X is 20% slower than Titan X (or 20% faster than 980) as rumored, you can bet NV will put out a cut-down GM200 to price pressure it.

Imagine the cut-down 390 if that were the case. It will be only slightly faster than the 980, probably equal to custom 980 models. The 970/980 are small mid-range GPUs, 224/256 bit bus PCBs are cheap. NV drops the 980 to $399.

Now AMD is forced to sell an expensive HBM + massive GPU for less than that (NV premium tax and all).

Now, I don't foresee this being much of a chance from happening. Because to me, seeing as how fast R290/X is compared to NV's big Kepler, there's almost zero chance of a large next-gen 390X + HBM being slower than GM200.

Time will tell I guess. Here's hoping the reviews are impartial, use a wide variety of games, and most importantly leave the past were it belongs.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
How low the price would it be appropriate?

R290/X at $240/280 doesn't help them be profitable, it's actually a major loss given the quarterly reports.

If the 390X is 20% slower than Titan X (or 20% faster than 980) as rumored, you can bet NV will put out a cut-down GM200 to price pressure it.

Imagine the cut-down 390 if that were the case. It will be only slightly faster than the 980, probably equal to custom 980 models. The 970/980 are small mid-range GPUs, 224/256 bit bus PCBs are cheap. NV drops the 980 to $399.

Now AMD is forced to sell an expensive HBM + massive GPU for less than that (NV premium tax and all).

Now, I don't foresee this being much of a chance from happening. Because to me, seeing as how fast R290/X is compared to NV's big Kepler, there's almost zero chance of a large next-gen 390X + HBM being slower than GM200.

Perhaps many people are waiting to see what the 300 series offers and don't want to buy a 200 series right now, but could buy a 200 series right now if they wanted to.

If you are in no hurry, and got priced out of upgrading by the mining craze that happened with the 290 / 290x launch (along with no aftermarket cooler options for a few months), why not wait and see how the 300 series performs and what the cost will be?
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
R290/X at $240/280 doesn't help them be profitable, it's actually a major loss given the quarterly reports.

Can you provide me with direct evidence that AMD is selling the 290/X at a loss? Since AMD combines their CPU/APU and GPU sales in one report, I find it difficult to identify the exact silicon responsible for AMD's losses. Please share this knowledge with me.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Perhaps many people are waiting to see what the 300 series offers and don't want to buy a 200 series right now, but could buy a 200 series right now if they wanted to.

If you are in no hurry, and got priced out of upgrading by the mining craze that happened with the 290 / 290x launch (along with no aftermarket cooler options for a few months), why not wait and see how the 300 series performs and what the cost will be?

That's basically how I ended up with a GTX 780 Lightning. At the time of purchase it was cheaper than a stock R9 290.

And now that I'm back in the market to buy, it seems GTX 980 Ti will hit the scene first, by a few months.

As just one ant in this ant hill, AMD continues to miss selling to me. And considering I only used Radeon's in my personal rig for the last 15 or so years. Woof.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
That's basically how I ended up with a GTX 780 Lightning. At the time of purchase it was cheaper than a stock R9 290.

And now that I'm back in the market to buy, it seems GTX 980 Ti will hit the scene first, by a few months.

As just one ant in this ant hill, AMD continues to miss selling to me. And considering I only used Radeon's in my personal rig for the last 15 or so years. Woof.

FWIW, I ran a GTX 780 ACX from Feb 2014 - Apr 2014; grabbed the 290 mostly for Mantle and 4GB.

The 780 worked great, but got the itch for the new toy.

The last nVidia card before that was AGP GT 6600, I think I had that for over 2 years!
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I don't buy my video cards to beat other video cards. I buy them to render pixels on my monitor. And I buy the card that will give me the highest FPS for what I have to spend.

What you're talking about is bragging rights. Basing your video card purchases simply to be king of the hill is the utter failure in my opinion.

Amen.
Why I find the R9 290 so appealing right now is that it's far cheaper than the GTX 970, and puts similar to better output. What I find appealing about the R9 290x is it beats out the GTX 970 usually and is STILL cheaper.
Is the R9 290x better than it's "competitor" the GTX 980? Debatable, most times not. But at the end of the day, I'm buying it to render pixels, I don't care.
If the 390x is $500-700 and is as fast or slightly slower than the Titan X (or GTX 980Ti) then I'm ok. I'm looking for a price competitive product, not a "I beat xyz product so I win!!!!"
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I don't buy my video cards to beat other video cards. I buy them to render pixels on my monitor. And I buy the card that will give me the highest FPS for what I have to spend.

What you're talking about is bragging rights. Basing your video card purchases simply to be king of the hill is the utter failure in my opinion.

:thumbsup: That's my view too. With GPUs advancing so fast and new price/performance levels established every year ($400 R9 290 vs. $1K Titan, $330 970 vs. $699 780Ti, etc.), I don't see much point in spending hundreds of dollars extra for that last 15% increase in performance. That's why I thought cards like GTX570 and HD6950 were were more exciting during that generation rather than 580 and 6970. Similarly, I much prefer HD5850, R9 290 and GTX970 to 5870, R9 290X and 980.

A lot of people also don't account the real financial impact for a gamer who is reselling older cards. As a hypothetical example, let's say I can find a deal on an R9 295X2 for $450, after selling my cards, it'll cost me only $250 to get that upgrade. However, if I am looking at an $800-1000 card, it'll cost me $600-800 to upgrade, a huge difference! If R9 295X2 (or some other card) gets me the performance increase I desire at 1080P, why would I spend $600-800 more for that last 15-20% that shows up at 4K? That's just 1 example.

AMD isn't going to be selling many 390X at $599 if its slower than Titan X. Why? Cos the 980Ti is coming, as fast or faster than Titan X at ~$799. The gap will be a blow out. They'll be forced to sell 390X for $399, then the poor 390 also with HBM, at what, $299? lol.

Since you are speculating, we don't know how fast 980Ti will end up. What if we get R9 390 non-X with 90% of Titan X's performance at $499, then R9 390X with 97% of Titan X's performance for $599? Ok, so what if NV has a card 10% faster than the Titan X for $799? All these 3 cards have their own market segments. You make it sound like going from $499-599 to $799 is pocket change, but what if someone is upgrading from dual cards to dual cards? All of sudden your upgrade path becomes $1000-$1200 vs. $1600. That's a big difference for that extra 15-20% performance.

Since we are discussing hypothetical scenarios as you said, you sound are 100% convinced that a card slower than a Titan X at $499 is a failure then with 980Ti at $799. Hmm...

R9 390 nonX $500 = 90%
Titan X $1000 = 100%
980Ti $800 = 110% (22% faster than the R9 390 for 60% price increase!)
R9 390 nonX CF $1000 = 144%-150%

Would you pay $300 extra for a 22% increase in performance in a 980Ti? Would you get an $800 card over 2x R9 390s with the positioning I just outlined? I wouldn't. I'd rather get a single $500 card or dual $500 cards than that $800 card. Again, since we are just discussing hypothetical, I think you have to admit there are too many variables/uncertainties in this equation. For starters, we also need to know the price/performance of the cut-down GM200 6GB. That card could end up the killer high-end card of the generation just like 6800GT was! The vocal minority on these forums is way too obsessed with the market catering to the 4% of PC gamers. 6800GT one of the best cards ever made and it wasn't even in the top 6 fastest cards of its generation. Certainly during this generation 970 is capturing that spotlight, not 980 or the Titan X.

X850XT Platinum Edition, X850 XT, 6800 Ultra Extreme, X800XT Platinum Edition, 6800 Ultra, X800XT all beat 6800GT and yet 6800GT was one of the best cards of that generation.

As far as your points about 980 selling for $500-550 against a $300 R9 290X, that has almost everything to do with image of the 290 series. It's pretty clear that NV gamers prefer GTX970 over 980 by a wide margin and R9 290X offers an even better price/perf and more VRAM than the 970 does. Therefore, imho, the market's current trend of skipping 290X stems largely from misunderstanding of its real world power usage, noise levels, temperatures, and performance available in after-market 290X versions. Right now the average Joe's perception of 290 series is that all of them run hot and loud and they have mediocre performance based on what he remembers from launch. This view is reinforced in more recent reviews which continue to use reference blower 290X in all of their temperature, noise levels, power usage and overclocking sections. This coincidentally also explains why GTX960 is vastly outselling 280X and 290 despite both of those cards being superior for gaming.
 
Last edited:

JSt0rm

Lifer
Sep 5, 2000
27,399
3,948
126
I jumped on the 290x because I dont want to pay even $500 for a gpu. I like the $300 price point every 3 years or so and that fit nicely into that scheme and seems to be a nice upgrade from the 6970 2gb I was running before.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
The sweet spot strategy was a success and turned around their graphics division, making profits, and did take discrete leadership away from nVidia. AMD's execution and balanced, efficient GPU's placed a lot of pressure based on nVidia tried to bring the big dies first to market -- and very difficult.

It's the opposite. RV770 was lightyears ahead of GT200 in perf/mm2 and double precision. Architecturally, Conroe vs. A64 comes to mind, and that's probably an understatement. AMD stupidly decided to take this enormous architectural advantage and squander it on a tiny 260mm2 chip compared to Nvidia's mammoth 576mm2 GT200. A hypothetical 500+mm2 RV770 with 1600 shaders and a 384-512bit GDDR5 bus would have resulted in the biggest beat down in graphics history, worse than G80 vs. R600 or the 9000 series vs. FX. AMD could have went for Nvidia's the jugular, but because of poor management they wasted the opportunity.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If people want to spend 100% more to get 15% more performance, that's up to them. If I think they're idiots for doing so, that's up to me. Obviously their goal is to get every last FPS they can, price be damned.

Ya, but the market gets FAR more irrational than that. Look at Titan Z reviews vs. R9 295X2 reviews online. R9 295X2 was faster + cooler + quieter + 1/2 half as expensive. Today R9 295X2 is $650 on Amazon but Titan Z is $3000, yet look at the users reviews:

Titan Z = 5 stars: 75%, 4 stars = 8%, 14% with 1-2 star reviews
XFX R9 295X2 = 5 stars: 67%, 4 stars = 12%, 15% with 1-2 star reviews

If you read a total of 0 reviews, you'd naturally assume that Titan Z at $3000 was actually a better product for gaming than a $650 R9 295X2 is.

I prefer to get the best value I can for my money. It's simply a difference of priorities.

That's why I prefer price/performance because I feel it's the least biased metric in GPU selection. I feel that by a certain group of gamers on this forum hyping up 980 and Titan X so much are overshadowing the importance of $150-400 GPU segments.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's the opposite. RV770 was lightyears ahead of GT200 in perf/mm2 and double precision. Architecturally, Conroe vs. A64 comes to mind, and that's probably an understatement. AMD stupidly decided to take this enormous architectural advantage and squander it on a tiny 260mm2 chip compared to Nvidia's mammoth 576mm2 GT200. A hypothetical 500+mm2 RV770 with 1600 shaders and a 384-512bit GDDR5 bus would have resulted in the biggest beat down in graphics history, worse than G80 vs. R600 or the 9000 series vs. FX. AMD could have went for Nvidia's the jugular, but because of poor management they wasted the opportunity.

Pretty much. HD4890 die size was tiny compared to GTX275's, the PCB was smaller and it was way more power efficient. The same people who bought Kepler and Maxwell bought GTX275/280/285 cards. :whiste:

cards-bare.jpg


power-load.gif


AT's review on HD4890 vs. GTX275 :

"At 1680 x 1050 and 1920 x 1200 the 4890 is nearly undefeated. At 2560 x 1600, it seems to be pretty much a wash between the two cards.
"

On the 285 vs. 4890:

"All but two. That's how many benchmarks in which our 1GHz/1.2GHz (core/mem) Radeon HD 4890 lead the stock NVIDIA GeForce GTX 285. These tests show that there is the potential for a 959 Million transistor AMD GPU to consistently outperform a 1.4 Billion transistor NVIDIA GPU in the same power envelope at 55nm with similar memory bandwidth." Oh snap, a smaller chip on $250 256-bit bus card was able to beat a $399 384-bit 1.4B NV chip. Damn, AMD must have had 80% market share during that era....oh wait!

HD4890 also cost about $225-250 on the market when GTX285 was $350-400, yet people still bought NV. So during that generation, AMD lead in price/performance, 4890 OC beat 275 OC, 4890 had more VRAM, superior perf/watt, superior perf/mm2, but people still bought NV. Today, TechReport's, TPU's, AT's readers, etc. who are ex-GTX200/Fermi owners are in love with perf/watt of a 750Ti, 960 and 970 and openly bash 290/290X's awesome price/perf. Lulz. :whiste:

That's why all these hyped up 'pro-NV' metrics emphasized during Kepler and Maxwell such as perf/mm2, perf/watt and overclocking are all just used to hide one's preferences for a certain brand. No one who is touting perf/watt and efficiency and overclocking of Kepler and Maxwell would have bought GTX200 series or worse yet Fermi, unless they are a hypocrite or don't value these metrics nearly as much as they claim!

How can we forget, one of the most power inefficient cards ever made - the GTX460 that needed to be max overclocked to keep up at which point it used 100W+ of power over the 5850, despite launching 9 months later. Talk about an engineering flop that became a marketing success!

p460_power.png



And this is the core difference between gamers who focus on price/perf vs. all other metrics - consistency. When you pick and choose what metrics matter which generation, you stand for nothing other than supporting your buying decision based on a random metric that's winning during a particular generation. With gamers who buy NV, you almost never know what's the most important metric in any given generation. One certainly has to wonder what the heck it was during GTX200 and Fermi days....:colbert: (about the only gamers who are justified are those who always buy the best in which case NV's GTX285 and GTX480/580 did make sense, but everyone else?).
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's the opposite. RV770 was lightyears ahead of GT200 in perf/mm2 and double precision. Architecturally, Conroe vs. A64 comes to mind, and that's probably an understatement. AMD stupidly decided to take this enormous architectural advantage and squander it on a tiny 260mm2 chip compared to Nvidia's mammoth 576mm2 GT200. A hypothetical 500+mm2 RV770 with 1600 shaders and a 384-512bit GDDR5 bus would have resulted in the biggest beat down in graphics history, worse than G80 vs. R600 or the 9000 series vs. FX. AMD could have went for Nvidia's the jugular, but because of poor management they wasted the opportunity.

imho,

The smaller die strategy was more-so about creating awareness to garner share and still have solid margins --- smaller dies translated into an execution advantage for AMD -- what would of happened if the bigger dies were not executed in a timely way? AMD is creating a bigger die now--- it's not easy.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

:thumbsup: Of course there is nothing in here we haven't read before; just most people in this thread who are spreading FUD on R9 390X chose to ignore this because it flies in the face of all the re-branding theories or that R9 390X will be a dual-GPU card:

"We have revealed in advance that the new AMD cards will be shown during the Computex 2015 (link) and that AMD is ironing out the Catalyst drivers before the presentation (link). Now, I got to know that the high end Radeon 390X (Fiji XT), a single GPU card, can sport 8GB of HBM memory (so, there will be two version of this card, one with 4GB of HBM and one with 8GB of HBM).

Thanks to the joint effort of AMD and SK Hynix, the Fiji XT GPU can manage 8Gb of HBM through the use of a new technology called “Dual Link Interposer”. This new interposer can double the quantity of HBM usable. We will know more about this technology during the next Computex."

imho,

The smaller die strategy was more-so about creating awareness to garner share and still have solid margins --- smaller dies translated into an execution advantage for AMD -- what would of happened if the bigger dies were not executed in a timely way? AMD is creating a bigger die now--- it's not easy.

I think AMD should have ditched DP for all post-HD6000 products and made pure gaming cards like Maxwell. However, AMD chose to grow their FirePro market share and focus on the HPC/GPGPU market segments. Since they don't have enough money to design so many different ASICs, they decided to make 2 in 1 gaming+HTP/GPGPU compute architecture and GCN was born. The downside to that strategy is while GCN is great at gaming, it gets heavily penalized in perf/watt due to the massive compute it carries (i.e., those transistors can't be as readily/easily turned off during gaming and they take up extra space on the die). Just imagine if R9 290X had almost no DP transistors and we would have a sub-400mm2 chip trading blows with NV's 561mm2 780Ti at lower power usage.

Naturally the task at hand for R9 390X is of monumental proportions since GM200 is the most DP crippled chip in % terms NV made in the last 5 years. If AMD's engineers manage to make R9 390X as fast as the Titan X at gaming in a 550mm2 die and still manage to increase the DP compute performance beyond 290X, it will be an extremely impressive engineering feet for 1 chip! IMO, it's far more impressive to have a 300W chip that's 2 in 1 than to make a 250W chip that's only good at games, SP compute and rendering. But it's clear this strategy is way more complex and for a small company straddled with debt like AMD, it's probably best to focus on making lean gaming chips until they have enough fianncial and human resources to be able to tackle engineering designs of this magnitude.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
There may be a dual GPU offering -- utilizing full cores, a liquid cooled edition single and an air cooled single.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Ya, but the market gets FAR more irrational than that. Look at Titan Z reviews vs. R9 295X2 reviews online. R9 295X2 was faster + cooler + quieter + 1/2 half as expensive.

You can't compare a single-GPU card to a dual-GPU card. Dual cards are inferior because some games (and as some non-gaming applications) can only use a single GPU, and virtually none have 100% scaling. Even if the frame rates are the same, the SLI/Crossfire experience is often inferior. Yes, XDMA helps some, but it's not nearly perfect.
 

biostud

Lifer
Feb 27, 2003
20,231
7,353
136
You can't compare a single-GPU card to a dual-GPU card. Dual cards are inferior because some games (and as some non-gaming applications) can only use a single GPU, and virtually none have 100% scaling. Even if the frame rates are the same, the SLI/Crossfire experience is often inferior. Yes, XDMA helps some, but it's not nearly perfect.

Titan Z and 295x2 are both dual GPU cards....
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
NV's Maxwell chips throw fp64 under the bus and thats why GM200 is 600 sq mm. Nvidia's increased performance efficiency (perf /sqmm) came at the cost of gutting fp64 performance. GM200's fp64 performance is a pitiable 200 - 220 GFLOPS. :D

http://www.anandtech.com/show/9096/nvidia-announces-quadro-m6000-quadro-vca-2015

On the contrary next gen AMD flagship Firepros will keep the tradition with the same 1/2 fp64 rate as Hawaii based Firepro W9000. the flagship Firepros based on the next gen GCN flagship will have a stunning 3.5 TFLOPS of fp64 performance even accounting for lower clocks of 900 Mhz for staying within 250W TDP.

The next flagship Tesla product from Nvidia will not launch till Q2 2017 when GP100 launches. Thats the reason why Nvidia launched dual GK210 based Tesla K80 in Nov 2014.

http://www.anandtech.com/show/8729/nvidia-launches-tesla-k80-gk210-gpu

Even assuming AMD launches their next gen flagship Firepro in Q4 2015 or early Q1 2016 they will have no competition to their fp64 single GPU monster for the full year 2016 in HPC.

That is an interesting point. Are there any figures indicating how well AMD's current FirePro products sell? Are the extra sales for FP64-specific applications going to be enough to make up for a shortfall in gaming revenue?

Perhaps a better question is why, if Fiji is a chip that excels more in FP64 than in gaming, AMD didn't just start off with the FirePro and bring out the consumer R9 390X later (if at all). This is what Nvidia did with GK100 - Tesla cards first, then Titan, then GTX 780(Ti). The later GK110 revision is a pure compute chip that isn't going to be included in any gaming cards at all.

I do not expect Nvidia to have a smooth transition to 16/14nm, Pascal, NVlink and HBM. If they go direct to an all out transition without any Maxwell 16/14nm shrinks they are taking on even more risk than Fermi GF100. Lets see how this plays out. Nvidia's strength is GPU architecture and software. AMD's strength is engineering and thats the reason AMD was first to GDDR5 and have now co-invented HBM along with Hynix and proposed it to JEDEC which adopted it as a standard JESD235 in Oct 2013. AMD's efforts with HBM will pay over the next 2 years just as it happened with the HD 4870 to HD 5000 series.

I suspect that Nvidia is going to wind up using a de facto "tick-tock" development plan: new architecture on an existing node, followed by a die shrink to a new node, then repeat. This way, they avoid the issue you discuss where they have to develop a new architecture on a new and untested process. This would not have been feasible at one point for marketing reasons, but now that high-performance nodes are going to stagnate for 4 to 6 years apiece at a minimum, it becomes much more viable as a business plan.

As for AMD, I have mentioned before that I think the real reason they're testing out HBM is to get experience for their upcoming generation of Zen 16nm FinFET APUs. The biggest problem with their existing APU lineup has been the lack of memory bandwidth; discrete cards can use GDDR5, but motherboard manufacturers resist using this (they refused to do so with Kaveri and that feature had to be scrapped), so putting it on-die with HBM is the only way for APUs to be remotely competitive.