[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 28 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Pretty much. HD4890 die size was tiny compared to GTX275's, the PCB was smaller and it was way more power efficient. The same people who bought Kepler and Maxwell bought GTX275/280/285 cards. :whiste:

I bought a 4870, 2 in fact.

I thought they were a damn bargain, way too cheap for the performance and power use advantage as well as more vram.

In fact, I would have bought it if they were $100 higher in price. That was the failure during the 4800 and 5800 series. 5800 series in particular because NV was a no-show for 6 months with the 480 and 9 months with the 460 (which was slower and more power hungry, relying on comparisons to factory OC models vs reference 5800 series). They didn't need to go so low on prices & price-war against NV. The 4800 series was dominating so they came from a good setting.

AMD could have priced them much higher and still sell well due to lack of competition. They did not capitalize.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
There may be a dual GPU offering -- utilizing full cores, a liquid cooled edition single and an air cooled single.

I don't doubt it, AMD has had one every generation since HD4870X2. They already have the infrastructure and experience with a 500W R9 295X2. Whether it will be fully unlocked R9 390 or 390X cores, or cut-down/downclocked 390X cores remains to be seen. I presume such card will succeed R9 295X2 at the $1,499 level.

In AMD's case, I'd try to pull of a major upset and shock by launching that card at $999-1099. I thought HD5970 at $599 vs. $369 HD5870 was one of the most successful strategies for its time. Imagine a card 40-50% faster than the Titan X at $999-1099 with AIO CLC with warranty.

You can't compare a single-GPU card to a dual-GPU card. Dual cards are inferior because some games (and as some non-gaming applications) can only use a single GPU, and virtually none have 100% scaling. Even if the frame rates are the same, the SLI/Crossfire experience is often inferior. Yes, XDMA helps some, but it's not nearly perfect.

Titan Z is a dual-GPU card and competed directly against the R9 295X2, another dual-GPU card. Titan Z lost badly in nearly every metric. Today the Z sells for $3000 but R9 295X2 is $650, yet the Z is slower.

7052


What made it worse is Titan Z thermal throttles by 30% with stock bios/auto fan speed control, but I guess those 75% 5-star reviewers have no idea this is happening. :p

After 20 minutes of gaming, the Titan Z's clock speed dropped from 1000mhz to 706mhz.

Titanz.png


In comparison the R9 295X2 was rock stable.

R9295X2-2-76.jpg


Even if we ignore R9 295X2 for a second, the Titan Z was an unfinished and horribly executed design. How can you release something on the market that thermal throttles by up to 30% but costs $3000? :rolleyes:

Even NV's mouthpiece couldn't defend the turd that was the Titan Z:

"PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design."

My point to use $3K Titan Z vs. $650 R9 295X2 as an example was just to show that there are so many high-end gamers/users that so loyal to a particular brand that AMD could produce a card that beats NV in 95% of all key metrics and price it for 1/5th the price and they would still pick the thermal throttling NV option. D:
 
Last edited:
Feb 19, 2009
10,457
10
76
Sorry, I thought he said Titan X. I've never even seen a Titan Z review - not sure where he found one.

Because NV knew it was DOA due to the R295X2 smack down, they didn't seed review sites with a $3K product that was slower with worse frame time than a $1.5K product.

Still, its not really relevant because dual-GPU cards are very very niche.
 
Feb 19, 2009
10,457
10
76
Since we are discussing hypothetical scenarios as you said, you sound are 100% convinced that a card slower than a Titan X at $499 is a failure then with 980Ti at $799. Hmm...

R9 390 nonX $500 = 90%
Titan X $1000 = 100%
980Ti $800 = 110% (22% faster than the R9 390 for 60% price increase!)
R9 390 nonX CF $1000 = 144%-150%

Would you pay $300 extra for a 22% increase in performance in a 980Ti? Would you get an $800 card over 2x R9 390s with the positioning I just outlined? I wouldn't.

You wouldn't but as history shows, the majority of gamers would, hence the biggest marketshare gap ever.

There's no rationality behind why NV commands such a huge price premium and gamers still folk out for it besides this: It's the fastest.

Look at the 970 vs 980 scenario, before we knew of the 3.5gb cripple vram. We're talking nearly double the price for 15% performance. There's no logic behind that except for the same thing we keep seeing that NV uses for the premium: Fastest. There's a large segment out there who are willing to pay ridiculous $ for the fastest GPU, no ifs or buts.

If the 390X ends up significantly faster than Titan X, I have little doubts that those same people will be willing to pay big $ for it. I honestly don't think its a brand thing up the top, it's just pure performance lead.

What the market doesn't do is reward AMD for being slower and cheaper.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I bought a 4870, 2 in fact.

I thought they were a damn bargain, way too cheap for the performance and power use advantage as well as more vram.

In fact, I would have bought it if they were $100 higher in price. That was the failure during the 4800 and 5800 series. 5800 series in particular because NV was a no-show for 6 months with the 480 and 9 months with the 460 (which was slower and more power hungry, relying on comparisons to factory OC models vs reference 5800 series). They didn't need to go so low on prices & price-war against NV. The 4800 series was dominating so they came from a good setting.

AMD could have priced them much higher and still sell well due to lack of competition. They did not capitalize.

I don't think management at AMD knew that Fermi was 6-9 months late. Do you honestly believe if they had the information that GTX460 would launch by June 2011 that they would release an HD5850 for $259? There is no way they knew this.

As far as HD4000 series goes, I agree. However, putting AMD's strategy aside, I vividly remember cross-shopping MSI HD4890 and MSI GTX260 216 because they were priced $10-20 apart. I truly couldn't understand how a 4890 card that traded blows with 285 when overclocked was priced so close to the GTX260 216 and people were still choosing 260 216/275 over the 4890. This was one of the few times when I realized that there is a clear brand bias that exists in the GPU industry (another one was obviously when people bought GeForce 5 over 9700/9800 series). I agree with you that because AMD priced HD4000-5000-6000 series so low, it became as a shock to many when 7970 came out at $549. However, now NV has basically given AMD a lot of headroom to price R9 300 series by positioning the Titan X at $1K and mid-range 980 at $550. Unfortunately this likely means all of us will be paying more for GPUs moving forward.

If the 390X ends up significantly faster than Titan X, I have little doubts that those same people will be willing to pay big $ for it. I honestly don't think its a brand thing up the top, it's just pure performance lead.

1. This is unlikely because the Titan X overclocks from 1.075Ghz base (in games this is probably closer to 1175-1185mhz for a reference card) to 1.45Ghz. I think Titan X OC will beat R9 390X OC. You also have 12GB of VRAM which is a selling point for the users paying $1K for cards. They wouldn't "downgrade" to a 4GB-8GB HBM1 card.

2. I think there is strong loyalty at the top, much in favour of NV because of NV having the fastest single GPU since 8800GTX days. All those gamers that went 8800GTX--> Titan X are not going to sell their NV cards and buy a hypothetical 10-15% faster R9 390X. They will instead wait for Pascal. That's why I think AMD needs to still focus on price/performance. Even if R9 390X ties the Titan X, AMD should just price it at $599-649 and forget the Titan X. Go for the absolute price/performance kill with the R9 390 non-X at $499 by beating the 980 by 20-25%. Even if NV drops 980 to $399, a $499 R9 390 nonX would still be worth buying so this NV price cut wouldn't undermine the 390 nonX. Pricing R9 390 nonX at $499 with 20-25% more performance than a 980 is going for the jaguar in the $450-550 segment. I have less confidence that R9 380/380X will be able to compete well against the 980 though.

3. You also say performance lead is what matters but that doesn't explain how 680 outsold 7970/7970Ghz. 7970Ghz won on day 1 and retained this lead until the Titan was released. Nearly every launch review shows 7970Ghz beating 680 and 7970 OC beating 680 OC in games, while having more VRAM and costing less than NV's 680 cards! Despite that 680 outsold 7970 and 7970Ghz rather easily.

value-99th-2.gif


In hindsight, knowing how GK204 falls apart in modern games, things only got worse for the 680 2GB as time progressed. Today 7970Ghz/280x is on the heels of a 780, leaving 680 behind by 15-20% regularly.
 
Last edited:
Feb 19, 2009
10,457
10
76
I don't think management at AMD knew that Fermi was 6-9 months late. Do you honestly believe if they had the information that GTX460 would launch by June 2011 that they would release an HD5850 for $259? There is no way they knew this.

Yes. Built from the same foundry, put together by many of the same AIBs. It's not difficult to get that info for professionals when random internet leaks have been very accurate.

They blew their massive lead into a tiny 10% profit on revenue.

The market has shifted way more than in the past that you had noted, the bias has grown larger, the NV premium tax has grown massively. A 390X that is slower will be hammered on pricing to a point where a huge die + HBM part won't be profitable. There's only one way to reverse that market perception trend, be faster.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The market has shifted way more than in the past that you had noted, the bias has grown larger, the NV premium tax has grown massively. A 390X that is slower will be hammered on pricing to a point where a huge die + HBM part won't be profitable. There's only one way to reverse that market perception trend, be faster.

I don't agree. R9 390X can be 50% faster than the Titan X but if the rest of R9 300 desktop and mobile cards are junk, it won't matter. Its NV's mobile stack and GTX750/750Ti/960/970 that are doing the most damage to AMD, not 980 and Titan X. A fast R9 390X doesn't address those lower $100-350 market segments at all. That's why I thought from the beginning if AMD focuses primarily on R9 390 series and neglects the rest of the line-up on the desktop and mobile, they will struggle significantly for another 15-18 months. You realize if R9 370/370X/380/380X flop, 960, 970, 980 and future refreshes and price drops on those NV cards will wreck havoc on AMD's market share? Honestly how many people care about the $550+ GPU market R9 390X will target? Not many.

brand bias exists everywhere.

It's usually far more balanced. BMW vs. Mercedes vs. Audi worldwide. Apple vs. Samsung phones worldwide. Buying GTX960 over a 50-60% faster AMD card is no longer simply brand bias - that's brand devotion.
 
Feb 19, 2009
10,457
10
76
You also say performance lead is what matters but that doesn't explain how 680 outsold 7970/7970Ghz. 7970Ghz won on day 1 and retained this lead until the Titan was released. Nearly every launch review shows 7970Ghz beating 680 and 7970 OC beating 680 OC in games, while having more VRAM and costing less than NV's 680 cards! Despite that 680 outsold 7970 and 7970Ghz rather easily.

1. Stock v Stock, the 7970 loss. The 7970 Ghz ed won, but at a great cost in power efficiency and the reference card that's used as a benchmark by most review site? It's horrid. 1.25vcore default, incredibly noisy, very power hungry. It did not matter that custom models had much lower vcore, were cool, quiet and use less power. The damage was done. It was done by AMD, at the launch of the Ghz Ed and the 7950 Boost, I said it was a mistake, it would cost them dearly for going with 1.25vcore + crap reference blower to send that to reviewers.

2. OC v OC, the power use difference blew up. There's one thing about the 7900 series, left at stock with 1.1vcore (my PCS+ 7950 had 1.07vcore default!), they were very power efficient, as efficient as Kepler ever was in fact. As soon as you pump vcore into them, boom. Power use skyrocket. Kepler OC kept its efficiency, that was its key win.

People are very aware of that because efficiency is a key selling point from NV. It's also something you cannot ignore, when there's potentially a 100W difference, it matters a lot to many gamers, whether you feel thats rational or not.
 
Feb 19, 2009
10,457
10
76
I don't agree. R9 390X can be 50% faster than the Titan X but if the rest of R9 300 desktop and mobile cards are junk, it won't matter.

If 390X is very fast, the rest of the lineup will improve quite a fair bit too. It's not all down to HBM, as certainly smaller cards with less shaders are not bandwidth restricted to *need* HBM.

The damage in the past few quarters is soley from 970. Look at it on the steam charts, its up there at the top in %.
 

JSt0rm

Lifer
Sep 5, 2000
27,399
3,948
126
It's usually far more balanced. BMW vs. Mercedes vs. Audi worldwide. Apple vs. Samsung phones worldwide. Buying GTX960 over a 50-60% faster AMD card is no longer simply brand bias - that's brand devotion.


But you are assuming most people do research into their purchases. I have a good friend who only buys whatever $200 nvidia card is available when he wants a new card. I told him to his face not to do that and he still said he always buys nvidia. He trusts he will get a good product at $200. And in general he does but he also doesnt tweak much.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
People are very aware of that because efficiency is a key selling point from NV. It's also something you cannot ignore, when there's potentially a 100W difference, it matters a lot to many gamers, whether you feel thats rational or not.

Most PC gamers aren't savvy enough to read reviews of after-market cards but only focus on reference card reviews at launch. This is one of the problems that also hurt R9 290 series. That only highlights the ignorance that continues to prevail in the GPU industry. I think AMD should send only after-market open air cooled cards and R9 390 WCE for all launch reviews to address this BS. The extra power usage at the wall on the 7970 you talk about doesn't account that you also got 3GB of VRAM with it.

zpw-xbt.png


680 2GB users can talk all day about perf/watt over 7970 but today modern games cripple 680 on both ends - the GPU performance side - and exploit its major 2GB bottleneck.

In modern games today, 1Ghz HD7990 (ARES II) slaps GTX690 sooooo hard, GK204s are in TKO mode. Of course former GTX680 2GB SLI owners are quiet and don't talk about this because they've moved on to 780Ti/Maxwell. hehe.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Grand_Theft_Auto_V__GPU_v.2-gta_v_2560_msaa.jpg


So really, it's obvious once again GK204 won because of marketing and reviewers covering up its major flaws that would show up down the line. You and I pointed out those flaws continuously and we were right....but yet major reviewers that get paid $ to write reviews ignored them, just like they hide 960's 2GB of VRAM mess under the rug. I find this practice of not pointing out major flaws in GPUs in the near future as providing a disservice to the readers, many of whom haven't been building PCs for 10-20 years and don't have the foresight to see that what they are buying will be obsolete quickly or require serious compromises in IQ.

Again, marketing of perf/watt won over because professional reviewers didn't emphasis how 2GB of VRAM on such a high-end card would become a problem in 2.5 years and beyond. Given their job of working as GPU editors, this is unforgivable. When things are stacked that way, we ended up with a $399 after-market HD7970 1Ghz card such as Gigabyte Windforce 3X going against a $550-580 GTX680 4GB card. How come the reviewers never discussed things from that perspective when hyping up GK204's perf/watt? :cool:

Today it's impossible to put HD7970/7970Ghz and GTX680 2GB on a similar playing field because the latter falls apart in way too many modern games. So while GK204 outsold 7970 by miles, the market made the wrong decision and picked the wrong card as the winning one. The 2GB VRAM bottleneck shows up so often now and we can clearly see R9 280X nearly doubling GTX670 in performance. Ouch.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Mortal_Kombat_X_-test-mkx_2560.jpg


Knowing this information, what does the average Joe do? Buys 3.5GB GTX970 and 2GB GTX960 over the $250 R9 290 4GB.

But you are assuming most people do research into their purchases. I have a good friend who only buys whatever $200 nvidia card is available when he wants a new card. I told him to his face not to do that and he still said he always buys nvidia. He trusts he will get a good product at $200. And in general he does but he also doesnt tweak much.

I know, see above. If AMD priced R9 390X 8GB at $300 with 3X the performance of a $200 GTX960 2GB, your friend would still buy NV, and millions of other PC gamers are like that too. I think that's the point Silver is making -- that AMD can't win on price/performance since most people will buy NV. Then might as well make a faster card and price it at $699 to make $ off the unbiased portion of the market. I think that's a sound strategy too but the amount of people willing to buy $500-700 is too small for AMD to survive off the profits of those cards only. That means whether AMD likes it or not, unless they start putting proprietary and locked gaming features into games to entice people to buy their cards, they will have to compete on price/performance in the sub-$500 segments because the loyal NV customer base will buy slower or more expensive cards to have NV. As a result, AMD has no choice but to offer superior price/performance or change AMD's Gaming Evolved to 100% resemble GWs.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Ya, but the market gets FAR more irrational than that. Look at Titan Z reviews vs. R9 295X2 reviews online. R9 295X2 was faster + cooler + quieter + 1/2 half as expensive. Today R9 295X2 is $650 on Amazon but Titan Z is $3000, yet look at the users reviews:

Titan Z = 5 stars: 75%, 4 stars = 8%, 14% with 1-2 star reviews
XFX R9 295X2 = 5 stars: 67%, 4 stars = 12%, 15% with 1-2 star reviews

If you read a total of 0 reviews, you'd naturally assume that Titan Z at $3000 was actually a better product for gaming than a $650 R9 295X2 is.

Yes, if you buy a $3000 product by looking at the overall "star" ranking on Amazon and not reading any reviews, you deserve what you get.

If you actually read the reviews at that Titan Z Amazon link, you'll notice that most of them are jokes. In contrast, the R9 295 X2 reviews are genuine reviews.

I don't recall anyone on this board - even the diehard anti-AMD partisans - recommending the Titan Z for gaming. None of the major review sites even covered it, and I can't imagine that Nvidia sold too many of them. It might have made some sense to buy a Titan Z when they dropped the price to $1,499.99, if you specifically needed double-precision computing performance. But out of that narrow use case, I don't think it was ever recommended by anyone. That's why I thought you were comparing the R9 295 X2 to the Titan X - the Titan Z was a total nonentity, barely a blip on the radar, and I'm surprised anyone would even bring it up now.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Another excellent analytical post by RS. Kepler doesn't get enough flak for outright sucking nowadays, especially 2GB cards.
 
Feb 19, 2009
10,457
10
76
I know, see above. If AMD priced R9 390X 8GB at $300 with 3X the performance of a $200 GTX960 2GB, your friend would still buy NV, and millions of other PC gamers are like that too. I think that's the point Silver is making -- that AMD can't win on price/performance since most people will buy NV. Then might as well make a faster card and price it at $699 to make $ off the unbiased portion of the market.

I think that's a sound strategy too but the amount of people willing to buy $500-700 is too small for AMD to survive off the profits of those cards only. That means whether AMD likes it or not, unless they start putting proprietary and locked gaming features into games to entice people to buy their cards, they will have to compete on price/performance in the sub-$500 segments because the loyal NV customer base will buy slower or more expensive cards to have NV. As a result, AMD has no choice but to offer superior price/performance or change AMD's Gaming Evolved to 100% resemble GWs.

That's been my point. AMD isn't going to sway and own marketshare on the level of NV by chasing "value". They can only do so if they own the top end with the fastest GPU. It will trickle down to the sheep below, who has blindly accepted "NV has fastest GPU, therefore, NV is better". If AMD is consistently faster for a few generations in a row, I guarantee you the brand recognition will change massively in a good way. Suddenly Radeons will be perceived as the best (whether or not its true for the entire stack!) from the masses.

NV has held the performance crown for a long time and they fully deserve the premium tax, fully well earnt! The 7970 Ghz vs 680 was too close a call, a tiny win at a major cost of efficiency so its an overall lost, as perceived by the market. We know now that GCN is a much more future proof uArch than Kepler, but at the time, efficiency and the notion of NV = better reign supreme.

The mass mentality is very simple, it's why cars, fashion and everything else can be marketed as "my top product is better than yours, therefore, all my products are better than yours", people buy that because they don't want to do their homework or research.

You know when Fermi was late, it used twice the power of the 5870 for 15% faster performance, we in the know called it for what it was, a failure. But you know how NV marketed it? "Fastest GPU in the world". Power consumption be damned, price by damned, late be damned. It's the fastest! Look at how quickly AMD's marketshare which they built upon the 4800 then 6 months lead with the 5800 eroded once Fermi landed. The 460 was inferior by all metrics to the 5850 yet it was lauded as an awesome product and sold very well. Purely because 480 was fastest, it helps with marketing everything else.

ATI had the same effect during their dominance at the top, their low end stuff aren't that good but it sold well (I still recall RL friends who bought lowe-end ATI stuff cos "9700/9800 pwns"). The "Halo Effect" is very strong in marketing.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Most PC gamers aren't savvy enough to read reviews of after-market cards but only focus on reference card reviews at launch.

Most PC Gamers have no clue about hardware at all. One of my co-workers is of that kind and he insist on buying NV even thought I told him it makes 0 sense on price/performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

Forgot about those! They even had 3850X2 with such primitive looking heatsinks. Who was the genius that thought 1 3850 should get a smaller heatsink? :ninja:

card_front.jpg


card_naked.jpg


card_heatsink.jpg


http://www.pcper.com/reviews/Graphi...3850-X2-1GB-Review-Mainstream-Dual-GPU-Option

vs. the HD7990

7990HSF.jpg


:biggrin:

I am most amazed that what's holding back R9 295X2's overclocking is not the single 120mm rad but that small VRM heatsink in the middle. If AMD repositions the power phases to the top of the PCB and increases the VRM heatsink 2-3X, they should allow for R9 395X2 to be a better overclocker.

My favourite part is how a single 120mm rad kept dual 290Xs operating at 1018mhz at 70C and below. Maybe AMD will go all out with the R9 390X WCE edition and raise its clocks another 10-15% out of the factory? Holding back the WCE might be one way to tactically combat a 1200-1250mhz GM200 6GB card. :p

AMDRad_R9_WaterCooler_Product_Shot_Inner_Angles_10in300dpi_4c.jpg
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
These 14nm GPU rumours are making it extremely difficult to be 'excited' for R9 390 series/980Ti.

AMD to Skip 20 nm, Jump Straight to 14 nm with "Arctic Islands" GPU Family

"AMD's next-generation GPU family, which it plans to launch some time in 2016, codenamed "Arctic Islands," will see the company skip the 20 nanometer silicon fab process from 28 nm, and jump straight to 14 nm FinFET. Whether the company will stick with TSMC, which is seeing crippling hurdles to implement its 20 nm node for GPU vendors; or hire a new fab, remains to be seen. Intel and Samsung are currently the only fabs with 14 nm nodes that have attained production capacity. Intel is manufacturing its Core "Broadwell" CPUs, while Samsung is manufacturing its Exynos 7 (refresh) SoCs. Intel's joint-venture with Micron Technology, IMFlash, is manufacturing NAND flash chips on 14 nm.

Named after islands in the Arctic circle, and a possible hint at the low TDP of the chips, benefiting from 14 nm, "Arctic Islands" will be led by "Greenland," a large GPU that will implement the company's most advanced stream processor design, and implement HBM2 memory, which offers 57% higher memory bandwidth at just 48% the power consumption of GDDR5. Korean memory manufacturer SK Hynix is ready with its HBM2 chip designs."

Looks like R9 390 series will be a very short-lived generation from AMD.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Assuming they fix their drivers. GTA V has Day 1 Nvidia WHQL drivers that actually work not Beta's that you need to bump your FPS up otherwise you are gimped. And Omega drivers that conveniently appeared after you played those games. And this year there is Witcher III and what else for a shiny new GPU?

?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
These 14nm GPU rumours are making it extremely difficult to be 'excited' for R9 390 series/980Ti.

AMD to Skip 20 nm, Jump Straight to 14 nm with "Arctic Islands" GPU Family

"AMD's next-generation GPU family, which it plans to launch some time in 2016, codenamed "Arctic Islands," will see the company skip the 20 nanometer silicon fab process from 28 nm, and jump straight to 14 nm FinFET. Whether the company will stick with TSMC, which is seeing crippling hurdles to implement its 20 nm node for GPU vendors; or hire a new fab, remains to be seen. Intel and Samsung are currently the only fabs with 14 nm nodes that have attained production capacity. Intel is manufacturing its Core "Broadwell" CPUs, while Samsung is manufacturing its Exynos 7 (refresh) SoCs. Intel's joint-venture with Micron Technology, IMFlash, is manufacturing NAND flash chips on 14 nm.

Named after islands in the Arctic circle, and a possible hint at the low TDP of the chips, benefiting from 14 nm, "Arctic Islands" will be led by "Greenland," a large GPU that will implement the company's most advanced stream processor design, and implement HBM2 memory, which offers 57% higher memory bandwidth at just 48% the power consumption of GDDR5. Korean memory manufacturer SK Hynix is ready with its HBM2 chip designs."

Looks like R9 390 series will be a very short-lived generation from AMD.

I dont see those Arctic Islands Chips to come before Q3 2016. So 390 will be a full year in the market, just like the old days.
 
Feb 19, 2009
10,457
10
76
These 14nm GPU rumours are making it extremely difficult to be 'excited' for R9 390 series/980Ti.

AMD to Skip 20 nm, Jump Straight to 14 nm with "Arctic Islands" GPU Family

"AMD's next-generation GPU family, which it plans to launch some time in 2016, codenamed "Arctic Islands," will see the company skip the 20 nanometer silicon fab process from 28 nm, and jump straight to 14 nm FinFET. Whether the company will stick with TSMC, which is seeing crippling hurdles to implement its 20 nm node for GPU vendors; or hire a new fab, remains to be seen. Intel and Samsung are currently the only fabs with 14 nm nodes that have attained production capacity. Intel is manufacturing its Core "Broadwell" CPUs, while Samsung is manufacturing its Exynos 7 (refresh) SoCs. Intel's joint-venture with Micron Technology, IMFlash, is manufacturing NAND flash chips on 14 nm.

Named after islands in the Arctic circle, and a possible hint at the low TDP of the chips, benefiting from 14 nm, "Arctic Islands" will be led by "Greenland," a large GPU that will implement the company's most advanced stream processor design, and implement HBM2 memory, which offers 57% higher memory bandwidth at just 48% the power consumption of GDDR5. Korean memory manufacturer SK Hynix is ready with its HBM2 chip designs."

Looks like R9 390 series will be a very short-lived generation from AMD.

I saw that coming early last year, despite people claiming in AMD's conference calls, they are going ahead with some products on 20nm, I knew it weren't going to be for dGPU. 20nm is SoC focused, unsuitable.

Both AMD/NV are skipping 20nm hence all the subsequent launches on 28nm. It's going to carry us until 14nm ff is ready. I don't think it will be ready for high perf dGPU in 2016 for one reason, early sign suggest yields are bad, dGPU tend to be very large dies, would make the problem worse. TSMC is struggling and delayed, so whenever they are ready, don't expect yields to be good either.

We'll probably see big high perf 14nm GPUs in 2017. Any 2016 dGPU part would be very small low-end or notebook, and priced high due to competition from every mobile SOC maker out there desperate for a new node. Because TSMC is out for the meantime, we've heard news of NV signing on with Samsung for 14nm production. Between Samsung & GF making chips for Apple, QC, MediaTek, dGPU for desktop is most likely last in line. There is 1 silver lining for AMD in all of this, they can bypass the queue due to their tight knit relationship with GF (already confirmed in their conference call, $1B in 14nm wafers ordered from GF!). So in theory, they have a good chance at being first to 14nm for dGPUs.

Despite lacking a new node, dGPU is continuing to advance at a good rate. If NV releases a full GM200 with 6gb vram, its got potential to be faster than Titan X. It's up to AMD to compete hard and hopefully 390X lives up to the hype. If it does, we'll be fine for performance until 2017.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Bahaha, wccftech and their speculations...

14nm FinFET is for LPP or LPM, Low Power, such as SOCs. Nvidia will make their Tegra there but the process is not capable of manufacturing GPUs which require a different process. You will find Qualcomm, Altera, Apple etc using 14nm but AMD will NOT manufacture discrete GPUs there either.
The available process for that will be TSMCs 16nm FinFET in 2016.
 
Feb 19, 2009
10,457
10
76
Considering Samsung's 14nm ff has no issue running at insane clock speeds, I think it will be just fine scaling up driving lower-clocked dGPUs. The only problem is yield for larger dies. Hence, expensive. But if performance is there to back it up, an expensive dGPU is a non-issue considering Titan X at $999 is selling beyond expectations!

ps. I suspect the days of a dedicated "HP" node solely for dGPU is over. Not enough $ involved to develop it considering the market value of mobiles, SoCs, wireless, server ICs vastly dwarf desktop dGPUs. This is why 20nm planar wasn't suitable and all lower nodes are focused for maximum efficiency, low power. TSMC is no different. They need to cater to the massive demands for the dominant market. Thus, PC dGPU designs will need to adapt.

Edit: http://semimd.com/blog/2014/04/17/globalfoundries-and-samsung-join-forces-on-14nm-finfets/
There's 2 separate 14nm FF from Samsung/GF, LPE and LPP, the latter is enhanced for performance but its LPE that's available already.
 
Last edited: