[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
GloFo 14nm FF LPP is suitable for low power SoCs up to high performance CPUs and GPUs. Unlike the old planar process, with FinFets there will only be a single process for every design/product from Low power up to very high frequency high performance.
 

dacostafilipe

Senior member
Oct 10, 2013
808
314
136
Technical aspect aside, I'm baffled by how AMD handles the hype surrounding the 390X, as in not at all.

If 970/980 continue to sell like they do now, who's going to buy the 390/390X? And I'm not event starting with the 980ti ...

AMD should start some kind of teasing ASAP!
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Technical aspect aside, I'm baffled by how AMD handles the hype surrounding the 390X, as in not at all.

If 970/980 continue to sell like they do now, who's going to buy the 390/390X? And I'm not event starting with the 980ti ...

AMD should start some kind of teasing ASAP!

This is an illogical statement. Every year there are consumers who upgrade their old GPUs. btw most consumers are not impulse buyers and look at the right time to purchase a GPU which means that they wait for competing products from both GPU vendors so that they can get the best value for their money. Even buyers who want only Nvidia GPU wait for AMD to launch new products so that the Nvidia GPUs fall in price. Not everyone wants to pay USD 1000 for a high end GPU. :biggrin:
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Considering Samsung's 14nm ff has no issue running at insane clock speeds, I think it will be just fine scaling up driving lower-clocked dGPUs. The only problem is yield for larger dies. Hence, expensive. But if performance is there to back it up, an expensive dGPU is a non-issue considering Titan X at $999 is selling beyond expectations!

ps. I suspect the days of a dedicated "HP" node solely for dGPU is over. Not enough $ involved to develop it considering the market value of mobiles, SoCs, wireless, server ICs vastly dwarf desktop dGPUs. This is why 20nm planar wasn't suitable and all lower nodes are focused for maximum efficiency, low power. TSMC is no different. They need to cater to the massive demands for the dominant market. Thus, PC dGPU designs will need to adapt.

Edit: http://semimd.com/blog/2014/04/17/globalfoundries-and-samsung-join-forces-on-14nm-finfets/
There's 2 separate 14nm FF from Samsung/GF, LPE and LPP, the latter is enhanced for performance but its LPE that's available already.

Low or high clocks, they are still low power, meaning around top 5W.
There are zero possibilities manufacturing any desktop or mobile discrete GPU with 14nm FinFET from Samsung/GloFo.

GloFo 14nm FF LPP is suitable for low power SoCs up to high performance CPUs and GPUs. Unlike the old planar process, with FinFets there will only be a single process for every design/product from Low power up to very high frequency high performance.
Yeah right. Show your source for this please?
You think they can use the same process for a 200W GPU with high voltage with a 3.5W SOC with lower voltage?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Technical aspect aside, I'm baffled by how AMD handles the hype surrounding the 390X, as in not at all.

If 970/980 continue to sell like they do now, who's going to buy the 390/390X? And I'm not event starting with the 980ti ...

AMD should start some kind of teasing ASAP!

People who paid $1K for Titan-X and can sell them for more than the price of the 390X and pocket the difference. :p
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Low or high clocks, they are still low power, meaning around top 5W. There are zero possibilities manufacturing any desktop or mobile discrete GPU with 14nm FinFET from Samsung/GloFo.

Yeah right. Show your source for this please?
You think they can use the same process for a 200W GPU with high voltage with a 3.5W SOC with lower voltage?

Before asking others for proof you should do the same. Anyway since you asked

http://www.globalfoundries.com/technology-solutions/leading-edge-technology/14-lpe-lpp

14nm FinFET Technology
14LPE – Early time-to-market version with area and power benefits for mobility applications
14LPP – Enhanced version with higher performance and lower power; a full platform offering with MPW, IP enablement and wide application coverage

Immediate availability
PDK and DM available now for design starts
Silicon maturity on track at Fab 8, New York
MPWs starting 2014

Wide range of applications
Mobile and wireless – lower watts per GHz
Computer, network and storage – more performance per watt

Its clear that 14LPP is perfectly suitable for high performance CPUs and GPUs. So stop spreading FUD. :D
 
Feb 19, 2009
10,457
10
76
Not to mention GF is is close knit with AMD due to their investors, they wouldn't invest billions & develop a node that is useless for making high power CPU/APU and GPU. It's not good for business.

Hence, there's 2 14nm ff node, LPE & LPP.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yeah right. Show your source for this please?
You think they can use the same process for a 200W GPU with high voltage with a 3.5W SOC with lower voltage?

Klick on the "14nm FinFET Technology Overview" video in the link below

http://www.globalfoundries.com/technology-solutions/leading-edge-technology/14-lpe-lpp


  • 14nm FinFET Technology
    [*]14LPE – Early time-to-market version with area and power benefits for mobility applications
    [*]14LPP – Enhanced version with higher performance and lower power; a full platform offering with MPW, IP enablement and wide application coverage

And the Samsung/GloFo 14nm FF pdf

Even 14nm FF LPE is 20% faster with 30% lower power than 20nm. That means that both AMD and NVIDIA could make a faster and more efficient GPU with 14nm LPE than 28nm planar.
 

dacostafilipe

Senior member
Oct 10, 2013
808
314
136
... most consumers are not impulse buyers and look at the right time to purchase a GPU which means that they wait for competing products from both GPU vendors so that they can get the best value for their money ...

You are right, it's what people should do ... but if that was the case, the marketshare would certainly not look like it is today.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
...As for AMD, I have mentioned before that I think the real reason they're testing out HBM is to get experience for their upcoming generation of Zen 16nm FinFET APUs. The biggest problem with their existing APU lineup has been the lack of memory bandwidth; discrete cards can use GDDR5, but motherboard manufacturers resist using this (they refused to do so with Kaveri and that feature had to be scrapped), so putting it on-die with HBM is the only way for APUs to be remotely competitive.

You hit the nail on the head.

I think 4K consoles are coming sooner than expected, with full backwards compatibility with today's consoles.

And who is going to power them? The company already in the current gen with bad@$$ GCN tech, or something random from Intel or nVidia?
 

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41
I think 4K consoles are coming sooner than expected, with full backwards compatibility with today's consoles.

No, they aren't. Neither MS, nor Sony are hardware companies. The machines serve a purpose in a larger, interconnected service strategy and just because better tech came out, doesn't mean anyone is going to spend the money on R&Ding a new box, if the old one is perfectly serviceable. We might see them come earlier than the current gen, but that's only because the previous one was stretched so thin.

And let us not start on the whole backward compatibility thing. It's not how those companies do business.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Before asking others for proof you should do the same. Anyway since you asked

http://www.globalfoundries.com/technology-solutions/leading-edge-technology/14-lpe-lpp

14nm FinFET Technology
14LPE – Early time-to-market version with area and power benefits for mobility applications
14LPP – Enhanced version with higher performance and lower power; a full platform offering with MPW, IP enablement and wide application coverage

Immediate availability
PDK and DM available now for design starts
Silicon maturity on track at Fab 8, New York
MPWs starting 2014

Wide range of applications
Mobile and wireless – lower watts per GHz
Computer, network and storage – more performance per watt

Its clear that 14LPP is perfectly suitable for high performance CPUs and GPUs. So stop spreading FUD. :D

You are grasping at straws which doesnt even exist. Why is that a source when it doesnt even mean what you think it mean?

Why do you think its called Low Power?
I can give you examples of applications they mean from the quote you are posting that involves computer and mobile but with low power since you can`t:
Modems, Wifi cards, 3G cards, DDR4, flash memory, SOCs including Ax processors from Apple and Cortex from ARM, Exynos from Samsung etc etc.

Those are LOW POWER devices that can be manufactured by Samsung/GloFo`s 14nm process.

Not even AMD`s upcoming APU`s (Zen) can be produced on 14nm FinFETs with the current process (LPE and LPP). They are either manufactured with 20nm SOI from GloFo (if they even manufacture it) or 16nm FinFETs from TSMC. Intel are the only foundry that can do 14nm on high power and there are zero chance that a foundry less company like AMD can leapfrog and suddenly end many years of Intel being generations ahead in manufacturing. Or produce GPUs in 14nm FinFETs.

Stop reading wccftech`s silly articles and get back to reality.
Wccftech mention Nvidia in the article that they will be producing their GPUs at Samsung on 14nm FinFETs and try to base that on that Pascal and Greenland will be 14nm too. But completely fail to see that it only involves Tegra from Nvidia.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
You are grasping at straws which doesnt even exist. Why is that a source when it doesnt even mean what you think it mean?

Why do you think its called Low Power?
I can give you examples of applications they mean from the quote you are posting that involves computer and mobile but with low power since you can`t:
Modems, Wifi cards, 3G cards, DDR4, flash memory, SOCs including Ax processors from Apple and Cortex from ARM, Exynos from Samsung etc etc.

Those are LOW POWER devices that can be manufactured by Samsung/GloFo`s 14nm process.

Not even AMD`s upcoming APU`s (Zen) can be produced on 14nm FinFETs with the current process (LPE and LPP). They are either manufactured with 20nm SOI from GloFo (if they even manufacture it) or 16nm FinFETs from TSMC. Intel are the only foundry that can do 14nm on high power and there are zero chance that a foundry less company like AMD can leapfrog and suddenly end many years of Intel being generations ahead in manufacturing.

Stop reading wccftech`s silly articles and get back to reality

you have not provided a shred of proof to back your rubbish statements. So who is gasping at straws here. btw in the wide range of applications it mentions Computer, network and storage. So that means its not just mobile SoC. If you don't have any proof then just stop the FUD. :biggrin:
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
So.... it is expected that the performance hierarchy will go as follows then:

1. Titan X
2. 980ti
3. 390X
4. 980 Metal
5. 980

Is this correct? I could understand that 2 and 3 might be interchangeable, or perhaps 3 and 4.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
you have not provided a shred of proof to back your rubbish statements. So who is gasping at straws here. btw in the wide range of applications it mentions Computer, network and storage. So that means its not just mobile SoC. If you don't have any proof then just stop the FUD. :biggrin:

I have lots of proof, its called knowledge. Try googling it up and educate yourself. I`m too lazy to look it up for you.

There are spefications posted about what is low power and what is not too, you might want to look at that while you are at it.

So.... it is expected that the performance hierarchy will go as follows then:
1. Titan X
2. 980ti
3. 390X
4. 980 Metal
5. 980

Is this correct? I could understand that 2 and 3 might be interchangeable, or perhaps 3 and 4.
Number 4 have been taken out of air. The article from baidu spoke about GTX 980 Ti only. Not GTX 980.
That there will be reference models first, metal enhanced versions later in September and plus a version that will have better cooling and faster clocks as well (the leaker couldnt say this version will come for sure but he may have been referring to Water cooling perhaps.)

Also GTX 980Ti is suppose to be faster than GTX Titan X. That is what the recent rumor said. But remains to be seen
 
Last edited:

Pinstripe

Member
Jun 17, 2014
197
12
81
No, they aren't. Neither MS, nor Sony are hardware companies. The machines serve a purpose in a larger, interconnected service strategy and just because better tech came out, doesn't mean anyone is going to spend the money on R&Ding a new box, if the old one is perfectly serviceable. We might see them come earlier than the current gen, but that's only because the previous one was stretched so thin.

And let us not start on the whole backward compatibility thing. It's not how those companies do business.

Actually, Sony is very much a hardware company. And a Playstation 4 "Ultra" makes much sense. A 14nm APU with scaled up GCN tech doesn't require that much R&D cost, and you can still use the same PS4 OS and API.

Developers could opt to develop hybrid games for PS4/PS4Ultra (PS4 runs with 30fps, the Ultra with 60fps and extra high settings), or make PS4Ultra exclusives altogether.

Of course, that's just a theory. But seeing how current PC hardware is already destroying "NextGen" console graphics/framerate, it's kinda hard for Sony to keep their userbase excited.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I think 4K consoles are coming sooner than expected, with full backwards compatibility with today's consoles.

They technically can already do 4K gaming right now, albeit very simple games at 30Hz over HDMI 1.4 connections. If anything, we'll just see the standard hardware revision with new ports (HDMI 2.0a for 4K 60Hz), but don't expect to actually play any bleeding edge games at 4K. I doubt we'll see new, truly 4K-capable consoles from Sony/MS until 2020, after 4K has gone as mainstream as 1080p is now.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
Actually, Sony is very much a hardware company. And a Playstation 4 "Ultra" makes much sense. A 14nm APU with scaled up GCN tech doesn't require that much R&D cost, and you can still use the same PS4 OS and API.

Developers could opt to develop hybrid games for PS4/PS4Ultra (PS4 runs with 30fps, the Ultra with 60fps and extra high settings), or make PS4Ultra exclusives altogether.

Of course, that's just a theory. But seeing how current PC hardware is already destroying "NextGen" console graphics/framerate, it's kinda hard for Sony to keep their userbase excited.

If that happened, I'd be very much in favour of it. But I don't see it happening.

The retail price of the PS4U would have to be quite high - so high in fact that most people wouldn't buy it when they have the option of playing the same games at a lesser price.

Console gaming enthusiasts play lots of games and love it. PC gaming enthusiasts also include people who play lots of games and love it plus entire groups that are just hardware enthusiasts, and combinations of the two. The PS4U would have to be targeted towards the segment of the PC gaming group (aka - people that want better-than-console-graphics) that has cash, is ok with consoles, and really likes to game, and doesn't want to do it on PC for some reason even if given the option. As a proportion of the market, it would be quite small, making costs high, and production costs also high. I can't see this being viable.

I just don't see PC enthusiasts shifting away from PC when the cost of consoles increases.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
Number 4 have been taken out of air. The article from baidu spoke about GTX 980 Ti only. Not GTX 980.
That there will be reference models first, metal enhanced versions later in September and plus a version that will have better cooling and faster clocks as well (the leaker couldnt say this version will come for sure but he may have been referring to Water cooling perhaps.)

Also GTX 980Ti is suppose to be faster than GTX Titan X. That is what the recent rumor said. But remains to be seen

Thanks, much appreciated. There is so much info here that it's hard to tell what is right and wrong, especially when some of it has been redacted and others not. If 980ti is priced properly, and AMD continues to do nothing, I don't see how I can not buy it. lol, I've never actually said that before about a GPU!
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
Granted I am wildly speculating, and 4K consoles will only be sensible when 4K tv's are cheap and plentiful.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Assuming they fix their drivers. GTA V has Day 1 Nvidia WHQL drivers that actually work not Beta's that you need to bump your FPS up otherwise you are gimped.

As a gamer why do I care if the drivers are WHQL or Betas as long as they work? Are you suggesting AMD's GTA V drivers are buggy, yet R9 290X delivers better smoothness "inside the second" than a 980.

980 SLI bombs against R9 295X2 at 4K.

1429511282q5iVvFquHG_3_3_l.gif


I am not even sure what point you were trying to make here other than to say WHQL drivers can be worse than Beta drivers? :confused:

And Omega drivers that conveniently appeared after you played those games.

People haven't used Omega drivers for Radeons in half a decade, if not more.

And this year there is Witcher III and what else for a shiny new GPU??

#1. You assume everyone is like you where they buy $60 games on release date. You realize a lot of PC gamers do not do that? I can guarantee there are PC gamers who haven't beaten Far Cry 4, Dragon Age Inquisition, Dying Light, Shadow of Mordor, AC Unity either because they buy games at <$20 or because they have a large backlog of other games, or because they are simply enjoying some other game(s) today. Didn't you say before you regularly play 6-8 hours of PC games a day every day? Now imagine a PC gamer who only has time to game 5-10 hours a week.

#2.

- Tom Clancy's Rainbow Six Siege
- Star Wars Battlefront
- Batman Arkham Knight
- Project CARS
- Metal Gear Solid V: The Phanom Pain
- The Witness
- Dead Island 2
- Warhammer 40000: Eternal Crusade
- possibly Just Cause 3
- AC Victory

#3. You have ignored three other major points. The first one is for many people who are entering their summer seasons, summer is the time for outdoor activities, house renovations, sports, travel, and family as kids are on vacation. The average PC gamer today is pushing 35 years old. For example, I hardly play videogames in the summer. That means there is no rush to buy the new shiny GPU on the date it releases regardless of what games are out. You ignore this large group of PC gamers who feel the same.

The second point is we are still one year or more away from 14nm/16nm GPUs. That gives both AMD and NV plenty of time to sell GM200 6GB and various R9 300 series of cards. The market is dynamic and it doesn't just stop purchasing GPUs once the Titan X or 980 come out.

The third point is back to school season, Skylake, Windows 10 and holiday Q4 2015. These 4 events will result in more GPU sales in Q3-Q4 than in Q1-2. It would have been better if R9 390 came out January 2015, but there are still major catalysts for GPU/PC sales on the horizon in the 2nd half of the year.

#4. You ignored the sub-$300 GPU market. Right now the offerings in the $100-300 are extremely weak. 750 and 960 series offer horrible price/performance, while 200 series have aging feature set and poor perf/watt. As a result, R9 300 series can make a huge impact IF AMD redesigned products in that segment and aren't just rebadging old series.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
No, they aren't. Neither MS, nor Sony are hardware companies. The machines serve a purpose in a larger, interconnected service strategy and just because better tech came out, doesn't mean anyone is going to spend the money on R&Ding a new box, if the old one is perfectly serviceable. We might see them come earlier than the current gen, but that's only because the previous one was stretched so thin.

And let us not start on the whole backward compatibility thing. It's not how those companies do business.

He said he thinks that's what will happen. You on the other hand know what they are going to do. Are you employed by someone with knowledge, or have a source? Or are you just claiming to know something you don't?

Now that they are on X86 with GCN there is no reason I can think of they can't simply improve the hardware and keep it completely backwards compatible. I don't know for certain, but it makes sense.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
765
136
RS is right.

#3: on point, I am 32, do not play as much as I used to, have a lot of house work to do this spring / summer / fall.

#4: new lower end cards stink, would rather get a used 7950 / 7970 for someone if needed.

If 390 is mega boss, I will put one in a few months after release, so mass use case bugs are ironed out.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
According to some news I read on some site [here] that AMD will have problems delivering a big number of 390X/390 cards due to SK Hynix having yield problems with HBM.

In the same news it says that AMD is apparently gonna release the 395x2 at the SAME TIME as the 390X/390. They also say 390X will only appear as 4GB version.

But that just makes the entire news sound derpworthy...not sure if someone else posted this already, but now it's here for sure. Again...nothing is certain and I certainly don't know how trustworthy Tweaktown is.