[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
16,977
7,382
136
The only people who believe AMD will rebrand most of the mid-range/high-end lineup are those who like to think AMD is too stupid to see reality, that Tahiti, Pitcairn, Tonga and Hawaii are not competitive versus Maxwell... so they will push them out again to compete for the next cycle..

Yeah, really? Lisa cannot be that clueless.

There's not much point in releasing anything really new without HBM at this point since they need the power savings to be competitive. And HBM is going to be expensive enough that it can only be at the high end. At the same time OEMs want new products... so rebrands are going to be needed.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The only people who believe AMD will rebrand most of the mid-range/high-end lineup are those who like to think AMD is too stupid to see reality, that Tahiti, Pitcairn, Tonga and Hawaii are not competitive versus Maxwell... so they will push them out again to compete for the next cycle..

It cannot be understated that even with the same underlying GPU architecture, improvements can be achieved. Furthermore, we cannot discount the engineering work that goes on at the transistor level.

GTX580 vs. 480:

Even when NV released the full die GF110, they still managed to make changes under the hood despite the architecture itself being virtually identical:

"NViDIA has ported over GF104’s faster FP16 (half-precision) texture filtering capabilities, giving GF110/GTX580 the ability to filter 4 FP16 pixels per clock, versus 2 on GF100/GTX480. GF110 can now do 64bit/FP16 filtering at full speed versus half-speed on GF100, and this is the first of the two major steps NVIDIA took to increase GF110’s performance over GF100’s performance on a clock-for-clock basis.

The other change ties in well with the company’s heavy focus on tessellation, with a revised Z-culling/rejection engine that will do a better job of throwing out pixels early, giving GF110/GTX580 more time to spend on rendering the pixels that will actually be seen. This is harder to quantify (and impossible for us to test), but NVIDIA puts this at another 8% performance improvement.

Thus the trick to making a good GPU is to use leaky transistors where you must, and use slower transistors elsewhere. This is exactly what NVIDIA did for GF100, where they primarily used 2 types of transistors differentiated in this manner. For GF110, NVIDIA included a 3rd type of transistor, which they describe as having “properties between the two previous ones”. Or in other words, NVIDIA began using a transistor that was leakier than a slow transistor, but not as leaky as the leakiest transistors in GF100. Again we don’t know which types of transistors were used where, but in using all 3 types NVIDIA ultimately was able to lower power consumption without needing to slow any parts of the chip down. In fact this is where virtually all of NVIDIA’s power savings come from, as NVIDIA only outright removed few if any transistors considering that GF110 retains all of GF100’s functionality."

http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/3

Despite full chip, GF110 actually had 200 million transistors less and a die size 10mm2 less than G100!

My position is simple: if the silicon coming off the wafers is unchanged, then it's a rebrand. ...
If AMD were to respin Tonga for GloFo instead of TSMC and got lower power consumption at the same clocks, then that wouldn't be a rebrand, because it would be different silicon.

Very good post! I agree.

From the same anandtech link.

Ya, that's really confusing then. Why would AMD create new codenames for the exact same product - Pitcairn?

All I gotta say if AMD re-brands everything with 0 new features, perf/watt improvements, or clock speeds improvements, except for R9 390/390X, they might as well launch nothing besides those 2 cards.

Let's take Pitcairn. R9 270X regularly sells for $140-150 US and it isn't even as popular as a $120-130 GTX750Ti despite a 43% 1080P performance advantage. So what exactly would AMD's plan be, to price it at $99? :D
 
Feb 19, 2009
10,457
10
76
There's not much point in releasing anything really new without HBM at this point since they need the power savings to be competitive. And HBM is going to be expensive enough that it can only be at the high end. At the same time OEMs want new products... so rebrands are going to be needed.

OEM specific SKUs are low-end/entry.

People here are talking about putting Tonga & Hawaii as competitor versus Maxwell, again. Even when its clearly failing to compete leading to a major erosion of marketshare.

You have to question the sanity of selling R290/X SKU at $240 and so few want to even buy that at such low prices, for them to keep on doing that until the next node.. quite a big die, 4gb vram, 512 bus PCB, 300W boards & hefty cooling, for $240? Where's the margins to make that worthwhile for their AIB partners?

Heck, I bet its selling for a loss at that price, to clear inventory.
 

jpiniero

Lifer
Oct 1, 2010
16,977
7,382
136
You have to question the sanity of selling R290/X SKU at $240 and so few want to even buy that at such low prices, for them to keep on doing that until the next node.. quite a big die, 4gb vram, 512 bus PCB, 300W boards & hefty cooling, for $240? Where's the margins to make that worthwhile for their AIB partners?

There's not much AMD can do about it really.
 
Feb 19, 2009
10,457
10
76
There's not much AMD can do about it really.

Eating a small loss to clear inventory is fine, because those products are already produced, not selling them at all leads to a total loss.

What makes no sense is to produce MORE of poorly selling products that are uncompetitive and non profitable to sell for another year or two. This situation is what a "re-brand" of current mid-range/high-end SKUs is. It's total non-sense.
 

jpiniero

Lifer
Oct 1, 2010
16,977
7,382
136
What makes no sense is to produce MORE of poorly selling products that are uncompetitive and non profitable to sell for another year or two. This situation is what a "re-brand" of current mid-range/high-end SKUs is. It's total non-sense.

Except as I said... without the power savings of HBM the new product would also be uncompetitive. Plus, if 390/X really is Dual Tonga with HBM, what would they do to make a mid range product?
 
Feb 19, 2009
10,457
10
76
Except as I said... without the power savings of HBM the new product would also be uncompetitive. Plus, if 390/X really is Dual Tonga with HBM, what would they do to make a mid range product?

Nothing. If its a straight rebrand, they may as well not sell anything and take a long holiday.

Don't forget, even R290 at $240 is not selling well against NV products, 960 and 970 in particular.

Also, if 390/X is Dual Tonga (LOL!) which would be slower than Dual Hawaii, ie. R295X2, AMD may as well pack up, sellout to Samsung and call it a day in dGPU business.

See how realistic these scenarios are?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You have to question the sanity of selling R290/X SKU at $240 and so few want to even buy that at such low prices, for them to keep on doing that until the next node.. quite a big die, 4gb vram, 512 bus PCB, 300W boards & hefty cooling, for $240? Where's the margins to make that worthwhile for their AIB partners?

Heck, I bet its selling for a loss at that price, to clear inventory.

I am 100% convinced that the reputation of R9 290/290X is so tarnished (hot, loud, volcanoes, use as much power as a village in Africa) that even at $159 for R9 290 and $219 for R9 290X these cards would not sell against the competition. A 960 and even 750Ti would still outsell both of them at those prices.

Look a used good after-market R9 290X is going for $200 USD ($250 CDN) and no one is buying it.

Here in Canada, GTX960 4GB is $310 CDN, but cards like Asus DCUII R9 290 are $320 CDN with 52-107% more performance.

That's right, for only $10 you get 52% more performance at 1080P, 61% at 1440P, 107% more at 4K. Not selling well against the 960. :hmm:

Also, if 390/X is Dual Tonga (LOL!) which would be slower than Dual Hawaii, ie. R295X2, AMD may as well pack up, sellout to Samsung and call it a day in dGPU business.

See how realistic these scenarios are?

Don't forget $280-300 R9 290X against a $515-550 980 or $140 R9 270X that pummels a $120 750Ti by 43%. Not selling. The reputation of the entire R9 200 series is so bad, even on our forum you see stuff like people believing that an R9 280 uses 250W of power all by itself (but the reality is the entire max overclocked Core i7 3770K @ 4.6Ghz gaming rig uses 309W). Doesn't matter though because since R9 280 has "marketing" TDP of 250W, it must use 250W of power in games, right, right.

1413158332ipRva8wL8T_9_1.gif


AMD is going to need to really work with reviewers because more or less the entire R9 200 series was thrown under the bus by the media. I mean look at it, R9 280 was faster than a 760, had more VRAM a similar price levels, better overclocking headroom/overclocking scaling and guess which card was favoured by reviewers/more popular? The 760! Later on it got so bad that for the price of a 760 one could purchase an R9 280X, while people still paid $150 more for 770 4GB over the 280X. Facepalm, AMD marketing fail.

AMD isn't NV, they can't sell polished turds on marketing alone. Can you just imagine a review of a "brand new" re-branded R9 290X = 100% identical R9 380X for $399? If AMD really wants to waste a generation, might as well sell R9 290X for $179 for 18 months until 14nm to at least gain back market share. :D
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You have to question the sanity of selling R290/X SKU at $240

Don't worry. The North American media will ignore that too. $40 more for 52-107% more performance and double the VRAM? Nah, forget it. Make a review full of GW titles, remove MSAA and all modern games that exceed/push 2GB of VRAM, limit game selection and voila - Gold AWARD. And when your results don't match any other credible site and especially European reviewers, just pretend they don't even exist.

We've reached a critical point in market share where the media refuses to criticize the wildly popular and market dominant brand, and skew testing to hide problems in products to account that no matter what the most popular player makes, it receives an award. It's similar to media outlets like the Verge, Wall Street Journal, Bloomberg Business being very reluctant to give an Apple product a bad review because of their readership and receiving future review samples/being invited to events.

There is no way all these free prizes, free trips, invites to media events, getting favourable treatment for receiving review samples of the latest products aren't done in return for favourable reviews. Most of the North American GPU media is so far removed from being objective, we can count the number of credible/not sold out to marketing NA sites on 1 hand. This generation has been the most eye-opening because I don't even think they are ashamed to hide it anymore that they swing reviews whichever way marketing dollars go and specifically to cater to popular readership views on their site! After all, if most of your readers are strongly biased towards 1 brand, if you start criticizing that brand/its products in reviews, you'll quickly start to lose the core readership/members on your site, which impacts your traffic and thus ad revenue.
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
The more I think about the 390X dual GPU theory, the less stupid it sounds (tho I'm still not subscribing to it).

AMD is years ahead regarding shared resources & heterogeneous hardware. And anyone who follows AMD GPUs believes Tonga (and newer I suppose) has a a certain amount of 'extra functionality' which we are yet to see; until we get HSA platforms and Mantle(Vulcan) titles... or until we pair them up?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Also, if 390/X is Dual Tonga (LOL!) which would be slower than Dual Hawaii, ie. R295X2, AMD may as well pack up, sellout to Samsung and call it a day in dGPU business.

See how realistic these scenarios are?

You forget the 295X2 is 500W and requries hybrid cooling. 300W is the magic border if you actually want to sell some cards in somewhat volume and earn on it.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
You forget the 295X2 is 500W and requries hybrid cooling. 300W is the magic border if you actually want to sell some cards in somewhat volume and earn on it.

It doesn't require hybrid cooling. That is just what they chose to go with. The Gigabyte G1 980 Windforce 3x coolers are good to 600W.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
You forget the 295X2 is 500W and requries hybrid cooling. 300W is the magic border if you actually want to sell some cards in somewhat volume and earn on it.

Air cooled 295x2 models actually exist...I'm not saying it's the best solution...but they're out there.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
R9 390X may be dual die`s under one die. Since 390X got a total of 4096 shaders its probably dual Tonga to get all the features it bring to the table over older GCN cards.

54221dunz.jpg


GloFo 28SHP process and AMD`s goal is to make dense chips, which is what they may have done with Tonga to make 390X. Or it could be one massive chip with two normal Tonga`s.
Why they would do this over R9 295X2?
- Technically its still a single GPU - Take the throne against Titan X and do comparisons against any single GPU.
- Make use of exisiting tech which means much less money spent than making a brand new chip.
- Less cost of the card than 295X2 due to less silicon, less power phases and other components. The card might be smaller too.
- Keep the price down to a decent level since HBM alone will draw the price up
- Getting the best out of HBM and it may be the only way of using ubump which is written in the slide above.

I already mentioned that dual Tonga at 1050MHz is pretty much dead on with what the AMD slide said about performance over R9 290X.

(4096*1050)/(2816*100) = 1.53 = +53%
Well they won`t scale perfectly like this, so maybe +40%. Rest is gathered from HBM

Compare against 295X2:
(4096*1050)/(5632*1018) = 0.75
Compare against Techpowerup Titan X review:
120*0.75 = 90
Which is 10% slower than Titan X (90/100). Perhaps 390X will do better than this, since it got HBM and 295X2 doesnt. So it can match Titan X or beat it.
Either way, 10% slower than Titan X but with price at $700 due to the reasons above, would be good enough to move many units.
 
Last edited:
Feb 19, 2009
10,457
10
76
So you think they were happy with the Tonga turd so they slap two of them together, so that its revolutionary?

Lolworthy. It seems the delay is making some of you guys stark mad.

Tonga isn't anywhere close to be competitive versus Maxwell. AMD can't sell $240 R290 because its inefficient and too power hungry.

Do you think many people would buy Tonga x2 that is slower than Titan X at 300W?

If that's the best AMD can do, they should quit.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
So you think they were happy with the Tonga turd so they slap two of them together, so that its revolutionary?

Lolworthy. It seems the delay is making some of you guys stark mad.

Tonga isn't anywhere close to be competitive versus Maxwell. AMD can't sell $240 R290 because its inefficient and too power hungry.

Do you think many people would buy Tonga x2 that is slower than Titan X at 300W?

If that's the best AMD can do, they should quit.

Try reading what I said again. Two of them on the same die + HBM + features of Tonga (Freesync etc) might be the only way to get what they need out of HBM (like the picture showed regarding ubump whatever that is).

And again, they can keep the price down vs 295X2 with a card like this = competitive against Maxwell.
I just said say $700 price would mean they could sell a lot of cards since Titan X cost $999. Efficiency is only part of the whole equation, which you should know, looking at the whole bloody AMD lineup since 2012 against Kepler and Maxwell.

Why do you think Nvidia is waiting before releasing GTX 980Ti despite it being ready for launch? Its because they want to release a cheaper GM200 to match or atleast get closer to the 390X when it launch. It may even be a cut down GM200 to get down to 80-90% of GTX Titan X like my calculation above show that R9 390X might do.

Both AMD and Nvidia is bound by 28nm. I think the way for both, especially AMD with less funds, is to make GPUs with the least engineering cost until 16nm is here. That may involve using exisiting cores, like Tonga, and putting them under the same die.

I`m not saying it will happen, but the slide above certainly hints toward that.
The reason why all other 300 cards is rebrands and not cards with HBM may be just that, that dual core under the same die is required. Which cost money to make. They can`t just use exisiting 200 GPUs and slap on HBM and call it a day.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I don't get why you think it will be dual tonga, so many things point to a single large gpu, just like a lot of things point to the rest of the lineup being rebrands.
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship
 

jpiniero

Lifer
Oct 1, 2010
16,977
7,382
136
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship

Every review I've recently seen says that CrossfireX works much better than SLI now. Well, when it's enabled.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
So even IF the 390X was just a "doubled" Tonga XT with HBM...wouldn't the performance still rock if it actually works like a single GPU?


I mean Tonga XT would be at least like what...10% faster than a 285 by default...right?
Now I'm not sure if it could actually work to make 2 GPUs work like ONE unit...but if it can...wouldn't we be looking at +50% performance vs a 290X either way?

I mean a 290X has 2816 Shaders...doubled Tonga XT would be 4096 shaders...that alone is already nearing 50% (yes yes, 45.4545%) "moar". Then in addition to that it has better color compression and some updated tech inside....and the HBM.





That would be the only "plausible" Tonga 390X that I could possibly understand...because just Tonga XT on internal crossfire makes 0 sense since dual GPUs don't always work...and you would essentially be buying a 285X (if such a product existed).


Not even AMD would be so foolish to sell a new high end card that would end up having less performance than it's previous generation.



So let's round up the weird fears and rumors of those Tonga people and supposed leaks into data:

-"Doubled Tonga XT" single GPU (come on, double GPU is just a ridiculous claim, it would mean that it would be below the 290/290X in games that don't do Crossfire, stop being silly)
-45.4545~% moar shaders
-290X GPU base clock is 1000 Mhz, 390X base clock supposedly starts at 1050...aftermarket models likely to be higher clocked
-8.6 Terraflops
-GDDR5 vs HBM


So even IF we were looking at a doubled Tonga XT...that would be bad...why? It could still have the magical 50% improvement over a 290X this way....and let's not forget that Tonga AND HBM both consume less power...leaving more room for just cranking up the clocks like mad.
 
Last edited:

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship
Look at the slide above. Its two die's on a single die. Communication between them is not through crossfire or XDMA.
Im not saying its a dual GPU, but single GPU with two Tonga cores. And there will be 100% scaling due to sharing same internal memory

Atleast a possibility right?
 

RaulF

Senior member
Jan 18, 2008
844
1
81
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship


The only crossfire problems is them not doing fast driver releases with newer games. Xfire works pretty good.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Would be revolutionary if they put two Tongas on the same chip and did something to where they wouldn't need crossfire for it to run.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Look at the slide above. Its two die's on a single die. Communication between them is not through crossfire or XDMA.
Im not saying its a dual GPU, but single GPU with two Tonga cores. And there will be 100% scaling due to sharing same internal memory

Atleast a possibility right?

I think it is fair to say that it is a possibility -- although one I consider remote -- that the GPU could have a design that creates a "dual core" configuration. That would be similar to the first AMD dual core CPU, which was essentially two distinct CPU cores sharing a memory controller and interconnects with the system, linked together through a crossbar.

I don't expect that at all, because that approach still utilized a single die, and while the chip with the dual controllers for the HBM prototype is a single package on an interposer, I do not believe you could correctly call that a single die. Those two controllers, be they GPU dies or not, are entirely separate from one another, only linked together through the Interposer and TSVs. To coordinate queuing and shared memory pooling, one would imagine a need for controllers and caches and other bits to have direct die-level communication with all cores.

The image is a prototype, and almost surely does not reflect the design intentions of their GPUs. However, while I do not have the engineering expertise, perhaps it is possible that one large die could be split in two and the connections in the interposer do not reduce communication and control of the various elements, nor hamper synchronization.