[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

geoxile

Senior member
Sep 23, 2014
327
25
91
Look at the slide above. Its two die's on a single die. Communication between them is not through crossfire or XDMA.
Im not saying its a dual GPU, but single GPU with two Tonga cores. And there will be 100% scaling due to sharing same internal memory

Atleast a possibility right?

On an interposer, where the actual dies are still separate. So I dont see how they'll manage the two GPUs any differently than they would with a normal dual gpu card.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
On an interposer, where the actual dies are still separate. So I dont see how they'll manage the two GPUs any differently than they would with a normal dual gpu card.


As long as we're just speculating...Is there any reason to believe that the second core just isn't somewhat of a piggy back. As in more rop's,etc. Any written rule that states all of a gpu needs to be on the same die/silicone?

Would be funny if AMD could throw a monkey wrench in Gameworks spokes by doing something like above to eliminate the need for patches,profiles,etc.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
As long as we're just speculating...Is there any reason to believe that the second core just isn't somewhat of a piggy back. As in more rop's,etc. Any written rule that states all of a gpu needs to be on the same die/silicone?

Would be funny if AMD could throw a monkey wrench in Gameworks spokes by doing something like above to eliminate the need for patches,profiles,etc.

To be fair, there is only little room for any potential "dual die" GPUs. They have a fairly standard package size. If you go with multiple dies, each one will be smaller. The other option is to have a single large die, what we typically see in the high-end market.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Where the heck is a 270X selling for $140? I'd grab one.

Newegg had the MSI R9 270X Hawk for $140 just days ago. I am surprised no one put it up in the Hot Deals section. You can get Sapphire dual-X 270X for $140 after $10 promo and $20 MIR. However, I wouldn't buy it for $140 since XFX R9 280 is $160. For $20 extra, you get 3GB of VRAM, higher overclocking headroom that gets you to HD7970Ghz/770 level of performance and lifetime warranty. A way better deal if you ask me. Way too many people passed up on the 7950/R9 280 without realizing that it scales well with overclocking, easily surpassing 7970Ghz and 680. Granted at this point if you can save a bit, I think it's a matter of time before we see fire-sale $200-210 R9 290. The price/performance from a $140 R9 270X to a $215 R9 290 is linear but you get 4GB of VRAM and 97% of 970's performance.

---

Wow, the theory behind dual Tongas is getting traction. Easily the most absurd theory of all time. Let me put it this way in the UK an R9 295X2 sells for barely more $ than a 980 (500 pounds vs. 450-550 pounds), but it hardly has an impact on the market. In the US we also know that 295X2 is not much more money than a 980 despite a 67% performance advantage at 4K. How in the world would a dual 2048 SP Tonga would be able to compete with a 295X2? Ha, is this April 1st? Why wouldn't AMD just add HBM to dual 290s? No one is going to buy a $700 dual card slower than an R9 295X2 even if it used 200W of power against an after-market GM200 6GB because CF scaling isn't perfect in all games. Come on, think!

A single R9 290X is 2.08X faster than a 285 at 4K. That means if you put 2X Tongas together in CF, they would lose to a single R9 290X. Are we supposed to believe AMD will sell such a card for $700? Absurd!

9477


Even if we back off to 1440P, 290X smashes the 285 by 64%. AMD would never make a next gen flagship card comprised of last gen mid-range chips. Who even thought of this craziness?

9476


Do some people here actually think before they post? What would Tonga XT do with 512-640GB/sec of memory bandwidth when CF doesn't work?! What a waste of resources. Holly cow speculation fail/sounds like a negative NV PR spin tactic to get people to upgrade to NV and not wait for R9 390 series.
 
Last edited:

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Wow, the theory behind dual Tongas is getting traction. Easily the most absurd theory of all time.

Just for reference..I agree with that.

But I said that there is only ONE idea that actually would be worth theorizing about.

A Dual Tonga XT (for reference, XT doesn't even exist as consumer GPU) that communicates and works like ONE GPU...so basically either 2 GPUs on 1 die that is interconnected or just a Tonga XT that has been blown up to 4096 shaders.

In which case it could actually be +50% performance of a 290X easily. And if you then CF THOSE you could get quite a nice product.


But again...I feel that this is highly unlikely...I think the only new use of Tonga is going to be an ACTUAL Tonga XT GPU that ends up being a 370X or something like it.
A Tonga XT 370X would be a fine product imho...especially since it wouldn't even be a rebrand since it doesn't exist in the 200 series except for 1 "M" version used in some iMac.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In which case it could actually be +50% performance of a 290X easily. And if you then CF THOSE you could get quite a nice product.

And when CF doesn't work since CF depends on software to work with each of the GPUs to split the frame rendering workload (AFR or SFR), then you end up with a card only as fast as an R9 280X. How exactly would you sell a card at $600-700 that loses to a $280 R9 290X? Secondly, the target market of $600-700 cards only cares about perf/watt when all else is similar. In this case a 1-year-old R9 295X2 would smash this mythical dual-Tonga chip at 1440P and 4K and no amount of power saving could sell this dual-Tonga card against dual 290Xs or R9 295X2 or 970 SLI. Instant fail.

The main reason why someone pays $600-700 for a large single-chip card with performance near R9 290 CF is because you get consistent performance and minimum frame rates, but CF is not always consistent. Why do you think people get 7970Ghz over HD6970 CF or GTX780Ti over GTX760 SLI or GTX580 over GTX460 SLI?

Again - since R9 290 smokes 285 in perf/watt and performance for not much more $, if AMD even considered going dual chips for R9 390X, it only makes sense to use 290/290X as the basis to which to add HBM. Having a dual-card with less performance than an R9 295X2 but similar price just because it uses 100-200W less power is not a selling feature at all. The minute GM200 6GB comes out, no one would buy it.

The biggest elephant in the room - when CF doesn't work, you'll end up with 512-640GB/sec of memory bandwidth of very expensive and complex HBM for a single Tonga XT. Why? At this point take dual 290s, drop voltage and their clock speeds to 750-800mhz, reuse existing GDDR5 and they will at least match if not beat dual Tongas in performance with minimal investment in expensive HBM. And since AMD has piles of inventory of these R9 290 chips, they could have just downclocked/undervolted them 1 year ago if this was even the plan. Why spend millons of dollars working with Hynix on HBM just to lower power usage 50W on dual Tongas when you can drop power usage 50-75W on each R9 290/290X chip with undervolting/underclocking. The theory is flawed on so many levels.

With 28nm so mature and cheap now, AMD has the capability to make a 530-550mm2 die. That's enough for a 3800-4000 SP/256TMU chip. I guess the doom and gloom theorists have run out of things to say since GM200 6GB is still mia; so pushing this new theory of combining last gen low-end chip in CF sounds like a sound strategy to undermine R9 390. Even the slide itself talks about that design with dual-controllers being a prototype. Someone blatantly re-labelled the Controllers as "Dual GPUs" and then like sheep the forum went wild for this amateur Paint mod. Give me a break -- it clearly states controllers on that slide, not GPUs.

At some point VR-Zone had this floating as an R9 390X. The amount of FUD being posted online for pure ad-clicks/site visit is nuts.

390xmockup_zps8934411e.png
 
Last edited:

Shehriazad

Senior member
Nov 3, 2014
555
2
46

But a doubled 290X chip (as SINGLE gpu)? That thing would have a tdp of 450+W on a SINGLE chip....that's just not manageable anymore...and now don't say 295X does it, too...that's a different design with 2 GPUs that are cooled "seperately".


I guess haters will have to get used to the idea that this GPU is actually a new chip...because that's the only way a 4096 Shader GPU with 1050 Mhz STOCK makes sense....because I don't think any of the current designs have this capability without lots of reworking.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Newegg had the MSI R9 270X Hawk for $140 just days ago. I am surprised no one put it up in the Hot Deals section. You can get Sapphire dual-X 270X for $140 after $10 promo and $20 MIR. However, I wouldn't buy it for $140 since XFX R9 280 is $160. For $20 extra, you get 3GB of VRAM, higher overclocking headroom that gets you to HD7970Ghz/770 level of performance and lifetime warranty. A way better deal if you ask me. Way too many people passed up on the 7950/R9 280 without realizing that it scales well with overclocking, easily surpassing 7970Ghz and 680. Granted at this point if you can save a bit, I think it's a matter of time before we see fire-sale $200-210 R9 290. The price/performance from a $140 R9 270X to a $215 R9 290 is linear but you get 4GB of VRAM and 97% of 970's performance.

---

Wow, the theory behind dual Tongas is getting traction. Easily the most absurd theory of all time. Let me put it this way in the UK an R9 295X2 sells for barely more $ than a 980 (500 pounds vs. 450-550 pounds), but it hardly has an impact on the market*1. In the US we also know that 295X2 is not much more money than a 980 despite a 67% performance advantage at 4K*2. How in the world would a dual 2048 SP Tonga would be able to compete with a 295X2?*3 Ha, is this April 1st?*4 Why wouldn't AMD just add HBM to dual 290s?*5 No one is going to buy a $700 dual card slower than an R9 295X2 even if it used 200W of power against an after-market GM200 6GB because CF scaling isn't perfect in all games*6. Come on, think!*7

A single R9 290X is 2.08X faster than a 285 at 4K*8. That means if you put 2X Tongas together in CF, they would lose to a single R9 290X*9. Are we supposed to believe AMD will sell such a card for $700? Absurd!*10

9477


Even if we back off to 1440P, 290X smashes the 285 by 64%. AMD would never make a next gen flagship card comprised of last gen mid-range chips*11. Who even thought of this craziness?*12

9476

Do some people here actually think before they post?*13 What would Tonga XT do with 512-640GB/sec of memory bandwidth when CF doesn't work?!*14 What a waste of resources*15. Holly cow speculation fail/sounds like a negative NV PR spin tactic to get people to upgrade to NV and not wait for R9 390 series.*16

1)UK prices for 295x2 & 980 mean 390 can not be "dual" GPU?
2)US prices for 295x2 & 980 @4k means 390 can not be "dual" GPU?
3)Dual Tonga has the possibility of revolutionary memory management & ofc power consumption. Proof otherwise?
4)April 12th where I live.
5)290's don't have appropriate resource sharing.
6)My/our question is not reliant on crossfire.
7)If you can't understand #6 then I'm not sure how to explain it.
8)Arbitrary but kinda useful figure is kinda useful.
9)Never talked about crossfire, nor neutered Tonga.
10)Yes. Maybe. Mr Logic/Price/Cost?
11)How do you know?
12)People who thought of this? Can I count deliberate trolls or only the paid type?
13)I think before I post. Grammar and spell checking are last on my list, but if there's a troll attacking my posts I'll try to reply.
14)Question was never about crossfire. But you seem a bit slow.
15)Sharing resources between multiple devices is a "waste"?
16)Seems there are a few people who consider the option of shared resources outside of "crossfire".

Pfft. Good forum member,
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
But a doubled 290X chip (as SINGLE gpu)? That thing would have a tdp of 450+W on a SINGLE chip....that's just not manageable anymore...and now don't say 295X does it, too...that's a different design with 2 GPUs that are cooled "seperately".

I am saying R9 290/290X already have better perf/watt than R9 285. Why would AMD start off with a less power efficient chip to which to add HBM? You can undervolt/underclock R9 290/290X too. This doesn't solve all the other issues I talked about such as CF scaling.

I guess haters will have to get used to the idea that this GPU is actually a new chip...because that's the only way a 4096 Shader GPU with 1050 Mhz STOCK makes sense....because I don't think any of the current designs have this capability without lots of reworking.

I wish we could turn back time and end up at HD2900XT/3870 era. Non-stop doom and gloom and then out of nowhere 4850/4870/4890, followed by a 5850/5870. The irony is that the better R9 390 series is, the more aggressively priced GM200 6GB will be. The more $ AMD makes and the more market share it gains with 300 series, the more $ they'll have to design 500-600 series. But I guess some people love paying $550 for mid-range next gen chips instead to save on electricity.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
If I recalled, some people were saying the 290x would have to be a dual GPU card to beat the original titan. It was just not possible for AMD to beat Nvidia with a single die GPU, they thought. Same fud, here.
 

jpiniero

Lifer
Oct 1, 2010
16,982
7,383
136
I am saying R9 290/290X already have better perf/watt than R9 285. Why would AMD start off with a less power efficient chip to which to add HBM? You can undervolt/underclock R9 290/290X too. This doesn't solve all the other issues I talked about such as CF scaling.

Tonga is more power efficient. Those benchmarks you showed the 285 is suffering because of the 2GB of memory and less memory bandwidth. Obviously that won't be a problem with HBM.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship

Crossfire was fixed a long time ago.

Every review I've recently seen says that CrossfireX works much better than SLI now. Well, when it's enabled.

True. This is AMD's problem though. You either have people who are ill informed or FUD spreaders about Crossfire. nVidia needs to fix Gameworks which breaks crossfire. Of course unless it's on purpose. But nobody would support a company doing something like that to the public.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Several things here I like to point out:

1. Using 2x290/290X die`s would be extremely stupid and impossible. Why? Because A) It will interfere with 295X2 performance wise. B) 290X die is 438mm2. Good luck fitting two of those under one silicon die along with TSV and controllers. Yes they will shrink the die if they remove the memory bus but it will still be big C) TDP of one 290X is 290W. Two would be 580W but probably better due to 28nm SHP. Good luck building one die with 500W TDP and cooling that off. Even water will have trouble with that.

Tonga is a much better fit for pretty much everything (thermal, power, performance, against 295x2, chip size) if they really go dual core.

2. Again, the two Tonga cores would not be connected through Crossfire. AMD should fire all their engineers if they couldnt connect two cores on the same die, with an internal connection that goes directly from one core to the other. If its through TSV or through L2 cache by routing crossbars I don`t know, but if they can do it with CPU cores and IGP, they can do it with dual GPU as well. This should have zero performance hit.

3. HBM1 have a limitation of 4GB. HBM2 wont be ready until 2016. Still AMD`s slide say "up to 8GB" for the 390 WCE. Dual controller like shown in the dual core picture anyone...?
 
Last edited:

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I am interested to see what dual Tonga XT gpus on HBM would do actually. I bet it might not be exactly Tonga but a new revision of Tonga with some more added DX features but if it is 4096 SP, then the alarm rings off as 2x 2048SP which relates back to Tonga XT.

I mean, the last 2048SP GPU AMD had was Tahiti so the Tonga speculation isn't out of the realm of possibility. I doubt AMD is going straight to a big die and if Cloudfire is correct on the dual GPUs on one die, that is a very different approach and I am intrigued how this would turn out.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Unless AMD fixes all their crossfire problems they're not going to willingly put two GPUs on their next "single-GPU" flagship

The issue nowdays with AMD crossfire is not the poor technology; rather, is the lack of punctual driver support in order the the CF players get the optimizations on time. Especially on Gameworks titles.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I am interested to see what dual Tonga XT gpus on HBM would do actually. I bet it might not be exactly Tonga but a new revision of Tonga with some more added DX features but if it is 4096 SP, then the alarm rings off as 2x 2048SP which relates back to Tonga XT.

I mean, the last 2048SP GPU AMD had was Tahiti so the Tonga speculation isn't out of the realm of possibility. I doubt AMD is going straight to a big die and if Cloudfire is correct on the dual GPUs on one die, that is a very different approach and I am intrigued how this would turn out.

Should be pretty interesting actually. Even though it may use Tonga XT (with some changes I bet too) its still a brand new GPU. With HBM :thumbsup:

If you think about it, AMD have never liked to do big die`s like Nvidia. GM200 is a behemoth of 600mm2 and could be built because Nvidia have the expertise to make it.
AMD may benefit more by using two smaller die`s like Tonga instead and putting them under one silicon with connection between them. With TDP in the 300-400W area like I think 390X may have, the heat density will also go down with a dual Tonga under a bigger chip approach. Which makes it easier for AMD to cool it effectively.
 
Last edited:

njdevilsfan87

Platinum Member
Apr 19, 2007
2,342
265
126
2. Again, the two Tonga cores would not be connected through Crossfire. AMD should fire all their engineers if they couldnt connect two cores on the same die, with an internal connection that goes directly from one core to the other. If its through TSV or through L2 cache by routing crossbars I don`t know, but if they can do it with CPU cores and IGP, they can do it with dual GPU as well. This should have zero performance hit.

Everyone seems to have forgotten how the Q6600 was really just two E6600 sitting right next to each other.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Wouldn't actually using HBM's bandwidth in existing game titles involve driver tweaking? I mean AMD and Nvidia have been known to completely rewrite shader models and the like for AAA titles in their drivers and similar code altering stuff.

I can't imagine that HBM itself would come with an insane amount of necessary driver tweaking...sure, some might be necessary...but essentially for the software itself it kind of works the same...just WAY faster.

The memory controller on the GPU will work in a different way...but that's a firmware thing and not a driver thing, eh? Once that is fixed via Firmware...you're good to go..and I'm pretty sure that this was on of the first things that was worked out x)
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Don't worry. The North American media will ignore that too. $40 more for 52-107% more performance and double the VRAM? Nah, forget it. Make a review full of GW titles, remove MSAA and all modern games that exceed/push 2GB of VRAM, limit game selection and voila - Gold AWARD. And when your results don't match any other credible site and especially European reviewers, just pretend they don't even exist.

We've reached a critical point in market share where the media refuses to criticize the wildly popular and market dominant brand, and skew testing to hide problems in products to account that no matter what the most popular player makes, it receives an award. It's similar to media outlets like the Verge, Wall Street Journal, Bloomberg Business being very reluctant to give an Apple product a bad review because of their readership and receiving future review samples/being invited to events.

There is no way all these free prizes, free trips, invites to media events, getting favourable treatment for receiving review samples of the latest products aren't done in return for favourable reviews. Most of the North American GPU media is so far removed from being objective, we can count the number of credible/not sold out to marketing NA sites on 1 hand. This generation has been the most eye-opening because I don't even think they are ashamed to hide it anymore that they swing reviews whichever way marketing dollars go and specifically to cater to popular readership views on their site! After all, if most of your readers are strongly biased towards 1 brand, if you start criticizing that brand/its products in reviews, you'll quickly start to lose the core readership/members on your site, which impacts your traffic and thus ad revenue.


You don't like the reviews so they must be paid shills. I don't suppose you have any proof to back this up?
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Everyone seems to have forgotten how the Q6600 was really just two E6600 sitting right next to each other.

And that was fine because software didn't have to be updated to reflect this - it was just a hardware detail that end users had no reason to care about. Currently, we aren't at that stage with GPUs. Even if they're on the same physical card, dual GPUs are treated separately and discretely by the system. Software needs to specifically support Crossfire/SLI to take advantage of them; some games and many productivity packages don't. As long as that's the case, users will always prefer one powerful GPU to a setup with two smaller GPUs, unless the dual setup is far cheaper.

Now, if AMD somehow figured out a way around this, then doubling up Tonga would be feasible. If the system sees it as a single, 4096 SP GPU with 100% compatibility for everything, then the user has no reason to care what the physical die layout looks like. But it absolutely has to be a hardware solution - anything requiring buy-in from developers is going to be a sure loser (see: HSA, Mantle, TrueAudio, etc.).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The only sold HBM1 atm is 1GB/module at 1Ghz.

HBM2 is coming in 2016. So HBM1 will rather quickly be outphased.