[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 129 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Toms Hardware is now reporting on a rumor that the 5500 XT GPU was produced by Samsung, not TSMC.

I don't put a lot of confidence in that claim but if true may explain why 5500 was so damn late. AMD may have tried to get TSMC to make it but when it became clear it would take too long they went to Samsung. Of course AMD made that deal with Samsung on mobile GPU's and the 5500 may well have been part of that deal from the beginning.

We know the rumor was wrong now, but even if it was true, the reason would not be as you state. Because TSMC and Samsung have differences in their fab technologies, a design cannot be easily swapped over to another fab without changes. AMD would have had to intend for Samsung to do it from the start.

And honestly, it wasn't *that* late. The OEM version came out when expected. The retail version just had to wait to fulfill the initial OEM (Apple) requirements.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Well the rumor could actually make sense even if it ends false.
How is TSMC going to have capacity to manufacture all the console chips for sony and microsoft, because it's not 80mm2 like the zen2 chiplets.

The consoles chips are expected to have something like navi10 (250mm2) plus the cpu cores and other stuff will add at least 100mm2 to the total, so we are looking for about 350mm chips sizes.

Vega VII was 331mm2 on 11/2018 which was a major achievement since we have yet to see something that big from there.
The consoles will be release in summer 2020? So it would make sense to have an second source for the chips.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Well the rumor could actually make sense even if it ends false.
How is TSMC going to have capacity to manufacture all the console chips for sony and microsoft, because it's not 80mm2 like the zen2 chiplets.

The consoles chips are expected to have something like navi10 (250mm2) plus the cpu cores and other stuff will add at least 100mm2 to the total, so we are looking for about 350mm chips sizes.

Vega VII was 331mm2 on 11/2018 which was a major achievement since we have yet to see something that big from there.
The consoles will be release in summer 2020? So it would make sense to have an second source for the chips.
There will be really good yield on all component separately. Have you heard that they're going to put all this on one die?

There are no Navi10 or Zen2 or IOD supply issues other than the 3950X and TR which are pretty exclusively binned. Even the 3900X is coming available pretty quickly. Let's say they just use a "3700X" chiplet which is easily found for the CPU and a 36CU 5700 chip, also easily found, for the GPU, plus those cheap GF IODs. If AMD smartly re-use those components across all lineups, I don't think there are going to be any significant production issues.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
At least fudzilla is appropriately named for a person to come to the conclusion that the rumors should be taken with a pinch of salt.

If we could just get wccftech to rebrand as wtftech they’d be on even footing in that regard.

Why does it even matter what they're called? Those unreliable sources keep getting dragged here by anyone trying to "prove" a point even if just a page back they cried those aren't reliable sources.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
If true then its bad, 6gb vram on a $250 card in 2020 is going backwards as AMD has been providing 8gb for that price since 2016. The cards would be outdated in just 2 years as next gen console ports will be hamstrung by the vram.
 
  • Like
Reactions: IEC and Head1985

RetroZombie

Senior member
Nov 5, 2019
464
386
96
There will be really good yield on all component separately.
You must be making some confusion, amd products is not sony or microsoft products. I don't know what volume is needed for the consoles, maybe is low, but the die is big enough to require some capacity that tmsc might not have and so samsung would be needed to fulfill sony and microsoft needs (not amd needs).
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136

1920 ALUs, 6 GB of VRAM according to Igor Wallossek. 25% slower than RX 5700 XT, essentially.

So we are looking at Vega56-GTX 1070 Ti competitor for full SKU with 1920 ALUs/6 GB GDDR6, and 5600 non-XT most likely will compete with GTX 1660 Ti, and will have 1792 ALUs/6 GB GDDR6.
 

Ranulf

Platinum Member
Jul 18, 2001
2,898
2,561
136
So a repeat of the 1660ti insanity, just 10 months or so later. So much wow, such innovation. If Nvidia gets more 1660 Supers in stock for $230-250, they win the round price wise.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Amd is dead with this launch, they are downgrading themselves. Going from 8gb in 2016 to 6gb in 2020. This is simply unbelievable. Amd doing a good job killing themselves they don't need Nvidia to do it.
 
  • Haha
Reactions: Krteq

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136

With 192 bit GDDR6 you get 288 GB/s.

Effectively we are looking at Vega 56-GTX 1070 Ti performance. If its 270-280$ its a good option, if AMD will not reduce prices of 5700 series GPUs at CES.

Edit:
VCZ Twitter said:
RX 5600 XT:

🟡

6GB 192-bit 12 Gbps

🟡

new GPU, NOT Navi14

🟡

Might be: N10LE, N21, Ariel (no idea, just guessing).

🟡

Was told the GPU will also be used by sth else. Prob not a console as it lacks hw RT.

🟡

Was also told ASIC size will tell us a lot.
So it might be Navi 12, after all.
Edit nr 2:

 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
So the possibilities are:

32 CUs, 192 bit Bus, 288 GB/s GDDR6 - Navi 12 die.
30 CUs, 192 Bit Bus, 288 GB/s GDDR6 - cut down Navi 10 die.

I think I prefer the first option :p.

30-32 CUs require any RDNA GPU to have two Shader Engines, which means that 32 CU design would have 16 CUs/Shader Engine, compared to 20 CUs/SE in Navi 10, and 24 CUs/SE in Navi14, so the Navi 12 GPU in this design might have the best performance scaling.

Second thing why I like this the most: cut down RX 5600 will have most likely 30 CUs, instead of 28.

Still, performance targets do not change a lot, with this latest information. We might be looking at Anything between GTX 1660 Ti to Vega 64 performance wise.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Amd is dead with this launch, they are downgrading themselves. Going from 8gb in 2016 to 6gb in 2020. This is simply unbelievable. Amd doing a good job killing themselves they don't need Nvidia to do it.
And yet I think they are doing the right thing.

AMD RX580 8GB vs Nvidia 1060 6GB
Consumers/reviewers:
Wow nvidia is so awesome with 6GB have the same performance of amd 8GB card and even consumes less power, just buy the Nvidia card.

AMD RX5600 6GB vs Nvidia 1660 6GB:
Consumers/reviewers:
The Amd card have the same performance and features of the Nvidia card and costs 10$ less, just buy the amd card.

No more wasted money on things nobody value.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
And yet I think they are doing the right thing.

AMD RX580 8GB vs Nvidia 1060 6GB
Consumers/reviewers:
Wow nvidia is so awesome with 6GB have the same performance of amd 8GB card and even consumes less power, just buy the Nvidia card.

AMD RX5600 6GB vs Nvidia 1660 6GB:
Consumers/reviewers:
The Amd card have the same performance and features of the Nvidia card and costs 10$ less, just buy the amd card.

No more wasted money on things nobody value.
Well back in 2016 8gb vram was not really necessary but in 2020 that extra 2gb is crucial and the difference between stuttery gameplay and smooth experience once PS5 and Xbox Series X ports start coming to PC. Also if the price difference is just $10 then the default choice is Nvidia not AMD.
 
  • Like
Reactions: Ranulf

RetroZombie

Senior member
Nov 5, 2019
464
386
96
Because if the price difference is only 10 then most people will default to Nvidia because they are a safe bet. Until AMD does something like Ryzen for graphics card i don't see the default choice changing for now.
But they will never do that, the graphics market is very different from the cpu market.
For them to achieve that you mean it needed something like 50% more processing units, with more memory channels, and also more ram than nvidia. And all that at the same price, just no.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
But they will never do that, the graphics market is very different from the cpu market.
For them to achieve that you mean it needed something like 50% more units, with more memory channels, and also more ram than nvidia. And all at the same price, just no.
Well am hoping that in a few months AMD is able to price the 5500XT and 5600XT where they should ideally be which is $150 and $200 respectively. Right now i am willing to buy that AMD doesn't have room for lower price due to gddr6 and 7nm but hopefully around middle of next year prices will be more reasonable.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
I dont know...today gpu market is just so boring.6GB card with 1070/1070TI performance 3 years later and slightly cheaper(AIB 5600xt will be 300usd and 1070Ti was 400 2years ago).Just great...really great:rolleyes:
Wake me up when they release 2080TI performance for 350usd just like GTX970 did.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I don't understand the outcry.

Last gen GTX 1060 6 GB did cost up to 300$ MSRP. GPUs like GTX 1660 Ti and RX 5600 XT are segment replacement for it.

Going from GTX 1060 6 GB to GTX 1070 and 1070 Ti in performance in one generation is bad thing?

And yes, if RX 5600 XT is 270-280$ for PowerColor GPUs - it is day one buy from me.
 
  • Like
Reactions: guachi

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Because if the price difference is only 10 then most people will default to Nvidia because they are a safe bet. Until AMD does something like Ryzen for graphics card i don't see the default choice changing for now.
If AMD is not concerned about it, why are you?
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
If AMD is not concerned about it, why are you?
Oh no I'm not concerned about AMD, my last 2 cards have been nvidia although i do kind of want my next card to be amd but its kind of looking tough now with the direction amd is going but hopefully the pricing is more sensible by middle of next year.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I dont know...today gpu market is just so boring.6GB card with 1070/1070TI performance 3 years later and slightly cheaper(AIB 5600xt will be 300usd and 1070Ti was 400 2years ago).Just great...really great:rolleyes:
Wake me up when they release 2080TI performance for 350usd just like GTX970 did.
RX 5700 XT from Sapphire and PowerColor were 10$ over MSRP for custom models.

So why would it be 300$ when the MSRP will be 270-280?
 

insertcarehere

Senior member
Jan 17, 2013
712
701
136
I don't understand the outcry.

Last gen GTX 1060 6 GB did cost up to 300$ MSRP. GPUs like GTX 1660 Ti and RX 5600 XT are segment replacement for it.

Going from GTX 1060 6 GB to GTX 1070 and 1070 Ti in performance in one generation is bad thing?

And yes, if RX 5600 XT is 270-280$ for PowerColor GPUs - it is day one buy from me.

Even ignoring the fact that the 1060 6gb was $300 MSRP only for a founders edition card (actual MSRP is 250), it was launched 3.5 years (and counting) years ago. I am supposed to be impressed with a 60-ish% improvement in performance in that timeframe?

The 1060 itself matched the gtx 980 in performance despite being launched less than two years after the former. That was progress.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
I really hope they come out with a 6GB and a 12 GB model of the RX 5600. I'd be interested to see the performance difference at 1440p between the two. It's pretty clear that at 1080p you need a >4GB card to get the best performance (I kinda feel bad for those people who bought the 1060 3GB). I'm going to bet that at 1440p the minimum you'll want is 8GB.

Now I wonder if 8GB is optimal at 4K? I bet not. I wonder if Navi 20 or whatever Nvidia comes out with next will have either a 384bit bus with 12GB vram, or 256bit bus with 16GB of vram?
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
I really hope they come out with a 6GB and a 12 GB model of the RX 5600. I'd be interested to see the performance difference at 1440p between the two. It's pretty clear that at 1080p you need a >4GB card to get the best performance (I kinda feel bad for those people who bought the 1060 3GB). I'm going to bet that at 1440p the minimum you'll want is 8GB.

Now I wonder if 8GB is optimal at 4K? I bet not. I wonder if Navi 20 or whatever Nvidia comes out with next will have either a 384bit bus with 12GB vram, or 256bit bus with 16GB of vram?

Just turn down the textures and you won't need 8GB.
 
Status
Not open for further replies.