• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question Speculation: RDNA2 + CDNA Architectures thread

Page 221 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
Against 3050 Ti is Navi 14 capable of going with full 24 CUs and 16 Gbps GDDR6, let alone a GPU with more TFLOPs(1024 ALUs at 3 GHz is for gaming better than 1536 ALUs at 1.9 GHz).
On what site did they test that?

I think 3050Ti should be comparable to 1660 super or 1660Ti and both of them are ~20% faster than a standard RX 5500XT 4GB. Link
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
4,826
3,443
136
Where was that tested?
Well, kinda. 60W 3050 Ti laptop version with full die enabled is slower than 80W GTX 1660 Ti laptop version.

If you scale differences between laptop GPUs and desktop, with the same dies, but higher power envelopes you see that you have to add 20-30% of performance on desktops.

Desktop GTX 1660 Ti is just 15% faster than Navi 14. Adding those 2 lacking CUs from full die, and 16 Gbps GDDR6 you will get it to similar performance as 3050 Ti desktop.


Here you see that 1660 Super is 11% faster on average than RX 5500 XT. The difference between 1660 Super and 1660 Ti is marginal.

So yeah, full die N14 is capable of going against 3050 Ti, if AMD wants it to.
 
  • Like
Reactions: Tlh97

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
As you said, It could end up a bit faster than 3050Ti.

If It can be clocked at 2GHz and be within 35-45W, then It should be capable to go even against the mobile 3050Ti 45W. Not bad considering It's a smaller chip.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
4,826
3,443
136
As you said It could end up a bit faster than 3050Ti.

If It can be clocked at 2GHz and be within 45W, then It should be capable to go even against the mobile 3050Ti 45W.
This is what you've said:
At 3GHz It should be capable to go against 3050Ti.
To which I replied:
Against 3050 Ti is Navi 14 capable of going with full 24 CUs and 16 Gbps GDDR6, let alone a GPU with more TFLOPs(1024 ALUs at 3 GHz is for gaming better than 1536 ALUs at 1.9 GHz).
I have not said it would end up faster. I said it is capable of going against. Those are completely two different things, two different meanings.

N14 with 24 CUs clocked at 1.9 GHz on the core, and with 16 Gbps GDDR6 should be around 1660 Super-1660 Ti. The same as 3050 Ti.

Yes, it will be much less efficient. But who would care about it at this day and age, if it would offer 8 GB's of VRAM, and be cheap?
 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
Against 3050 Ti is Navi 14 capable of going with full 24 CUs and 16 Gbps GDDR6, let alone a GPU with more TFLOPs(1024 ALUs at 3 GHz is for gaming better than 1536 ALUs at 1.9 GHz).
You didn't say It directly, true.
I assumed from your reply that Navi 14 has the same performance as RTX 3050Ti and N24 at 3GHz is ~5% faster than full Navi14 at 1.9Ghz, that's why I said what I said.

P.S. The bad thing about N24 is 4GB Vram limit.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
3,308
1,857
136
Man, both manufacturers are really dragging out this generation's launch.

Still haven't gotten reviews on anything slower than a 6700XT / 3060 and we're already 9 months out from launch. My memory might be failing me, but when was the last time it took this long to launch one top to bottom gen?
 

beginner99

Diamond Member
Jun 2, 2009
4,846
1,233
136
Man, both manufacturers are really dragging out this generation's launch.

Still haven't gotten reviews on anything slower than a 6700XT / 3060 and we're already 9 months out from launch. My memory might be failing me, but when was the last time it took this long to launch one top to bottom gen?
It's puzzling on NVs side really. AMD simply lacks capactiy but NV as bascially only Samsung customer? I thought their main advanatge there will be volume. Either Samsung has a tiny fab or yields are still that bad.
 

psolord

Golden Member
Sep 16, 2009
1,347
433
136
It's puzzling on NVs side really. AMD simply lacks capactiy but NV as bascially only Samsung customer? I thought their main advanatge there will be volume. Either Samsung has a tiny fab or yields are still that bad.

Or many fab workers of different tiers, top to bottom, died from Covid and they are not saying anything. Why would they?
 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
AMD Radeon PRO W6600M
The same parameters as the standard version including TFLOPs, but TBP was reduced to only 90W!
Link

P.S. even the memory interface is still listed as 4096-bit. :D
 
  • Haha
Reactions: lightmanek

Mopetar

Diamond Member
Jan 31, 2011
6,133
2,958
136
Or many fab workers of different tiers, top to bottom, died from Covid and they are not saying anything. Why would they?
That's quite doubtful. South Korea has had fewer than 2,000 deaths from COVID. Fab workers are probably some of the least likely to get it since they're already working in environments that require them to wear clean-room suits.

The Samsung process was always reported as being less mature than the TSMC one and there were even a few rumors that Nvidia was trying to use a 7nm node but couldn't get it working and had to go with Samsung's 8nm node instead. Samsung also uses their own fabs for their own chips, so they're essentially competing with Nvidia for wafers. Combine those two factors and it's not too surprising that Nvidia would be supply constrained as well.
 
  • Like
Reactions: Tlh97

gdansk

Senior member
Feb 8, 2011
691
387
136
I'm doubtful Samsung's yields are bad. Look at Nvidia's revenue lately. They are shipping a lot of very big chips. It just seems demand outpaces all supply, be it Samsung or TSMC.
 
  • Like
Reactions: xpea and Tlh97

insertcarehere

Senior member
Jan 17, 2013
401
261
136
Big mistake on AMD's part if true...

https://videocardz.com/newz/amd-radeon-rx-6600m-navi-23-mobile-gpu-gets-tested

[Chinese reviewer] also made a good point that both Radeon RX 6600M and RX 6700M require two separate motherboards designs, which will greatly increase required R&D spending for OEMs. In comparison, NVIDIA’s whole series ranging from GTX 1650 to RTX 3080 only requires two board designs, reviewer notes.
Tough to get widespread OEM adoption if every model of GPU on a laptop SKU needs its seperate MB design while Nvidia has widespread commonality.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
3,308
1,857
136

leoneazzurro

Senior member
Jul 26, 2016
415
548
136
Frankly, the claim seems to be a truckload of bullshit. First, because it's very likely that 6700M and 6800M will share the MB design and the 6600M will have its own design - which means that AMD too will have two MB designs for its complete lineup.Which makes sense if you remember these are literally two chips - same for Nvidia mobile 3060-3080 range. Maybe they cannot reuse older design - but this is because there was practically none.
From various sources it seems mobile Radeon being slowed down is more a question of other - "political" - reasons.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
3,308
1,857
136
I know this is just a fluff piece, but figured I'd throw it out there since it is *something*

 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
I know this is just a fluff piece, but figured I'd throw it out there since it is *something*

I still find It weird that only 28CU models were unveiled so far( one mobile and one workstation).
Boost for the workstation model is very impressive at 2900Mhz, but according to Bondrewd It's heavily binned and who knows what is the average clockspeed for It, but for desktop I expect lower clocks.
According to the chinese leaks mentioned at videocardz, the 28CU 6600M looses against the mobile 3060. The full 32CU model would have been much better for mobile If this is true. On the other hand 128bit GDDR6 and 32MB Infinity Cache is not enough for 1440p, but gaming laptops use mainly 1080p displays so unless you want to use an external monitor, It shouldn't be a big issue.

What I can't understand is how RTX 3060 mobile can have higher performance within the same TGP or in other words be more power efficient?


I have to wonder If this leak is really accurate.
 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
The Ampere uarch isn't inefficient at all, the problem is desktop GPUs are pushed to the absolute limit of freq/voltage scaling and GDDR6X is a disaster.
I never considered Ampere inefficient, but RDNA2 should be more efficient thanks to IF and better process and no GDDR6x.
GDDR6x is not very good, true, but RTX 3060, Ti and 3070 don't have It and mobile variants are not so much lower depending on TGP.
RTX 3060 mobile TGP specs:
60-65W RTX 3060 -> 1382Mhz(Boost)
80-85W RTX 3060 -> 1525Mhz(Boost)
80-95W RTX 3060 -> 1525Mhz(Boost)
90-95W RTX 3060 -> 1630Mhz(Boost)
115-130W RTX 3060 -> 1802Mhz(Boost)

170W RTX 3060 desktop -> 1780Mhz(boost) Official data

So the 95W mobile version has only 8.5% lower boost than the desktop model, that's not a big difference.

The weird thing is that from rumors, N23 should have lower TBP(135-150W) than RTX 3060(170W) and be clocked at ~2.4-2.5GHz.
Now RX 6600M is a cut-down 28CU N23 with 2177Mhz boost, which would mean It's ~9-13% lower than the not yet released desktop part.
Why would cutting down 35-50W mean bigger relative decrease in clockspeed for RDNA2 than cutting 75W from RTX 3060? This doesn't make any sense to me.

P.S. Let's not forget TGP(GPU+MEMORY) on a desktop card is lower than TDP(NVIDIA) or TBP(AMD).
 
Last edited:
  • Like
Reactions: Tlh97

leoneazzurro

Senior member
Jul 26, 2016
415
548
136
Mobile is quite difficult to judge. You cannot look only at the nominal power limit, but also at the actual cooling system and so on. That means, if you don't compare two identical laptops with the exception of the GPU, you may have quite the misleading results.
 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
True, but we don't have anything better for now, which is sad considering It should have been released in 1H of 2021, and It ended as only paper launch.
 

leoneazzurro

Senior member
Jul 26, 2016
415
548
136
If you don't deliver because you supplied the chips in time and your OEM "partner" does not launch the product because of external pressures, well...
 

TESKATLIPOKA

Senior member
May 1, 2020
441
425
96
If you don't deliver because you supplied the chips in time and your OEM "partner" does not launch the product because of external pressures, well...
We don't know If It's true or just a lot exaggerated.
I really want to see how Nvidia is so capable of forcing big OEMs(HP, DELL, ASUS, Acer and Lenovo) to not launch a competing product.
It's not like they are much better in gaming with their products.
 
Last edited:

leoneazzurro

Senior member
Jul 26, 2016
415
548
136
We don't know but multiple unrelated sources are reporting that, and when you have 80% of the discrete GPU market, well, it is easy to force your hand. And this has nothing to do with being better in gaming or not. After all, we already had similar attempt in the past by Nvidia, with their GeForce partner program,. And that was in open light.
 

ASK THE COMMUNITY