AMD 7600 reviews

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,559
20,838
146
I think the Sony Handheld is going to flop. It's NOT a console. It's just a dedicated Wifi Streaming device for your PS5.


Sony has had the hot hand. We will see how it plays out. You are on record that it flops. Assign a number to what you consider a flop. Also, where in my post did I write console? That's right, I didn't. You need to put away your jump to conclusions mat. I wrote exactly what I intended. Some handhelds are considered PCs, others consoles, others remote play devices, but all fall under the category of handheld.
 

gdansk

Platinum Member
Feb 8, 2011
2,141
2,663
136
I agree that there is sharing of costs that includes not only dGPUs, consoles but also iGPUs. But none of them pays it all, and then, the costs have to amortized across all of the the products that use the shared technologies.
I just feel like reiterating my thought here. I caveated that by accounting principles it doesn't work out that way but in actuality RDNA continues development due to Sony.

What's the largest revenue source of any product containing RDNA IP? Oberon. It alone had more revenue than N31 and N21-24 all combined in Q1.
By accounting N31-N33 are doomed to never recoup all RDNA3 costs. They will never make money at any margin AMD can achieve with its mediocre performance. But Sony is the reason AMD won't kill RDNA despite RDNA3 never paying for itself. Sony is ultimately the cash cow that keeps AMD working on RDNA IP even if some iterations lose money because they are not adopted in consoles.
 
Last edited:
  • Like
Reactions: Tlh97

Ranulf

Platinum Member
Jul 18, 2001
2,364
1,218
136
No, I think you just don't know what is high price, what is neutral price and what is low (aggressive) price.

You think $269 is a price increase, while in reality it is a price cut.

It is a price increase over the prices of the 6600/50XT cards for the last several months for little performance gain. At $300 it isn't very impresive against their own products.

HUB has a Q&A vid up, talks about the last minute price drop on the 7600, queued at the 20min mark:

 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Sony has had the hot hand. We will see how it plays out. You are on record that it flops. Assign a number to what you consider a flop. Also, where in my post did I write console? That's right, I didn't. You need to put away your jump to conclusions mat. I wrote exactly what I intended. Some handhelds are considered PCs, others consoles, others remote play devices, but all fall under the category of handheld.

I mentioned that it's not a console, because that's the main reason I think it's going to fail. Plus you didn't mention it, so I didn't know if you were aware it was only a streaming device.

I don't have exact number for a failure. I just expect to be reading in the year or two following launch that it's a flop, followed by it quietly disappearing from the market.

This is depending on early info that it is only a streamer. If it has some light gaming capability separate from the PS5, that would make success more likely IMO, but if it's just a streaming device for the PS5 that's just lame.
 
  • Like
Reactions: GodisanAtheist

Mopetar

Diamond Member
Jan 31, 2011
7,866
6,095
136
I haven't seen any evidence that 4 GB chips with a 32 bus width are coming, just 4 GB chips with a 64 bus, which means that you need fewer chips, but cannot do with a smaller bus (although you can of course still clamshell).

It's not here yet, but we'll definitely see it in the not too distant future. Eventually the DRAM manufacturers will start using EUV and they'll be able to make 4 GB chips without issue.

At that point we might start to see cards with only a 96-bit bus as you could get 12 GB of VRAM. As long as there's sufficient cache on the die, it won't need terribly high bandwidth to compensate.
 

Aapje

Golden Member
Mar 21, 2022
1,394
1,881
106
It's not here yet, but we'll definitely see it in the not too distant future. Eventually the DRAM manufacturers will start using EUV and they'll be able to make 4 GB chips without issue.
I don't know. Cache supposedly is no longer scaling with the new nodes and cache is memory.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
It's not here yet, but we'll definitely see it in the not too distant future. Eventually the DRAM manufacturers will start using EUV and they'll be able to make 4 GB chips without issue.

At that point we might start to see cards with only a 96-bit bus as you could get 12 GB of VRAM. As long as there's sufficient cache on the die, it won't need terribly high bandwidth to compensate.

DRAM is having a significant scaling challenges because each cell contains a Capacitor, and these have been a scaling problem for a while now. Going to EUV won't help.
 
  • Like
Reactions: Tlh97 and Joe NYC

coercitiv

Diamond Member
Jan 24, 2014
6,218
12,003
136
Looks like some people will need an adapter:
AMD says the reference cards with the backplate design flaw will not be sold. A revised reference card will be available for purchase in the next weeks.
 

Aapje

Golden Member
Mar 21, 2022
1,394
1,881
106
MLID claims that the 7600 is selling above expectations, while the 4060 Ti is not selling at all. Not sure how meaningful this is with the 4060 not being out yet. I already thought that the 4060 might sell quite well and this just makes me believe that is a bit more likely.
 

Mopetar

Diamond Member
Jan 31, 2011
7,866
6,095
136
4060 Ti not selling much at all is hardly surprising. It's not a step up for most previous generation gamers and even if it were those people aren't likely to spend $400.

Anyone else who might have considered buying one is waiting for the 16 GB model to drop. Even if it's an extra $100, at least it's not obsolete out of the box due to VRAM limitations.
 
  • Like
Reactions: Tlh97 and blckgrffn

Rigg

Senior member
May 6, 2020
472
976
106
6600 is not selling by bucketloads at $179
6600 XT is not selling bybucketloads at $214
6650 XT is not selling bybucketloads at $219
(my MicroCenter prices)

But somehow, 7600 would sell by bucketloads, if only price was $30 less.

Well, here is a forecast: At some point in its life, 7600 will get a $30 discount, 7600 will still not sell by bucketloads, and your argument will go up in smoke.
This is such a bad faith "argument". These prices were adjusted the day the 7600 became available. They were all $20-30 higher earlier in the week.

How many units have they sold since setting these prices? How many units constitutes a bucketload?
 

Rigg

Senior member
May 6, 2020
472
976
106
7600 cards are part of AMD notebook solution it is offering to the OEM. So AMD is already making them, why not sell them for desktop, which is, after all, AMD's other business.
I'm well aware that they are selling these in laptops. Dell, HP, Lenovo, ext. also make gaming desktops with dGPU's. AMD has made OEM exclusive CPU SKU's for these companies in the recent past. 5700g and 5600g were initially exclusive to OEM. A similar scenario with a GPU wouldn't really be totaly unprecedented. OEM's really like having parts to sell with current gen model numbers. They could have sold this GPU for desktop without releasing it to the DIY card market initially. You Tubers would have bought and reviewed it anyway. This would have served as a perfect way to gauge what price to sell it for to the DIY market once RDNA 2 cards have mostly dried up.

I don't think AMD or AIBs would be making restarting N23 production, selling for $200 and under. Likewise, AMD or AIBs would not be making money selling Navi 33 (with only slightly lower cost to make) for $200 and under.

What really is pointless (for a company that is not a monopoly) is making a product with intent to sell it at a loss. Ask Intel, even Intel can't do it any more.
Stop pretending that you have some grand insight into the costs associated with these cards. You don't. We know they can sell N23 for $200-$250 which means they can sell N33 for that kind of money. IMO the notion that anyone involved is losing money on these cards is laughably absurd. I think ya'll are delusional on that front.

How are you ever going to have any competition if buyers (like you) insist that companies competing with NVidia can't make any profit?
I insist that a product (that almost certainly costs no more to make then the previous gen model it's replacing) should sell at or near the price that the previous gen model sells for. At the end of the day I think the market will dictate that the 7600 is a $200-$230 GPU. We just might have to wait a while for that to become reality. Nvidia can charge more money because they are currently making a better all around product than AMD. If AMD wants to compete they need to fix that. Just because Nvidia doesn't care that they're MSRP's are ridiculous doesn't mean AMD gets to charge more money for offering the same gen on gen performance. Make a good product and you can (within reason) make a good profit.
 
Last edited:

Hans Gruber

Platinum Member
Dec 23, 2006
2,140
1,089
136
I used to have to remind the young kids that AMD made terrible CPU's and good GPU's. This was before Ryzen. They thought Nvidia was the best and everything else was garbage.

Now we have AMD making consistently second best GPU's and pricing them close to Nvidia. What is worse, the 4060Ti is a total joke. We are in unprecedented territory.

AMD needs to go back to RDNA 1 and figure out where things went right. Pricing was perfect for the value they offered. The sad part (AMD) just wanted to see what their new 7nm silicon could do on RDNA 1. It was thought that RDNA 2 would be where they would give Nvidia a run for their money. RDNA 1 was a test/dry run that produced excellent results based on price.

Another question I have. Why is the 7600 on 7nm (enhanced) silicon in 2023? Apple is releasing 3nm silicon later this year. 5nm is not new, AMD is cheaping out on silicon and the pricing does not reflect this. Only the 7900xt/xtx uses 5nm. The rest use 7nm that they call 6nm but it's really enhanced 7nm silicon.

Let's put things into context. The 7600 is using 7nm silicon in 2023. It's almost been 4 years since RDNA 1 debuted on 7nm silicon in July 2019. The 4060ti is based on enhanced 5nm silicon. Which makes the 4060ti even more pathetic.

A lot of you guys like to justify profit margins. You shouldn't, we live in a capitalistic society. It's not our problem is a company wants to maintain ridiculous margins. Is it unfair to ask that AMD GPU's be made using just the basic 5nm process? TSMC 5nm silicon debuted 31 months. There is no real uplift in raster performance from 5nm silicon. The benefit is 30% energy efficiency vs. 7nm.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
I used to have to remind the young kids that AMD made terrible CPU's and good GPU's. This was before Ryzen. They thought Nvidia was the best and everything else was garbage.

Now we have AMD making consistently second best GPU's and pricing them close to Nvidia. What is worse, the 4060Ti is a total joke. We are in unprecedented territory.

AMD needs to go back to RDNA 1 and figure out where things went right. Pricing was perfect for the value they offered. The sad part (AMD) just wanted to see what their new 7nm silicon could do on RDNA 1. It was thought that RDNA 2 would be where they would give Nvidia a run for their money. RDNA 1 was a test/dry run that produced excellent results based on price.

Another question I have. Why is the 7600 on 7nm (enhanced) silicon in 2023? Apple is releasing 3nm silicon later this year. 5nm is not new, AMD is cheaping out on silicon and the pricing does not reflect this. Only the 7900xt/xtx uses 5nm. The rest use 7nm that they call 6nm but it's really enhanced 7nm silicon.

Let's put things into context. The 7600 is using 7nm silicon in 2023. It's almost been 4 years since RDNA 1 debuted on 7nm silicon in July 2019. The 4060ti is based on enhanced 5nm silicon. Which makes the 4060ti even more pathetic.

A lot of you guys like to justify profit margins. You shouldn't, we live in a capitalistic society. It's not our problem is a company wants to maintain ridiculous margins. Is it unfair to ask that AMD GPU's be made using just the basic 5nm process? TSMC 5nm silicon debuted 31 months. There is no real uplift in raster performance from 5nm silicon. The benefit is 30% energy efficiency vs. 7nm.

Limited 5NM capacity?
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,140
1,089
136
Limited 5NM capacity?
Years ago Dr. Su said that AMD was not going to be on the absolute cutting edge process due to cost. That's not a bad strategy. This was when Zen 2 was released on 7nm.

There is no 5nm shortage. I am trying to figure out how many different iterations of 5nm that TSMC has. Nvidia is on the cutting edge 5nm and they call it 4nm. The basic 5nm iteration is just plain 5nm. No enhancements/optimizations. There is no more uplift in performance like with previous nodes. Moving forward it's all power savings and the power saving is significant. The performance does scale better (higher clock rates) so there is added performance. Just not the clock for clock type performance by switching to a new node 7nm to 5nm.

It would be nice to see RDNA 2 6800xt on 5nm silicon.

From the reviews that I have I read. The 4060ti is really the 3060ti with DLSS 3 and 30% power savings because of the node shrink from the 30 series to the 40series.
 

maddie

Diamond Member
Jul 18, 2010
4,757
4,712
136
The performance does scale better (higher clock rates) so there is added performance. Just not the clock for clock type performance by switching to a new node 7nm to 5nm.
This was always so. Density increases allowed more transistors which could then be used for iso freq performance increases.
 

jpiniero

Lifer
Oct 1, 2010
14,643
5,272
136
AMD was going to make N33 into a chiplet like N32 and N31. But even with (presumably too rosy ASP projections), they didn't like the numbers.

Maybe the better question to ask is why they bothered or didn't just shrink N23 as is.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,757
4,712
136
AMD was going to make N33 into a chiplet like N32 and N31. But even with (presumably too rosy ASP projections), they didn't like the numbers.

Maybe the better question to ask is why they bothered or didn't just shrink N23 as is.
Where is your statement coming from? Name your sources.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
AMD was going to make N33 into a chiplet like N32 and N31. But even with (presumably too rosy ASP projections), they didn't like the numbers.

I doubt they had to think about N33 chiplet based for more than a minute.

The savings from chiplets shrink as your overall die area shrinks, and the extra packaging isn't free, so it would probably end up costing more to do N33 chiplet based, than to keep it monolithic. The big win for chiplets is at larger sizes.

Maybe the better question to ask is why they bothered or didn't just shrink N23 as is.

N7 to N6 doesn't really seem worth doing a shrink on.

The real question is why not just just rebrand N23? My Guess: They thought the would get more out of the new CU architecture changes in RDNA 3.
 

maddie

Diamond Member
Jul 18, 2010
4,757
4,712
136
Too lazy. But specifically mentioned the packaging costs.
What you are mistaking as planning to do is what all development programs do. You explore several designs and then choose the best fit your needs. The fact that AMD investigated a chiplet version of N33 in no way means that they planned to produce one.