• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD 7600 reviews

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Well, the RDNA3 architectural changes (namely dual FP32 and higher clocks) were supposed to be substantial by themselves.

As it is now dual FP32 doesn't seem to do anything and without a process advantage it doesn't even seem to clock much higher on desktop than RDNA2 can.

The N33 in the 7600 seems to be over volted. When TPU overclocked their cards they maxed the slider at 3Ghz, Maxed the memory slider and managed to reduce the voltage from 1.2v to 1.05v, average clocks like this were 2.9ghz with 2.4 Ghz ram and they were limited by the sliders so maybe there is a bit more headroom for well binned N33. OTOH well binned N33 are probably better off in laptops.
 
I still think that the go chiplet crazy strategy was far too risky. A 6nm monolith Navi32 able to perform closer to 6800 with 12GB or better still 16GB at maybe $350 would have sold very well IMO.

How are chiplets the problem? N33 is monolithic and it performs the same as N23. That is the problem, not chiplets.

The problem is that RDNA 3 performs about the same as RDNA 2. Not chiplets.

I see zero evidence that chiplets are causing issues.
 
If I remember correctly N33 has the same register file size as N2x. N31 (and supposedly N32) have +50% more. This might contribute to the fact N33 has ~the same performance as N23.

Nevertheless, pretty terrible.
 
How are chiplets the problem? N33 is monolithic and it performs the same as N23. That is the problem, not chiplets.

The problem is that RDNA 3 performs about the same as RDNA 2. Not chiplets.

I see zero evidence that chiplets are causing issues.
Yeah, it sure does look like RDNA3 is the main issue here. Improvements are so underwhelming that it's pretty clear that something is not working properly. Honestly, I'm not sure why they even bothered with this...
 
How are chiplets the problem? N33 is monolithic and it performs the same as N23. That is the problem, not chiplets.

The problem is that RDNA 3 performs about the same as RDNA 2. Not chiplets.

I see zero evidence that chiplets are causing issues.
That is my point: N33 and N32 should both have been monoliths. At least a monolith N32 would have the advantage of a lower BOM . As Navi33 is a about 15% smaller than Navi23, so a monolith N32 on 6nm might have been around 290mm² - a size where yields should still have been good - ergo chiplets don't really save anything especially once packaging costs and the higher power overhead are take into account.

If N32 ever gets released we might be able to compare, but to me chiplets make a lot of sense for bigger parts (and Navi31 was too small IMO) but far less for smaller parts. N32 just seems less risky as a monolith to me. With TSMC charging less for 6nm, an optimised RDNA3 port of N22 to 6nm able to sell for less with a bit better perf/watt, improved codec and display output plus re-worked RT sounds like a very sensible part.
 
That is my point: N33 and N32 should both have been monoliths. At least a monolith N32 would have the advantage of a lower BOM . As Navi33 is a about 15% smaller than Navi23, so a monolith N32 on 6nm might have been around 290mm² - a size where yields should still have been good - ergo chiplets don't really save anything especially once packaging costs and the higher power overhead are take into account.

If N32 ever gets released we might be able to compare, but to me chiplets make a lot of sense for bigger parts (and Navi31 was too small IMO) but far less for smaller parts. N32 just seems less risky as a monolith to me. With TSMC charging less for 6nm, an optimised RDNA3 port of N22 to 6nm able to sell for less with a bit better perf/watt, improved codec and display output plus re-worked RT sounds like a very sensible part.

The chiplets in N31 and N32 are a stepping stone. A bit like the evolution from Zen to Zen 2. Also you can see where it is going with something like MI300.

At some point you might end up with a compute die that has 2SEs in it and AMD can mix 1,2,3 or however many of that same building block to make bigger and bigger GPUs. That is where chiplets will really show their BOM advantage but it needs to start somewhere.
 
It's nearly 50% more power usage compared to my bench 6600s which according to the AMD overlay top out at 100W out of the box.
My 6600XT maxes out at 130W. This card is the 💩 I expected. Good to see reviewers suddenly "discovering" the 6700 still exist, at least here in the U.S. and cost the same money. That's why I said this card was pointless when I saw the specs. It is AMD, so there might be a little fine wine in the tank, but nothing is going to save this turd.
 
As much as we hate Nvidia for their trash 4000 series, this pretty clearly shows AMD isn't our friend either. Probably never see another legit midrange release like the RX 480 8GB for $240 again.
As far as I can tell this forum has raged at every card released this gen because the price/performance isn't good enough (other then 4090 perhaps) and given it a do not buy stamp.

The problem is every card from both AMD and Nvidia fits into that same pricing structure. You can't treat every card like it's the dud outlier. At some point you just have to accept that's how much they all cost these days and get on with it.
 
My 6600XT maxes out at 130W. This card is the 💩 I expected. Good to see reviewers suddenly "discovering" the 6700 still exist, at least here in the U.S. and cost the same money. That's why I said this card was pointless when I saw the specs. It is AMD, so there might be a little fine wine in the tank, but nothing is going to save this turd.

Seems to get an extra 10% through an easy OC which probably reduces power usage as well.

Just max all the sliders and keep lowering voltage until it becomes unstable. Easy peasy. Seems to max out at around 2.9Ghz with 2.4Ghz ram (which given they are 20gbps chips is an underclock). The strict limits do suggest there might be an N33 based 7600XT coming, ideally with 16GB but lets wait and see.
 
No we don't. We can refuse to acquiesce and not buy this garbage gen.
Especially with new old stock still sitting on the shelves of RDNA2. It's unclear if they will even discontinue it soon because it's on essentially the same silicon as RDNA3.

At this point they should have made an actual "6600" die on TSMC 6nm and priced it at $200 MSRP and done a similar shrink to the 6750xt and priced it at $300 and called it a day. This is so much work for products that don't move the needle, and this one won't really even at lower prices because of how much power it uses.

The 6600 is so great at $200 - low power, compact, quiet & capable.
 
As far as I can tell this forum has raged at every card released this gen because the price/performance isn't good enough (other then 4090 perhaps) and given it a do not buy stamp.

The problem is every card from both AMD and Nvidia fits into that same pricing structure. You can't treat every card like it's the dud outlier. At some point you just have to accept that's how much they all cost these days and get on with it.

Sure. I can get on a buy a console instead.

dGPUs are a luxury I can live without, there are other alternatives. If I am going to get ripped off being in the market I will just exit it.
 
As far as I can tell this forum has raged at every card released this gen because the price/performance isn't good enough (other then 4090 perhaps) and given it a do not buy stamp.

The problem is every card from both AMD and Nvidia fits into that same pricing structure. You can't treat every card like it's the dud outlier. At some point you just have to accept that's how much they all cost these days and get on with it.
Yeah it's a crap generation the whole way around just like the time before when a crypto boom shot gpu prices to the moon in 2018 and raised the baseline for prices in the successive gen where $150 GTX 50 series become the GTX 1660 Super at $230, $250 GTX 60 series became the RTX 2060 at $350, RX 80 series that was $240 became the RX 5700 XT at $400, etc. It's why I bought last year knowing the new gen was going to be a repeat of trash value we saw in 2019 after the 2018 crypto boom.
 
My 6600XT maxes out at 130W. This card is the 💩 I expected. Good to see reviewers suddenly "discovering" the 6700 still exist, at least here in the U.S. and cost the same money. That's why I said this card was pointless when I saw the specs. It is AMD, so there might be a little fine wine in the tank, but nothing is going to save this turd.

Not anymore. Newegg appears to be running out of 6700 but they do have one left. It is $304.
 
As far as I can tell this forum has raged at every card released this gen because the price/performance isn't good enough (other then 4090 perhaps) and given it a do not buy stamp.

The problem is every card from both AMD and Nvidia fits into that same pricing structure. You can't treat every card like it's the dud outlier. At some point you just have to accept that's how much they all cost these days and get on with it.
This is the exact mentality nVidia and AMD want the consumer to have. And if the consumer continues to purchase GPUs with this lack of generational improvement and rising cost they will continue to milk every last penny from the consumer's backside. At some point the consumer has to take a stand.
 
That is my point: N33 and N32 should both have been monoliths. At least a monolith N32 would have the advantage of a lower BOM . As Navi33 is a about 15% smaller than Navi23, so a monolith N32 on 6nm might have been around 290mm² - a size where yields should still have been good - ergo chiplets don't really save anything especially once packaging costs and the higher power overhead are take into account.

If N32 ever gets released we might be able to compare, but to me chiplets make a lot of sense for bigger parts (and Navi31 was too small IMO) but far less for smaller parts. N32 just seems less risky as a monolith to me. With TSMC charging less for 6nm, an optimised RDNA3 port of N22 to 6nm able to sell for less with a bit better perf/watt, improved codec and display output plus re-worked RT sounds like a very sensible part.
This doesn't sound sensible at all.
What you want for N32 is the same thing AMD did with N33, and we know from reviews It's a flop. Basically, only a port of N22 to 6nm using RDNA3 architecture.
There is no point for such GPU when we already have N22.
.
5nm N32 with 60CU, 64MB IC, 256bit 16GB GDDR6 sounds much better to me than 6nm version with 40CU, 96MB IC, 192bit 12GB GDDR6.
 
Back
Top