• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 102 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Nvidia's power draw is ludicrous never mind the power connector debacle.

Given the huge performance advantage that RTX 4000 delivers (especially at 4K), the huge power draw may be necessary.

People want their cake and to eat it too, I understand that. But if that kind of performance was easy to achieve at lower power draw, then Nvidia would have done it, nevermind AMD.
 
Because they didn't went for ludicrous power consumption performance may seem a bit underwhelming, that is the impression it got. Actual clocks are nothing like "the rumors".
At least it'll possible to overclock reasonably, right?
At 350 watts the RTX 4090 loses about 5% compared to when it’s not power capped. The RTX 4090 is also very power efficient for it he performance it delivers.

Hopefully, AIBs will overclock RDNA3
 
So where will N32 land in performance then?

7900XT has about 43Tflops which is quite a cut from the 7900XTX but I presume with a 10% lower price performance might not be that far off, maybe 15% less give or take otherwise the pricing gap would be larger (or it is being used to push people to buy the XTX instead).

So N32 at 2.5Ghz say would be 38.4 Tflops but with less bandwidth and cache than the 7900XT. I still think 30% more than the 6900XT is on the cards for full N32 but it does mean that the 7700XT is probably close to 6950XT performance, maybe + 10% and then 7600XT might be more like 6800 / 6800XT at 1080p rather than the rumored 6900XT perf.
 
So do you reckon AMDs pricing will force Nvidia to lower its pricing a notch down the road? Based on specs it seems XTX might be on par with 4090, just like 6900 was with 3090... The price difference though, its now at 700 USD, thats quite a lot. Even if Nvidias stuff is usable outside of gaming, which is the only reason i am locked to them. Unfortunately.
 
At 350 watts the RTX 4090 loses about 5% compared to when it’s not power capped. The RTX 4090 is also very power efficient for it he performance it delivers.

Hopefully, AIBs will overclock RDNA3

Is that true for RT? TPU tested power draw in raster and in RT and with RT on it consumes a lot more power than in pure raster. Cutting the power draw may not make much different raster wise but it will probably make a bigger RT difference.
 
At 350 watts the RTX 4090 loses about 5% compared to when it’s not power capped. The RTX 4090 is also very power efficient for it he performance it delivers.

Hopefully, AIBs will overclock RDNA3

"But with RT on?"

Since RT performance is all that matters, and it's power usage is super swol during RT usage scenarios, are you sure that's all that it loses? Seems like internet magic math to me.
 
Feeling better and better about this 6800xt purchase...

I admit that given what we had seen on paper in terms of specs I was anticipating a bit more. I get that doing the first chiplet GPU is going to have some growing pains, and the whole effort did feel like AMD would have liked another 6 months to be able to dial everything in. Granted, some of the issues NV is having, it feels like this whole gen could have used 6 months in the oven, but oh well.

After basically catching NV with the RDNA 2 in all but RT, I was honestly expecting AMD to try and go for the kill at this point. 50-70% gains gen over gen ain't *bad*, but NV managed to pull off 70-90% gains (and what did it cost them, everything) on not even their top end part.

I do wonder if AMD has a mid-gen refresh part in the wings that leverages the chiplet design a bit more, or the rumored v-cache part (which simply might not exist), or who knows what.

Anyhow, will be nice to see these things in more of a head to head match-up with NV parts and what the new "stack" looks like.

Any thoughts on how this is going to shake out down to stuff like the N33 die?
 
Also the flagship AMD card is not an 8K card at all for AAA games at 60FPS. This is what nvidiia did with the 3090. Come on marketing.
 
When are the reviews coming out? With $999 price, my guess is XTX is 70% of 4090's raster perf and 50% of RT perf. Which is not that bad considering it slurps just 355W.
 
Honestly, AMD somehow made the 4090 look more attractive at $1500. For 50% more you get a better cooler, CUDA, DLSS, looking like >25% better raster (at 350W), and looking like >2.5x better RT.
It's $1600. And we'll see about raster performance. I expect 4090 is about 25% up at stock configuration. Better is subjective as I don't want a new power supply nor case. Nor do I want to normalize $1600 450W GPUs even if it has some cool features.

Then I remember Nvidia wants $1200 for the AD103. Maybe AD103 is better than its specs let on...
 
It's $1600. And we'll see about raster performance. I expect 4090 is about 25% up at stock configuration. Better is subjective as I don't want a new power supply nor case. Nor do I want to normalize $1600 450W GPUs even if it has some cool features.

If AMD's numbers shown hold true, the 4090 should only be about 10% faster than the 7900XTX in raster.
 
Performance seems a bit disappointing because they said "fastest gaming card", but I forgot, they'll release a card with two GCDs, with that one they have the opportunity to get on top, or won't this be enough?
 
Back
Top