When AMD price too close to Nvidia based on their Raster performance , everyone screams they need to lower prices because RT and feature set can't compete... 'they'll never gain market share' , 'Distrupt the market' , etc etc.
AMD come in undercutting Nvidia significantly:
"it must be even slower than the 4080 in rasterization"
" should be renamed and be dropped to $949"
You guys are funny.
As for the chip/architecture itself. The only thing that's "wrong" is the RT performance. Yet everyone's fixated on the clock speeds not being through the roof, not beating the s**t out of 4090 (even at a mere 355w) and therfore it must have been botched. It's a Fermi, it's an R520...
Hello? , since when is a 50% increase in perf/watt, and 60% increase in performance vs a predecessor "Botched"
It's still a huge uplift over RDNA2 at the end of he day. It's also the first Gen Chiplet architecture, which no doubt has presented a host of challenges, and wouldn't come without some compromise..
Comparing to Nvidia's Gen on Gen - They've gone from an inferior 8nm SS process, to a Superior custom '4nm' process , so you can't even draw any parallels there either. It was always going to be a challenge to maintain status quo with Nvidia this gen because of this fact.
Bit of a reality check people.. Raster perf and perf/watt is looking fine. Not amazing, not matching the random rumors started my morons, sure. but all things in the real world considered.. Fine.
RT.. Yeah It'd be interesting to discuss the "why's" around this. because regardless what personal importance you put on it, it's becoming more and more heavily weighted in reviews, but I can't tell if AMD seem to have consciously given it a low priority with the nature of changes made, or if it's performance is unusually low for the resources on tap. I'm struggling with this a bit as I don't fully understand the bottlenecks.