Isn't ZEN3 in the CPU performance lead lately? It might be annoying for Nvidia to use AMD cpus in reference testing, but more FPS is more FPS? Heck AMD had used Intel CPUs when showcasing their GPUs during Bulldozer era, no big deal.
With CPU core count increasing there is actually more horsepower to extract from CPUs. Nvidia was already doing this multithreaded processing during DX11 era.
Too much drama, i like both DLSS and RT and I support NV for pushing technologies forward, instead of safe choice of more and more classic rasterization performance.
My point was no matter what they could bench with, Intel sells the most x86 CPUs so Ampere is going to be paired with Intel a lot. Maybe even in the vast majority of cases when we take prebuilt PCs (and basically the only way to buy RTX cards right now)... And you have to make sure you are performant on the most widely used brand, right? It would “come out” if you can use value CPUs with the competition and still get solid benchmark results.
In maybe other words, if you have a GPU that needs a high performance CPU to perform, and the main supplier of CPUs (Intel) is busy counting their money and buying back stock for years and years instead of pushing the performance envelope, wouldn’t you see that dependence as a risk? One that might severely limit the upside of your highest priced and highest margin SKUs?
As Mopetar says, and I am taking some liberties here, if Ampere is really a server/AI part masquerading as a GPU for the masses as well, it's likely that hardware scheduler (or whatever is limiting performance with lower power CPUs) is a non factor for those loads or something that makes it a non-issue for their highest margin install base. Wasted silicon.
That's the part that is relevant to the main issue this thread was created to discuss.
Secondarily, it is my opinion there is a ton of silicon that vast majority of most Ampere cards sold to gamers like us is unlikely to see heavy service in their relevant lifetimes. Given how RT effects are likely going to target the biggest install base (consoles) the computer hardware in a generation or two should be able to handle it exceptionally well without scaling IMO. We just will have to see how it plays out. And hopefully with crypto GPU mining behind us, prices should be affordable if Nvidia and AMD wants move new cards in volume, because the earth should be littered with RDNA 1 cards and low price 3060s and cards like them for cheap as they get liquidated. Still true even if it is just a GPU refresh for the miners (worst case scenario imo).
To me it plays out differently if MS updates the Series X in 2-3 with 2-4x the GPU/RT horsepower. I don’t know what the uptake rate will be on that (sign me up though) but that would naturally play to a sliding scale of RT effects in more titles (to support Xbox Series S through SX through SXXX) sooner rather than later, given the relative parity of the SX and PS5 now.
DLSS and RT drama aside, it’s really not that big of a deal because gamer adoption is relatively low because so many Ampere GPUs sold so far are being used for compute. I would love to know the ratio of gaming to mining, and if an Ampere owner has a card, how many do they have. I’d bet decent money most of them are in the hands of a vast minority of the owning population. Different topic but I typed it out so I am leaving it for later
. Imagine the install base if it was 1 card sold to 1 gamer. What a paradise it would be! 😂