• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Nvidia Blackwell in Q1-2025

Page 121 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I'd assuming some dream perf? 5060 = 7800XT? 5070 = 7900XTX? 🤡
Could be just looking at some ideal RT benchmark. The 5070 has 80% of the shaders a 4070 Ti does, it's unlikely to perform as well based on what we've seen so far. Those also group the 7900XTX with the 4070 Ti instead of the 4070 Ti Super or 4080 though, so it's obviously not grouping strictly by raster performance.
 
No one shopping for a 5090 is going to settle for a 9070 XT , they will wait until they can get the 5090. Someone interested in the 5070 or 5070 Ti however are far more likely to cross shop AMD if they can't find the Nvidia product in stock.
Wrongo!
I was going to get a 5090 or 5080, but if supply remains scarce and AMD delivers perf/price and is available Ill buy a 9070xt instead.
My 3070 TI is too old, and I needs me a replacement!
 
If you still have doubts gamers have become second class customers with Nvidia, then the terrible state of the drivers should squash them. Pepperidge Farms probably remembers when Nvidia drivers were this bad, but I don't. WTF is going on, are the code monkeys letting ChatGPT write the drivers for them? 😛
 
Yeah ok... but come on... who would put Fortnite into performance mode with a 5080. Thats what I do with my RX 6600 via thunderbolt and a laptop to get 60hz at 1080p 🙂

Latency is king in competitive multiplayer games where lower is always better. Not to mention the fact that performance mode usually takes away a lot of stuff you don't want in a competitive match like depth of field, fog or motion blur.
 
If you still have doubts gamers have become second class customers with Nvidia, then the terrible state of the drivers should squash them. Pepperidge Farms probably remembers when Nvidia drivers were this bad, but I don't. WTF is going on, are the code monkeys letting ChatGPT write the drivers for them? 😛

Hmm, maybe those rumors that Nvidia wasn't working so hard with game devs last year and not keeping up with the upscaling/RT tech support due to an AI focus were true. Nah, must have been devs selling out to a competitor and harming gamers as a whole.
 
Most of the competitive Fortnite players use performance mode. Even with high end GPUs.

I've watched competitive Overwatch and now Rivals streamers run at low settings to make sure the are hitting 300+ fps, especially they have a 240hz monitor. Overwatch these days can hit 400-500fps probably on some high end setups.
 
It reminds me of Crypto days. Didn't AIBs start upping their prices when scalping went nuts.
Less so. Some got caught reselling on eBay, and the move to new models (LHR's, 3070 Ti, 3080 12GB & Ti, 3090 Ti) let them reset the value proposition. They also prioritized higher margin SKUs.
I don't remember them outright cranking up the price of the same model by hundreds on their own webstore though.
 
Even 7900 XTX are sold out online in the US. We're in another GPU shortage here, I think. But there are too many possible causes (tariff panic buying, channel clearing, disappointing 5000 series) to know if it'll last.
I think for AMDs case it’s probably a combination of those factors. I would imagine they were hoping to clear stock for the 9070/xt anyway
 
Even 7900 XTX are sold out online in the US. We're in another GPU shortage here, I think. But there are too many possible causes (tariff panic buying, channel clearing, disappointing 5000 series) to know if it'll last.

Yeah, only one cause: Nvidia and AMD don't care about gamers, they will use almost all of their silicon for AI chips.
They are launching new products pretending they care, but they don't.
 
Nvidia using 5nm is a blackmark for their launch. They should have increased the price. It is embarrassing that such a rich company as Nvidia is cutting corners and sabotaging themselves. The overclocking headroom of these cards show these card are being held back by 5nm. This type of cheaping out is what has to the downfall of Intel.

It's crazy that these things are still on 5nm which is ridiculous.
 
Nvidia using 5nm is a blackmark for their launch. They should have increased the price. It is embarrassing that such a rich company as Nvidia is cutting corners and sabotaging themselves. The overclocking headroom of these cards show these card are being held back by 5nm. This type of cheaping out is what has to the downfall of Intel.

It's crazy that these things are still on 5nm which is ridiculous.
Many parts slated for 2024 were originally designed for N3B, only Apple and Intel ultimately used it.
So it is possible Blackwell was a backport, we already know Zen5 was.
 
Back
Top