This is with DLSS3 enabled? We don't even know how these actually perform, then. All vague hype and BS benchmarks along with horrible prices. Nice job, Nvidia.Valhalla numbers is telling..
pretty sure that will be real average performance over 3090TI.
1.5x
View attachment 67830
That's because the company is not focused on gaming anymore. Just look at their revenue split: HPC/AI overtook gaming revenue this year, and growing. The moment Nvidia stops hyping up AI, its stock price falls. There's so much expected growth baked into its current stock price and it is all predicated that Nvidia keeps its lead on the AI front. Hence why EVERY presentation needs to be chocked to the gills with AI buzzwords.What the hell is this presentation? It feels like an investor slide deck. Literally 4 minutes spent on new video cards as they pertain to video games and its been over 50 minutes on robotics/datacenter/car computers and other abstract things. Am I missing something or am I crazy?
This time I think it's to clear inventory and will drop in time. AMD might price lower but not much. AMD wants a higher average selling price too.Price is a gut punch. I literally felt my stomach sink when I saw them. If AMD follows suit, I don't think I can be an enthusiast anymore.
What the hell is this presentation? It feels like an investor slide deck. Literally 4 minutes spent on new video cards as they pertain to video games and its been over 50 minutes on robotics/datacenter/car computers and other abstract things. Am I missing something or am I crazy?
Get with the times: interpolate!Am I missing something or am I crazy?
Valhalla is an AMD sponsored title that only has FSR, no DLSS. So the 4090 is indeed probably 50% faster than 3090 Ti in pure rasterized games.This is with DLSS3 enabled? We don't even know how these actually perform, then. All vague hype and BS benchmarks along with horrible prices. Nice job, Nvidia.
100% agree. This pricing is just preliminary and as usual the early adopters will have to pay up a bit.This time I think it's to clear inventory and will drop in time. AMD might price lower but not much. AMD wants a higher average selling price too.
Either way it pays to wait and see what AMD does.
To make it even clearerValhalla numbers is telling..
pretty sure that will be real average performance over 3090TI.
1.5x
View attachment 67830

OR, when Nvidia advertised Lovelace can do 90 TFLOPs, that was for the full AD102 die.



RTX Remix enabled modding is the thing I'm most excited about. This looks really promising.
![]()
What the hell is this presentation? It feels like an investor slide deck.

Beyond GeForce!NVidia has a big datacenter business.
Exactly right. The same stunt was pulled by older TVs pretending to have high refresh rates when they were really just 60Hz with frame interpolation. Some specifically had a game mode which bypassed this garbage to stop the extra input lag.So it sounds like DLSS 3.0 is just to falsely jack up frame rates by creating frames.
Valhalla numbers is telling..
pretty sure that will be real average performance over 3090TI.
1.5x
View attachment 67830
This is what we get this generation:Agreed. The first 3 (RE: Village, AC: Valhalla, Div 2) are NOT on the DLSS 3 games list, so this is real performance gains.
The rest of the games are on the DLSS 3 list, and are thus potentially using the extra interpolated frames from DLSS 3.
The last 3 are a combination of improved RT HW and frame interpolation.
So its clear they DID get major power savings by going from Samsung 8nm to TSMC 4N, but these top end models have cranked the power to maximize gains. Looking at their power graph
We don't have a Y axis but if you walk it back a but, a 3080 level of performance looks to be possible at 180-200 watts. And remember, a 3080 is STILL overkill for anyone gaming at the most popular resolution, 1080p. My 3070 pushes most games at 1080p, 240hz no problem. I'm excited to see mobile Ada Lovelace GPUs.
lol @ imaginary frames. Main issue is that the input lag will still be based on the key frames since those are the only ones being generated at the whim of what the user is doing. In theory, Nvidia could generate as many intermediate frames as they'd like, so if there is zero motion Nvidia could have advertised like a 10x fps boost if the GPU had enough horsepower to do it. Of course, no one would want to play a game at a key frame rate of 10fps which is artificially boosted to 100fps (i.e. 90% of your frames aren't key frames) because the input lag would be atrocious. Leave it to Nvidia marketing to push for higher frame rates when it will now take even more technical know-how to discern that not all frames are created equal.Exactly right. The same stunt was pulled by older TVs pretending to have high refresh rates when they were really just 60Hz with frame interpolation. Some specifically had a game mode which bypassed this garbage to stop the extra input lag.
But I mean if customers have accepted imaginary resolutions ("4K" DLSS) as a feature, the next logical step is imaginary frames. "AI"...anal idiocy more like it.
And a $900 furnace with only 12GB...LMFAO.
The only possible redemption here is OoOE, as long as it transparently works at the hardware level like CPUs and doesn't require driver shader substitution.
But I mean if customers have accepted imaginary resolutions ("4K" DLSS) as a feature, the next logical step is imaginary frames. "AI"...anal idiocy more like it.
This is what we get this generation:
3090Ti beating 4080 in raster games
First xx80 nvidia gpu with 192bit bus, true innovation from Jensen 😆 (For $899 at that!)