I can see it now, "DLSS4.0, we take an 8k native path traced image and with the magic of AI, turn it into a blurry artifact ridden mess and render it at 1000fps!"
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:
![]()
NVIDIA shows GeForce RTX 4090 running 2850 MHz GPU clock at stock settings - VideoCardz.com
NVIDIA RTX 4090 2.85 GHz at stock, less power use with DLSS3 At yesterday’s NVIDIA Editors Day for RTX 40 series, the company demonstrated its RTX 4090 graphics card in a short gameplay demo. Wccftech, who took part in this press briefing, posted screenshots and videos of what that was shown...videocardz.com
View attachment 67963
Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.
If you do the numbers you get ~ 70% of the fps the real 1440p rate (170/2).Obviously DLSS running at lower resolution, therefore latency will be low.
Our fault really, for accepting the crappy visuals of Minecraft and similar games and making them hugely popular.I can see it now, "DLSS4.0, we take an 8k native path traced image and with the magic of AI, turn it into a blurry artifact ridden mess and render it at 1000fps!"
Our fault really, for accepting the crappy visuals of Minecraft and similar games and making them hugely popular.
My apologies if this was already posted (it's new product season, so AT Forums is more active), but looks like Nvidia gave some more info regarding DLSS 3.0:
![]()
NVIDIA shows GeForce RTX 4090 running 2850 MHz GPU clock at stock settings - VideoCardz.com
NVIDIA RTX 4090 2.85 GHz at stock, less power use with DLSS3 At yesterday’s NVIDIA Editors Day for RTX 40 series, the company demonstrated its RTX 4090 graphics card in a short gameplay demo. Wccftech, who took part in this press briefing, posted screenshots and videos of what that was shown...videocardz.com
View attachment 67963
Latency sees a reduction from what Nvidia calls Native to with DLSS 3.0, but who knows if that's just coming from not enabling Reflex for Native and turning it on for DLSS 3.0 to make it look like you get a reduction. Also, where is DLSS 2 in this comparison? Those of us with Ampere won't be able to take advantage of DLSS 3 so I'd like to know how much of a benefit comes from DLSS 3.0 alone, specifically from the Frame Generation feature. If the DLSS 2 results look exactly like the DLSS 3 results except the frame rates are just simply half, then we all know what's going on: the higher frame rate comes from inserting those additional "fake" frames.
I know LTT typically tries to keep companies happy (for obvious reasons), but this video really sounds like an apologists video. Sure he mentions how the 4080 12GB should probably have been a 4070, but he goes on in several cases about how we should be happy that the cards are as cheap as they are?!
I wouldn't mind a little increase here and there, but it's like they're working their way down the stack, bumping things to tier above pricing.First it was the 80Ti models jumping to Titan prices, then the 80 models jumping to 80Ti, and now the renamed 4070 jumping 2 tiers in price. What's next, a 4060 next year for $550?That was really pathetic. Saying Nvidia's not being greedy, EVGA leaving the business is proof margins are tight. Yeah margins for AIBs sound tight. Not margins for Nvidia with the ridiculous price inflation of their cards ever since Turing.
Yes.What's next, a 4060 next year for $550?
I wouldn't mind a little increase here and there, but it's like they're working their way down the stack, bumping things to tier above pricing.First it was the 80Ti models jumping to Titan prices, then the 80 models jumping to 80Ti, and now the renamed 4070 jumping 2 tiers in price. What's next, a 4060 next year for $550?
Some devious marketing here. You will get the same latency with DLSS 2, without the motion errors induced by sudden changes.
Yes.
At the current prices it seems like the 4070 will be like 7-750 so a 550-600 4060 would be just about right on.
I can see it now, "DLSS4.0, we take an 8k native path traced image and with the magic of AI, turn it into a blurry artifact ridden mess and render it at 1000fps!"
Yes.
At the current prices it seems like the 4070 will be like 7-750 so a 550-600 4060 would be just about right on.
At some point virtual frame generation will still be limited by the underlying base frame rate, no? Even if DLSS 4.0 was super duper good at frame generation and could project 9 virtual frames for every 1 key frame, your base frame rate needs to be sufficiently high enough or else your inputs won't get polled frequently enough. It's not as straight-forward as cranking up all the graphic settings, leading to a base frame rate of 10 fps, and then justifying it via frame generation to boost it to 100 fps. It will be a bad gameplay experience. The only way I can see them fix this is if DLSS also takes in user input in between rendered frames so that the motion vectors are constantly adjusted on the fly. This is basically what is already done in VR with timewarp techniques... Ah crap, knowing that this technique already exists but is not mass-marketed leads me to think this will be the direction they take, then everyone will think Nvidia invented it, like RT...DLSS 4 will probably be DLSS 3 done right, just as DLSS 2 was DLSS 1 done right. Regardless of current deficiencies virtual frame generation is the future and will be perfected eventually. Open question of whether Ada even gets DLSS 4 support given the precedent set by DLSS 3 though.
Given the huge gap between the 4080's they could be only minor rip off prices, $600-650 for the 4070
I found some extra Nvidia slides regarding DLSS 3... Looks like there is only a small reduction in latency going from DLSS 2 to DLSS 3, even with frame generation enabled, solely because Reflex is integrated into DLSS 3. If it weren't, I suspect the latency between DLSS 2 and DLSS 3 are identical:
View attachment 67973
View attachment 67974
View attachment 67975
View attachment 67976
You will actually get better latency with DLSS 2, since DLSS 3 does not grant a 100% speedup outside of certain fringe 100% CPU limited scenarios.
At some point virtual frame generation will still be limited by the underlying base frame rate, no?
No. It would need much tighter integration, but there's no reason you can't switch off just the rendering loop for virtual frames and leave everything else intact. Or, depending on your engine, you might run a subset of the game loop or store inputs in a buffer and reverse predict. And there are definitely other integration approaches I haven't thought of. Heck, there's no fundamental reason you can't completely decouple rendering, except that game engines don't tend to be built that way today. Might make more sense in a future where virtual frame insertion is the norm though.
When I think of "perfected" virtual frame insertion the first thing that comes to mind is a major reduction in artifacting and second, an increase to performance to the point where the cost to base framerate is light-to-negligible. It should be possible to insert an arbitrary number of virtual frames, and also to shut off frame generation entirely based on context. It should be possible to integrate loosely, where the game loop is shut down during virtual frames, or tightly, where the engine can still run during virtual frames but won't render.
This is the future.
It's not the future. It sounds like you are drinking the NVidia Kool-aid.
The reactivity can NEVER be better than the real base framerate.
It's very disingenuous to sell this is a real frame rate like NVidia is doing.
By thinking in terms of like "real base framerate" you're missing the picture altogether. Games today mostly run 1-to-1 logic and rendering, but it's not a fundamental constraint.
Exactly, but don't give ideas to NVidia. Otherwise it will be called GSYNC 3.0You could literally add this functionality to the Monitor, just like it's in most TVs, which should make it clear, that the real base frame rate, is the only thing that matters.