- Apr 3, 2006
- 4,764
- 6,277
- 136
A followup on the Big DLSS vs FSR comparison. This time seeking to answer the "is it better than Native" question.
The answer is sometimes...
The answer is sometimes...
You could argue the newest DLSS is better then native on average. It's only by including old versions in older games that it becomes a tie. Even then there is a fix as you can update the DLSS in that game if you really care, and there are programs that exist to make this easy. Pretty impressive considering the amount of nerd rage that still exists on this forum against using it.A followup on the Big DLSS vs FSR comparison. This time seeking to answer the "is it better than Native" question.
The answer is sometimes...
You could argue the newest DLSS is better then native on average. It's only by including old versions in older games that it becomes a tie. Even then there is a fix as you can update the DLSS in that game if you really care, and there are programs that exist to make this easy. Pretty impressive considering the amount of nerd rage that still exists on this forum against using it
If DLSS looks better than "native+TAA" then this means that the game is really badly optimized, particularly in the TAA algorithm.
I think a lot of the "nerd" rage, is really NVidia rage that is very common on this forum.
A lot of NVIDIA's wounds lately have been self inflicted, it's like they're trying to piss people off. You can't blame people for getting upset with them. EVen brand loyalty only lasts so long. Call it NVIDIA rage if you want, but NVIDIA only has to look in the mirror to find out why.
It seems almost entirely because NVidia charges more than AMD.
But the reality is everyone in the lead charges more. When AMD Ryzen 5600X beat Intel at gaming for the first time, they basically killed the 5600 model, until Intel came back with a competitive product.
That effectively meant the price of 6 core AMD CPU went from $200 to $300 in one generation (50% increase), and not a peep from the same crowd that want lynch NVidia when a GPU card goes from $500 to $600 (20% increase).
None of these companies are your buddy. They will ALL price to maximize their profits, and without effective competition, they will ALL take fatter margins. ALL of them.
It's fine to grumble about pricing, but when you just view everything through sour grapes, then your opinions are questionable.
DLSS is very good technology. Often simultaneously delivering improved performance and visuals.
That's not true and you know it. RDNA 3 is hardly a value proposition. People are unhappy with NVIDIA largely because of their VRAM shenanigans. The naming nonsense is just the cherry on top. In an odd way, if NVIDIA hadn't been forced to "unlaunch" the 4080 12GB that wouldn't have been a thing.
4070 Ti -> 4080 12GB
4070 -> 4070 Ti
If the 4070 was the 4070 Ti at $600 there would probably be less noise. At least then the names would make sense, even if the performance didn't.
As for the 5600, not sure where you're getting that. Zen 3 was selling so well the non-X models didn't come out until, looks like eight months, after the launch of Zen 3. By then there was ample supply. I think you are remembering wrong.
Seriously, the renaming of 4070 Ti isn't something anyone who doesn't already have a massive chip on their shoulder about NVidia would get mad about.
It makes Nvidia look kind of lame, something to laugh at, not to get mad about.
I'm not remembering wrong. AMD finally beat Intel, and they could finally charge more, so they did.
Intel was ahead in gaming when, 3600 and 3600X came out together at $200 and $250.
Zen 3 finally beat Intel at gaming, so 5600X came out alone at $300. A massive increase for 6 cores.
It was only after Intel came back with 12th gen, and AMD lost the upper hand that the better value 5600 was released.
5600X was released Nov 2020, 5600 was released April 2022. A lot more than 8 months.
While there is nothing wrong with charging a premium when you have the top product, it's silly to pretend that only Intel and NVidia do this.
All companies want to be selling the most premium product, so they can charge the biggest margin.
there is nothing wrong with charging a premium when you have the top product
I have no problem saying AMD will do it too. I don't know why you would assume otherwise. You said the forum has "NVIDIA rage". Do you think the same of them too?
I think the best GPU ever was the 8800GT. I ran that until it literally died. Only card I've ever done that with. Doesn't mean I can't be unhappy with what NVIDIA is currently doing.
Because the overwhelming thing I see raging at NVidia about is pricing. People unironically calling them "Ngreedia", which is right up there with calling Microsoft, M$.
People act as if the only reason GPUs cost more today than they did 5 or 10 years ago is only because of NVidia charging ever greater margins.
Corporations all attempt to maximize profits. It's what they do. If AMD was the leading in GPUs, they would be charging NVidia prices, and if NVidia was behind, they would be charging AMD pricing.
It's just that NVidia has been in front so long that some people seem develop complex about them, that colors their view of everything NVidia does.
So I think DLSS gets dumped on more than it deserves because of that.
That's not to say NVidia doesn't do some scummy things, but I just hate those things, not letting it color my views of everything thing do. They bring a lot of interesting technology to market that should be viewed on it's own merits.
I used an ATI 9700 Pro until it died (which made me very unhappy - great card), then I bought an 8800GT.
My 8800 GT is still the only dGPU in the house, and it still runs with tens of thousands of hours on it. Recently built a new PC after ~15 years, and I'm finally looking for a new GPU.
The 9700 Pro is pretty much tied for the 8800GT for the best ever IMHO. I give the 8800GT the edge for how long I used it. I had a 9800 Pro and loved it. The only reason I had to get rid of it was because I was moving to PCIe.
I'm surprised you still have an 8800GT. With how active you are on this forum, I figured you were some (modern) hardcore gamer. I get it, for years now I haven't cared for games that have been coming out.
Don't take the bait. There's a far more productive discussion to be had here, and it starts by acknowledging both the strengths and weaknesses of upscalers. Personally I'm inclined to say that DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes compensating for a poor AA implementation, saving power when playing with a frame cap, locking in a fluid FPS while keeping some desired IQ settings on higher level. I think I said this before, but DLSS suffered greatly at launch not only due to technical limitations and weak IQ results, but also because it was used as a crutch for RT. It was pushed out half-baked to take the RT performance hit, and it took a while for the tech to stand on it's own feet as people and media began to use it independently of RT effects. Nowadays it is a good tool to prolong the life of a card or simply enable higher graphical settings on a system which is very close to fluid performance but still needs a bit of a push to lock in the desired FPS.Sure, let's just dismiss this under... "nerd rage".
Not sure about that.compensating for a poor AA implementation
I think you read too much into my comment. We still judge games based on IQ, and AA implementation is a big part of that.I'm not on board as taking "a poor AA implementation" as an acceptable baseline...
Don't take the bait. There's a far more productive discussion to be had here, and it starts by acknowledging both the strengths and weaknesses of upscalers. Personally I'm inclined to say that DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes compensating for a poor AA implementation, saving power when playing with a frame cap, locking in a fluid FPS while keeping some desired IQ settings on higher level.
I think I said this before, but DLSS suffered greatly at launch not only due to technical limitations and weak IQ results, but also because it was used as a crutch for RT. It was pushed out half-baked to take the RT performance hit, and it took a while for the tech to stand on it's own feet as people and media began to use it independently of RT effects. Nowadays it is a good tool to prolong the life of a card or simply enable higher graphical settings on a system which is very close to fluid performance but still needs a bit of a push to lock in the desired FPS.
Where is this rage?Sure, let's just dismiss this under... "nerd rage".
* insert Principal Skinner meme *
... DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes ... saving power when playing with a frame cap,
Does anybody have experience with frame generation turned on with limited frame rate?
With my limited experience I found DLSS with frame generation to work properly only if the monitor is uncapped. When the cap is lower than what DLSS with frame generation could do, it does not work.
Has anybody else experienced this or am I doing something wrong?