Question 4080 Reviews

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,223
5,225
136
So in your experience, it's better to play with DLSS enabled because it's generally less of an irritating experience for you?

If so, that's a solid value add and I'd imagine you'd want to keep it enabled.

I don't have DLSS on my old GPU, I'm just going by various videos I have seen, but overall, I expect I'd just leave it on where available.
 

Aapje

Golden Member
Mar 21, 2022
1,355
1,821
106
So are we saying that broadly speaking DLSS improves visual fidelity and generally speaking games are better with it on than off?

I would say that it depends on the game and your personal preferences, but highest quality temporal upscaling can be preferable to native.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That was an old test with FSR 2.1. FSR 2.2 is the current version.

Here is a test of DLSS 2.4 vs FRS 2.2 (Both just released).

In short, DLSS has less shimmering during movement. FSR has better anti-aliasing, as DLSS has a pixelated look.

FSR has better anti-aliasing?! Are you serious? Look at the power lines in the screenshot! Jaggies everywhere. And in the video, the FSR shows vaseline blur on the trees.

Both have ghosting, but its more noticeable with FSR. Game performance was very close on both. Considering the head start nVidia had, AMD has made up a lot of ground. But really, anybody using DLSS or FSR doesn't care about visual quality, they only care about increased frame rates.

Not really. FPS has diminishing returns. The only time I've used DLSS with my RTX 4090 is with A Plague Tale Requiem and it's set to quality mode, which looks really good I must say.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Again, FSR 3 is most likely not going to be an upscaling technology, just like DLSS 3 isn't. You keep conflating temporal upscaling with interpolation, even though they are completely different technologies.

Both upscaling and interpolation have their pros and cons, so I'm sure that AMD will use their new A.I accelerators to improve the upscaling portion of FSR.

You argued that DLSS 3 will improve the image quality compared to DLSS 2, even though it has already been released and we know that it isn't the case. DLSS 3 produces way more artifacts than both DLSS 2 and FSR 2, yet you don't comment on that at all, even though you critique the far smaller differences between DLSS 2 and FSR 2. Also, input lag will be worse than having half the FPS, where those frames are actually based on new input.

I dare you to quote me where I said DLSS 3 will improve image quality compared to DLSS 2. I don't recall saying that anywhere. In fact, I did a word search on the entire 2nd page for whenever I mentioned DLSS and nowhere do I even mention DLSS 3. I personally have used it only once and I didn't detect any image quality or input lag issues, but then Nvidia reflex was also turned on at the time.

But speaking of DLSS 3 and input lag, a guy over at Guru3D tested the latest driver and claimed it reduced the driver overhead for DLSS 3 which of course helps with input lag and overall performance. People should know by now never to underestimate Nvidia in matters like these, because they are investing tons of resources into these types of technologies and the input lag is going to be reduced over time.

Guru3d DLSS 3 overhead

You continuously exaggerate the benefits of Nvidia features and downplay AMD's features...

And I could say the same about you, that you continuously exaggerate how much progress AMD has made, while downplaying Nvidia's lead.

But AMD can still add extra ray accelerators to the CU's.

And they are doing just that, but it's not the same thing as dedicated RT hardware that Nvidia uses. AMD's approach is more hybridized.

I don't agree on the clarity, but the other two are a bit worse on FSR 2. However, I dispute that this is a big difference.

Yep, a lot of this stuff is subjective to be sure.

Again, you don't call out the far bigger artifacting with DLSS 3, which suggests to me that you are rather biased.

Again, there you go piping off about DLSS 3 when I haven't even mentioned it in this thread until now.

ML is not necessarily better than manual coding. There is nothing that ML can do that a coder couldn't do in theory, although it may be harder to implement or even unfeasible due to how much effort it takes. However, there are advantages to hand coding as well, like tunability.

ML dramatically accelerates how quickly developers can implement algorithms to improve A.I based upscaling and RT acceleration. It's a big factor behind how RT and upscaling performance and quality have improved by leaps and bounds in such a short period of time relatively speaking.

XeSS actually runs way worse on non-Intel GPU's. XeSS on non-Intel is way worse than FSR 2.

I know it runs and looks worse on non Intel GPUs, but when it's run on an Intel Arc GPU, it's the closest DLSS alternative we have. And Intel is just getting started. Hopefully their next GPU is more competitive on the high end.
 
  • Like
Reactions: igor_kavinski

Hans Gruber

Platinum Member
Dec 23, 2006
2,130
1,088
136
There is already a lot of blowback online over the 4080 pricing. Crypto is gone, PC sales are way down but the price of GPU's seems to be increasing. It's going to take 3-5 months before Nvidia starts feeling the pain. Look at Intel, there are AMD cults for Ryzen all because of the greed of Intel for more than a decade. The only cure to such a sickness, cheap hardware cures all.
 

Aapje

Golden Member
Mar 21, 2022
1,355
1,821
106
And I could say the same about you, that you continuously exaggerate how much progress AMD has made, while downplaying Nvidia's lead.

That's just false, because I do criticize AMD tech as well, while you consistently show huge bias, including seeing 'vaseline' blurriness that no one else seems to see.

ML dramatically accelerates how quickly developers can implement algorithms to improve A.I based upscaling and RT acceleration. It's a big factor behind how RT and upscaling performance and quality have improved by leaps and bounds in such a short period of time relatively speaking.

I didn't know that you are part of both the Nvidia and AMD developer teams and have inside information how many man-hours each company spent on this. In any case, FSR's quality has also improved by leaps and bounds.

You seem to have this magical belief in ML, but the advantage of Nvidia could be explained by them having a substantial lead start. Nvidia also have a lead in raytracing, but you can't attribute that lead to ML.

I know it runs and looks worse on non Intel GPUs, but when it's run on an Intel Arc GPU, it's the closest DLSS alternative we have. And Intel is just getting started. Hopefully their next GPU is more competitive on the high end.

Of course the difference between XeSS on Intel & DLSS vs FSR 2 is that the former use unique hardware, while AMD provides a solution that works about equally well on all hardware. Nvidia has a habit of leaving owners of previous gen cards out in the cold, while AMD doesn't.
 

Tup3x

Senior member
Dec 31, 2016
954
937
136
I wouldn't use FSR2 (not sure if any of them use newest version) in any game that I have. It doesn't look very good in motion for various reasons but antialiasing takes pretty serious hit in motion. DLSS and XeSS handle subpixel detail better too. It's not currently as good as DLSS and XeSS (on Arc) but it works on older hardware and should be better than most in-house temporal upscaling methods.
 

scineram

Senior member
Nov 1, 2020
361
283
106
There is already a lot of blowback online over the 4080 pricing. Crypto is gone, PC sales are way down but the price of GPU's seems to be increasing. It's going to take 3-5 months before Nvidia starts feeling the pain. Look at Intel, there are AMD cults for Ryzen all because of the greed of Intel for more than a decade. The only cure to such a sickness, cheap hardware cures all.
They just need to stop buying Geforce. Like they did with Turing and it worked. Simple as.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
They just need to stop buying Geforce. Like they did with Turing and it worked. Simple as.

It should be obvious the reason why Nvidia priced the RTX 4080 as they did.

Considering the price of the RTX 4090 and RTX 4080 along with the performance difference and the RTX 4090 is the flagship, it is likely Nvidia would rather price the card lower as they typically know how to get good reviews for the RTX xx80 series card.

Looking at their third quarter results, they have over 4 billion dollars in inventory which is over 1.5 billion higher compared to last year.

For the current results, they submitted a 700 million dollar inventory write down which is marked as a loss. This the depreciation of unsold product as a result current market conditions and competition.

Imagine the damage a $799 RTX 4080 would do to the rest of their inventory.

The RTX 3090 ti would have to drop to $600 and everything else would need to follow. All of a sudden that $700 million dollar loss turns into well over a billion to maybe over 2 billion because all of a sudden most of Nvidia gaming inventory is worth half as much.

Delayed logistics and Nvidia lack of foresight mirrors AMD mistakes during previous mining booms. They have literally boatloads of AMPHERE cards and any extra profit made from selling more Lovelace cards would be overshadowed by the loss of the inventory write down for Amphere.
 
Aug 16, 2021
134
96
61
There is already a lot of blowback online over the 4080 pricing. Crypto is gone, PC sales are way down but the price of GPU's seems to be increasing. It's going to take 3-5 months before Nvidia starts feeling the pain. Look at Intel, there are AMD cults for Ryzen all because of the greed of Intel for more than a decade. The only cure to such a sickness, cheap hardware cures all.
Maybe, but sadly I notice a trend of everything going up in price, regarding computer hardware. It seems to me that computer building is becoming increasingly premium market and low margins are in OEM machines. So basically you can build your own computer, but if money is tight, Dell is your answer. And there hasn't been cheap hardware for years, therefore it might be a point of no return for it.
 

blckgrffn

Diamond Member
May 1, 2003
9,117
3,044
136
www.teamjuchems.com
Maybe, but sadly I notice a trend of everything going up in price, regarding computer hardware. It seems to me that computer building is becoming increasingly premium market and low margins are in OEM machines. So basically you can build your own computer, but if money is tight, Dell is your answer. And there hasn't been cheap hardware for years, therefore it might be a point of no return for it.

It's been like this for maybe 15 years or more. I got out of building budget PCs in like 2005 because the margin was about zero and they were high failure rate, angry customer call generating pits of time debt with very little profit of satisfaction/fun on my part. I transitioned to those who didn't want a gaming PC to charging a flat rate for consulting and helping them buy a budget compaq or dell or whatever, adding aftermarket memory, doing a fresh no-bs install of windows and advising them to not call me for support without expecting to pay for it. Gaming PCs were built to my spec, and if they didn't have the budget it didn't happen.

I did that at the same clip as building, made more money and had happier customers. Then and now it was easy to provide $100 of value just by advising them of a current sale or letting them know that memory was cheaper to add, the value prop was super simple.

With regards to these GPUs, inflation and the lack of easy silicon wins dating back to the long stay on 28nm make it less and less of a surprise. Look at a 5900 Ultra and compare it to current cards. It's size and power usage was similar to what, a 3060? No surprise they cost the same. Call that "normalizing insanity" or whatever you will but we need to acknowledge that the loss of easy density increases has crazy knock on effects and that a 4090 has way more going on than flagships from a decade ago. Heck, nvidia probably made more money, especially with regards to inflation, from the Titan X than they are making on the 4090 now. :/
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,666
136
Maybe, but sadly I notice a trend of everything going up in price, regarding computer hardware. It seems to me that computer building is becoming increasingly premium market and low margins are in OEM machines. So basically you can build your own computer, but if money is tight, Dell is your answer. And there hasn't been cheap hardware for years, therefore it might be a point of no return for it.
That trend is ending now. Look at the posts claiming a ~ 25% drop in Zen 4 pricing. SSDs, Ram, monitors, also dropping,
 
Aug 16, 2021
134
96
61
That trend is ending now. Look at the posts claiming a ~ 25% drop in Zen 4 pricing. SSDs, Ram, monitors, also dropping,
I don't see those drops anywhere at all. Any GPU above RX 6500 XT is overpriced, local retailers are literally selling brand new Pentium 4s and Core 2 Duos, motherboards went up by 2 times in two years, even H610 e-waste, low end GPUs literally died. Argument about Zen 4 being cheaper is lukewarm at best, because it it is massively overpriced for what it is. Mot affordable SSDs are QLC or TLC only, affordable MLC SSDs died. RAM admittedly came down to pre 2019 prices. Monitors kinda stagnated, only higher end models launched at even higher prices. Basic 1080p60 monitor still costs the same as it did in 2014. I still remember times of cheap quads like Athlon 760K for less than 100 EUR and cards like 7790 being around 130 EUR, running games at 1080p 60 fps and medium-high settings. Today you need nearly 70% more cash for that same result and yes settling with RX 6500 XT on Gen 3 board or CPU. 600-700 EUR is not reasonable for entry level computer, that's actually higher than minimum wage. Even APU system is ~400-500 EUR.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,666
136
I don't see those drops anywhere at all. Any GPU above RX 6500 XT is overpriced, local retailers are literally selling brand new Pentium 4s and Core 2 Duos, motherboards went up by 2 times in two years, even H610 e-waste, low end GPUs literally died. Argument about Zen 4 being cheaper is lukewarm at best, because it it is massively overpriced for what it is. Mot affordable SSDs are QLC or TLC only, affordable MLC SSDs died. RAM admittedly came down to pre 2019 prices. Monitors kinda stagnated, only higher end models launched at even higher prices. Basic 1080p60 monitor still costs the same as it did in 2014. I still remember times of cheap quads like Athlon 760K for less than 100 EUR and cards like 7790 being around 130 EUR, running games at 1080p 60 fps and medium-high settings. Today you need nearly 70% more cash for that same result and yes settling with RX 6500 XT on Gen 3 board or CPU. 600-700 EUR is not reasonable for entry level computer, that's actually higher than minimum wage. Even APU system is ~400-500 EUR.
I forgot where you live. I'm in the Caribbean but tied to the US$. Most currencies worldwide are falling against the $ and that explains differing situations a lot.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Some Microcenters didn't even sell half of the 100 4080's they got on launch day, according to the guy below. Many of those sold ended up on ebay and other marketplaces and I think the scalpers are finding it harder than they thought to screw people over even harder than Nvidia already did with the outrageous price of this piece of garbage. I wouldn't buy it unless it was $700, but that's just me I'm sure. If the price drops to even $1000 I'm sure all the gamers will instantly club their mother over the head and steal her credit card to buy one.

 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I don't see those drops anywhere at all. Any GPU above RX 6500 XT is overpriced, local retailers are literally selling brand new Pentium 4s and Core 2 Duos, motherboards went up by 2 times in two years, even H610 e-waste, low end GPUs literally died. Argument about Zen 4 being cheaper is lukewarm at best, because it it is massively overpriced for what it is. Mot affordable SSDs are QLC or TLC only, affordable MLC SSDs died. RAM admittedly came down to pre 2019 prices. Monitors kinda stagnated, only higher end models launched at even higher prices. Basic 1080p60 monitor still costs the same as it did in 2014. I still remember times of cheap quads like Athlon 760K for less than 100 EUR and cards like 7790 being around 130 EUR, running games at 1080p 60 fps and medium-high settings. Today you need nearly 70% more cash for that same result and yes settling with RX 6500 XT on Gen 3 board or CPU. 600-700 EUR is not reasonable for entry level computer, that's actually higher than minimum wage. Even APU system is ~400-500 EUR.

Where do you live, I am assuming the EU based on your monetary comparisons? I can look at pretty much every retailer here in the US and they all have 6K series Radeon's for WAY under MSRP. I am seeing 6700XT's for as low as $370 new.

It sounds like what you are seeing is local to the EU region, as it does not represent what we have here in the states. I literally just bought a brand new Ryzen 5800X3D for $360 with free shipping. Yes, its a previous gen chip, but that's a great price to upgrade my current system without needing a new motherboard.
 

coercitiv

Diamond Member
Jan 24, 2014
6,172
11,792
136
It sounds like what you are seeing is local to the EU region
It's not local to EU, there's plenty of cheap hardware available right now in my local Eastern Europe market. CPUs, motherboards, RAM, and good SSDs can all be had for reasonable prices. The only exception is GPUs, and we all know why that is, we're still feeling the economic ripples of the mining boom.
 

Triloby

Senior member
Mar 18, 2016
585
273
136
It's a good thing NVIDIA has enough brain cells to realize that they would've been outright blasted by reviewers if they didn't "unlaunch" the "4080 12GB". If you're already spending more than a grand on a GPU, you might as well go all-out and get the 4090 in that regard. Even then, good luck with that considering that they're producing far less Lovelace stock to get rid of all their Ampere stock.

Considering the huge differences in specs between AD102 and AD103, I'd imagine that a "4080 Ti" will have a much bigger performance uplift from the 4080 compared to the 3080 Ti from the 3080.
 
Aug 16, 2021
134
96
61
Where do you live, I am assuming the EU based on your monetary comparisons? I can look at pretty much every retailer here in the US and they all have 6K series Radeon's for WAY under MSRP. I am seeing 6700XT's for as low as $370 new.

It sounds like what you are seeing is local to the EU region, as it does not represent what we have here in the states. I literally just bought a brand new Ryzen 5800X3D for $360 with free shipping. Yes, its a previous gen chip, but that's a great price to upgrade my current system without needing a new motherboard.
I just Googled how much is cheapest 6700 XT here in Lithuania, it's 466 EUR for Asus Dual model. That's more or less what GTX 680 used to cost and this is upper mid tier card, not high end card. 5800X3D is minimum 378 EUR and no free shipping. Average wage in Lithuania is 3 times lower and annual inflation is 24%, meanwhile my observations in Excel sheet show that food raises in price around 15-20% per month.
I can say that in 2018, after recovery from mining crash, prices of GTX 10 series and Coffee Lake chips were more or less sane. BTW nearly 400 EUR for 8 core chip in 2022 is still awful price. FX 8320 was 160 EUR, even Ryzen 1700 was 270 EUR, hell i7 10700KF was like 280 EUR.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,666
136
I just Googled how much is cheapest 6700 XT here in Lithuania, it's 466 EUR for Asus Dual model. That's more or less what GTX 680 used to cost and this is upper mid tier card, not high end card. 5800X3D is minimum 378 EUR and no free shipping. Average wage in Lithuania is 3 times lower and annual inflation is 24%, meanwhile my observations in Excel sheet show that food raises in price around 15-20% per month.
I can say that in 2018, after recovery from mining crash, prices of GTX 10 series and Coffee Lake chips were more or less sane. BTW nearly 400 EUR for 8 core chip in 2022 is still awful price. FX 8320 was 160 EUR, even Ryzen 1700 was 270 EUR, hell i7 10700KF was like 280 EUR.
food raises in price around 15-20% per month.

Are you making a mistake? That's over 500% annually. 1.15 compounded monthly.
 
Aug 16, 2021
134
96
61
food raises in price around 15-20% per month.

Are you making a mistake? That's over 500% annually. 1.15 compounded monthly.
It seems that quite a lot of food rose by 2-3 times over year, so I am mistaken, but situation isn't close to good. The lowest rise in prices over year is probably 40-60% for sugar, some cheapo milk, cookies. And that's not a whole story, because you also need gas to prepare food on stove, gas prices went up by 3-4 times and to get that food to kitchen petrol/diesel prices are now 70-80% higher than last year. Wouldn't be surprised if overall cost of living really became 3-4 times higher.