Question 4080 Reviews

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Timmah!

Golden Member
Jul 24, 2010
1,429
641
136
Price cut probably won't be that quickly. My guess would be the cut would be to $999.

My guess that Nvidia´s lesson here wont be to price next x80 card lower than 1200, but price the x90 even higher, maybe straight to 2000, to make bigger gulf between the 2, to make 80 card look like a better deal.
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
FYI power consumption is at direct odds with RT in games. It may not seem that way because we tend to compare uncapped performance between raster and RT enabled settings, but enabling RT in a controlled performance environment can highlight a big power penalty.

Case in point I ran Cyberpunk 2077 with a 75 FPS cap. Then I switched on ray tracing. Reported GPU power jumped from 130-150W to 250-270W for the same net result of 75 FPS. Obviously, this wasn't the RT computation alone, but also raster performance being pushed harder to compensate. To illustrate, if you have a fixed 75 FPS thus ~13ms frame time allocation and RT comes in "stealing" 4ms, then the GPU must perform raster ops in 9ms instead, therefore pushing the equivalent of 110+ FPS. Energy usage goes up a lot even if RT power cost is zero.

I think people should take a moment and be honest to themselves about the performance /power balance they consider acceptable. Just like with CPUs recently, it's not just the vendor that dictates power, but also the user. If you think of all the levers you have available to customize your experience (power caps, FPS caps of all flavors, detail settings, DLSS/FSR, RT etc) I would argue that power consumption is ultimately under your control as long as the vendor designed a good product. The only real problem is cost, that parameter is very adverse to user customization :p
Well, RT needs lots of computing power which in turn need lots of power. Currently when you turn on RT it will use the RT cores that would otherwise be unused (also computation is pushed even harder). Maybe they eventually figure out some kind of breakthrough how to make RT more efficient.

Also they are pushing these cards past the efficiency sweet spot. I which that they would release power efficient versions of GPUs too (using lower quality chips, like Ryzen 7 5700X vs. Ryzen 7 5800X). Obviously with more appropriate coolers to keep the costs cheaper. Then AIBs should scrap these non-OC and OC versions (OC skus are pure scams anyway).
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
AMD got away with their price increase more because of the economy and supply/demand than that they had the better chips for several months.

They had the upper hand and they exploited it. Plain and simple. Same as every other company.

If you think any company is charging less than they can, because they want to be nice, you are living in a dream world.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I watched the RDNA3 presentation and that was enough to show me it wouldn't even come close to the RTX 4090 so I bit the bullet and bought one for several hundred dollars above MSRP.

I hate paying well above MSRP, but honestly, the GPU is extremely impressive in how it performs and how cool it runs so I have been very happy with it. It will automatically boost to 2.8ghz while still running amazingly cool.

If I was in the market I'd buy a 4090 too. There's nothing else to consider if spending $1000 or more. I'd just throw some more money at it and get the best card as well. No way would I spend $1000 on anything other than the best if I'm already going that high in price. The 4080 would stay dead to me at $900 as well because $900 msrp means $1100 AIB anyway.
I used to spend $1200-$1500 on SLI GPUs because they gave more performance than any other single card solution. No way in hell would I have saved like $500 and bought any single GPU for $1000 instead. Not a chance. At that point it's the best or go home. I'd have to spend like $700 instead to choose a cheaper option. It has to be like WAY the hell cheaper, not just a couple hundred cheaper.
 

Ranulf

Platinum Member
Jul 18, 2001
2,356
1,175
136
They had the upper hand and they exploited it. Plain and simple. Same as every other company.

If you think any company is charging less than they can, because they want to be nice, you are living in a dream world.

Sure, but what was the long term effect? It didn't help their image much and after 1.5 years they've had to gut their prices and no one wants to pay their elevated prices for the new chips because there is both competition and little true price/perf gain.

Then again, for me after seeing how first gen Ryzen played out, I only buy AMD at the end of a generation cycle when stuff is on sale.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well, RT needs lots of computing power which in turn need lots of power. Currently when you turn on RT it will use the RT cores that would otherwise be unused (also computation is pushed even harder). Maybe they eventually figure out some kind of breakthrough how to make RT more efficient.

This is a noteworthy point. I sometimes wonder whether people expect these massive GPUs to run on pixie dust or something.

That's why I said earlier that you have to look at the RTX 4090's power consumption with some context because the card is extremely capable and powerful and is doing things that no other card has ever done...... especially at 4K.

That said, you can easily limit the power consumption of these GPUs by using upscaling or framerate limiting.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If I was in the market I'd buy a 4090 too. There's nothing else to consider if spending $1000 or more. I'd just throw some more money at it and get the best card as well. No way would I spend $1000 on anything other than the best if I'm already going that high in price. The 4080 would stay dead to me at $900 as well because $900 msrp means $1100 AIB anyway.
I used to spend $1200-$1500 on SLI GPUs because they gave more performance than any other single card solution. No way in hell would I have saved like $500 and bought any single GPU for $1000 instead. Not a chance. At that point it's the best or go home. I'd have to spend like $700 instead to choose a cheaper option. It has to be like WAY the hell cheaper, not just a couple hundred cheaper.

To me an RTX 4090 is only worth buying if you're going to go all in on a 4K gaming setup. Once I bought my 4K monitor, I was locked into getting an RTX 4090 to ensure great performance at 4K native without resorting to upscaling unless I wanted to. If you still have a 2K monitor or less, then I definitely don't recommend getting an RTX 4090.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
My guess that Nvidia´s lesson here wont be to price next x80 card lower than 1200, but price the x90 even higher, maybe straight to 2000, to make bigger gulf between the 2, to make 80 card look like a better deal.
They would have sold the 4080's if it was released before the 4090, so in reality it's the 4090 that truely killed the 4080's sales. Like others said no value buyers > $1000, so nearly all the 4080's potential buyers not surprisingly got a 4090 instead. Imo there is room for 1 card above $1000.
 

Timorous

Golden Member
Oct 27, 2008
1,616
2,781
136
They would have sold the 4080's if it was released before the 4090, so in reality it's the 4090 that truely killed the 4080's sales. Like others said no value buyers > $1000, so nearly all the 4080's potential buyers not surprisingly got a 4090 instead. Imo there is room for 1 card above $1000.

NV got the strategy wrong.

Don't bother with a 4090 at all. Make the AD102 part a Titan with the Titan drivers and price it at $2,500. If NV were feeling generous they could give it 48GB of ram but that would depend on how it impacts Quadro sales so maybe they won't.

Then the 4080 at $1,200 would offer great relative value. The people who still want the absolute best would choose Titan but even those willing to spend $2k might baulk at spending 100% more for just 33% more performance.

This would also leave plenty of room for a $1,600 4080Ti to launch with Titan performance at some point next year.
 
  • Haha
Reactions: Tlh97 and Kaluan

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
No Surprise, but it looks like 4080 is selling poorly, while the 4090 keeps selling out. This is pretty much expected because once you are over a $1000 for a GPU you probably aren't looking at a value buyer, and for them a couple hundred more to have the "best" GPU probably makes more sense than saving 20% and settling for second best.

While I am a value buyer (never paid more than $300 for a GPU), if I suddenly found myself with a bunch of excess cash to the point that I was thinking of $1000+ GPU, I'd just go straight to the top.


HWUB is confirming this. $1200, is a hard sell, for a card that isn't the best on the market. It may be one of NVidia's worse launches(for low sales) in a LONG time.
 
Jul 27, 2020
16,339
10,350
106
4080 isn't selling well because 3090/3090Ti/6900XT/6950XT users are not doing their math. If they factor in the electricity cost they would save by upgrading to the 4080, they would see that the 4080 is excellent value for THEM.

If I was the chief of electricity crimes division of Interpol, I would seize ALL 4090/3090/6900/6950 cards and recycle them for parts.

Feel free to hate me for my unpopular opinion :)
 
  • Like
Reactions: Carfax83 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
4080 isn't selling well because 3090/3090Ti/6900XT/6950XT users are not doing their math. If they factor in the electricity cost they would save by upgrading to the 4080, they would see that the 4080 is excellent value for THEM.

If I was the chief of electricity crimes division of Interpol, I would seize ALL 4090/3090/6900/6950 cards and recycle them for parts.

Feel free to hate me for my unpopular opinion :)

RTX 4080: ~304W
RX 6900: ~305W
RX 6950: ~391W
RTX 4090: ~411W
(values taken from TPU)

Not sure the 6900 should be included in that ban. Surely that extra 1W is within testing tolerances.
 
Jul 27, 2020
16,339
10,350
106
RTX 4080: ~304W
RX 6900: ~305W
RX 6950: ~391W
RTX 4090: ~411W
(values taken from TPU)

Not sure the 6900 should be included in that ban. Surely that extra 1W is within testing tolerances.
I was thinking more of the maximum/raytracing power graphs. Plus, the 4080 makes transient spikes almost a non-issue.

4080 is a lovely card. Unfortunately, still too big for me.
 

adamge

Member
Aug 15, 2022
51
127
66
If you don't care about what things cost and just want the fastest thing in the world then that narrows it down to single card...

While I could easily buy RTX 4090, my biggest issue with it (price aside) is the power consumption. Personally I think that 250 W max should be plenty and ideally even lower. RTX 4080 is almost acceptable but it's not optimal (same thing for 7900 XT/XTX). It also bothers me quite a bit that NVIDIA didn't include DP2.0/2.1 ports when even Intel Arc cards have it.

But since I do care about value, I just can't justify the price especially for RTX 4080. It's insane and nobody should be buying. It gives wrong signal.

I think you could buy it, underclock/volt it to 250W, and have the most performance at 250W of any solution on the market.

Edit: Also the most efficiency at 250W of any solution on the market.
 
Last edited:

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
to my knowledge those are reserved for you when you activate the offer. these were up for anyone to buy. i have not signed up yet because id like to time it to the 4090s coming back in stock. I don't plan on using it but instead opting for a 7900xt or xtx instead. The 4090 will be used in something else, resold or gifted.I expect 4090s to come back into stock around march when the export ban takes effect on march 1 2023, nvidia is allowed to continue shipping of orders up until that point until september 1 2023. there was a rumor a while back that 4090 production was suspended to focus on wafers for china exports before that march 1st date. there's still 4090 dies in the wild that have yet to be implemented on cards. nvidia is controlling the flow of these cards to offload their 30 series cards onto consumers who will no doubt buy up the scraps left. nvidia gets to sit on and control what aibs can make and sell, all while taking orders and then having months to fullfill them.

Well, I thought it was interesting because Best Buy kept showing that nothing was in stock, yet I was provided an offer to buy something. I looked into the offer, and yes... going to Best Buy without the link showed absolutely no 4080 stock, but with the link, I was able to buy any 4080 that I wanted. In fact, clicking the filter option to pick up that day in my local store showed that I could get two models (FE and a Gigabyte one) the same day.
 

amenx

Diamond Member
Dec 17, 2004
3,911
2,134
136
The pricing of the the 4080 was an obvious play to move existing 30 series stock.
Not sure why that would be the case anymore with 3 weeks left to the release of 7900 cards. And which are not far off in price than the existing 30 series. The 4080 has not even remotely led me to consider a 30 series card. Am very likely going for a 7900, even if the 4080 comes down to $8-999.
 
  • Like
Reactions: Kaluan

scineram

Senior member
Nov 1, 2020
361
283
106
Not sure why that would be the case anymore with 3 weeks left to the release of 7900 cards. And which are not far off in price than the existing 30 series. The 4080 has not even remotely led me to consider a 30 series card. Am very likely going for a 7900, even if the 4080 comes down to $8-999.
You are one of the 30% who would at least entertain buying Radeon. Jensen focuses on the 70%.
 

amenx

Diamond Member
Dec 17, 2004
3,911
2,134
136
You are one of the 30% who would at least entertain buying Radeon. Jensen focuses on the 70%.
Not really. For long time have been solely Nvidia (mostly xx80 cards). The 4090 went up 10% which is OK by me, but not the ridiculous 42% rise on the 4080. Jensens presumptive "they will buy it with a smile on their faces" is too much of an insult to me.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,949
504
126
4080 isn't selling well because 3090/3090Ti/6900XT/6950XT users are not doing their math. If they factor in the electricity cost they would save by upgrading to the 4080, they would see that the 4080 is excellent value for THEM.

If I was the chief of electricity crimes division of Interpol, I would seize ALL 4090/3090/6900/6950 cards and recycle them for parts.

Feel free to hate me for my unpopular opinion :)
How long do you have to own a 4080 before it starts to save you money. Because math.