nVidia 3090 reviews thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Written:


Video:

 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
It's the Titan rebranded as GeForce. 10% faster, double the VRAM and double the price. I read it mentioned somewhere, and it's most likely right, that nvidia couldn't sell the mid-sized die as a flagship this gen. So we get the big die for 3090/3080, but nvidia still want to charge $1200USD! :D

3070 probably is not even consistently as fast as 2080ti.
Is it though?
1600960724360.png
From LTT's review.

It's getting into Titan levels of pricing, but it is driver constrained and certainly doesn't offer Titan levels of performance in some professional apps. If you're an engineer looking for a budget Quadro for modeling, this isn't the card for you in the same way previous gen Titans were.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,009
417
126
I mean it is shocking that in gaming at 4K 3090 will only be about 10-15% faster. This is the worst gaming marketed card in Nvidia history in value. I don't think it has ever happened in the past where a card cost double the previous step down and offered only 10% more gaming performance? Even with 24GB ram which will be useless for gaming for the next few years, 3090 should have not cost more than $1200 even with the Halo card tax. I hope no one buys this for gaming even if you can afford one. Rather buy 3080 and donate the rest to charity.
How is this “shocking”? The difference in RAM has nothing to do with performance at this level. Games themselves are optimized for their 4K textures already on cards that have only 6-8GB RAM, so having 10 or 24, or 100GB of RAM will not make any difference in running the game. Where it will make a difference is in GPU computing models for things like AI research and neural network training, but that has nothing to do with gaming performance. Again, I think too many people think the 3090 is just another product and that Nvidia redid their entire lineup shifting the card classes around wherein the XX90 is what the XX80 were, when what really happened is that the XX90=Titan. It is a workstation card first and foremost, but the marketing guys had an idea that might make them sell more since there are so many people out there that seem to have this desire that they need the absolute best product on the market in their gaming computer, and so by changing the name, the card becomes a “gaming” card and not a workstation card anymore, Nvidia might move more units at a very nice profit. It is marketing genius if you ask me.

So back to the how are you surprised? Look at the performance differences between the Titan and XX80 of the last 3-4 generations. Depending on situation, it was always between 10-25%. So how are you surprised that a card with only 20% more cuda cores that is running at slower boost speeds and is most likely power and cooling limited only able to provide 10-20% more performance than the card that has 20% less cuda cores? I just don’t know what in the world you are thinking if you expect more.
 
  • Like
Reactions: xpea

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Is it though?
View attachment 30437
From LTT's review.

It's getting into Titan levels of pricing, but it is driver constrained and certainly doesn't offer Titan levels of performance in some professional apps. If you're an engineer looking for a budget Quadro for modeling, this isn't the card for you in the same way previous gen Titans were.

Heh, its TITAN for some and not a TITAN for others :p

As I said before, worst card release for the last 10-15 years.

For those that havent seen the LTT video, bellow the official NVIDIA response for the SPEC performance

1.png
 

Grooveriding

Diamond Member
Dec 25, 2008
9,107
1,260
126
Is it though?
View attachment 30437
From LTT's review.

It's getting into Titan levels of pricing, but it is driver constrained and certainly doesn't offer Titan levels of performance in some professional apps. If you're an engineer looking for a budget Quadro for modeling, this isn't the card for you in the same way previous gen Titans were.

Eh, even worse then. It will still be popular and sell out for a long time, fastest card there is. The 3080 is going to continue to be absolutely near impossible to get your hands on now. Everyone knows with an overclock they have a stock 3090, going to drive buyers down from 3090, and up into 3080 I bet.
 
  • Like
Reactions: Tlh97

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
A lot of fake nerd rage about cost, but we all know that doesn't matter as long as it's the fastest. Those who can easily spend $1500 on a hobby will buy it even if it's only a little faster then the $700 card, particularly as SLi is dead so no point buying 2*3080. Hence it'll be a success, unless AMD can dethrone it with NAVI 2...
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
This guy usually tests laptops. He didnt' get 3080 yet so he compared to 3090, which leads me to think that for marketing purposes NVidia should have led with the 3090 reviews. He sees 47% gains over 2080 Ti at 4K, so that looks nice with no 3080 in the mix. Still crazy expensive, but I expect there would have been less vitriol with out the 3080 to compare with. Then when the 3080 reviews land people would have been pleasantly surprised how close to the 3090 it was.

 
  • Like
Reactions: Tlh97 and Mopetar

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Or with Dennard scaling largely dead, we are just seeing the power requirements of running 28 Billion transistors at 1.7GHz boost clock.

A100 is 54 billion xstors and 1400mhz in a 250 watt tdp....


A102 on TSMC would have been an absolute monster and been easily under 250 watts.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
A100 is 54 billion xstors and 1400mhz in a 250 watt tdp....

A102 on TSMC would have been an absolute monster and been easily under 250 watts.

A100 is power limited on PCIe to 250W and loses performance. Otherwise it's 400 watts at 1400MHz.

1400 MHz is ridiculously slower, and lower down the power curve than 1700 MHz of the GA102. These are linear relationhips. You probably cut close to half the power lowering the clock speed that much.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
What were you expecting? Nvidia themselves explicitly marketed it as a Titan replacement, hence the huge amounts of fast memory and support for multi-gpu, Titan's have never been good value for money since the very beginning, why would that change now?
I'm pretty sure you haven't seen the 3090 presentation from Jensen Huang, otherwise you wouldn't be posting this nonsense.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
How is this “shocking”? The difference in RAM has nothing to do with performance at this level. Games themselves are optimized for their 4K textures already on cards that have only 6-8GB RAM, so having 10 or 24, or 100GB of RAM will not make any difference in running the game. Where it will make a difference is in GPU computing models for things like AI research and neural network training, but that has nothing to do with gaming performance. Again, I think too many people think the 3090 is just another product and that Nvidia redid their entire lineup shifting the card classes around wherein the XX90 is what the XX80 were, when what really happened is that the XX90=Titan. It is a workstation card first and foremost, but the marketing guys had an idea that might make them sell more since there are so many people out there that seem to have this desire that they need the absolute best product on the market in their gaming computer, and so by changing the name, the card becomes a “gaming” card and not a workstation card anymore, Nvidia might move more units at a very nice profit. It is marketing genius if you ask me.

So back to the how are you surprised? Look at the performance differences between the Titan and XX80 of the last 3-4 generations. Depending on situation, it was always between 10-25%. So how are you surprised that a card with only 20% more cuda cores that is running at slower boost speeds and is most likely power and cooling limited only able to provide 10-20% more performance than the card that has 20% less cuda cores? I just don’t know what in the world you are thinking if you expect more.
I'm pretty sure he means it's shockong that they ask at least 1,5 grand for it.
 
  • Like
Reactions: Tlh97

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Hell the FE thermals are amazing due to 2.1Kg card weight

AIBs cards have way worst thermals but their cards are lighter at 1.3Kg and 1.5Kg
If that all that weight is only acting as a big mass for heat inertia, I wonder how a review after an hour of heat soaking would look like?
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,009
417
126
So this is a TItan with Quadro optimizations removed. I assume we will see the same GPU released as a Titan for something like $2,500 with the full die enabled and the Quadro optimizations present.
Don't expect much additional performance though. The "full" die only has 256 more shading units and 12 TMU's (+2.44%). The only thing else that can be tweaked would be the clock speeds, which I believe are mostly power and cooling constrained from the looks of reviews (so it would take custom water solutions to really make a difference).

That said, this is all based off the stock 3080/3090 firmware. It is possible that Nvidia is holding some things back in the firmware and can get some additional performance in there by changing some parameters to go with changes in power delivery and handling on the card's board. But we probably won't know until AMD releases and see what Nvidia has held in reserve.
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,478
14,434
136
If @Markfw can get his hands on one he'll be using it for fold@home 24/7 which would push the cooler to it's maximum/minimum potential.
Well, I slept in this morning. By 6:45 nothing of any brand was available. I really want EVGA, since I trust them a lot.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I would say that even 8k TVs are completely unrealistic from normal viewing distances (10-14') unless massive screen too impractical for typical home usage. 8k imo is the biggest marketing rip-off I've yet seen for real world usage. Its simply milking the 'moar is better' idea to lure in the clueless masses.

I really don't see the point in 8k for home use. 4k was marginal enough with the display sizes and distances normally used at home. Honestly the only reason I want to upgrade my 1080p plasma is to get HDR support and higher refresh rates if I even hook a PC or console up to it. The only upgrade from 4k I'd be interested in is 21:9 and 32:9 4k displays. Even that is really iffy as my 1440 32:9 I can't imagine using smaller text than it displays. It would just make fonts look smoother.
 

traderjay

Senior member
Sep 24, 2015
220
165
116
Lined up this morning at 10 am EST - managed to get myself first in line when the card arrives (Strix 3090)

Very well designed card that runs cool and quiet!
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
This guy usually tests laptops. He didnt' get 3080 yet so he compared to 3090, which leads me to think that for marketing purposes NVidia should have led with the 3090 reviews. He sees 47% gains over 2080 Ti at 4K, so that looks nice with no 3080 in the mix. Still crazy expensive, but I expect there would have been less vitriol with out the 3080 to compare with. Then when the 3080 reviews land people would have been pleasantly surprised how close to the 3090 it was.


-Maybe yes, but then you launch a $700 3080 that is only slightly slower a month later and watch people's heads explode in anger.

NV was stuck between a rock and a hard place on this one.
 

loki1944

Member
Apr 23, 2020
99
35
51
A lot of fake nerd rage about cost, but we all know that doesn't matter as long as it's the fastest. Those who can easily spend $1500 on a hobby will buy it even if it's only a little faster then the $700 card, particularly as SLi is dead so no point buying 2*3080. Hence it'll be a success, unless AMD can dethrone it with NAVI 2...

It matters to most gamers with more sense than money.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
-Maybe yes, but then you launch a $700 3080 that is only slightly slower a month later and watch people's heads explode in anger.

NV was stuck between a rock and a hard place on this one.

Or you launch a somewhat slower 3080 and people complain about the large gap, sandbagging etc :)

Not that we need to feel sorry for NV. They're doing very nicely for themselves.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
Or you launch a somewhat slower 3080 and people complain about the large gap, sandbagging etc :)

Not that we need to feel sorry for NV. They're doing very nicely for themselves.

- Indeed, given the people willing to sell their first born children for these cards on release then throwing a fit when there is not enough to go around.