1080Ti still shows us how bad the 2080Ti/2080 was

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
A timely article:



1080Ti had huge gains in specs and performance over the 980Ti for basically the same price, at a much smaller die size.

In contrast it's horrifically shocking how overpriced and underperforming the 2080Ti was relative to the 1080Ti, especially given the overall failure of ray-tracing/DLSS. An absolute turd of a card.

Part 2 is up (2080):


You can clearly see 8GB is not enough in Doom Eternal 4K even though the 2080 is more than fast enough to normally handle it. A frametime graph there would show noticeable stutter.
 
Last edited:

Golgatha

Lifer
Jul 18, 2003
12,651
1,514
126
Those folks who got a 1080Ti right at launch got a hell of a value. I got mine just about when GPU prices started really spiking (got mine for about $775), and for going on near 3 years now, it's held out far better than I expected to.

Same here. Just recently moved my 1080 ti to a system I helped my son build, and I bought it at launch for $700+tax. Best GPU purchase so far in terms of longevity. In contrast, I replaced it with a used 2080 Super, which likely took a $100-200 hit in value within a month due to the 3000 series announcement :D
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Yep, my 1080Ti still performs like a new card. I can't think of a more legendary GPU. The 8800GT was incredible value almost matching the 8800GTX at half the price. It was insanity. Didn't last no 3-4 years though. What GPU is more legendary than the 1080Ti? I can't think of one, but if they released a 3080Ti at $800 or less, I think we would have another legendary card. However, they just replaced the X80 card at $700. I got a feeling the Ti replacement will still be over $1000, just like the 2080Ti was. People are basically fine with it because we've been starved of a good GPU release for over 3 years, so at this point, the 3080 looks better than it would otherwise.
 
  • Like
Reactions: Tlh97 and Saylick

Saylick

Diamond Member
Sep 10, 2012
3,157
6,374
136
What GPU is more legendary than the 1080Ti?
You know, as much as I like making fun of the R9 290X and how hot and loud it ran, subsequent releases of that die with AIB cooling and FineWine Technology gave Hawaii some staying power. Probably not as legit as the 1080 Ti but decent.

I have no doubt in my mind that if Nvidia made a 1080 Ti Mk2 that was essentially Pascal on 12nm but scaled up to 2080 Ti die size and given the faster memory, it would have been a better performer than the 2080 Ti. Like, TU102 is 50% larger than GP102 and that's with all the extra RTX fluff. Just strip that out and slap on even more SMs in there and bam, you got yourself 50% more performance vs Turing's measly 25-30%. 😁
 

coercitiv

Diamond Member
Jan 24, 2014
6,199
11,895
136
I have no doubt in my mind that if Nvidia made a 1080 Ti Mk2 that was essentially Pascal on 12nm but scaled up to 2080 Ti die size and given the faster memory, it would have been a better performer than the 2080 Ti. Like, TU102 is 50% larger than GP102 and that's with all the extra RTX fluff. Just strip that out and slap on even more SMs in there and bam, you got yourself 50% more performance vs Turing's measly 25-30%. 😁
The gamble Nvidia made was offsetting the transistor and performance cost of RT with the help of DLSS. It is a very tempting path to take, doing more with the same transistor count by leveraging machine learning, but also a narrow path to success as it also requires wide developer adoption of a proprietary tech. Turing faced all the problems, from slow adoption to lackluster DLSS performance/quality in the first iteration. Ampere has a much easier mission from this PoV.

There's an argument to be made here that RT adoption was probably abrupt, in the sense that it may have been better to start with a narrower scope and build from there. The problem with this narrow scope approach was the competition, in the sense that it would have given everyone enough time to adapt. Nvidia tried to get ahead and this year we'll see if they make it happen.

Strictly from a consumer perspective, people who bought Turing for the RTX essentially made a down payment on everyone's Ampere :)
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I have no doubt in my mind that if Nvidia made a 1080 Ti Mk2 that was essentially Pascal on 12nm but scaled up to 2080 Ti die size and given the faster memory, it would have been a better performer than the 2080 Ti. Like, TU102 is 50% larger than GP102 and that's with all the extra RTX fluff. Just strip that out and slap on even more SMs in there and bam, you got yourself 50% more performance vs Turing's measly 25-30%. 😁

I don't know if we would have got better performance. Part of the reason I suspect they are going for Ray Tracing is because perf/watt wouldn't improve much so without it, it would have been a GTX 2080 Ti, with smaller die, still the same 30% improvement and priced at $699.

Using Ray Tracing is an attempt to get around the lack of perf/watt increase. It does something totally different anyway.

It has slowed drastically in the recent years, mostly because we're up against the cooling and power limit, but increasing power consumption for peak performance continues.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
1080ti is stuff of the legends for sure. 100% certain NV know that too and 320bit mem cfg on 3080 is double play to increase the gap to 3090 AND to make fellow 1080TI (352bit 11GB card) owners to consider 3090 due to memory config.

I have been rocking 1080ti undervolted @ 2Ghz for all these years and it is still performant even on 34" ultrawide. 20XX generation had some reliability problems at start and even if i had money for purchase, the motivation was simply not there. Was not really interested in 1080P RTX experience and DLSS 1.0 was not relevant for ultrawide either.

Things are completely different with 30XX gen now, DLSS2.0 is simply too good to be missed and raw performance is there.

The only problem is money: compared to 1080ti days, i can (way more) easily afford 3090, even if 3080 is more than enough ( never seen any game even approach 10GB on 34" ultrawide res ), but damn i want another long lasting card.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,021
136
Seconds after posting this video came up on the YT feed:
*snip*

There's a small bit at the end that I think most people are missing.

Many people are acting like Nvidia is some kind of saint for "low prices" on RTX3000
The prices match RTX 2000 which are much higher than usual over the last 10+ years.
The only reason 3000 looks good is because 2000 was terrible. That's not a good thing.

Past generation prices were lower, and I think RTX 3000 should be lowered as well.

X80 was $550 - now $700 - should be $600 3080
X70 was $350 - now $500 - should be $400 3070
X60 was $250 - now $350 - should be $250 3060
X50 was $150 - now $250 - should be $150 3050
 
  • Like
Reactions: Tlh97 and Hitman928

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
You know, as much as I like making fun of the R9 290X and how hot and loud it ran, subsequent releases of that die with AIB cooling and FineWine Technology gave Hawaii some staying power. Probably not as legit as the 1080 Ti but decent.
It's got to be close, especially the 290. You got close to 780 Ti performance at half the price, and Nvidia didn't really answer it for a year until the 970 came out. Even then Hawaii was within spitting distance of Maxwell until the 980 Ti came out a couple years later. That GPU had legs, especially if you got one at firesale prices during a mining dip.
 

biostud

Lifer
Feb 27, 2003
18,249
4,760
136
Basically because a lot of the die is used for tensor cores and ray tracing instead of raw graphics power. One could argue that the 20xx should have enabled DLSS 2.0 in the games that supports it, as it is one of the key features of the 20xx cards, and would make the figures look far better. Ray tracing we all know hit the performance a lot, but the performance is still way better than on the 10xx cards. The 20xx cards was a necessity for making the 30xx cards to take off.
 

voodoo7817

Member
Oct 22, 2006
193
0
76
It's got to be close, especially the 290. You got close to 780 Ti performance at half the price, and Nvidia didn't really answer it for a year until the 970 came out. Even then Hawaii was within spitting distance of Maxwell until the 980 Ti came out a couple years later. That GPU had legs, especially if you got one at firesale prices during a mining dip.

I'm still chugging along with my 290 which I got for $255 back in Spring 2015. It has served me extremely well and only when I upgraded to 1600p ultra widescreen last year did I really start to feel the upgrade itch. Even today, since I don't play competitive FPS, basically everything is at least playable. I feel like my patience has really paid off, as it looks like we will have finally reached the point where the "true mid-range" of less than $300 offers a worthwhile increased price/performance upgrade, at least in my opinion. Whether it's the 3060 from Nvidia or the 6700 from AMD, I'm looking forward to finally upgrading my GPU again.
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
I don't know if we would have got better performance. Part of the reason I suspect they are going for Ray Tracing is because perf/watt wouldn't improve much so without it, it would have been a GTX 2080 Ti, with smaller die, still the same 30% improvement and priced at $699.

Using Ray Tracing is an attempt to get around the lack of perf/watt increase. It does something totally different anyway.

It has slowed drastically in the recent years, mostly because we're up against the cooling and power limit, but increasing power consumption for peak performance continues.

Not really. NVIDIA could have skipped RT altogether and delivered a monster of a card. However, RT is necessary. It is the future. It was the future long before NVIDIA added to their cards. Eventually traditional rasterization will be done away with.