I just remembered tinkering with them to get them working properly in games that didn't support Xfire officially. That was somehow fun. All these cards now are just not the same especially with the bios locks. 12gb 3080 was a complete money grab, they should have made it 20gb 3080 then the price would have been somewhat justified.I remember 290 crossfire was some great bang for buck back in the day.
There are current games where even a 3090 will struggle at 1440p or 4k. If the goal is only 60FPS, this may be workable, but there are many monitors with higher refresh rates, and a single 3090/6900XT would struggle to keep up. I think they should bring back SLI and CF. It isn't perfect, but gives us more options. The only problem right now is the availability of many cards.To some degree you can adjust for resolution since even back when SLI/Crossfire were popular, we still had a similar situation to today. However instead of 1080p, 1440p, and 4K being the primary resolutions you would see a shift where 1080p (or the 16:10 equivalent since that was more common) was the middle resolution.
Here's some of the benchmark results from the AT review of the 4870 which shows why SLI was necessary. Even at the middle resolution of 1920 x 1200 the top cards struggled to hit would be considered a minimum acceptable frame rate today.
![]()
That's just for games like Oblivion which weren't the most demanding titles. If you look at the results they have for Crysis then 1920x1200 is the top resolution that's benchmarked.
![]()
We don't have a situation where a top-end card like a 3080 or a 6800 XT needs SLI/Crossfire because it can't even manage to hit 60 FPS in a resolution like 1440p in many titles. There aren't even a lot of games where this is true in 4K. There are even a few games that these cards will run well enough that most people are unlikely to have a monitor capable of displaying all of the frame since the refresh rate isn't high enough or would often require a monitor that's even more expensive than that top-end GPU.
And there were even cases where SLI/Crossfire still wasn't good enough. Some games were just that demanding. Here's results from a different AT review showing off Metro 2033 results in 1920x1200 where outside of the most powerful GPUs, it's just not possible to hit 60 FPS even when using an SLI or Crossfire setup. Move up to 2560x1600 and even a pair of 580s in SLI are only going to get 35.5 FPS. Those cards retailed for $500 when they came out, which is about $650 today when adjusting for inflation.
![]()
So in a decade we've gone from where the effective maximum resolution for many games, even when running SLI/Crossfire, has become the lowest resolution and there's a top-end resolution (4K) is effectively 4x the pixels as that. The MSRP for a 3080 wasn't too far off of what a 580 would cost in today's dollars and that single card delivers more than acceptable performance in that high-end resolution.
There are current games where even a 3090 will struggle at 1440p or 4k. If the goal is only 60FPS, this may be workable, but there are many monitors with higher refresh rates, and a single 3090/6900XT would struggle to keep up. I think they should bring back SLI and CF. It isn't perfect, but gives us more options. The only problem right now is the availability of many cards.
I think they should bring back SLI and CF. It isn't perfect, but gives us more options. The only problem right now is the availability of many cards.
SLI is nigh impossible with DX12/Vulkan
There are current games where even a 3090 will struggle at 1440p or 4k. If the goal is only 60FPS, this may be workable, but there are many monitors with higher refresh rates, and a single 3090/6900XT would struggle to keep up. I think they should bring back SLI and CF. It isn't perfect, but gives us more options. The only problem right now is the availability of many cards.
depends what your time is worth and what you got now, if you make a good amount, value your free time and want to spend it playing games at high res then you will do what it takes to complete that task. If you pay double what a card is worth but get to play your games at full enjoyment for a year then its not really a big deal. or if you have free electricity and mine during its downtime you can make up some of the extra $?Do any sane people actually buy graphics cards at these insanely inflated prices?
or if you have free electricity and mine during its downtime you can make up some of the extra $?
1000$ is just 3x eating out.
well I don't call McDonalds eating out ;( sometimes I can get away with 150$ dinner if I share the main course with my wife, i cant remember the most expensive card i have purchased but most likely 550 for a 1080ti many years ago. if i could pay double for double the performance id still be happyNot yet. Not for most people.
well I don't call McDonalds eating out ;(
again one meal and eating out is a different thing to me, i can eat a meal for about 3$, but to go out (i dont consider applebees going out but more of a waste of $ or a regular meal) either way you get what you pay for.. we havnt had a decent 150$ cpu in 10 years a used 7990? was close as you could get.Most Americans like their casual dining experiences where you get out the door with a ticket of around $50 (plus tip). You know: Olive Garden, Chilis, Applebees, places like that. If dGPUs were only $150 then most folks wouldn't be complaining. There are very, very, very few people that pay $150 or higher for one meal.
again one meal and eating out is a different thing to me, i can eat a meal for about 3$, but to go out (i dont consider applebees going out but more of a waste of $ or a regular meal) either way you get what you pay for.. we havnt had a decent 150$ cpu in 10 years a used 7990? was close as you could get.
Not msrp, but there were loads of rx480/470/570/580 for <$150 before the mining boom.
As for cpus, the ryzen 2600/2700 were frequently sub $150 after the 3000 series were released.
The 10700k went on sale for $150 for a few weeks at microcenter a few months back.
You'd be hard pressed to find people who agree that spending $1000 on three 2 person meals in 2022 isn't Mr. Moneybags territory.
again one meal and eating out is a different thing to me, i can eat a meal for about 3$, but to go out (i dont consider applebees going out but more of a waste of $ or a regular meal) either way you get what you pay for.. we havnt had a decent 150$ cpu in 10 years a used 7990? was close as you could get.
yes you quoted a mistype auto correct or something the 7990 is not a cpu its a gpu. sorry for my mistakeUh...
I don't consider less than 120FPS to be acceptable.Which games are those? I pulled up a TPU review for the 3090 since they've got a large set of games and the only thing it didn't hit 60 FPS for in 4K was Control. At 1440p the lowest FPS is in Anno 1800 at 67.9 FPS, but it's clearly CPU bottlenecked since it gets basically the same FPS regardless of resolution.
I checked the 3090 review at Tom's and it's above 60 FPS in all 9 games they tested in 4K. The only possible concern is the 99th percentile frames (the fact that we now bother to care about this at all just further proves my point I think) dipping below 60 FPS for Metro Exodus.
The Gamer's Nexus review has 8K results (which is 4x the pixels of 4K) for the 3090 where it often only scrapes past 30 FPS for the 6 titles tested. The only games it doesn't hit 60 FPS in at 4K are Total War: Three Kingdoms and the completely ray traced Quake II if you care to count that.
Hardware Unboxed unfortunately doesn't show the graphs for each individual game, but the numbers for the 3080 can be used for a point of comparison. The only game there where it wouldn't hit 60 FPS in 4K is Microsoft Flight Simulator 2020.
Maybe there are other games out there, but after looking at a few of them there were only 3 games where the 3090 struggled in 4K and generally this meant that the game had around 50 FPS in those titles. I suppose you could turn stuff like RTX on and really tax the card, but then the counter argument is that you can just use DLSS (if available) and basically eliminate FPS issues entirely.
I don't consider less than 120FPS to be acceptable.
Eh? I thought multi GPU was supposed to be a major DX12 feature? Well maybe not "major" since practically nobody uses it, but . . .
Nah. Older stuff like DX11 was all abstractions, drivers would interpret those abstraction and decide how it would work on the actual hardware. Thus driver support generally needed for SLI support but you could do it if the effort was put in.
Meanwhile the new APIs work far more like consoles, everything done is very explicit.
I don't consider less than 120FPS to be acceptable.
Eh? I thought multi GPU was supposed to be a major DX12 feature? Well maybe not "major" since practically nobody uses it, but . . .
![]()
Explicit Multi-GPU with DirectX 12 – Control, Freedom, New Possibilities
developer.nvidia.com
Explicit like that?