What would happen, if AMD and Nvidia took turns being in the lead?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Elfear

Diamond Member
May 30, 2004
7,165
824
126
25x16 was always an extremely niche resolution and by no means representative of enthusiast setups. 19x12 was the enthusiast resolution before the move to 16:9. Not only was 25x16 extremely difficult to run but the screens were also incredibly expensive. Today 4K is very difficult to run but at least the screens justify their price so we are already seeing a bigger adoption of 4K than 25x16 ever had.

Like today 1440p is certainly the most logical resolution to compare high end cards not 4K. Once we are comparing SLI/CF setups then it makes sense to test such resolutions.

Fair enough. 1080p was probably more representative of even the higher-end gamers 6yrs ago much like 1440p is today. Results from TPU I posted shouldn't shift meaningfully at 1080p. My main point was using the "All Resolutions" graph didn't make sense because 1024x768 and 1280x1024 were mixed into those averages.

As far as dual GPU in a single card is concerned am I wrong in remembering that there were bigger compatibility hassles than actual SLI\CF because you couldn't merely turn off 1 GPU? Can you also point out which dual GPU had a price/performance advantage over buying 2 cards? Keep in mind the resale value of 2 separate GPUs will be MUCH higher than a dual GPU card.

For people who had the room, dual cards was almost always the better option. But that doesn't get around the fact that there was a lot of talk back in the day about the fastest card and the fact that I was replying to Happy Medium's post where he specifically said fastest "card" and not GPU.


The Fury X is not in the same performance class as a 1070. AIB 1070s are about 20% faster than a Fury X with still more room to overclock and 100% more VRAM. The Fury X is basically it's own category.

Meh. The reference 1070 is only 10% faster and the average AIB is 15-16% faster. That's basically the same as the GTX 280 vs 4870, 5870 vs 480, 6970 vs 580, etc, etc. For high-end cards, 10-15% is within the same competitive tier. There may be some very highly overclocked AIB 1070s that are closer to 20% but those don't represent the average 1070.

perfrel_2560_1440.png
 
Last edited:

Thinker_145

Senior member
Apr 19, 2016
609
58
91
Fair enough. 1080p was probably more representative of even the higher-end gamers 6yrs ago much like 1440p is today. Results from TPU I posted shouldn't shift meaningfully at 1080p. My main point was using the "All Resolutions" graph didn't make sense because 1024x768 and 1280x1024 were mixed into those averages.



For people who had the room, dual cards was almost always the better option. But that doesn't get around the fact that there was a lot of talk back in the day about the fastest card and the fact that I was replying to Happy Medium's post where he specifically said fastest "card" and not GPU.




Meh. The reference 1070 is only 10% faster and the average AIB is 15-16% faster. That's basically the same as the GTX 280 vs 4870, 5870 vs 480, 6970 vs 580, etc, etc. For high-end cards, 10-15% is within the same competitive tier. There may be some very highly overclocked AIB 1070s that are closer to 20% but those don't represent the average 1070.

perfrel_2560_1440.png

Yes 1050p was the most popular resolution at one time and most such gamers upgraded to 1080p which then became the de facto resolution. For a few years 1080p and 1200p were both considered enthusiast level.

In the example you posted there is a 16% difference and the clocks on that card are bottom of the barrel for 1070 AIB cards but this will easily go to about 25% when we do a max OC vs max OC comparison since that is not affected at all by what type of 1070 you have(apart from the Micron vs Samsung ram thing). Now factor in 100% more VRAM and tell me again that we are still in the same performance class?

There is a bigger difference between a max OC 1070 vs max OC Fury X than between a max OC 1070 and max OC 1080.

Looking into reference vs reference performance when one of the cards in question is almost beyond improvement is just not right at all. Both the Fury cards are highly over rated simply due to the fact that they are the equalent of a premium AIB card and ends up being compared to reference cards in many of the benchmarks.

So me using one of the fastest AIB cards against the Fury is not unfair at all. The fastest 1070 still has more OC headroom than a Fury X.
 
Last edited:

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
I remember when ATi launched the 9700 and totally dominated and......

Ummm..yeah.

I've got a better thread:

What is NVIDIA and AMD went back to new gens every 18 months?

or

What if AMD and NVIDIA went back to 18 month refreshes, and Rendtition, S3, Matrox, PowerVR, and 3DFX came back with competitive products?

This thread shows just how sad it is getting. For years we're reduced to talking about "What if?" and "What was".

I wish AMD would just launch the da** Vega so I could buy one.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
You kinda messed those facts up...it was the contrary of what you say...every nvidia card on your comparison was faster then amd counterpart..

As an owner of both a 680 and a 7970, he did not mess that up. Initially the 680 was the faster (and cheaper) card, but that didn't last long. Driver updates along with the GHz edition put the 7970 pretty firmly in the lead. Fast forward to today, and that lead is quite a bit larger, particularly if you're comparing it to the 2GB 680. I had a pair in SLI, one I gave to a friend in desperate need of an upgrade, the other I keep around to toss in random boxes when I'm tinkering. My 7970 however is still on "active duty" and gets a fair amount of usage in another part of the house.
 

OatisCampbell

Senior member
Jun 26, 2013
302
83
101
As an owner of both a 680 and a 7970, he did not mess that up. Initially the 680 was the faster (and cheaper) card, but that didn't last long. Driver updates along with the GHz edition put the 7970 pretty firmly in the lead. Fast forward to today, and that lead is quite a bit larger, particularly if you're comparing it to the 2GB 680. I had a pair in SLI, one I gave to a friend in desperate need of an upgrade, the other I keep around to toss in random boxes when I'm tinkering. My 7970 however is still on "active duty" and gets a fair amount of usage in another part of the house.

Yeah....but- the only reason is 7970s have higher VRAM. (2GB vs 3GB) Has nothing to do with driver updates and the 7970 was not "firmly in the lead". (and we're already seeing the same with Fury X and it's 4GB VRAM)

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html

Over two years after the 680 launch the 7970 GHz is only 3% above the 680 at 1600p and tied at 1080p. Where the GHz edition 7970 beat the 680 was on value, with the handful of games and much lower price it was a no brainer choice. (loved that buy back in the day, and my 290)

Your post still speaks to what I consider the problem of computer gaming these days:

How many times, for how many YEARS, has the 7970 vs 680 debate been discussed?

It has to be that way now because there are only 1.75 players in the market these days, and even the "1" (NVIDIA) has stretched their development time so far that it's more frustrating than anything when they do "gift" us with a release.

AMD's releases are mostly re-brands lately and good lord, Fury X was a year and a half ago now and Vega predicted for Q2....next year. :-(
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Yeah....but- the only reason is 7970s have higher VRAM. (2GB vs 3GB) Has nothing to do with driver updates and the 7970 was not "firmly in the lead". (and we're already seeing the same with Fury X and it's 4GB VRAM)

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html

Over two years after the 680 launch the 7970 GHz is only 3% above the 680 at 1600p and tied at 1080p. Where the GHz edition 7970 beat the 680 was on value, with the handful of games and much lower price it was a no brainer choice. (loved that buy back in the day, and my 290)

Your post still speaks to what I consider the problem of computer gaming these days:

How many times, for how many YEARS, has the 7970 vs 680 debate been discussed?

It has to be that way now because there are only 1.75 players in the market these days, and even the "1" (NVIDIA) has stretched their development time so far that it's more frustrating than anything when they do "gift" us with a release.

AMD's releases are mostly re-brands lately and good lord, Fury X was a year and a half ago now and Vega predicted for Q2....next year. :-(

It had plenty to do with driver updates AND vram differences. There were several driver updated for the 7900 series that significantly increased it's performance when it first came out. The VRAM advantage played itself out a bit later in the life cycle, and beyond.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It had plenty to do with driver updates AND vram differences. There were several driver updated for the 7900 series that significantly increased it's performance when it first game out. The VRAM advantage played itself out a bit later in the life cycle, and beyond.
And to add to that, a change in game development also went on to give the 7970 an advantage. Their change in direction favored the capabilities of the 7970 uarch, over the 680.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
AMD's releases are mostly re-brands lately and good lord, Fury X was a year and a half ago now and Vega predicted for Q2....next year. :-(

This is where I find myself this year. As someone who upgrades annually - AMD basically told me to go with Nvidia. Hopefully Vega puts AMD back into a spot where I normally shop, otherwise, I'll have to skip em for another year.


@Topic: Back in the day when AMD and NV use to exchange hits, they got sued for price fixing haha. I personally don't see AMD/NV competing like they use to. I see what is happening in the CPU market basically replicate with GPUs. AMD will have gems for specific tier prices, but NV will dominate the top end.

Which is fine, AMD doesn't need to the crown to stay healthy. But boy are these forums (and many other "enthusiasts" ones) going to be hard to read.