Five Generations of Radeons compared!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ctk1981

Golden Member
Aug 17, 2001
1,464
1
81
The two I mentioned are actually still being used. My dad doesn't play games but its in his rig. My son has an older PC built just to play old games on steam and it uses the 4890.

I know if I dig around I can find my 9500 pro and an original Ati Radeon 32MB.....and a modded 8500 Pro. Why the hell do I even keep this stuff. lol
 

Plimogz

Senior member
Oct 3, 2009
678
0
71
I have a vintage ATI All In Wonder X800XT [...]
[...]I still have an AGP computer that works perfect and I just gave away another Radeon AGP video card so I'll probably keep the old AIW as backup in case the card I have goes bad.
AGP ATI AIW X800XT? Cool card; I remember wanting one a long time ago to replace a 9800 Pro AIW which had drowned in a flood.

***

So, which do you think history will remember as the better card for its time: 5870 or 7970? I mean the 5870 has some impressive credentials to its credit, but those 3GB Tahiti parts are aging pretty darn well.
 
Feb 19, 2009
10,457
10
76
We can see that AMD with each generation is approaching closer to matching NV's top-dog:

"For a brief comparison between the Radeon and GeForce cards before we wrap up, the HD 5870 was 25% slower than the GTX 480, the 6970 was 15% slower than the GTX 580, the 7970 was able to match the GTX 680 with the GHz Edition being 8% faster, and while the R9 290X stood evenly with GTX 780 Ti, it was 15% slower than the GTX 980."

They caught up with the 7970 but that's not impressive, because GK104 was a smaller chip. What's impressive is R290X, finally matching the top dog of NV, Titan & 780ti, while being a smaller chip. At 4K, it demolishes it, particular CF vs SLI. This is the first time in a long time where AMD/ATI has managed such a feat.

Maxwell is next-gen, smacking down on everything before it, in perf & perf/w.

Looking forward to AMD's R380X response!
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
AGP ATI AIW X800XT? Cool card; I remember wanting one a long time ago to replace a 9800 Pro AIW which had drowned in a flood.

***

So, which do you think history will remember as the better card for its time: 5870 or 7970? I mean the 5870 has some impressive credentials to its credit, but those 3GB Tahiti parts are aging pretty darn well.

the problem with the 7970 is that initially the price was not very aggressive, it was $550 or something, it was fine because it was faster than the 580, but... not like the 5870 which was $379, and only beaten by NV after 5 months or so!?

but the 7970 is aging better.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ but in the hands of overclockers, 7970 led or at least tied from day 1. Only few cards like MSI Lightning 680 max overclocked could approach or beat a 1.2Ghz 7970. Once driver updates kicked in, 7970Ghz beat the 680 by June 2012 and held on to that lead until Titan in Feb 2013. 5870 only had the lead for 6 months.

As a gaming card, the 7970 is way better. It overclocked better than a 5870, came with 3GB of VRAM while 5870 became VRAM bottlenecked quicker, and in modern titles the 7970 is beating the 680:
http://gamegpu.ru/action-/-fps-/-tps/evolve-beta-test-gpu.html

7970 also came out near the peak of mining years where it produced a lot of valuable bitcoins, essentially making thousands of dollars. But, those who mined early on with a 5870 at lower difficulty but held on to the coins to flip them later, also made good $.

I would say both are amazing cards for different reasons but 7970 @ 1.15-1.175Ghz today is still relatively quick for 1080p at least. 3 years after 5870 came out, it was starting to show its age with VRAM and tessellation bottlenecks.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
^ but in the hands of overclockers, 7970 led or at least tied from day 1. Only few cards like MSI Lightning 680 max overclocked could approach or beat a 1.2Ghz 7970. Once driver updates kicked in, 7970Ghz beat the 680 by June 2012 and held on to that lead until Titan in Feb 2013. 5870 only had the lead for 6 months.

As a gaming card, the 7970 is way better. It overclocked better than a 5870, came with 3GB of VRAM while 5870 became VRAM bottlenecked quicker, and in modern titles the 7970 is beating the 680:
http://gamegpu.ru/action-/-fps-/-tps/evolve-beta-test-gpu.html

7970 also came out near the peak of mining years where it produced a lot of valuable bitcoins, essentially making thousands of dollars. But, those who mined early on with a 5870 at lower difficulty but held on to the coins to flip them later, also made good $.

I would say both are amazing cards for different reasons but 7970 @ 1.15-1.175Ghz today is still relatively quick for 1080p at least. 3 years after 5870 came out, it was starting to show its age with VRAM and tessellation bottlenecks.

I'll add Tahiti's compute capability (1/3 dp) which is still tremendous today.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'll add Tahiti's compute capability (1/3 dp) which is still tremendous today.

Ya, forgot about that.

It's also impressive that in the GPU killer titles the 7970/7970Ghz are more or less doubling the 5870.


Crysis.png


DA.png


If one paid enough attention back in early 2012, HD7970/7970Ghz duo had incredible performance gains over the 6970 in the more demanding titles like Crysis 2.

crysis2_1920_1200.gif


metro_2033_1920_1200.gif


alanwake_2560_1600.gif


shogun2_1920_1200.gif


bf3_1920_1200.gif


Seems to me the 7970 got too much negative reception for poor reference cooler and lack of ability to show its true potential due to so many CPU limited titles at its launch time. Glimpses of its potential were seen in some games I linked above and the lead only got greater! Amazing card to me.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
The two I mentioned are actually still being used. My dad doesn't play games but its in his rig. My son has an older PC built just to play old games on steam and it uses the 4890.

I know if I dig around I can find my 9500 pro and an original Ati Radeon 32MB.....and a modded 8500 Pro. Why the hell do I even keep this stuff. lol

Was the mod done to rise the voltage? I had the Radeom 8500pro and I remember that it had 8 pipelines all of them already enabled unlike vanilla 9500 so was the mode done for the voltage control? What cooling did you use? Results? Mine topped out at 300MHz which wasn't very impressive considering the stock frequency was 275MHz but still close to 10% and unfortunately all of my cards overclocked by about that amount, all of them, while I had CPUs overclocked by over 50-70% many times. Athlon XP 1700+, XP 2500%, Core 2 duo E6400, i5 750, 2500K, 2600K, 5820K... All of them overclocked very well and I didn't win any silicon lottery, all of them overclocked no better than average overclock for that chip. I didn't have any luck overclocking graphics cards. Why are the graphics cards are clocked so close to their potential?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
What's impressive is R290X, finally matching the top dog of NV, Titan & 780ti, while being a smaller chip. At 4K, it demolishes it, particular CF vs SLI. This is the first time in a long time where AMD/ATI has managed such a feat.

Wait, what? At 4K 290X (uber mode) and 780 Ti/Titan Black are basically equals.
http://ht4u.net/reviews/2014/nvidia_geforce_gtx_titan_black_6gb_review/index39.php
http://www.computerbase.de/2014-03/nvidia-geforce-gtx-titan-black-test/5/
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980_SLI/20.html

What do people gain by exaggerating?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126

I think he meant "demolishing" in the context of price/performance. Take 2 after-market 290s and 2 after-market 780s at 4K. The first setup would have ran you $700-750 around summer 2014 but the 2nd setup was $1300-1400 and wasn't faster on average at 4K. I would call that demolishing as well. You shouldn't have a situation where one GPU setup that costs close to double can't win which is exactly the case with 780Ti SLI vs. after-market 290s at 4K.

7077


7078

7079


7080


7081

$1300-1400 780Ti SLI
$700-800 R9 290 CF

http://www.sweclockers.com/recension/18944-nvidia-geforce-gtx-titan-z/16#pagehead

He also has a point about after-market 290 vs. 780Ti at high resolutions. 1 year after 780Ti's launch, you can buy 290X=780Ti performance in an after-market $250-275 R9 290 or $330-350 GTX970. Look how long it took to get 7970Ghz level of performance at $250 vs. how quickly one could get 780Ti level of performance after its launch.

In hindsight, the 780Ti turned out to be nothing more than a bragging rights card for a short period of time, with 0 future-proofness factor over an after-market 290 despite being priced at retail close to double for most of its life. I would say it's pretty much a given that those who bought a 780Ti or 2 aren't concerned about value but since many of us don't care about bragging rights alone and are not made of $, I would be very upset if I bought a 780Ti for $650-700 and then by December 2014 an after-market R9 290 is just $250 with more or less identical high-rez performance. Another reason why so many of us ripped the Titan since 1.5 years after it launched, one could buy nearly 3xTitan performance in after-market 290s for the price of as single Titan! :eek:

It's fair to say there is a top tier market segment that could care less about price and will pay $100s of dollars more for 10-15% more performance that in the end makes no difference in future games. The rest of us are better off buying 2nd best AMD/NV card and upgrading with the $ saved. That's why to me buying a pair of flagship $700 cards at launch only makes sense if you can afford to do that nearly every gen.

I mean what future-proofness was there going with 580s over 570s or 680s over 670s or 780Tis over 780 Ghz editions? In hindsight, none. Those 570/670 SLI users could have taken the savings and moved on to 970s by now, meaning the extra performance the 580 had was kinda irrelevant by the time next gen games launched that crippled both the 570 and the 580. Now there are some exceptions like if you can buy cards cheap through a contact at NV/AMD/AIB or if you travel a lot to some country where you can buy flagship cards for way less than at a home (say you live in Brazil but travel to the US every 2-3 years) or vice versa (say you live in the US but travel to Brazil to visit friends every 2-3 years where you can offload a 780Ti for $600 US today I bet). I guess, if gaming is your primary hobby, you are a high income earner or single with no family/kids/relatives that need financial support, then it's not that expensive to buy and resell $1400 cards every 2 years. Still, doesn't change the fact that 780Ti and 780Ti SLI were simply awful value setups against after-market 290/290 CF for most. :)
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
I think he was merely talking performance not value, so your whole post is quite irrelevant to my argument.
Also remember when looking at performance (especially in MGPU configs) the influence of default and maxed out mode (quiet, uber, PT/TT). I'm a bit tired of pointing this out - most reviews test with AMD uber but Nvidia on default which skews results in AMDs favor.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
My titan left at default was like 15% slower due to throttling which was trivial to eliminate on air.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think he was merely talking performance not value, so your whole post is quite irrelevant to my argument.
Also remember when looking at performance (especially in MGPU configs) the influence of default and maxed out mode (quiet, uber, PT/TT). I'm a bit tired of pointing this out - most reviews test with AMD uber but Nvidia on default which skews results in AMDs favor.

There is no double standard here since NV's 780Ti doesn't throttle at default but you need Uber mode for 290X to get 1Ghz clocks to stick. This isn't giving AMD any inherent advantage but just getting stock after-market 290 performance for a reference 290X. If anything, reviewers understate 290X's performance by using Uber mode on reference 290X cards because after-market 290X cards are even faster. Therefore, the comparison is valid.

It's almost pointless to talk about performance and not talk about price. I am not sure why you would ignore that metric unless $1400 to you is the same as $700. 780Ti was an overpriced card for most of us. I mean today you can buy a 295X2 for $660, a setup I already linked to you that's faster at 4K than 780Ti SLI.

Exactly. My two Titans behaved the same, maybe not 15% but 10% or so.

Do you have proof that a reference 780Ti throttles under load? What about HardOCP that tests cards at their max? What about LinusTechTips that tests all cards max overclocked at 4K? Look, a 980 max overclocked can barely beat a max overclocked 290X at 4K today. You think 780Ti and 780Ti SLI can beat 290X/290X CF on average at 4K? That's not happening.
http://www.youtube.com/watch?v=rMsYRo7X8EU

The point still stands, after-market 290s at $700-750 demolished $2000 Titans/Titan Blacks/$1400 780Ti SLI at 4K in terms of overall value, unless $1400-2000 is just as affordable to you as a $750 setup. During the lifetime of the Titans/780Ti, the 290 CF owner has $700-1300 in his pocket with performance at least as good at 4K performance. Today, 290s would win because NV basically stopped optimizing drivers for Kepler.

No need to defend a $1000 Titan when today it loses to a $250 R9 290 and $330 970. Since 290 has 64 ROPs, at 4K 290s in CF would beat dual Titans for $550 as of now. If you look back at those Sweclockers scores, even if you raise Titan SLI performance by 10-15%, it still wouldn't beat dual-after market 290s CF = 290X reference CF.

2560.png

r2560.png

2560.png

i2560.png
 
Last edited:
Feb 19, 2009
10,457
10
76
Early on, [H] clearly show R290X in CF (Ref OR custom) demolished the 780, 780ti SLI. It just scales better at 4K with less stutter.

980 fixed whatever problem Kepler had at 4K & SLI though.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Early on, [H] clearly show R290X in CF (Ref OR custom) demolished the 780, 780ti SLI. It just scales better at 4K with less stutter.

980 fixed whatever problem Kepler had at 4K & SLI though.

Not only do most sites show 780Ti / Titan SLI slower at 4K, but the frame delivery is also worse than 290s in CF.

Frame Rate Consistency and Scaling


"We experienced something with SLI we aren't use to at 4K gaming. We experienced some inconsistent frames, some low efficiency and poor SLI scaling. We were used to seeing this on AMD GPUs until AMD fixed their issues. AMD implemented a technology called Frame Pacing, and ultimately went the hardware route with XDMA on the AMD Radeon R9 290/X.



At the end of the day, what we find is that GeForce GTX 980 SLI performance is left wanting at 4K, not because Maxwell isn't fast, but because the current implementation of SLI is more inconsistent and less efficient compared to AMD's XDMA technology on the AMD Radeon R9 290X. This is a case of aging SLI actually hindering very capable GPUs. SLI needs an upgrade, it needs to evolve.



We do not think the true potential of the GeForce GTX 980 GPUs are being exploited with current 4K SLI gaming. It is being held back from its full potential. If GTX 980 could be fully and efficiently tapped, two GTX 980 GPUs have the potential to offer a better gameplay experience.

AMD hit NVIDIA hard with this new XDMA technology. Everyone was expecting NVIDIA would strike back with Maxwell by offering its own evolved SLI technology. However, it did not for this generation. That may end up biting NVIDIA in the butt as far as 4K gaming goes in the future."


http://www.hardocp.com/article/2014..._980_sli_4k_video_card_review/11#.VL5WMS7Ccc4
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Choose either 4k performance or price/performance. 4K gamers aren't price conscious and price conscious gamers don't game on 4K monitors. AMD proponents always compare 4k performance even though almost no one has such a monitor and NV proponents are even more ridiculous concentrating on 1080p. Most high-end gamers have 2560x1440 and so do I and I'm as interested in 4K gaming as 1080p performance. Very few people care about 4K performance yet I see comparisons at that resolution more often than at the resolution that more than 100X times more people use and that is actually most popular amongst high-end gamers that is 2560x1440. People already know that kepler and 4K don't go well together and they don't care because people who do have already moved on from kepler a long time ago, when 980 launched.

Not only do most sites show 780Ti / Titan SLI slower at 4K, but the frame delivery is also worse than 290s in CF.

Frame Rate Consistency and Scaling

Yeah, keep harping on 4K performance. Almost no one who cared about 4K chose an NV solution! Better start ranting over poor driver optimisation in new games on kepler, allowing 290X to almost catch up to 780TI when it really should NOT. People actually care about that. 4K performance is irrelevant for kepler owners just as much as for owners of 7970s in CF which is even worse for 4K then kepler but no one cares because 4K is for people who are on the bleeding edge of technology and both tahiti and kepler are old news.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Choose either 4k performance or price/performance. 4K gamers aren't price conscious and price conscious gamers don't game on 4K monitors.

When when in 290/290X CF, it's faster and cheaper?

Sure, I am not trying to argue that 4K performance is end-all-be-all today. Titan SLI can't beat after-market 290 CF even at 1440P, so the point is still valid.

7052


Today R9 290 CF costs $500-550 and it's at least as fast as dual Titans at 1440P or 1600P. From the day GTX780 launched, and later when R9 290 launched, the Titan became what it was from day 1 - rip-off for gaming. It may be a good card for semi-professionals/renders/CUDA developers/scientific apps for university students but as a gaming card, it's an overpriced product.

A single 295X2 is $660 US, runs cooler and quieter than Titan SLI, and is faster too at your resolution. Even better 970 SLI overclocked would trounce the Titans at 1440P, while using way less power for $660.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Why do you think people who payed 1K per graphics card really care so much about the price? Most don't. When people were buying Titans there was no R290X, 780Ti and usually 780, so the comparison between those cards is strange, no one buying now buys Titans. It's been on the market for almost two years now!

BTW. Titans weren't so bad, a lot of people get them to 1.5GHz which is 50% over stock while R290 overclocks like garbage. Unfortunately my titans aren't so good, still they are 20% over stock and I'm still testing their limits on water.
 
Last edited:

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I agree.
PC benchmarks in general do need to include more of the older series.

The 8800GT is still minimum for many games. I'd love seeing more benchmarks testing down to the minimum specs.



I think they go as far back as the driver set allows them to. If they have to install different drivers across the cards, the review would be somewhat inaccurate.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
There is no double standard here since NV's 780Ti doesn't throttle at default but you need Uber mode for 290X to get 1Ghz clocks to stick.

Wrong. Throttling is the wrong way to look at this. AMD and NV cards both cannot reach their maximum clock potential at stock settings. Buzz word here is "maximum clock potential". That is not throttling. NV doesn't throttle because they don't fall below the guaranteed base clock. AMD doesn't throttle because they ad the moniker "up to". It may seem like nitpicking, but it is important to understand these mechanisms.
NV: fixed base clock, everything above depends on thermals. Clock ceiling is dependent on ASIC quality (some Titans for example can go up to 993 MHz, others a bin higher to 1006).

For example 780 Ti:
http://www.computerbase.de/2013-11/nvidia-geforce-gtx-780-ti-vs-gtx-titan-test/3/
6% difference in 1600p, probably similar for 4K
Titan Black:
http://www.computerbase.de/2014-03/nvidia-geforce-gtx-titan-black-test/5/
6% difference in 4K.

Thus if a reviewer tests NV cards at stock and AMD cards in uber mode, they give an unfair advantage to AMD. Why do you think AMD added uber mode in the first place? To avoid exactly the problem of not reaching maximum clocks and losing out a few percentage points in benchmarks. They saw the issue with NV boost and reacted accordingly and obviously cleverly since no one seems to have a problem with the testing practices that arose from it.

This isn't giving AMD any inherent advantage but just getting stock after-market 290 performance for a reference 290X. If anything, reviewers understate 290X's performance by using Uber mode on reference 290X cards because after-market 290X cards are even faster. Therefore, the comparison is valid.

Yes it is, see above. And of course you can test after-market cards, but if you test reference card, at least do it right and fair.

It's almost pointless to talk about performance and not talk about price. I am not sure why you would ignore that metric unless $1400 to you is the same as $700. 780Ti was an overpriced card for most of us. I mean today you can buy a 295X2 for $660, a setup I already linked to you that's faster at 4K than 780Ti SLI.

Not it is not from a technical perspective. Sometimes people are interested in that, too, you know?
And no, the 295X2 is not always faster than 780 Ti SLI. See the clock issue above. Computerbase tested both setups and especially in SLI the cards clock rather low without NVs "uber" mode, the difference is a whopping 14% in their testing:
http://www.computerbase.de/2014-04/amd-radeon-r9-295x2-benchmark-test/5/

It is pointless to discuss "faster" or "slower" without regarding the boundary conditions. It is not that easy ever since the inception of boost and you should know it.

Do you have proof that a reference 780Ti throttles under load? What about HardOCP that tests cards at their max? What about LinusTechTips that tests all cards max overclocked at 4K? Look, a 980 max overclocked can barely beat a max overclocked 290X at 4K today. You think 780Ti and 780Ti SLI can beat 290X/290X CF on average at 4K? That's not happening.
http://www.youtube.com/watch?v=rMsYRo7X8EU

See my links above. Forget overclocking, this adds a whole load of complexity on top. Some samples overclock better, some worse, then the different cooling capacity of the coolers...

The point still stands, after-market 290s at $700-750 demolished $2000 Titans/Titan Blacks/$1400 780Ti SLI at 4K in terms of overall value, unless $1400-2000 is just as affordable to you as a $750 setup. During the lifetime of the Titans/780Ti, the 290 CF owner has $700-1300 in his pocket with performance at least as good at 4K performance. Today, 290s would win because NV basically stopped optimizing drivers for Kepler.

No need to defend a $1000 Titan when today it loses to a $250 R9 290 and $330 970. Since 290 has 64 ROPs, at 4K 290s in CF would beat dual Titans for $550 as of now. If you look back at those Sweclockers scores, even if you raise Titan SLI performance by 10-15%, it still wouldn't beat dual-after market 290s CF = 290X reference CF.

Price...yawn. Doesn't interest me in a technical discussion.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Ya, forgot about that.

It's also impressive that in the GPU killer titles the 7970/7970Ghz are more or less doubling the 5870.


If one paid enough attention back in early 2012, HD7970/7970Ghz duo had incredible performance gains over the 6970 in the more demanding titles like Crysis 2.

Dat GCN......I didn't expect to pretty much double my framerates across the board going from the 5850 to R9 270 (7870). That's more tessellation and a better architecture for you :D
 
Last edited: