Clash of the Titans, AMD, NVIDIA, ASUS flagship graphics card Competition

csbin

Senior member
Feb 4, 2013
907
611
136
http://gamegpu.ru/test-video-cards/titany-i-olimpijtsy-test-gpu.html



HD 7970 GE:1160/7160MHz
HD 7990:1100/6500MHz
ARES 2:1200/7200MHz
GTX 680:1150/7000MHz
GTX 690:1020/7000MHz
GTX Titan:1180/6700MHz

Nvidia GeForce / ION Driver Release 320.20

AMD Catalyst 13.5



3dmark 11


3d11.jpg



3DMARK 13


3d.jpg



Metro 2033


m%202560.jpg




Total War: Shogun 2


tw%202560.jpg



The Witcher 2: Assassins of Kings


witcher%202560.jpg




Far Cry 3


fc%203%202560.jpg



Crysis 3


c3%202560.jpg



Tomb Raider


tr%202560.jpg



Bioshock Infinite

bi%202560.jpg




Results summary


uq%202560.jpg
 
Feb 19, 2009
10,457
10
76
All this shows is that gaming at high res still brings top GPUs or multiGPUs to its knees... and ppl keep throwing the mantra that more GPU power is not required. pfft!
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
2560x1600 is way to hard given that you cant do 60fps even on a 7990!

1440p is looks as big as you really want to go otherwise you need a $1k GPU just to get 40-50 fps!
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Thanks for the benches, missed these over there. Nice to see these comparative runs using a Titan with a large overclock.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Did I miss something, but wasn't nVidia handly beating AMD for Bioshock? Did 13.5 bring improvements for AMD on that title?
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Isn't it kinda misleading to compare the cards at the same clocks? I'm under the impression that GCN and Kepler are different?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Isn't it kinda misleading to compare the cards at the same clocks? I'm under the impression that GCN and Kepler are different?

They are not comparing them at the same clocks. The NV clock rates are GPU rates, not the max boosted rates. When you see GTX680 @ 1150mhz that's for the GPU, add more with boost and you are probably looking at the card operating at 1230-1250mhz. Secondly, the review overclocked their samples to the max. They were not doing a comparison of GPUs on a per clock basis.

Goes to show a single Titan is in no man's land as I said before -- not fast enough for 1600P on its own - really need 2 of those bad boys. Fingers crossed 20nm flagship Maxwell = Ares II OC. That would be a nice upgrade.
 
Last edited:

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
It does show how even with older games (Witcher 2/Metro 2033) modern GPU's and CPU's are struggling as well as the newer games. AMD/Nvidia need to make bigger faster improvements than the pedestrian progress they've made of late.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The 7970 is baffling to me. At times it competes with Titan and the next second it loses to a gtx680. Could there be any performance the driver team hasn't brought to the surface yet?

Also, that Ares II.... It's $1200 isn't it? If they fix the crossfire drivers that's thing is a better buy than the other $1000+ gpu's if you ask me.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Did I miss something, but wasn't nVidia handly beating AMD for Bioshock? Did 13.5 bring improvements for AMD on that title?

I have not benched, but the game is running better for me with 13.5 than previous releases. I am just watching my fraps numbers.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Wonder how this would look if you measured frame times rather than frame rates.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Weird. I was getting 1100 fps in the first test than by the last test I was down to 8.2 fps where the 2 were fighting.

I guess I must come to the realization that my GTX 460 1GB SLI is long in the tooth. I'll wait for the GTX 760 ?3GB? and grab 2 of those. I'm not impressed with the GTX 660.
 
Last edited:

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
*Bookmarked thread for dismantling claims by..ahem....various members here insisting on which cards own the high end market.*^_^
Nice post OP....tho I suspect Sonty and a few others won't believe/like/understand them.:thumbsup:
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Goes to show a single Titan is in no man's land as I said before -- not fast enough for 1600P on its own - really need 2 of those bad boys. Fingers crossed 20nm flagship Maxwell = Ares II OC. That would be a nice upgrade.

Since we're talking about AFR neither then 7990 or 690 are even playable at those settings in several of the titles.

The only real option at that res and with those settings is dual Titan, summary section would require Tri Titan, quad doesn't scale enough to matter.

All this shows is for $1k you don't get anywhere, you need to invest more money.

30-40 fps is unplayable in MGPU, that's alternating frames rendered at 15 to 20 fps.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
The 7970 is baffling to me. At times it competes with Titan and the next second it loses to a gtx680. Could there be any performance the driver team hasn't brought to the surface yet?

Also, that Ares II.... It's $1200 isn't it? If they fix the crossfire drivers that's thing is a better buy than the other $1000+ gpu's if you ask me.


A regular 7970 should be 10% faster than a 680, 7970ghz should be 15-20%... if the drivers were properly coded/optimized. Nvidia always has the better optimizations when it comes to software. AMD needs the better hardware to make up for the lackluster driver optimizations it seems.

I am sure the driver team over at Nvidia have a prototype 7xxx series driver and laughing down at AMD saying they are hindering their own cards capabilities LOL.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Since we're talking about AFR neither then 7990 or 690 are even playable at those settings in several of the titles.

30-40 fps is unplayable in MGPU, that's alternating frames rendered at 15 to 20 fps.

I don't necessarily disagree. My point is these measly 35% or even 45% increase on the Titan OC is a drop in the bucket for handling demanding games that choke HD7970GE OC / GTX680 OC. We need GPUs 75-100% faster to make a real dent in playability or at least Titan level of performance at a reasonable price to move the market forward. Otherwise, you more or less have to buy 2 Titans+ which is 2 grand. 20nm generation can't get here fast enough. Once games like Witcher 3 and other next gen titles come out, Titan and everything else this gen will be complete paperweight for 1600P and possibly 1080P.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Heh, first time I've ever heard of 35-45% called measly at the top end.

It's a dog chasing it's tail, you get new gpu's twice as fast and you get games twice as hard to max...

I think you're being a bit sensational with the paperweight thought, often you can reduce one or two settings to get twice the performance. Just dropping MSAA in some of those titles shown would be enough to nearly double your performance.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Heh, first time I've ever heard of 35-45% called measly at the top end.

It's a dog chasing it's tail, you get new gpu's twice as fast and you get games twice as hard to max...

I think you're being a bit sensational with the paperweight thought, often you can reduce one or two settings to get twice the performance. Just dropping MSAA in some of those titles shown would be enough to nearly double your performance.
yep even a modest gtx660ti can run all games on nearly max settings at 1080 by using common sense. just leaving of MSAA allows me to stay above 60 in most cases. even for Crysis 3, I have every setting on very high except shadows on medium and with SMAA get 40-50 fps average. not bad for the best looking game out there. and funny enough its been my cpu that cant keep me above 60fps in few games.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I am sure the driver team over at Nvidia have a prototype 7xxx series driver and laughing down at AMD saying they are hindering their own cards capabilities LOL.

Possible assumption but a real push is being made right now so the situation might change soon. Driver improvements took a higher priority within the AMD camp.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Heh, first time I've ever heard of 35-45% called measly at the top end.

Context - Historical

HD4870/4890 --> HD5870/6970 (60-70% in modern games) --> HD7970 OC (70-80% in modern games). Each incremental upgrade was very little $ out of pocket to upgrade to after selling those AMD cards. Similar story on the NV side - 8800GTX --> GTX280 --> GTX480/580. Very reasonable upgrade path out of pocket. Even 580 to 680 was reasonable since it was $499 for a 35% increase.

vs.

GTX680 / 7970GE --> Titan (35%) for a whooping $600+. :whiste: It's not about the Titan but what this means for the GPU industry as a whole. If we gamers accept this crap, then $1000 is the new flagship price.

Context - Current Gaming trends

It's what toyota says. If you are gaming at 1080P or similar, you drop AA a bit and you are good to go with HD7950 OC / 670 / 680. If you are hitting 50-60 fps on those cards at 1080P, you have a good gaming experience. The Titan's 35% increase won't get you 120 FPS for 120Hz 1080P monitors making it a waste at 1080P given $1000 asking price. For 1600P, the number of pixels rises 98% but the Titan is only 35% faster, say 45% overclocked. That's not enough which means if you upgrade from 1080P + HD7970OC to 1600P + Titan OC, you are going to suffer a major performance hit or have to reduce visuals significantly.

It's not about 35-45% performance increase being measly but the context within which the Titan's performance sits now. Price wise, it's a failure. State of readiness for next gen games, it's also a failure. We are on the cusp of PS4/720 with next gen titles that will increase demands by a magnitude of factors. Once those titles ship, we are looking at 20nm GPUs.

It's a dog chasing it's tail, you get new gpu's twice as fast and you get games twice as hard to max...

Once games twice as hard come out, then buy a GPU twice as fast for them because chances are 20nm $550 GPUs will offer Titan's performance for half the price or we'll have GPUs 30-40% faster than the Titan by the time those games hit. This is exactly like wasting $ on GTX280 over 4870 or GTX480/580 over 5870/6970. In hindsight, in all 3 of those cases, it would have been a lot smarter to save $140-150 on each of those upgrades and go for a slightly slower card and instead reinvest it into a new gen that was 60-75% faster. Titan's individual performance is not fast enough for next gen titles or 1600P. That means it only makes sense if you get 2-3 within the context of its current performance in games like Crysis 3, Tomb Raider, etc. Otherwise, the same scenario will play out for the 4th time where the savings of not getting the Titan would give you 60-75% more performance from GTX680/7970GE once reinvested into 20nm GPUs. That's $600 extra towards 20nm GPUs that are being wasted on 95% console ports and 3-4 demanding titles that cream Titan anyway.

I think you're being a bit sensational with the paperweight thought, often you can reduce one or two settings to get twice the performance. Just dropping MSAA in some of those titles shown would be enough to nearly double your performance.

It's not that it's going to become paperweight in literal terms. I am saying you are overpaying for a 35% performance increase at today's prices when next gen games are not here. Looking at the few demanding games that are out, the Titan is not fast enough which signals that it will get creamed by next gen titles. GameGPU results are foreshadowing this already. You need HD7990 OC or GTX690 OC level of performance from a single GPU for next gen games. HD7970 came out 1st week of January 2012. It's been 1.5 years since 28nm tech launched. We are likely past the middle point towards 20nm tech, especially from AMD. Think about the idea of spending $1,000 for a 35% performance increase over a 1.5 year old GPU. Now look at HD4870 OC vs. HD3870X2, HD5870/6970 OC vs. HD4870X2 or HD7970 OC vs. HD6990. Why wouldn't the same scenario play out once 20nm flagship cards launch - i.e., 20nm HD9970 OC ~ HD7990?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
*Bookmarked thread for dismantling claims by..ahem....various members here insisting on which cards own the high end market.*^_^

The goal posts are shifted very quickly around here. I've been away and suddenly GTX680 is neck and neck with HD7970GE? On what planet exactly, and why is GTX680's lower VRAM and higher price continue to be ignored? If we narrow it down to last 3-4 AAA games that choke GTX680/7970GE and forget all the other games where 680 gets leveled, then sure. It's interesting how all those other titles where 680 is slower or non-mainstream titles outside of the most popular games released more than in the last 3 months don't count anymore but the titles where 680 keeps up are "most relevant". So the only performance that matters now are last 3-4 mainstream games?
http://forums.anandtech.com/showpost.php?p=34997954&postcount=121

April 30, 2013 - PC Perspective - Frame Rating: High End GPUs Benchmarked
"The Radeon HD 7970 GHz Edition was the second best card and considering you can find it for less than half the price, it makes a compelling case at beating out the GTX Titan for 4K bragging rights. It performed better than the GTX 680 2GB and GTX 680 4GB in our testing which follows the results we have seen at 2560x1440 previously."

Let me get this straight, the cheapest 680 4GB card is $515 on Newegg and it loses to a $410 HD7970GE, and has worse game bundle. This forum never seems to acknowledge how overpriced NV cards have been this generation but at almost any opportunity bring up HD7970's launch price of $549 1.5 years ago. Only for a limited time when GTX670 and 680 delivered superior value hands down to 7950/7970, they were worth talking about for single GPU setups. GTX680 4GB should cost $375 at most since it's slower than a $410 card. NV marketing FTW.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You said the increase was measly not that perf/$, which I wouldn't have disagreed with but Titan is nearly 100% of the 580, and a large increase over the $500 mid-range 680.

You bought a 100% markup mid-range product, but using the same logic can't reason a 100% markup high end product?

You aren't making any sense. If Titan isn't enough for 1600p then why has the 7970 been pitched since day 1 as a 1600p product. You linked 1600p benchmarks relentlessly against the 680 to show the 7970 in a better light.

Price wise it's a total win for Nvidia, high margin products are what every company wants and that's exactly what Titan is.

What you're saying about Titan is true for every other card, why buy a 7850 for $250 when you can get a 8850 for $250 with 60-100% more performance, with better efficiency? Why buy anything at all in this industry, it's all outdated sooner rather than later?

Not fast enough for the settings, but neither are the dual cards. So what's next, no cards are fast enough for these games or Titan is fast enough with proper settings? You're concerned about turning down settings with Titan, but ignoring the fact that every other card has to as well. That said I would agree with you, Titan is a MGPU card with higher than 1600p in mind, and that doesn't mean 3x1600p with unnecessary AA modes in deferred games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You said the increase was measly not that perf/$, which I wouldn't have disagreed with but Titan is nearly 100% of the 580, and a large increase over the $500 mid-range 680.

Why are you ignoring the existence of the GTX680? GTX280 was $649, 1 month later dropped to $499, 7 months later GTX285 came out for $349, 2 months later HD4890 came out with GTX280 level of performance for $259, 6 months later HD5850 came out for $269 besting GTX285 and HD5870 crushed the 285 for $370, then 6 months later GTX470 came out for $350. That's a healthy GPU market of trading blows back and forth and price drops and/or continuous performance increases.

vs.

Today's stagnating GPU market from both AMD and NV.

HD7970 came out at $549, then 2.5 months later GTX680 at $499, then 3 months later HD7970GE for $470 in retail. Then from June 21, 2012 to now, there has been almost no price drops on 7970GE/680 cards. Terrible. For a gamer, this generation is one of the worst in a long time. Bitcoin mining is the only thing that made it bearable.

February 2010 - I got my HD6950 for $230 and it unlocked into a 6970.
May 2013 - GTX660Ti on a very good sale is $205 but it only offers 24-25% more performance, maybe 35% more when overlcocked. How are you not seeing this?

In 3 years and 3 months, there is almost nothing you can get that's 75-90% faster than HD6950 unlocked for $230. You need to pay ~$290 for a 7950 and overclock it.

If Titan isn't enough for 1600p then why has the 7970 been pitched since day 1 as a 1600p product. You linked 1600p benchmarks relentlessly against the 680 to show the 7970 in a better light.

Already stated this before: In 2012, for those people who were against multiple GPUs and specifically wanted to game on 1440P/1600P monitors, our forum often recommended HD7970 / 7970 OC because that was the fastest solution. That doesn't mean HD7970 was a fast enough solution. No one ever stated HD7970 OC is fast enough for 2013-2014 games at that resolution. In fact, a lot of us recommended getting GTX670 for those users since they could add a 2nd one later since we already anticipated that future games would level HD7970 OC at 1440P. I guess you missed those notes that came with the recommendations for the last 1.5 years regarding high rez monitor gaming.

Price wise it's a total win for Nvidia, high margin products are what every company wants and that's exactly what Titan is.

I don't work for AMD, Intel or NV. I want good value, not $ for NV's or AMD's shareholders. I am not blaming NV here. NV found a market niche that's willing to pay $1000 for GPUs. I just didnt' realize they were that many people who would pay $600 more for a 35% performance increase.

What you're saying about Titan is true for every other card, why buy a 7850 for $250 when you can get a 8850 for $250 with 60-100% more performance, with better efficiency? Why buy anything at all in this industry, it's all outdated sooner rather than later?

Buying a $250 card and it becoming outdated is a $250 cash outlay, not $1,000. How are these 2 even remotely comparable unless to you $250 and $1,000 is the same thing? There are certain times when it's better to buy a GPU. In the example you provided, if a gamer, and they could only afford a $250 GPU, they should have purchased HD7850 2GB for $250 in February of 2012 instead of waiting for 15-18 months to buy that GPU for $180 in 2013. The opportunity cost of waiting 15-18 months is not worth saving $70 in that case if they could already afford $250 in Feb of 2012. At a certain point on the technology curve, if you pay attention to GPUs, it becomes evident that certain GPUs are overpriced. For example, we knew HD7970 was overpriced at launch since NV didn't show its hand yet and we knew it carried the early adopter price premium. Many commented on it. We also knew that starting September 2011, GTX580 for $450 was overpriced since rumors of impeding HD7970 launch were around the corner. It's up to the consumer to sift through the information and make informed decisions.

The Titan is coming 1.5 years after 28nm GPUs launched. If it came out for $1,000 around January 2012, that would be totally different. You cannot rationally justify the Titan's price against HD7970/GTX680 since all of those GPUs are also overpriced today on the price/performance Moore's law/technology curve. In July 2012, HD7970GE was $470, HD7950 was $320, GTX670 was about $350-380, GTX680 was $470-500. Fast forward and in May 2013 the 7950 is selling for $280-320, HD7970GE for $410, GTX680 for $450-470, GTX670 for $360-380. That makes all HD7950/7970/7970GE/670/680 overpriced today. By waiting nearly 12 months, the gamer would have only saved $40-50 on these cards. Since GPUs get cheaper over time for a similar level of performance and we are not seeing this with current prices for 28nm tech, that means GPU prices are a point of stagnation - i.e., the worst time to buy a GPU. This should not happen in the GPU market with rational consumers because the consumers are paying 2012 prices for 1-1.5 year old 28nm tech in May 2013. Why would someone do that, especially in the summer months when a lot of people go travelling and are outdoors, reducing the number of hours devoted to gaming? Think about it when HD7970GE after-market cards were $450 in summer of 2012 and almost 12 months later they are still $410. That's not irrational to you?

Not fast enough for the settings, but neither are the dual cards. So what's next, no cards are fast enough for these games or Titan is fast enough with proper settings?

The dual cards are fast enough if a single GPU was capable of putting down those frames without CF/SLI. In those benches, they are putting down 30-40% more performance than the Titan which is the 75-90% range from GTX680/7970GE I was talking about.

You're concerned about turning down settings with Titan, but ignoring the fact that every other card has to as well.

That's why in May 2013, GTX690/7990/Titan are all overpriced because GTX690 was available for $1000 more a year ago. Think of it this way if we expect GPU performance to increase 70-80% every 18-24 months, then in 12 months, the price of GTX690 has to either fall by 35-40%, or a GPU with 35-40% higher performance needs to come out at $1000. Neither of this happened which means that level of performance for $1000 is overpriced.

In Summary: on average GPUs have never been more overpriced than they are in 2013. Performance has not increased at all since June 2012 in the sub-$600 level. The Titan doesn't even count since it's <1% of the market. Normally we either get price drops, or 15-20% refresh bumps. These prices were acceptable in 2012 since that was the first half of 28nm generation. By mid-2013, these prices are no longer acceptable if we look at the price/performance technology curve since we are deep into the 2nd half of 28nm generation and prices have barely moved in 12 months. Unacceptable. Maybe GTX700 series will change this.
 
Last edited: