Initial experience going 295x2 quadfire to 980Ti SLI at 4K

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tapoer

Member
May 10, 2015
64
3
36
It is understandable that they have a short time to review new cards, what is not is that nowadays there is little new tech to review, tech sites should have plenty of time to do a more in depth review, they choose not to, they loose clicks and viewers.

They loose in the end, and so do we.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So, basically what you're saying is "4GB is not enough for 4K on AMD hardware."

Not exactly. Since Jacky60 tends to buy 3-4 flagship cards in which case I'd go 980Ti. That was my point that with extra GPU horsepower, one could start to benefit from > 4GB of VRAM. However, it's not possible to conclude that 6GB of VRAM benefits 980Ti SLI vs. 295X2 CF since we aren't even testing 2 similarly performing setups in the first place. I would still get 980Ti SLI today over Fury X CF since 980Ti has 20%+ overclocking headroom and 6GB of VRAM is a bonus.

Are they freaking serious? Only 650$ is way too expensive...but people still buy the 3% faster Titan X for 1000$.

I have no interest in both companies as of right now.

We get double hit -- a lot of PC games out now are broken/unoptimized and have bugs/glitches. Assassin's Creed Unity, Dragon Age Inquisition, The Witcher 3, Batman AK, Project CARS, etc. Almost any AAA game released in the recent years, even going as far back as BF4 is broken. I almost don't even bother buying $60 games anymore.

The second hit is for us Canadians, Fury X is $830 + tax and after-market 980Ti cards are $840 + tax, which basically means a $1000 GPU. Essentially from our point of view, flagship GPU prices have increased ~ 70% since HD7970 days ($499 CDN x 1.13 tax = $565 CDN vs. $829 x 1.13 tax Fury X = $938), but PC gaming graphics / state of quality of AAA games is disastrous. With your setup, I'd skip this generation entirely. HBM 2 should bring 8GB of VRAM and probably 60-80% increase in performance over Fury X/980Ti.

Pretty much this but you know, certain people will latch on to this because there's nothing else left for them after the fury disaster.

Fury isn't as good as the 980Ti but it could be worse - like buying a reference $1000 Titan X that runs hot, loud and overclocks worse compared to the best after-market 980Ti cards that are 10-11% faster out of the box with 0 overclocking. Fury X still provides 95.5% of the Titan X + AIO CLC performance at 4K for $450 less. For only $300 more than a reference Titan X, Fury X CF is probably 60-70% faster at 4K. While Fury X didn't live up to the expectations due to voltage locking and poor overclocking headroom, at least it wasn't a rip-off. The few sites that tested Fury X CF even show it beating Titan X SLI at 4K. :biggrin:

To get TX SLI to run cool and quiet basically requires after-market cooling - add $100 per card but they would still only outperform 980Ti SLI by 3-4% at most while costing $800 more (EVGA Classified 980Ti SLI = $1400 vs. $2000 TX SLI + $200 for EVGA AIO CLC kits). At least Fury X has some merit in small miniITX cases while Titan X = e-peen / flushing $ down the toilet for gaming vs. after-market 980Tis.

He actually never even mentioned Battlefield. That was, what I inferred a typo, from RS talking about Arma then ending mentioned battlefield. Which launched the conversation in that direction.

Ya, exactly. I wasn't even discussing BF4 or IQ. I was discussing how the battlefield draw distance in Arma 3 could impact VRAM usage (for example moving draw/viewing distance from 500 meters to 1.5 kms).
 
Last edited:

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
http://forums.overclockers.co.uk/showthread.php?t=18679909

The GTA5 and Dirt Rally best show what I've experienced comparing 980ti to 295x2 quadfire. Imo the Fury x and in my case 295x2 image quality seems sharper/less blurred. The spectators in Dirt Rally are a prime example, with Titan x they're blurred whereas with Fury X they're sharper. Arma 3 it's not nearly as pronounced and as I play that far more than anything else I'll probably keep the 980ti SLI. I'm still tempted to return them and see how fury x crossfire works. It's also pretty sad that hardware review sites can't compete with home broadcast enthusiasts any more and don't bother to do these crucial side by side comparison tests. There seems to be a Nvidia control panel automatically reduces quality settings theme in this thread too. That said I prefer the Batman Titan x image quality but project cards I could see more reflections from the Fury x car in the race and that's a GW title.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Hm, there really is something funky going on with that gta5 video.

Looks like some over the top motion blur is running on the nvidia side, every time I pause the video on nvidia the cars ghost, on amd they dont (see the tearline in the wheel, it's moving). Either that or gregster botched his capture.

Anyway, see if you can disable motion blur.
full.png
 

amenx

Diamond Member
Dec 17, 2004
4,416
2,738
136
Another thing ( I mentioned in other thread), I think Nvidia handles some in-game settings poorly, namely bloom. I checked the overclockers thread and saw gregsters washed out vids of the TX. They looked absolutely horrible! I would chuck out all my Nv cards if they looked anything remotely like that. THAT said, I do not like the way bloom is implemented in some games with Nv cards, namely FC4, Skyrim and Witcher 3. They give a bit of a washed out effect, so I disable bloom. Still nothing like gregsters vids! Wtf is he doing? Turning down gamma? I would not play any game that looked like that.

edit: Ah! gregster took down those washed out vids on p2 of his thread and replaced them with proper ones!

http://forums.overclockers.co.uk/showthread.php?t=18679909&page=6
 
Last edited:

Gregster1970

Member
Mar 25, 2013
25
0
0
www.youtube.com
Guys, sorry for all the confusion but I am quite new to this recording with a capture card malarky but I have it running completely fair and square now. I had problems with the secondary monitor and what settings it was using and on BF4, I had AF not working but now I have the colours set perfectly and I have the drivers set to "Prefer max quality" so it is now a complete apples to apples test.

Capture card is an Avermedia LGP Lite which sadly only does 1080P 30 fps but if things work out well, i will invest in a better one :)

I hope this clears things up a bit and sorry again for the confusion.

https://www.youtube.com/watch?v=-gaT5CGafuQ

I have redone a few of them :)
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
anyone who thinks 4GB ram is 'enough' for 4k is mistaken imho, I've been using 4.4gb VRAM without changing any settings already today.

Running what applications? I could write an application right now that requires 349034Gb of vRAM and no card ever made could run it.

Let's be more specific, certain specific games (of the hundreds of thousands of games that exist) can exceed 4Gb of vRAM when all settings are maxed out.

This whole 4K discussion is starting to get muddied by bad arguments and as someone who has been running 4K on a single GTX980 for a number of weeks now, I can say that outside a few killer apps like GTAV, my huge library of games runs really well and doesn't consume more than 4Gb of RAM, my steam library is over 500 games now and I think you'd struggle to find more than 5 of those games use over 4Gb of vRAM

We need to start being more fair and clear on what we're saying and we need to be careful by how we judge overall gaming experience on some metrics. Saying 4Gb isn't enough for 4k is a highly conditional statement which is simply not true in >99.9% of cases.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Running what applications? I could write an application right now that requires 349034Gb of vRAM and no card ever made could run it.

Let's be more specific, certain specific games (of the hundreds of thousands of games that exist) can exceed 4Gb of vRAM when all settings are maxed out.

This whole 4K discussion is starting to get muddied by bad arguments and as someone who has been running 4K on a single GTX980 for a number of weeks now, I can say that outside a few killer apps like GTAV, my huge library of games runs really well and doesn't consume more than 4Gb of RAM, my steam library is over 500 games now and I think you'd struggle to find more than 5 of those games use over 4Gb of vRAM

We need to start being more fair and clear on what we're saying and we need to be careful by how we judge overall gaming experience on some metrics. Saying 4Gb isn't enough for 4k is a highly conditional statement which is simply not true in >99.9% of cases.

I agree that 4GB is enough in the vast majority of games and accept 4GB is enough ordinarily but it's only just enough and it isn't always enough and it won't be enough in the near future with newer more demanding games. I'd simply say if you want a good 4k gaming experience then more than 4gb of memory would be desirable in future. Clearly for older games it's not going to be an issue. In Arma 3 AND GTA5 which are two of the MOST demanding games available today having more than 4GB at 4k allows in my experience higher image quality settings e.g. 2xMSAA GTA 5, significantly greater draw distance in Arma 3. The image quality settings for GTA5 aren't by default as good as for Nvidia as they are for AMD imo but with fiddling one can achieve parity. Yes I could turn the settings down but having to do so when the raw GPU power is there but cards are limited by memory is irritating.
 
Last edited:

RaulF

Senior member
Jan 18, 2008
844
1
81
I would hope that going from 4 gpus to 2 would be a better experience.

Kind of hard to argue that.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
I have to say it's a vastly better experience but the annoying thing is a was hoping for better support and have reluctantly realised that there's clearly a driver team 'sweet spot' that is one or two GPUs and forget the rest. I hope never to bother going quad again because it worked OK but never shone as well as the hardware should have been able to. I guess very few enthusiasts do it so it isn't worth their while.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
It's been like 5 years since people noticed an obvious lower Image Quality on Nvidia's side.

Interesting. I recently upgraded my GTX 470 machine to a GTX 970 and the loss of detail at the same resolution and settings is noticeable in games that I play a lot (too much, probably). I also used a newer driver, so it could be that as well. Things that used to be sharp and clear now see slightly more blurry. All of my global control panel settings are the same and my in game settings are the same. I'm not sure what to make of it other than NVIDIA is cutting corners in some way - or the driver is messed up somehow. Ultimately the increase in framerate is worth more to me than the lost detail, but it is irritating.

When I have some free time, I'm going to get some comparison shots between both cards using both drivers and see if I can figure it out.