GeForce GTX 460 SLI vs. Radeon HD 5970: Two Against One

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Scali

Banned
Dec 3, 2004
2,495
0
0
Many, many people on these forums and beyond very happily use 24", 27" or 30" LCDs as their daily computing screen, and I can't imagine you'll find any of them that will tell you that it's too big to be practical. Most people I know who use engineering/design programs professionally have two big, high resolution screens, and swear by them for productivity and ease of multi-tasking (and want larger/higher resolution ones at that!).

I think you're missing the point here...
With engineering work, you don't need to oversee the entire screen at once, so bigger is always better... instead of maximizing your windows, you can just make them 'large enough' and arrange them however you see fit.
With gaming or watching movies, you DO want to oversee everything, else you're missing important details.
At work, my screens can't be big enough... but at home, I don't want an uber-big screen, because I wouldn't be able to play games or watch movies when sitting at the desk.
That's because at work, I just need to work. At home I need a 'multi-purpose' setup.

I think you've created a bit of a straw-man by branching off into arguing that high resolution doesn't necessarily look more realistic.

I don't do straw-men.
What I said is a very simple and obvious point: you have to prioritize different ways of improving visual quality.

High resolution can only look as good as the material it is receiving from your video card, of course, but sticking to lower resolutions and screen sizes doesn't change that in any way, shape or form, and looks worse to boot ;)

Since TV and movies (the most realistic content we have) also stick to 1080p resolutions at this point, I don't think it's unfair to say that games sticking at 1080p and instead concentrating on delivering more realistic content is unreasonable.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Well, the point is about where the extra processing power of future GPUs is to go.
More power will mean better FPS. It's up to the developers to find the right balance.
I pretty much agree here, I've even seen some work on anti-virus software using GPGPU to speed up scans.

http://gpgpu.org/2010/07/04/gravity-antivirus-engine

The bezels will always be 'in the way'...

I've heard talk of the idea of using Gorilla Glass for both the monitor screen & the bezels.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Textures are actually a poor-mans solution to fake detail. What we REALLY want is procedural texture generation and tessellation, not texture maps.
I'm not sure about the higher resolutions, really. I think that has lower priority than better geometry detail.... What's the point of having super-sharp faceted geometry?
What looks more realistic? A real movie at SD resolution, or a game like Crysis at 2560x1600?
Baby steps ;). But I do agree.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I think you're missing the point here...
With engineering work, you don't need to oversee the entire screen at once, so bigger is always better... instead of maximizing your windows, you can just make them 'large enough' and arrange them however you see fit.
With gaming or watching movies, you DO want to oversee everything, else you're missing important details.
At work, my screens can't be big enough... but at home, I don't want an uber-big screen, because I wouldn't be able to play games or watch movies when sitting at the desk.
That's because at work, I just need to work. At home I need a 'multi-purpose' setup.

I don't do straw-men.
What I said is a very simple and obvious point: you have to prioritize different ways of improving visual quality.

Since TV and movies (the most realistic content we have) also stick to 1080p resolutions at this point, I don't think it's unfair to say that games sticking at 1080p and instead concentrating on delivering more realistic content is unreasonable.

I don't think I am missing the point at all (but it clearly suits you to belittle me), since those people (and many 'enthusiasts' on here) have equally large screens at home, for gaming, watching movies and general desktop use (so email, excel, word, net browsing). If people really shared your concerns about the general utility of large screens, they wouldn't use them, buy them and want them.

If you want gaming to be immersive, the last thing I believe you want is to be able to see everything in one postage-stamp like box in front of you. You want to almost fall into it, to have to look around like you do in the real world, imo at least. That's why nvidia and ati are both so keen on multi-monitor gaming... I think we will have to disagree here ;)

I also don't think there is any link between resolution and content, as I said earlier. It just so happens that things look better at higher resolutions (not sometimes, but always), but that doesn't detract from content development at all imho, it's not about choosing one or the other. There's nothing to prioritise by (imo) artificially hamstringing your gaming and computing experience by sticking to smaller, lower resolution screens, or larger, lower DPI screens. At least, I am no developer but I can't imagine that needing to support miscellaneous resolutions occupies much of games developers' time?

If someone comes out with a game engine that looks great at 640x480, good on them, but it will look infinitely better at 19x12 or 25x16, and if current hardware can't manage those resolutions, then we'll just need faster video cards. That's what we do as a species, constantly innovate, and that's why we're alpha dogs here on Earth ;)
 
Last edited:

Scali

Banned
Dec 3, 2004
2,495
0
0
I don't think I am missing the point at all (but it clearly suits you to belittle me), since those people (and many 'enthusiasts' on here) have equally large screens at home, for gaming, watching movies and general desktop use (so email, excel, word, net browsing). If people really shared your concerns about the general utility of large screens, they wouldn't use them, buy them and want them.

I don't think those people REALLY use their system in a general way then...
I mean, for a bit of surfing and emailing, maybe a TV-like setup is okay... but I cannot imagine doing any prolonged text processing or development on a large screen, sitting a few metres away. I'd rather just grab a laptop then.

If you want gaming to be immersive, the last thing I believe you want is to be able to see everything in one postage-stamp like box in front of you. You want to almost fall into it, to have to look around like you do in the real world, imo at least.

I don't think current games are really suited to that. They feel rather 'claustrophobic' if you play on a screen that is too large. Missing small details in your peripheral vision means instant death in many cases.

That's why nvidia and ati are both so keen on multi-monitor gaming... I think we will have to disagree here ;)

I don't think nVidia is keen on multi-monitor gaming. I think they're just doing it in a response to ATi.
And I think ATi is mainly doing it to try and give people a reason to buy faster GPUs, since just higher framerates isn't cutting it anymore.

I also don't think there is any link between resolution and content, as I said earlier.

I never said there was...
Have you ever seen the demo Lapsuus on Amiga?
It runs in 160x100 resolution, yet it has this really 'realistic' feel to it because of the soft look it has, due to some nifty post-processing.
At the time, people were stunned by it. That is exactly what I mean: the resolution isn't the most important factor. Geometry detail and post-processing give you that 'soft', 'real' movie-look much more than super-sharp high-res antialiased does.

It just so happens that things look better at higher resolutions (not sometimes, but always), but that doesn't detract from content development at all imho, it's not about choosing one or the other.

The problem is that you can spend each GPU cycle only once.
So it's mutually exclusive. You either increase the geometry detail, OR you increase the resolution/AA.
It's always a balancing act.
But I think the problem here is that I'm thinking as a developer (what content to design, and what shading/processing to implement), while you're thinking as a consumer. You don't think in terms of control, because as a consumer, you don't have any.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I just hope that those missing features aren't PhysX and CUDA for God sakes. :rolleyes:
Don't forget 3D :)

It's funny that ATI fans have been dismissing these features for years and yet they are still here, still new stuff coming out supporting it. It's like people who own an old car dismissing air conditioning, a radio and seats. That's some brand loyalty for you. :rolleyes:

Although you should go and read the thread about NVIDIA disabling Physx when an ATI card is present. If nobody cared about it (especially ATI fans). That thread would not have gone on and on and on.

AMD could launch a card that can cook you food, wash your clothes and serve you as a wife, and it will never be enough for you, please don't lie yourself :p
That's just it. They don't do this or any other significant feature. Do they support ambient occlusion yet? Transparency AA? I don't call these significant features but do they even have these yet? Did they finally get AA in Batman Arkham Asylum? Or the other games they dropped the ball on? Do they support custom profiles yet for Crossfire?

I'm not an NVIDIA fan over brand loyalty. It's because I love gaming, I love new technology and ATI just does not seem to offer enough support for either.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I pretty much agree here, I've even seen some work on anti-virus software using GPGPU to speed up scans.

http://gpgpu.org/2010/07/04/gravity-antivirus-engine

I've heard talk of the idea of using Gorilla Glass for both the monitor screen & the bezels.

I have never understood why people are concerned that you might have brutal amounts of number crunching power sitting idle except when you are gaming, as long as it powers down properly. Sure, it's a resource that's there to be tapped, but that's not a 'problem', it's an opportunity. In terms of gaming, as has been mentioned earlier, there's no real danger that it's going to be sitting idle while gaming, we haven't even properly mastered Crysis on a single GPU, Metro 2033 is brutal, and Heaven/Lost Planet 2 demos give a good taste of gaming with tesselation cranked up.

Look back historically and games have always stepped up to utilise the available hardware (or, in the case of FarCry and Crysis as two more recent-ish examples, have preceded the hardware).

As to the bezel issue, it only makes sense that it's something that can be 'solved' if there's a market for it, and given the increasing affordability of monitors and cards that can drive multiple monitors in games I see no reason why that driver won't eventually deliver us an effectively 'bezel-less' monitor, and sooner rather than later I would wager :)

Opportunities, not problems ;)
 
Last edited:

dug777

Lifer
Oct 13, 2004
24,778
4
0
snip

The problem is that you can spend each GPU cycle only once.
So it's mutually exclusive. You either increase the geometry detail, OR you increase the resolution/AA.
It's always a balancing act.
But I think the problem here is that I'm thinking as a developer (what content to design, and what shading/processing to implement), while you're thinking as a consumer. You don't think in terms of control, because as a consumer, you don't have any.

Personal opinions aside, I think we may be coming at this from different angles on the resolution issues.

I agree that a higher resolution comes at the expense of detail, but that's where faster hardware comes into the picture, because people will immediately want to run a great looking engine at a higher resolution. We can have our cake and eat it too, it's not one or the other (developers are only limited by the hardware and the code you can write, and AMD/Nvidia will offer you a pretty regular jump in horsepower that developers and consumers can do what they like with).

Remember, the consumer adds the AA/AF, not the developer. Nothing to stop a developer writing the most amazing code ever that looks great at a low res, but hardware advances will then allow us to run it with AA/AF at higher resolution over time, and it will look better :)

You say it's mutually exclusive. I say that's only true in a static hardware performance world, and the way I see it, hardware performance is dynamic and on a healthily upward trend looking forward :)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Remember, the consumer adds the AA/AF, not the developer. Nothing to stop a developer writing the most amazing code ever that looks great at a low res, but hardware advances will then allow us to run it with AA/AF at higher resolution over time, and it will look better :)

I don't know about you, but Crysis was the first game in years that I ran without AA/AF. But because Crysis is designed the way it is, it STILL looked great that way, and I STILL considered it the best-looking game ever.
I think that's the point I'm driving at: those developers had spent a lot of resources on better shading, more geometry detail, better physics and nice post-processing. This is what made it look better, despite the fact that hardware at the time couldn't handle running it with AA/AF.

You say it's mutually exclusive. I say that's only true in a static hardware performance world, and the way I see it, hardware performance is dynamic and on a healthily upward trend looking forward :)

No, it's always going to be mutually exclusive. As I say, you can spend each cycle only once. Your budget may increase, but still you can spend it only once.
No matter how rich you are, you can spend your money only once.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I don't know about you, but Crysis was the first game in years that I ran without AA/AF. But because Crysis is designed the way it is, it STILL looked great that way, and I STILL considered it the best-looking game ever.
I think that's the point I'm driving at: those developers had spent a lot of resources on better shading, more geometry detail, better physics and nice post-processing. This is what made it look better, despite the fact that hardware at the time couldn't handle running it with AA/AF.

:thumbsup:

I for one would be happy if that happened every generation, especially since I tend to play games a few years after they come out (the cheapskate's way of enjoying it better than it could run on high-end cards at the time) :)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,110
1,260
126
I have to more generally agree with BFG here on monitor size and resolution looking forward.

I have a 19x12 24" monitor, and I'd suggest that your desk is pretty tiny if you couldn't easily fit a 27" or 30" monitor on the what would to all practical intents and purposes amount to the totality of people's desks out there :eek:

I think 1080p may become a 'standard' gaming resolution (although I think it's entirely inferior to 19x12 for desktop usage), but that won't stop 25x16 and so on an so forth from becoming 'enthusiast' gaming standards over the next few years (at least partly because anything over 24" 1080p starts looking pretty ordinary when you are sitting as close as you do to game on a computer, and I certainly don't see people being suddenly satisfied with 24" screens as an upper end of a 'common' screen size going forward), if you want a crazy prediction from me ;)

I forget to mention that I find another, more mundance aspect of higher resolutions is that they help you be more productive, particularly while multi-tasking. A higher resolution makes the same screen size more useful (to a point of course), it makes everything look better in both 2d and 3d, and then of course you need a better graphics card to drive games at your native resolution ;) That's the circle of life :)

That and I think it's utter nonsense to pretend that current graphics cards are more than enough and we have some how run out of more taxing game engines, even at 1080p.

There's plenty of life beyond 4xAA, for starters, and that's before you start talking about new game engines, tesselation etc.

Agreed.

Higher resolutions, along with things like Super-Sampling, tesselation, super high resolution textures and so forth are the ways that 3D gaming will improve image quality drastically. 3D is mostly a fad, like it was in the 70s, it has some application in movies, but for gaming it's a neat gimmick to try out a few times.

Physx is utter rubbish, a bloated pig of a feature meant to sell more video cards because it is so taxing and delivers hardly anything to the gamer. These are things that will hold back true image quality progress much the way consoles do.

There is more to the experience of going from 1920x1200 to 2560x1600 than just a 'bigger screen' The drastic increase in resolution translates to a big increase in image quality. I guess you have to try it to see it. There are more pixels, a more dense image. Also 30" monitors tend to be some of the best quality panels around, so that also contributes.

This is why I am not a big fan of the multi-monitor thing. It sort of defeats the virtues of an ultra high resolution big monitor, by brute-forcing your way to high resolutions.

Now, a single, super-large panoramic monitor would be amazing, but I've only seen one of those and it was major $$$$$.

So, bring on 6870 imo, because a good 50% boost over 480 SLI in 6870 CF would be just fine for high resolution gaming and super-sampling AA, and increased tesselation performance. I would like to see nvidia catch up in gpu development and release a next gen in tandem with ATI. It would be nice to get my gpus cheaper, but it looks like nvidia is going to be pushing refreshes again like they were doing with GTX 285s while 5870s were flying off the shelves.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
460 is a terrific card, No doubt about that.

470 and 480 are not even worth considering in my book. I respect your opinion and this is my personal opinion. You are at a fairly cool place and I live in the southern part of Midwest where summers are very hot. I have a 5750 and I can feel my room warming up when gaming. I don't want to be in the same room where 470's and 480's are dumping hot air at 85c or more into my room.

Makes sense. I agree with you on the warm climate point. I recommended a 5850 over a 470 to someone in another thread who lives in a warmer climate.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Don't forget 3D :)

It's funny that ATI fans have been dismissing these features for years and yet they are still here, still new stuff coming out supporting it. It's like people who own an old car dismissing air conditioning, a radio and seats. That's some brand loyalty for you. :rolleyes:

Although you should go and read the thread about NVIDIA disabling Physx when an ATI card is present. If nobody cared about it (especially ATI fans). That thread would not have gone on and on and on.


That's just it. They don't do this or any other significant feature. Do they support ambient occlusion yet? Transparency AA? I don't call these significant features but do they even have these yet? Did they finally get AA in Batman Arkham Asylum? Or the other games they dropped the ball on? Do they support custom profiles yet for Crossfire?

I'm not an NVIDIA fan over brand loyalty. It's because I love gaming, I love new technology and ATI just does not seem to offer enough support for either.

Please dont talk about brand loyalty here. I usually recommend nVidia cards without issues, you are just an nVidia shill, period, your fanboyism can bee seen from outer space. I've never seeing you recommending AMD hardware, even when AMD launched the HD 5x00 series and nVidia had nothing to compete, you always placed threads like the "3DMark Vantage highest score record", comparing GTX 285 SLI performance vs HD 5870 and saying that AMD hardware offers nothing, in the end you just lied to yourself and the thread got close due to your trolling and false statements. (you used a score that was done way before the HD 5x00 series debut and someone here proved you wrong with an HD 5870 CF that scored almost twice higher!!) jajaja, that was hillarious.

PhysX is still here, but it isn't making any difference nor impact in the PC gaming scene overall. It is just a gimmick, developers takes the vibre, implements cheap phisics effects and keep moving, its like the half assed multi monitor implementation known as Surround View. I like PhysX idea but its implementation is pathettic, besides Batman AA, I haven't seen another game that uses it properly and helps in the inmersion department. Ambient Occlusion is a feature that uses the shaders to calculate it, ATi hardware can do it to!! (AMD needs to implement it, but doesn't make sense since it changes the way that the developer wants you to see and feel the game)

Welcome to 2010, ATi HD 5x00 series has super sampling, which means that Alpha Textures get anti aliased too!! AFAIK, even my outdated videocard has some sort of Anti Aliasing that works for Alpha Textures, and really works great, its called Adaptive Multi Sampling.

About Batman AA, you keep going saying the same blatant lie. Since day 1, AMD hardware supported Anti Aliasing in Batman AA by forcing it with the CCC, which works better with the jaggies than the cheap nVidia's implementation done in the Built In game menu which uses some sort of Selective Edge Anti Aliasing, hence faster performance.

CrossfireX profiles? You can download the latest version invididually!! Cannot tweak them though (I wish for that), but you seems to forgot intentionally that the RadeonPRO tool allows to tweak and force Crossfire and Anti Aliasing support and works like a champ. (Singularity doesn't support Anti Aliasing through CCC and doesn't scale well with Crossfire, its the worst looking game that uses the UE3 engine, so it will not tax enough the GPU). I was able to use 16x AA with AAA and never dipped below 75fps and used more than 85 percent of each GPU, RadeonPRO is a nice tool. Try harder next time with your continous ATi bashing kid. :)
 
Last edited:

scooterlibby

Senior member
Feb 28, 2009
752
0
0
That was a good post evolucion. I miss that brief, Wreckage free, lull we had for a bit.

On to the topic at hand. Once they come out with a dual chip, single PCB SLI card (or even a dual GPU version with the extra GF104 cores enabled - even better!) I would probably switch back to Nvidia and get a second card for Physx. My 5970 has been pretty unstable and I would not mind giving Physx a second look.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
The new flagship? Cool, what the cards official name, and whats the official release date and more importantly what are the specs and performance numbers?
If you can, please give us some links for this official information also.
In fact , just start a new thread.
Thanks.

Plenty of talk about it at b3d, but I wouldn't want to predict it before xmas. May turn into a fermi like disaster.

edit: http://forums.anandtech.com/showthread.php?t=2099738 - actually already is a thread. So they can make the before xmas prediction.
 
Last edited:

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I am confused than as to why no one here plays in 3d? Plenty of nvidia hardware.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I am confused than as to why no one here plays in 3d? Plenty of nvidia hardware.

Because you take a performance hit of ~50%, add other factors like Dx11 and PhysX in new games which all add up to abysmal performance numbers.
 

Powermoloch

Lifer
Jul 5, 2005
10,084
4
76
In all seriousness. The majority of pc gamers usually play on low to high resolutions such as 1280x1024 / 1440x900 / 720 P / 1680x1050 / 1080 P. Heck a 250 GTS/4850 is just good enough to play games at these resolutions with a few adjustments on quality.

Besides 5970 is a great card of its time and is still is ! People who shelled out the cash for the 5970 a year a go still can play games without any problems and it's worth every penny when you look at the investment and time played. When the 460GTX came out, of course it made more sense that it is a better bargain. But lets not forget that ATi released their generation sooner than Nvidia.

I can either go green or red (owned 5200FX, x850xt pe agp, 6200 FX agp, 3650HD, 4650HD, 7600GT OC, then current). I have both because they're such good companies and supplied all kinds of cards. Both have ups and downs, but it is down to what do the PC users personal preference and especially their wallets.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
To be fair, 30" LCD's, or at least a monitor that can support 25x16 are still very very expensive. About a 1000.00 + investment. In your case, you should go SLI if you want the highest available AA and detail levels at that res. You have a single GTX480 which is still going to have trouble at the res with all the bells and whistles turned on. Probably not the case as often at 19x10.
The question asked was “where to from 1080p with 4xAA”, and I answered.

A 30” display is one of the best investments for gaming I’ve ever made, by far. A $400 graphics card can be obsolete within six months, but this thing will last years and still remain top of the line. Also I’d rather have single a 30” 2560x1600 display as opposed to any multi-monitor gimmicks.

And yes, I need all the performance I can get for my performance and IQ targets, and stuff like PhysX and Tessellation significantly erodes that performance while offering little to no justifiable benefit in actual gaming.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
1080p is fine for me, and probably the majority of computer users. It's mostly a practical thing. Monitors with resolutions higher than 1080p tend to be too large to comfortably fit on a regular desk (more like TV style, which requires something other than a regular desktop setup).
My 30” display is about 60 cm away from me but I’ve no trouble seeing the entire contents through a combination of direct and peripheral vision. There’s also a vast difference between 1920x1080 and 2560x1600 in terms of image quality.

I think especially with AA, we run into the law of diminishing returns.
For regular MSAA that’s quite true, but regular MSAA has not been enough for years, especially in many modern games that exhibit shader and texture aliasing. SSAA is needed there and requires much more processing power, but also provides massive gains to IQ in-game.

A real movie at SD resolution, or a game like Crysis at 2560x1600?
This is an invalid comparison given pre-rendered movies always have the advantage of offline IQ enhancements not feasible in real-time.

Also Crysis has horrific shader and texture aliasing and doesn’t look especially good as a result. The textures are quite bland too, and the indoor lighting & shadowing looks quite inferior to titles like Stalker or Metro 2033.

I’d far rather have Far Cry 1 style graphics at 2560x1600 with SSAA than Crysis at 720p with MSAA. The latter would look total like ass in comparison because it’d be a blurry shimmerfest.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I agree with most of your comments, BFG. 30" @ 2560x1600 completely changes the gameplay experience, and I wouldn't give it up for any of the other technologies offered by either AMD or NVIDIA.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
This is an invalid comparison given pre-rendered movies always have the advantage of offline IQ enhancements not feasible in real-time.

I don't think that's invalid. The point is: there are plenty of IQ enhancements that we don't have *yet* in real-time, which are more important than resolution.

I’d far rather have Far Cry 1 style graphics at 2560x1600 with SSAA than Crysis at 720p with MSAA. The latter would look total like ass in comparison because it’d be a blurry shimmerfest.

Apparently we don't share the same opinion on what makes something 'realistic'.
I would much prefer the added geometry detail, animation, physics and AI over Far Cry, at the cost of a slightly lower resolution and a bit of aliasing.

Far Cry at 2650x1600 with SSAA would just be a very clinically rendered world that isn't very realistic. Reminds me too much of the raytracing crowd... "Look at this beautiful collection of perfectly round spheres on a checkerboard that I've rendered!". Yea okay... but we don't have all that much in the form of spherical objects and checkerboards in the real world. So it may be cleanly rendered, but it's not a realistic 'world'.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
26,106
15,232
136
Scali, can I compliment you on one of the best phrased "we agree to disagree" posts I have ever seen from you !

Sorry this is OT, but it is very relevant.