Further investigation could be useful.
See if it's NV cutting corners or not. I wouldn't be surprised either way.
Lots of NV defense and skepticism. Would you be so defensive when looking at another GPU company?
With comments like that it's obvious your funny old CRT monitor has a higher IQ than you.:whiste:
Indeed it should be looked into, perhaps Nvidia is cheating with lighting, perhaps it's a LoD issue 2GB vs 3GB, perhaps it's AMD cheating with DOF, or perhaps it's nothing.
Wouldn't not seeing it be the point?
Huh?
DOF is like smearing vaseline on a camera lens. You're simply distorting image quality at the cost of performance. It's for a more cinematic experience, like TXAA.
That said, I don't see missing lighting with the Nvidia cards, what I see is lower image quality and a lot less saturation from the lights. In dimly lighted areas this seems to cause a loss of definition, though I can't say if it's actually a loss, or if it something you wouldn't see IRL. With AMD every light is extremely vibrant, even in dark areas it's like the sun is out.
But there are some low light areas that seem to have a lower LOD, either through DOF distortion, or simply because its not bright enough to actually see details.
It really depends on what the artist of the game was going for, with AMD it looks like a vibrant city with high power bulbs everywhere, while on the Nvidia system it seems more of an earthly tone, and the lights themselves aren't nearly as powerful.
SLI might not be perfect, but what are you talking about?
http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Test-4
Keep in mind, drivers issues do happen, but the crossfire issue is more of a general one.
@amenx
Thx for the pics, it looks like the lighting issue is nothing more than a time of day thing as evidenced in the first 2 screenshots
Since when did Fermi have latency issues, outside texture trashing that occurred later when they were trying to compare cards with the same performance but 750mb more vram?
CF has always been broken, it's not an issue with just 7xxx series.
It's kind of sad when you can look at one company, who has accross the board problems and make excuses based on one game the competition didn't do great in.
"Oh yeah, but that one time, in that one game, they had a problem too!!!"
Anyone remember the Far Cry 3 comparison where ambient occlusion was discussed between AMD and Nvidia hardware and although some claimed that Nvidia's cards would overdarken some spots, in those same screenshots the AMD cards had clear loss of texture details. Where graffiti was clearly seen on the Nvidia shot and the AMD shot had it smeared in with the wood panel it was on. Also on the brick structures on the roof, cracks were defined on the Nvidia shot and almost invisible on the AMD shot. Note the fleshtone difference in the screenshot below as well. The Nvidia shot clearly has more red in there.
This too could be attributed to the dynamic weather used in the game (time of day, overcast sky is clearely visible on one shot vs the other). We also don't know the details of the system used to acquire these shots. Was there a different setting used somewhere that didn't give a level playing field in the drivers? Too many variables to draw conclusions. The positioning is hardly exact as well. Just look at the proximity of the tree in relation to the player, and the grass. Is the FOV different or is the player actually standing closer to the structure and that is what changes the details from being clearly defined to slightly smeared?
![]()
I guess my point is that we cannot really know what is causing these differences for sure and we don't know what the original intent of the artists and programmers was for that particular scene with that particular weather and time of day. Was it intended to be like one shot or like the other? Is the color off on purpose or is it incorrectly rendered by one card?
Oh that was it, they had insufficient memory? Makes me wonder how you managed to play any games on your 470s for so long (plus the added latency of SLI)
6xxx crossfire was fine. No worse than the GTX5xx SLI.
My own screenshots of same places in other pics posted, at different times of night/day. This with my 660ti:
![]()
![]()
![]()
![]()
![]()
![]()
p.s. the wavyness of the store shutter door in 2nd pick is optical illusion. Click to enlarge pic, it goes away.
By using proper settings of course, but also having 250mb more memory than the 1GB cards you're comparing.
The slight increase in latency was offset by the fact it wasn't a stuttering mess like the Radeon cards. It was however, a trivial increase next to the power of the vsync warriors.
We can argue until we're blue in the face, however the perception of Radeon has gotten worse this generation, not better.
I don't really see this conversation going anywhere, anyone who thinks CF is on the level of SLI (which btw, is far from perfect) after all of this needs to take another look.
I don't think that was true either: http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995-5.htmlDon't get me wrong. I would not recommend CFX right now to anyone who isn't willing to fool around with radeonpro. but 6xxx CFX was just as good as SLI. The 570 had the same issue and the 650Ti/Boost also has the issue. 680/670 vs 7970/7950 is game dependent on the stutter. 7xxx was perfectly smooth throughout it's lifetime apart from a month or 2 where they had issues. nV cards have been having issues on and off for a few years now, but nobody calls them out on it.
I figured you to be a little more impartial than that.