Image quality Who is better

Xarick

Golden Member
May 17, 2006
1,199
1
76
I read the article on pcper about fcat. In it they included many videos showing frame drops, tearing etc. however the one thing I noted was a difference in image quality. Part of it color...
So the question.. which has better image quality.
Which has better color.
Which one is the most accurate.

Here is a video for comparison:
http://www.youtube.com/watch?feature=player_embedded&v=E-XMmdZyrtk'
http://www.youtube.com/watch?v=lY0iFojEKRU&feature=player_embedded
http://www.youtube.com/watch?v=oV3eAkXqykQ&feature=player_embedded

Discuss.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I can't see a difference. If you play through one that is full screen, then they jump back to the beginning on the other card, it looks slightly different, but that is because of where it is being recorded, but when they are shown side by side, they look the same, outside the obvious reality that it is not EXACTLY following the same movements.

In the Skyrim video, everything was exactly the same.

The Sleeping Dogs definitely looked like it was shot at different times of day or different settings.

Crysis 3 looks the same as well.

It seems Sleeping Dogs video was set at different settings, something is different on the cards, or it was a time of day/night thing. That was the only one I saw a difference though.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Skyrim: 7950 looks smoother, but has more screen tears.
Sleeping dogs: 660ti looks smoother.
Battlefield 3: 7950 looks smoother, but has more screen tears.


Image quality Who is better
? this isnt a CF vs SLI thingy?


Go to battlefield 3, take 1080p and pause 0:10 into it.
Look at the wall.

The brightness/ambience? nvidia has means its lost details on walls (the cracks).
(thin dark lines, in a very bright area = washed out)

On other hand, sometimes it feels like theres more life too nvidia's colours.
Amd has better darks though, sometimes things get washed out on nvidia's (if its a light area).

Look at the crispness of the wall textures, look sharper on the 7970.

If a area of the game is really dark, sometimes colours showing better, means you see something instead of just "blackness" which AMD would have. Loss in detail (cant see crap)

If you have a area of the game, thats really light, sometimes nvidias colours would be washed out.
Loss in detail.

Depends on the artists rendering, our own tinkering with colours/light ect.
Which is better, also its minimal differences. Probably subjective.

*** Im not sure the guy who ran the tests used same settings on both cards.
Looks like the 7970 is running higher AA or something.

********edit:

Its weird, because in Sleeping Dogs, the rolls are reversed. Nvidias looks more dark there (at first inside the room) (once outside its much much darker on the amd card, but suspect thats time of day differnce).
 
Last edited:

Xarick

Golden Member
May 17, 2006
1,199
1
76
I believe the colors are adjustable?
I am curious about out of the box settings.

I see a large difference in skyrims colors.

AMD has darker darks. Which in some games I found a detriment and had to turn up gamma. On Nvidia I drop gamma... :)

I noted when I switched Nvidias default brightness is higher and so I dropped it a notch. I actually dropped it on my monitor, because I had set my monitor brightness up because my 5850 was quite a bit darker.
 
Last edited:

geniusloci

Member
Mar 6, 2012
84
0
0
AMD is better in windows, OOB. I've noted this previously elsewhere, which I'll paste here:

Nvidia's video quality in Windows is terrible. Youtube, HD MKVs, you name it. It is massively inferior in every way to AMDs. I have a hard time believing anyone cannot see this in a side by side comparison.

Likewise, the 2D desktop is inferior. Unfortunately a good many people don't notice. This might only be at default color settings, but out of the box, with the correct monitor profile Nvidia lacks the punch, and lively presentation of AMDs offerings.

It might be possible to adjust this out with a stringent calibration, but I've never been able to do it by eyeballing it. I've never actually been able to make an Nvidia setup look remotely as pleasing by eyeballing it. It surprises me that more people can't see this, but everyone doesn't have the same visual acuity.

Along these same lines text on Nvidia driven systems has a distinct blur or fringe to it, regardless of how cleartype is set. (I spent a very long time doing this.) It should be noted that this is apparently subjective and some people will notice it more than others, as stated in the Cleartype wiki:
According to MSDN website,[5] Microsoft acknowledges that "[t]ext that is rendered with ClearType can also appear significantly different when viewed by individuals with varying levels of color sensitivity. Some individuals can detect slight differences in color better than others." This opinion is shared[6] by the font designer Thomas Phinney, program manager for fonts and core technologies at Adobe Systems:[7] "There is also considerable variation between individuals in their sensitivity to color fringing. Some people just notice it and are bothered by it a lot more than others."

I do notice it, and unfortunately it is annoying as hell. Apparently even text is widely accelerated now and there is something that Nvidia is doing different than AMD in the process resulting in a different image in WINDOWS. I will say I've not noted any similar effect in any linux distribution.

I don't know what factors play a role beyond these things, or if it's a combination of them, I only know I see a difference and wish I did not. From the people I've talked to in the last couple years, it might be 1 in every 25 or so people who can make this distinction. I'm of the opinion that it's probably compounded by the large number of men with various color blindness problems who would never be able to detect it.

edit: I should note I'm running a 6bit eips Dell U2412M. Not the absolute greatest out there, but a generally pleasing monitor to look at.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
AMD is better in windows, OOB. I've noted this previously elsewhere, which I'll paste here:

Nvidia's video quality in Windows is terrible. Youtube, HD MKVs, you name it. It is massively inferior in every way to AMDs. I have a hard time believing anyone cannot see this in a side by side comparison.

Likewise, the 2D desktop is inferior. Unfortunately a good many people don't notice. This might only be at default color settings, but out of the box, with the correct monitor profile Nvidia lacks the punch, and lively presentation of AMDs offerings.

It might be possible to adjust this out with a stringent calibration, but I've never been able to do it by eyeballing it. I've never actually been able to make an Nvidia setup look remotely as pleasing by eyeballing it. It surprises me that more people can't see this, but everyone doesn't have the same visual acuity.

Along these same lines text on Nvidia driven systems has a distinct blur or fringe to it, regardless of how cleartype is set. (I spent a very long time doing this.) It should be noted that this is apparently subjective and some people will notice it more than others, as stated in the Cleartype wiki:
According to MSDN website,[5] Microsoft acknowledges that "[t]ext that is rendered with ClearType can also appear significantly different when viewed by individuals with varying levels of color sensitivity. Some individuals can detect slight differences in color better than others." This opinion is shared[6] by the font designer Thomas Phinney, program manager for fonts and core technologies at Adobe Systems:[7] "There is also considerable variation between individuals in their sensitivity to color fringing. Some people just notice it and are bothered by it a lot more than others."

I do notice it, and unfortunately it is annoying as hell. Apparently even text is widely accelerated now and there is something that Nvidia is doing different than AMD in the process resulting in a different image in WINDOWS. I will say I've not noted any similar effect in any linux distribution.

I don't know what factors play a role beyond these things, or if it's a combination of them, I only know I see a difference and wish I did not. From the people I've talked to in the last couple years, it might be 1 in every 25 or so people who can make this distinction. I'm of the opinion that it's probably compounded by the large number of men with various color blindness problems who would never be able to detect it.

Okie...
Now put your 23 posts reputation where your mouth is :)
What are the differences betweene these pics? Which one is Nvidia/AMD?

One of these has a distinct blur or fringe to it. So for you the answer should be obvious, no?

c13omsyo.png
c23ams7i.png
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Personally always liked ATI's color and richness and may be the use of default dynamic contrast but with modest tweaking can receive rich color depending on lighting in the room with DV or third party tools like SweetFX. This is subjective.

What I always appreciated was quality AA, flexibility, filtering, smooth mip-map transitions, tools, features to improve image quality based on one's subjective taste, tolerance and threshold level.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
In the grand scale of things..... gotta admit the (colour) differnces are so small its not really a issue.

It used to be that AMD card has "flickering" textures, and ALOT more than nvidia had.
By now with the 7xxx series, thats as good as gone.

Amd still has more screen-tears than Nvidia, atleast by looking at CF vs SLI.
Something I believe I read they where gonna fix at some point too
(has something to do with drivers I believe).


@Fisherman

Id swear that there is a colour differnce between those.

The blue's / purples, look differnt.
As do the Reds (esp when darkened).

Look at 2nd row (on left hand side) 2 down and 10 out (that darkend red pixel?).
They are not the same on both sides, one side clearly has a more ambient red.

Same thing (red) with 4down-9out.
same thing (red) with 10down-13out.

I cant tell the greens apart from either picture, but I notice the red and blue's, there are small differnces.


I like the one on the "right" hand side more. (which one was that?)
because the greyed out / toned down pixels are sharper, more lively.
It defines the "dark" area that makes up the letter better.


One of these has a distinct blur or fringe to it.
its the one on the left right?
 
Last edited:

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
Anyone who says there is a difference in colors or whatever you guys are arguing about is a raging fanboy. There is no other explanation, you guys can't support your evidence with facts. What I just stated is a fact however.

When I had my 5850 and upgraded to my GTX 680 on the exact same monitor I noticed no differences at all, well besides my FPS going through the roof.

AMD and Nvidia are equal in this regard.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Funny takeno I also upgraded from a 5850 but to a 670 and I say that Nvidia has a higher brightness be default.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
its the one on the left right?

He was being sarcastic. There is no difference, and the article he pulled it from found no difference as well.

The blur that the target poster mentioned is likely ClearType, or maybe he uses an analog monitor.
 

geniusloci

Member
Mar 6, 2012
84
0
0
Some guy studied desktop image quality and disagrees with your experience.

http://hardforum.com/showpost.php?p=1038753026&postcount=1

I personally found Upping Digital vibrance was all I needed to make it okay.

As stated by the guy at adobe, not everyone can see the same thing. He can disagree all he wants. The 'digital is digital' argument is bullshit.

'Upping' digital vibrance does nothing but blow out certain colors.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
@Arkadrel
You'd be my go-to image affociando, if Xarick hadn't posted the answer already.
Left - NV. Right - AMD

as for Color Difference :

chart-1.png
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
As stated by the guy at adobe, not everyone can see the same thing. He can disagree all he wants. The 'digital is digital' argument is bullshit.

'Upping' digital vibrance does nothing but blow out certain colors.

The article that was posted prior is only about the desktop. Since you complained about the desktop and gaming differences. There may be some very slight differences in game, but not at the desktop.

Anyways, here is a game review on image quality. It is older, when the 7000 series was new. THG found AMD's inferior at the time, but the problem was fixed with later driver updates after learning of a defect in their texture handling.

http://www.tomshardware.com/reviews/image-quality-driver-optimization-graphics,3173-8.html
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
He was being sarcastic. There is no difference, and the article he pulled it from found no difference as well.

The blur that the target poster mentioned is likely ClearType, or maybe he uses an analog monitor.


Maybe he was.
But id swear if you took those (2 pictures he posted) in a program (if there is a match x picture with y picture thingy) and compaired them, you wouldnt find them 100% identical.

Maybe its just him useing a camra zoomed in a bazillion times (cam fault?), and maybes its differnt monitors (pixels in each area arnt 100% identical?) or differnt analong vs digital ports or something.

Those pictures arnt 100% identical.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Maybe he was.
But id swear if you took those (2 pictures he posted) in a program (if there is a match x picture with y picture thingy) and compaired them, you wouldnt find them 100% identical.

Maybe its just him useing a camra zoomed in a bazillion times (cam fault?), and maybes its differnt monitors (pixels in each area arnt 100% identical?) or differnt analong vs digital ports or something.

Those pictures arnt 100% identical.
What kind of monitor do you have? TN monitors often have some color shift which may be affecting what you see, as they are at different locations of the screen.
 

geniusloci

Member
Mar 6, 2012
84
0
0
Okie...
Now put your 23 posts reputation where your mouth is :)
What are the differences betweene these pics? Which one is Nvidia/AMD?

One of these has a distinct blur or fringe to it. So for you the answer should be obvious, no?

c13omsyo.png
c23ams7i.png

The one on the left is nvidia, the right is amd. If you can't see that the left image has visibly darker color to the pixels, you have no more need to post, because I can see it. I take it these are supposedly calibrated to exacting matches, right?

Again: I don't have to prove anything, Microsoft and Adobe BOTH admit that such a thing exists.

edit: so I go back up and read the last few comments and get vindication from multiple people in multiple ways.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Arkadrel
You'd be my go-to image affociando, if Xarick hadn't posted the answer already.
Left - NV. Right - AMD

as for Color Difference :



The blue's / purples, look differnt.
As do the Reds (esp when darkened).
:colbert: hahaha epic. Tiny differnces lmao.


Anyways you can see differnces in 3D rendering in games between the two.

Im not opsessed enough to notice a differnce in 2D text, unless the picture is blown up like that picture was and im actually looking for differnces.
 
Last edited:

geniusloci

Member
Mar 6, 2012
84
0
0
:colbert: hahaha epic. Tiny differnces lmao.

Annnnd I restate:

According to MSDN website,[5] Microsoft acknowledges that "[t]ext that is rendered with ClearType can also appear significantly different when viewed by individuals with varying levels of color sensitivity. Some individuals can detect slight differences in color better than others." This opinion is shared[6] by the font designer Thomas Phinney, program manager for fonts and core technologies at Adobe Systems:[7] "There is also considerable variation between individuals in their sensitivity to color fringing. Some people just notice it and are bothered by it a lot more than others."

Just because YOU can't see the difference, doesn't mean it doesn't affect other people. You might note that I even stated that a decent number of men might never notice these kinds of things due to various 'color blindness' problems.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The differences in the chart above reveal a very, very small differences, something that takes at least 2 decimal places to show. And which is actually better or more accurate is another question all together.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
The one on the left is nvidia, the right is amd. If you can't see that the left image has visibly darker color to the pixels, you have no more need to post, because I can see it. I take it these are supposedly calibrated to exacting matches, right?

Again: I don't have to prove anything, Microsoft and Adobe BOTH admit that such a thing exists.

No, but you need to realize that there is a thing called monitor refresh and camera shutter speed.
Also, that pixels are lit with R,B or G, and there is little that Nvidia/AMD can do about it.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Geniusloci

I quoted myself stateing I percived differnces in those 2 pictures.
I noticed it in the blues and red's.

Fisherman lateron posted that there was indeed tiny differnces,
in both the blues and reds.


On the same hand, I wouldnt have notice it (I believe) if I wasnt actively looking for differnces.
AND the picture wasnt blown up (in size) to make it easier to see.


Maybe you actually can see a differnce, at regular size between the two....
Im not sure I can, or if I could that it would even bother me.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
@Geniusloci

I quoted myself stateing I percived differnces in those 2 pictures.
I noticed it in the blues and red's.

Fisherman lateron posted that there was indeed tiny differnces,
in both the blues and reds.


On the same hand, I wouldnt have notice it (I believe) if I wasnt actively looking for differnces.
AND the picture wasnt blown up (in size) to make it easier to see.


Maybe you actually can see a differnce, at regular size between the two....
Im not sure I can, or if I could that it would even bother me.

To be completely fair, the picture on the left appeared to be slightly out of focus compared to the right. That is just a camera focus deal, and has nothing to do with what was displayed on the monitor. However, how much color difference is actually there really can't be measured in this manner accurately due to the focus issue.