[PCPER] Visual Effect of Vsync test yourself

can you tell the difference

  • Yes in all cases

  • Yes but not in all the cases

  • No they all look the same to me


Results are only viewable after voting.

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It took me 5 seconds after the intro showed to identify which was which in all the cases. It was immediately apparent which was which.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Youtube is useless for this because it only displays 30fps AFAIK, too lazy to download his video. But, the right side looks much smoother than the left. That could be the engine though. Battlefield 3 is a mess without running vsync or using the game's built-in gametime.maxvariablefps console setting to cap FPS.

Without one of those it never feels smooth.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Thanks. I definitely could tell. I also preferred the flat 30fps to the standard vsync version.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Another issue that is lacking is that part of the problem of v-sync is latency, but you don't experience latency from a video, you have to be controlling the action. However, you can at least see the visual differences of 30-60 FPS and perhaps tearing, but latency is lost without actually playing the game.
 

Dankk

Diamond Member
Jul 7, 2008
5,558
25
91
Another issue that is lacking is that part of the problem of v-sync is latency, but you don't experience latency from a video, you have to be controlling the action. However, you can at least see the visual differences of 30-60 FPS and perhaps tearing, but latency is lost without actually playing the game.

Yup. Latency is far more annoying to me than screen-tearing, and in many games, Vsync introduces a crippling amount of latency to the point where it's better turned off.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Unless the Latency is ungodly awfull..... Id prefer a tiny bit of that to screentears.
I guess it depends though, latency is annouying as f*** on FPS games.

Youtube is useless for this because it only displays 30fps AFAIK, too lazy to download his video.
^ this.

It becomes a matter of which had more screentears,
since both are running at 30 fps thanks to youtube.

The 30 fps vs Vsync one is really telling though.
It looks much better with Vsync on (less jerky, less screentears)
Its noticable at full speed, but really apparnt when they slow down the speed to 50% ect.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yup. Latency is far more annoying to me than screen-tearing, and in many games, Vsync introduces a crippling amount of latency to the point where it's better turned off.

This for me. I always have vsync off. Tearing is only minor to my eyes, but it is there. Sometimes more noticeable than others.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Some reps from AMD told me at a meeting during GDC that there were no animation issues once you enabled Vsync and that the constant shift from 16 ms to 33 ms times were imperceptible. I think our videos here have quite definitely discounted that theory.

And of course, enabling Vsync results in a situation where the frame rate does NOT dip below 60 FPS, as we did with our reference videos used in this article, will not result in animation inconsistencies.

This guy is obviously on nvidia's payroll....

Translation: "A single AMD card can't possibly hope to maintain over 60fps and runs like crap with Vsync on." "In situations where we can easily maintain over 60fps (wink wink Titan SLI TROLOLOLOL) Vsync appears to result in perfectly find animations."

The fact that he fails to even test crossfire 7970's with vsync and then goes on to claim that in situations where you can maintain greater than 60fps result in perfect animation makes me want to reach through my computer screen and shake the shill out of this guy. So, for those who didn't know, $2200 worth of GPU's still outperforms $380 worth, film at 11.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I think you're not understanding what being said, without a adaptive vysnc, or triple buffering the moment AMD drops below 60 fps it goes directly to 30... The point is 30 isn't smooth at all.

According to AMD there is no difference in smoothness between 30 and 60 fps, nor will any of it's users notice constant flips between the two...


Then again, maybe I'm the one who isn't understanding!
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I think you're not understanding what being said, without a adaptive vysnc, or triple buffering the moment AMD drops below 60 fps it goes directly to 30... The point is 30 isn't smooth at all.

According to AMD there is no difference in smoothness between 30 and 60 fps, nor will any of it's users notice constant flips between the two...


Then again, maybe I'm the one who isn't understanding!

I just view it as a passive aggressive attempt at him stating that AMD's cards can't maintain >60fps and that the experience suffers. He then goes on to say that on their reference setup which I believe he stated earlier was Titan SLI they didn't have the problem. It just comes off as a subtle jab at AMD, and unprofessional writing if you ask me.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
This guy is obviously on nvidia's payroll....

Translation: "A single AMD card can't possibly hope to maintain over 60fps and runs like crap with Vsync on." "In situations where we can easily maintain over 60fps (wink wink Titan SLI TROLOLOLOL) Vsync appears to result in perfectly find animations."

The fact that he fails to even test crossfire 7970's with vsync and then goes on to claim that in situations where you can maintain greater than 60fps result in perfect animation makes me want to reach through my computer screen and shake the shill out of this guy. So, for those who didn't know, $2200 worth of GPU's still outperforms $380 worth, film at 11.

Pcper is batting 0 for 3 on credibility currently with their shoddy research methods and nvidia involvement. I still want to see future reviews from other sites looking at all this, using the FCAT tool and not, which are likely coming soon with some of the new cards on the horizon.

Will be helpful illuminating some of the shadows pcper casts with their poor research practices to see if there are any inconsistencies with their data and objective conclusions when contrasted against others sites that generally appear more impartial.

See sites that don't: do marketing videos with nvidia marketing reps to go with their reviews, release data working in tandem with nvidia against their competitor without revealing they are working with them and using language at times that sounds it's more from your garden variety nvidia fanboy than an impartial(hopefully) reviewer.

There is undoubtedly something to CF not being as robust as SLI, a lot of sites have commented on this, but, pcper is the only one making outrageous blanket statements while sharing an ice cream with nvidia.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I just view it as a passive aggressive attempt at him stating that AMD's cards can't maintain >60fps and that the experience suffers. He then goes on to say that on their reference setup which I believe he stated earlier was Titan SLI they didn't have the problem. It just comes off as a subtle jab at AMD, and unprofessional writing if you ask me.

It seems that you are sounding very defensive.

The point wasn't whether AMD can't maintain over 60 FPS, it was about when they don't maintain over 60 FPS, this is what happens. He made it clear that the same thing happens on AMD and Nvidia cards with v-sync on. He even used real world examples, so that you can see that there are times you won't maintain 60 FPS.

No setup is always over 60 FPS in all games. Even if you set everything to low, there are games that are CPU bottlenecked below 60. This was about those instances.

Of course he is only looking at that condition. There are other conditions that v-sync isn't optimal at either. Such as minimum latency and more input points that competitive FPS gamers shoot for.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
It seems that you are sounding very defensive.

The point wasn't whether AMD can't maintain over 60 FPS, it was about when they don't maintain over 60 FPS, this is what happens. He made it clear that the same thing happens on AMD and Nvidia cards with v-sync on. He even used real world examples, so that you can see that there are times you won't maintain 60 FPS.

No setup is always over 60 FPS in all games. Even if you set everything to low, there are games that are CPU bottlenecked below 60. This was about those instances.

Of course he is only looking at that condition. There are other conditions that v-sync isn't optimal at either. Such as minimum latency and more input points that competitive FPS gamers shoot for.

Yes, I'm defending sound scientific practices. He calls crossfire crap. People claim it works perfectly with Vsync. He says, ok I'll test vsync. He says, AMD cards still run like crap even with vsync when they drop below 60 fps. He then goes on to say that the nvidia SLI system that maintained over 60fps had no problems with vsync on. All while never testing Crossfire with vsync which was the entire entire purpose of the article. Seems SUPER legit.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yes, I'm defending sound scientific practices. He calls crossfire crap. People claim it works perfectly with Vsync. He says, ok I'll test vsync. He says, AMD cards still run like crap even with vsync when they drop below 60 fps. He then goes on to say that the nvidia SLI system that maintained over 60fps had no problems with vsync on. All while never testing Crossfire with vsync which was the entire entire purpose of the article. Seems SUPER legit.

Wait a second. You misunderstood the dual SLI setups role in the test. He just wanted to create two scenarios. The Titain SLI setup is to create a system that can run with v-sync on and maintain 60 FPS 100% of the time in his test. This is the base line machine. It is the example of perfectly maintained 60 FPS. I suppose he could have taken the label off the system, so people wouldn't focus on what is powering it, but that is what people often ask for.

He also ran another system with 30 FPS, with v-sync that never varied. This is the 2nd baseline machine. This is to show exactly what smooth 30 FPS looks like, that has no variation in frame times.

Then he ran the 7970 on its own in positions that it cannot maintain 60 FPS. This is to show how v-sync looks like when you do not maintain consistently spaced out 30 or 60 FPS. It is to show just how it looks when frame delivery times change between 16.7ms and 33.3ms over and over.

I suppose to not offend anyone, they could have used a single 670 or 680 instead of the 7970. As they said, it wouldn't matter which brand of card was used, it results in the same delivery of frames.

It seems to me that you saw the test systems and just assumed what they were trying to explain, rather than reading it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Doesn't matter what is posted the partisan ridiculous starts. This topic has nothing to do with AMD verses Nvidia, it is a simply a question of can you see the difference between the 3 scenarios.

Stay on topic please.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
I've been gaming since the Amstrad CPC 6128 and after all these years, having moved from home computers to PCs, gaming with vsync@60fps is a must for me. No vsync or 30fps is not an option for me.

My CRT days are gone since the first proper LCDs came out, but I remember back in the day that vsync did not matter too much for me. Something changed and I am not sure if it's me or the way LCDs work.

I am pretty sure that a 120Hz solution will provide an even better gaming experience, but this will not happen for me, until large HDTVs with such capabilities+3D come out.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Doesn't matter what is posted the partisan ridiculous starts. This topic has nothing to do with AMD verses Nvidia, it is a simply a question of can you see the difference between the 3 scenarios.

Stay on topic please.

If that was the case then Ryan Shrout should not have disclosed what any of the hardware running the videos was. He just so happened to paint AMD in a negative light once again.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
PCPer is utterly obnoxious at this point.

I'm just wondering when multi-gpu solutions started to be the main selling point for graphics cards. I really wish that Steam was including this in its hardware survey.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
PCPer is utterly obnoxious at this point.

I'm just wondering when multi-gpu solutions started to be the main selling point for graphics cards. I really wish that Steam was including this in its hardware survey.

Maybe around the time that $300 2560x1440 monitors were discovered? I really don't know. I would love to have a single gpu card to do that resolution as well as my sli setup but outside of a titan for $1000 it doesn't exist.
 

deathBOB

Senior member
Dec 2, 2007
569
239
116
If that was the case then Ryan Shrout should not have disclosed what any of the hardware running the videos was. He just so happened to paint AMD in a negative light once again.

This is the fourth paragraph:

To be 100% clear - the issues with Vsync and animation smoothness are not limited to AMD graphics cards or even multi-GPU configurations. The situations we are demonstrating here present themselves equally on AMD and NVIDIA platforms and with single or dual card configurations, as long as all other parameters are met. Our goal today is only to compare a typical Vsync situation from either vendor to a reference result at 60 FPS and at 30 FPS; not to compare AMD against NVIDIA!!

The emphasis is from the original. Seems pretty clear to me.
 
Last edited: