Please elaborate, before I report you post a useless ad hominem?
Vsync fixes runt frames since there are no runt frames when the frames are synced with the refresh rate.
This is a fact.
You are wrong.
End of discussion
UserCP
Edit Ignore List
Add user to ignore list
(the troll's user name that I am not allowed to single out due to forum rules)
Okay
A lot of us have followed these steps, and I suggest you do it too if you cannot help yourself from feeding it.
If that were really the case, review sites would not be expending so much energy on investigating this issue, and AMD would not be developing driver updates specifically to address the issue. Are you saying they're all devoting so much time to this just to justify their jobs? Or maybe they just aren't as good as you at thinking outside the box eh?![]()
Dumb Question - When you enable vsync on a 120Hz monitor it limits the Hz/FPS to 120 right? So why do people play without vsync? I have to enable it because tearing gets annoying on a 60Hz Monitor. Just curious.
Input lag comes with Vsync.
Yes, it locks it at 120 FPS, unless your video card can't sustain that.
I agree screen tearing gets extremely annoying which is primarily why I use vsync too.
EDIT: Doesn't input lag depend on your mouse? Not sure.
Hmmm.. I may try a couple FPS games and see if I can notice a difference. Crysis 3 is the only game I play without vsync. For some reason it sets my FPS @ 30FPS when I enable it..
You CANNOT display runt frames when Vsync is turned on. FACT!
Input lag comes with Vsync.
And you also don't see another evil, which is tearing. You just have to frame limit to do away with input lag, which we already know is a valid remedy.
Once you do all that, the single number of FRAMES PER SECOND that the 7990 churns out is very much a valid metric. And the card is definitely faster than a 690 and Titan.
Of course it doesn't mean it's necessarily the better choice, power and noise be taken into account. For example in a multi-monitor configuration, Titan puts both cards AWAY. Not only does it have a much larger buffer (which helps tremendously with huge resolutions), but its multi-monitor idle power consumption is nothing short of amazing.
Jaydip wrote:
This is the "minor" negative to vsync.
Normal vsync drops to 1/2 refresh rate (30 fps) when it dips below 60 fps. Thus, you go 30-60 a lot if your GPU cannot maintain >60 fps. Thats the "major" negative, it causes stutter and nausea in gamers sensitive to motion sickness. FFXI caused me to throw up once because that game is so CPU bottlenecked in scenes, it would fluctuate based on the camera of my character.
In the old days, we cured this by a) playing with tripple buffering (not always possible with some games), and b) ensuring the graphics settings in games do not cause frequent dips below 60 fps.
Option a) adds an extra frame of latency, which for some is unacceptable. To me, its a non-issue, because i don't feel ~10-20ms is a HUGE OMFGBQQ winning advantage. Obviously, to some, its a big deal.
Option b) requires a been of tweaking and time, which for the new generation of PC gamers, sadly, they dont have the patience nor the mindset to bother with.
Today, NV already cures these issues by default in their drivers: adaptive vsync fixes these issues for gamers who like to game with vsync, and inbuilt frame time metering, for gamers who dont want vsync BUT obviously dont mind the extra frame time latency. This here is the key point, as any metering of frame times adds latency to the overall rendering pipeline. How much latency? No reviewers has investigated this thoroughly.
AMD up to recently dont care about it, since they feel gamers prefer the lowest latency at the expense of stuttering. Soon, they will add the option in CCC. Smooth gameplay with more latency or vice versa. But in the meantime, Radeon Pro outright fixes the issues of frame time with radeons.
Jaydip wrote:
"Dude I believe you have reported so many posts by now mods have probably put you on their ignore list" :biggrin:
Man that made me lol...best comment I've seen here in a long time. :thumbsup:
I love how he says he has Groove on ignore on page 2 but then posts right after him.:biggrin:
Reviews already said that Vsync fixes it. Even PCper have stated that vsync does away with all stuttering and runt frames by forcing the engine to only display frames when the monitor refreshes.
This whole storm in a tea cup is all about non-vsync performance. it always has been.
You CANNOT display runt frames when Vsync is turned on. FACT!
If that were really the case, review sites would not be expending so much energy on investigating this issue, and AMD would not be developing driver updates specifically to address the issue. Are you saying they're all devoting so much time to this just to justify their jobs? Or maybe they just aren't as good as you at thinking outside the box eh?![]()
Dude give it a rest.Unless you own AMD cards why do you bother if AMD improves their drivers or not?
That's a very good point to me based on I lean heavily towards nVidia and nVidia centric at this time.
PC Perspective did an article on it(the new drivers). It is a dramatic improvement for sure. Good to see.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Improves-CrossFire-Prototype-Driver
Just out of curiosity, since you are and have been a loyal NV user.. at what % of perf/$ deficit would it take for you to consider switching to Radeons?
Back when the 5850 was out, the price gap compared to NV card was rediculous. Yet many users still used NV cards, which not only cost heaps more but also was much worse in perf/w.
Is there a point that would make you reconsider? Me, i switch to any vendor that offers the best bang for buck for the generation or when i want a new rig. Loyalty to a corporation is akin to being brainwashed IMO, they are all out to deprive you of your hard earn $$, as much as possible, why in all logic-sakes would you devote yourself to their cause? Ive used AMD cpus, now ive been with Intel for the past several gen. I've used rendition, 3dfx, ATI, NV and back again to AMD. When next gen hits and if NV is ever gracious enough to not rape their consumers by offering a good value for money products, i will happily buy their GPUs.
Okay, please note I'm not trying to stir any pots, just trying to get some clarification.
I was under the impression that the "runt" frames were creating an artificial boost to CFX numbers. Going as far as PCPer saying the second card was "useless" as removing the runt frames reduced the total FPS by almost half.
So, does this new driver actually remove the runt frames? It doesn't seem the card took an FPS hit, which would rule out the notion that the runt frames were creating an artificial balloon effect, or...
What does it actually do? Are the frames now either full frames or just bigger so they don't get marked as runt frames?