bystander36
Diamond Member
- Apr 1, 2013
- 5,154
- 132
- 106
There is screen tearing. I personally will use adaptive v-sync most the time, but those looking for the most responsive feel, leave it off, and live with the tearing.
So if your on a 120hz monitor and you have a gpu system that is powerful enough how do you prevent it from going over 120fps with v-sync off?
Someone else pointed out alot of people with 120hz monitors play older games to achieve said frame rate.
Older games will easily pump out 200fps if you let them.
So is there no visable screen tearing if your at 200fps on a 120hz monitor with v-sync off?
So if your on a 120hz monitor and you have a gpu system that is powerful enough how do you prevent it from going over 120fps with v-sync off?
Someone else pointed out alot of people with 120hz monitors play older games to achieve said frame rate.
Older games will easily pump out 200fps if you let them.
So is there no visable screen tearing if your at 200fps on a 120hz monitor with v-sync off?
I can't imagine wanting to put a 50 inch TV on my desk. Surround at least gave me peripheral vision but I struggle to focus on all of a 24" screen let alone one that has a diagonal twice that. Seeing as how todays monitors are fairly close to the "retina" standard of PPD it doesn't look like we really need to go up to 4x as many pixels in the same space, more like +40%. This TV standard is likely going to stay in big TVs for quite a while. Thus the 30 fps isn't going to be that big of a problem.
Hey, hey, some of us need an HDMI port for audio
DP can carry audio too. Unfortunately a DP cable won't fit in your receivers HDMI ports.
You asked "Who actually plays with Vsync disabled??" And I told you: people with 120Hz monitors who maintain more than 60fps without going past 120fps. Such people (and I am one of them) do not NEED vsync because there's no screen tearing problem to be solved...it's not too hard to get one's head around
People don't get their head around it because it's factually wrong. You'll get frame tearing +120 fps and you'll get frame tearing sub 120 fps. The monitor will display whatever gets haphazardly thrown into the framebuffer regardless of what the framerate is. Granted, the lower the framerate, the better chance you'll get less overlapping frames in the framebuffer. The problem still exists though.
People don't get their head around it because it's factually wrong. You'll get frame tearing +120 fps and you'll get frame tearing sub 120 fps. The monitor will display whatever gets haphazardly thrown into the framebuffer regardless of what the framerate is. Granted, the lower the framerate, the better chance you'll get less overlapping frames in the framebuffer. The problem still exists though.
The tearing has less time to displayed on a 120hz monitor so the jumps between frames are much less noticeable. That's the reason why there is this myth that there is no tearing on 120hz displays.
I can. Can't tell you what I'll be watching on it thoughI can't imagine wanting to put a 50 inch TV on my desk.
The tearing has less time to displayed on a 120hz monitor so the jumps between frames are much less noticeable. That's the reason why there is this myth that there is no tearing on 120hz displays.
So all current Displayport 1.2 equipped videocards (like the Titan) can output 3840×2160 @ 60Hz, correct?
So it makes me wonder why they just don't add a displayport to such displays as the Seiki? Or are the electronics in the panel not capable of handling over 30Hz?
DP doesn't have the DRM that the industry loves, that's why you are stuck with HDMI at 30hz and don't have a 60hz DP solution.
Interesting, because my videocard is hooked up to my Dell U2711 over DP and the Nvidia control panel shows it as being HDCP compatible.