Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2013
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 07-25-2013, 09:23 PM   #51
Callsign_Vega
Junior Member
 
Join Date: Jul 2013
Posts: 19
Default

For those that were curious about the XL2420TE in non-Lightboost mode, I can confirm it's lowest brightness setting with zero-PWM is in fact lower than the two other monitors I have here. I have decided to keep one of the XL2420TE's for it's zero-PWM feature and use as a side monitor to my 3x1 Portrait XL2720T surround setup. If you want a non-Lightboost 120-144 Hz monitor that has a wide brightness adjustment and zero-PWM flicker, the Xl2420TE is the one to get (well, the only). I personally love Lightboost strobing and zero-PWM on m y eyes versus regular PWM strobing.

I would normally have stuck with my Dell U2413 but I found a nasty bug in which if you have a surround setup at 120 Hz, and a 60 Hz monitor on the side it doesn't work properly and the "feel" of the 120 Hz setup drops down to the 60 Hz. So, if you have a 120 Hz surround setup and want to run that 4th screen, you must match it with a 120 Hz screen.
Callsign_Vega is offline   Reply With Quote
Old 07-25-2013, 10:08 PM   #52
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by Callsign_Vega View Post
For those that were curious about the XL2420TE in non-Lightboost mode, I can confirm it's lowest brightness setting with zero-PWM is in fact lower than the two other monitors I have here.
That's good news -- could be the redeeming quality of XL2420TE, because I've had a few complaints that many 120Hz monitors are way too bright, even at Brightness=0%. And even if you don't get eye strain from PWM, the ability to lower brightness, without seeing ugly PWM motion artifacts, is a good benefit for people who prefer the non-LightBoost image.

i.e. It's like a more expensive VG248QE, to contain these specific nice perks.
Mark Rejhon is offline   Reply With Quote
Old 07-29-2013, 08:35 PM   #53
thujone
Golden Member
 
Join Date: Jun 2003
Posts: 1,114
Default

so do the ips/pls 27" 2560x1440 panels that overclock look as good as the 24" models with respect to motion blur? or are they at least close?


i've got the itch to look at something new lol
__________________
heat|ebay|email
thujone is offline   Reply With Quote
Old 07-29-2013, 10:44 PM   #54
Elfear
VC&G Moderator
 
Elfear's Avatar
 
Join Date: May 2004
Posts: 6,115
Default

Quote:
Originally Posted by thujone View Post
so do the ips/pls 27" 2560x1440 panels that overclock look as good as the 24" models with respect to motion blur? or are they at least close?


i've got the itch to look at something new lol
Look at post #35 to get an idea of the motion blur of an overclocked IPS monitor and a 24" Lightboost screen.
__________________
4770k@4.7Ghz | Maximus VI Hero | 2x290@1150/1450 | 16GB DDR3 | Custom H20
Elfear is online now   Reply With Quote
Old 07-30-2013, 12:59 PM   #55
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by thujone View Post
so do the ips/pls 27" 2560x1440 panels that overclock look as good as the 24" models with respect to motion blur? or are they at least close?
No. The 120Hz overclock helps some, but not nearly as much as LightBoost.

60Hz -- baseline
120Hz -- 50% less motion blur (2x sharper motion)
144Hz -- 60% less motion blur (2.4x sharper motion)
120Hz LightBoost at 100% -- has 85% less motion blur (7x sharper motion)
120Hz LightBoost at 10% -- has 92% less motion blur (12x sharper motion)

The IPS overclocks resemble a regular TN 120Hz monitor with LightBoost turned off (50% less motion blur) in terms of motion blur, and is actually slightly a bit more blur, due to IPS slower response speed. So it is closer to about 40% less motion blur. Definitely not an order-of-magnitude improvement in motion clarity that LightBoost is able to give during perfect framerate=Hz motion.

IPS / PLS overclocks do have better color quality and higher resolution, so that's another pro. You just can't get the best-of-all-worlds in terms of motion blur, resolution, and color quality. Unless you buy two monitors (a 1440p 120Hz overclockable monitor, and a LightBoost monitor!)
Mark Rejhon is offline   Reply With Quote
Old 07-30-2013, 02:42 PM   #56
jackstar7
Diamond Member
 
jackstar7's Avatar
 
Join Date: Jun 2009
Posts: 4,142
Default

Hey Mark, I saw you discussing a line of Sony TVs with a LB-lite tech. Will those be added to the list? Or are you waiting for some more data?
__________________
ASUS P8Z77-WS + 3770K@4.1@1.07v + 4x4GB 1833 + Sapphire 7990 + 2x512gbM4 + X-Fi Forte
in an FT02 with an X-1250w + X270OC+XL2720T + Realforce 87U-Silent
jackstar7 is offline   Reply With Quote
Old 07-31-2013, 03:23 PM   #57
Anarchist420
Diamond Member
 
Join Date: Feb 2010
Posts: 8,239
Default

My BenQ GW2760HS does up to 100Hz at 1600x900 (within pixel clock, I'm not going to try overclocking). I simply changed the monitor's native resolution to 1600x900 at 100 Hz (with ToastyX's tool) and I can play games at 100Hz at low resolutions; GPU scaling/centering still works perfectly when I use 1280x800, 1280x720, and 1024x768 or lower.
Anarchist420 is offline   Reply With Quote
Old 08-02-2013, 03:17 AM   #58
Braxos
Member
 
Braxos's Avatar
 
Join Date: May 2013
Location: Germany
Posts: 77
Default

Guys a question since I upgraded my gpu to 770dc2oc I want now to upgrade monitor from samsung bx2335 to ???????????

There is the problem. I can't get my mind on ips/pls or Tn 120hz/144hz.

First I will need it for gaming and some web browsing.

Cpu I7 2600k (atm no oc since my h100i and psu had psoblems so it was all working on low just so I could have a runing pc) I will oc it to 4.5 or 4.8 depends on temps both are tested working speeds but on winter in Germany.
MB asus p8p67 deluxe
Gpu asus 770 dc2oc may buy a second one later if needed
Ram 16gb corsair 1600
Samsung 830 256gb
H100i


Since the monitor i use is a 1080p @60hz i wanted to feel the like 10 years ago on crt times. But since high resolutions AND high hz can't be together without sacrificing 1k euro for 1440@85hz but even then the input lag is a killer for games. The past week I am using the pc on my tv 46es8090 but couldn't play much.

So now I was thinking about those 2 monitors.

Both asus the one tn with 144hz VG278HE and the other is a pls one PB278Q. I know they different kind of displays but from the one side you have a high hz/fps panel that can make your gaming experience (bf3/bf4 pre ordered/cs go all casual) to be on a different level and from the other at 27 inch the extra 1.6m pixels would provide you with much better picture quality and it can do 76hz. But the concern is that 27 inch with low pixel density could may be noticeable on vg278he and from the othrr side we have the lag/blur concern at fast games on the pb278q.

So what you think guys?


P.S. I am not a patient men so after 3rd return I go berserk. On past month I had 4 ax860i till I bought a ax1200i since all ax860 had a problem, 1 had fan noise 3 was reporting as ax760i on Link2, so after paying 80 euros on delivery fees, was ordering a new before returning the other, I made it till 4 because after the first 2 I was a month on vacations. So please no china korea models, since flashlighting I a user oppinion how much it affects you and ofc don't forget transport dmgs. So return only 1-2 days aka amazon.de or domestic eshops.
Braxos is offline   Reply With Quote
Old 08-02-2013, 06:59 AM   #59
Gryz
Senior Member
 
Gryz's Avatar
 
Join Date: Aug 2010
Posts: 399
Default

Obviously this zero-motion-blur has attracted quite a bit of attention. And users seem very positive about the improvement. However, it is still an unofficial feature. Everything has a strong do-it-yourself smell.

Do we know if manufacturers have picked up on this ? Are there plans to release new monitors with electronics and drivers and everything, that will be marketed (and supported) as zero-motion-blur monitors ?
Gryz is offline   Reply With Quote
Old 08-02-2013, 08:40 PM   #60
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by Gryz View Post
Obviously this zero-motion-blur has attracted quite a bit of attention. And users seem very positive about the improvement. However, it is still an unofficial feature. Everything has a strong do-it-yourself smell.
Yes, nVidia force-bundled motion blur elimination with 3D Vision, and it was only recently that the user community force-unbundled this LightBoost benefit away from 3D Vision. It is now finally as easy as running ToastyX Strobelight, and hitting a hotkey (Control+Alt+Plus to turn ON lightboost, Control+Alt+Minus to turn OFF lightboost).

Quote:
Originally Posted by Gryz View Post
Do we know if manufacturers have picked up on this ? Are there plans to release new monitors with electronics and drivers and everything, that will be marketed (and supported) as zero-motion-blur monitors ?
There's been quite some interest behind the scenes already, and Sony has already figured it out too as well, while a few other manufacturers (e.g. Eizo) are experimenting with it. It may take time since R&D can take years, and nVidia has a huge market lead. The LightBoost "Hack" was discovered only 8 months ago, and the easy LightBoost (on a touch of a keypress) is less than 2 months old. It is rapidly becoming easier and easier (without nVidia's further involvement). Give it time.

Momentum is increasing. Valve software (Michael Abrash: blog about motion blur in VR) and iD Software (John Carmack: QuakeCon keynote comments on motion blur) will help increase pressure to increase the number of options to shorten the persistence of displays on market. (Michael Abrash also complimented me here in this comment). I would expect more movers and shakers to make enough noise, too.

Quote:
Originally Posted by jackstar7 View Post
Hey Mark, I saw you discussing a line of Sony TVs with a LB-lite tech. Will those be added to the list? Or are you waiting for some more data?
Alas, they are only 60Hz. Yes, that's the new Sony Motionflow Impulse technology, which is the Sony strobe backlight similiar to LightBoost, and works in Game Mode (good for console and HTPC gaming). Flickers a lot though, since it's only a 60Hz strobe. But it's the world's first low-lag motionflow mode, and that's a godsend for console/HTPC gamers that can play at 60fps.

Supported Sony HDTV: with this "LightBoost" style Motionflow Impulse mode: HX920 Series, HX923 Series, HX925 Series, HX929 Series, XBR-55HX950, XBR-65HX950, KDL-47W802A (Budget), KDL-55w802A (Budget), KDL-55W900A, W905A Series, XBR-55X900A (4K Ultra), XBR-65X900A (4K Ultra)

I've added these to the bottom of OP, because 120Hz users are highly interested in motion clarity, and are interested to discover Sony's equivalent of LightBoost.

Last edited by Mark Rejhon; 08-02-2013 at 09:06 PM.
Mark Rejhon is offline   Reply With Quote
Old 08-02-2013, 09:02 PM   #61
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by Braxos View Post
i wanted to feel the like 10 years ago on crt times.
Your "CRT" and "Geforce 770" mention is screaming for 120Hz.
I make two recommendations:
(1) ASUS VG278H (best for color inside LightBoost, after calibration)
(2) QNIX QX2710 (2560x1440 120Hz PLS overclock for only $300)
(3) Both of the above. Find room on your desk for both.

You said CRT. So I'd lean towards the ASUS VG278H if you want to relive your CRT days. Just see the testimonials from former CRT users who happily replaced Sony W900's with LightBoost.
Mark Rejhon is offline   Reply With Quote
Old 08-04-2013, 02:54 PM   #62
BoFox
Senior Member
 
BoFox's Avatar
 
Join Date: May 2008
Posts: 689
Default

Quote:
Originally Posted by Mark Rejhon View Post
Your "CRT" and "Geforce 770" mention is screaming for 120Hz.
I make two recommendations:
(1) ASUS VG278H (best for color inside LightBoost, after calibration)
(2) QNIX QX2710 (2560x1440 120Hz PLS overclock for only $300)
(3) Both of the above. Find room on your desk for both.

You said CRT. So I'd lean towards the ASUS VG278H if you want to relive your CRT days. Just see the testimonials from former CRT users who happily replaced Sony W900's with LightBoost.
Wow, awesome!! Awesome thread, BTW!

My two FW900's are approaching/exceeding 10 years of age at this moment.. the heat that they give off during the summer.. ARGH!!
__________________
Man, zillions of years!

What is this thing right now?
Tell me, just what is this thing right now?
BoFox is offline   Reply With Quote
Old 08-05-2013, 12:29 AM   #63
hans030390
Diamond Member
 
hans030390's Avatar
 
Join Date: Feb 2005
Posts: 7,318
Default

Do keep in mind that Lightboost's benefits are, IMO, negated by judder if you can't match the refresh rate you have the monitor set at (100, 110, or 120Hz). The clarity benefit is still there, but all motion becomes very jittery and frankly uncomfortable to look at. You might be surprised how many games have difficulties running at 100+ fps, even with somewhat lowered settings. It's also important to note that some games cap you at 60fps, so you might need to dig into the ini files to adjust some settings. This is true for every UE3 game I've played.

But when you can match your refresh rate...Incredible.
__________________
"What the heck are you gonna do if you're on a picnic and have an ice cream, and the ants crawl on the ice cream? What are you gonna do? You're gonna eat the ants, because it's made out of protein. For your health!" - Dr. Steve Brule
hans030390 is offline   Reply With Quote
Old 08-05-2013, 08:22 AM   #64
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by hans030390 View Post
Do keep in mind that Lightboost's benefits are, IMO, negated by judder if you can't match the refresh rate you have the monitor set at (100, 110, or 120Hz).
This can be very true for both CRT's and LightBoost.
Motion blur can hide microstutters. Tiny stutters/judder are much easier to see on CRT's and LightBoost, due to the complete elimination of perceptible motion blur.

Quote:
But when you can match your refresh rate...Incredible.
And that can be true too. LightBoost sings the best when you run at framerate=Hz. If you've never seen framerate=Hz with LightBoost, then load www.testufo.com/#test=photo in Chrome browser with LightBoost enabled (if it does not play smooth, check chrome://gpu is all enabled) -- and you will instantly see the benefits of LightBoost in that page. Zero motion blur. Fast pans as perfectly sharp as stationary image, just like CRT. You get this motion quality in games (during turns/strafes/pans/zoombys) only if you have enough GPU to pull off framerate=Hz.

The new ToastyX Strobelight utility makes it easy to turn on/off LightBoost via a hotkey (Ctrl+Alt+Plus and Ctrl+Alt+Minus).

TIP: Where possible, try using Adaptive VSYNC with LightBoost, because it will help do the job of doing framerate=Hz that LightBoost looks most beautiful at. Turn VSYNC OFF in your games, but in your nVidia drivers, enable Adaptive VSYNC. This will ensure you aim for framerate=Hz operation, without the input lag of VSYNC ON. Then tweak the game settings until the games doesn't have too many drops below framerate=Hz operation (e.g. use FXAA instead of FSAA). Upgrade your GPU if necessary.

Last edited by Mark Rejhon; 08-05-2013 at 08:35 AM.
Mark Rejhon is offline   Reply With Quote
Old 08-05-2013, 12:05 PM   #65
Aikouka
Lifer
 
Aikouka's Avatar
 
Join Date: Nov 2001
Location: Alabama
Posts: 24,304
Default

Quote:
Originally Posted by Mark Rejhon View Post
The new ToastyX Strobelight utility makes it easy to turn on/off LightBoost via a hotkey (Ctrl+Alt+Plus and Ctrl+Alt+Minus).
I have a VG278H, which I recall being an "easier" monitor to tinker with. So, would this software still be recommended for an even easier time? I've been tempted to try this out, but I just never got around to it.
Aikouka is online now   Reply With Quote
Old 08-06-2013, 12:59 AM   #66
hans030390
Diamond Member
 
hans030390's Avatar
 
Join Date: Feb 2005
Posts: 7,318
Default

Quote:
Originally Posted by Mark Rejhon View Post
TIP: Where possible, try using Adaptive VSYNC with LightBoost, because it will help do the job of doing framerate=Hz that LightBoost looks most beautiful at. Turn VSYNC OFF in your games, but in your nVidia drivers, enable Adaptive VSYNC. This will ensure you aim for framerate=Hz operation, without the input lag of VSYNC ON. Then tweak the game settings until the games doesn't have too many drops below framerate=Hz operation (e.g. use FXAA instead of FSAA). Upgrade your GPU if necessary.
But I can't stand tearing, and I've noticed it even when the frame rate is below the refresh rate. I use D3DOverrider with v-sync and triple buffering. It works for most games, and I've used fraps before to verify that games don't lock the framerate at lower divisions of the refresh rate. But, on the other hand, I've heard that D3DO really isn't a good tool to use. I dunno...maybe my setup isn't optimal.

I also have never been able to notice a difference in input lag with v-sync on, which I find odd...
__________________
"What the heck are you gonna do if you're on a picnic and have an ice cream, and the ants crawl on the ice cream? What are you gonna do? You're gonna eat the ants, because it's made out of protein. For your health!" - Dr. Steve Brule
hans030390 is offline   Reply With Quote
Old 08-06-2013, 09:03 PM   #67
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by hans030390 View Post
But I can't stand tearing
Then use VSYNC ON. LightBoost looks pretty at VSYNC ON too, if you can sustain it without many framedrops, especially with the triplebuffering tweaks. Input lag of VSYNC ON is mainly relevant for online gaming, since the person with the lower input lag will likely react a few milliseconds before you do (e.g. shoot you first before you shoot them). It gives you a competitive advantage even when you can't feel the input lag.

That said, LightBoost motion quality does look really good with VSYNC ON (e.g. VSYNC ON motion looks like www.testufo.com/#test=photo -- that is how smooth turning/strafing actually can become with zero stutters on a powerful GPU in many FPS shooters when you enable VSYNC ON). But VSYNC ON, as we know, isn't always good for competitive gaming.
Mark Rejhon is offline   Reply With Quote
Old 08-07-2013, 03:26 AM   #68
hans030390
Diamond Member
 
hans030390's Avatar
 
Join Date: Feb 2005
Posts: 7,318
Default

I guess I will continue on with how I have it setup, then! Though I will have to experiment with adaptive v-sync, because I honestly haven't tried it yet. I started using v-sync + TB with D3DO long before Nvidia released that functionality and have stuck with it because it works so well, IME, if you have the rig for it.

I've always felt that network lag is generally so high compared to input lag with v-sync on that a tiny bit of input lag won't really make much of a difference in the end (assuming you have a powerful rig that can run near the 120Hz refresh rate most of the time). I've tried it both on (with TB) and off and never noticed a difference or a competitive advantage. Perhaps that's not something I'm sensitive to, unlike micro-stuttering, motion blur, refresh rate, smooth motion, etc. Those things I will pick up on in a heartbeat.

Plus with my new apartment and internet service, I've noticed a dramatic drop in network lag in several games (ex: League of Legends now averages around 50-60ms rather than 80-100ms, the latter of which is fairly common, IME). That might be more due to location to servers in the region, though. Either way, for those games, I'm sure that does more for a competitive advantage than disabling v-sync will for me.

Other than that, none of the other games I play require immediate input response.

Also, with the big LAN wars/parties I've been to (one was nearly 500 people, other were ~200), the vast majority of players there don't have rigs nearly as powerful as mine. Don't have 120Hz monitors. Don't even really know what v-sync is. I'm not sure if this is indicative of the general online PC gaming population or not, but even with v-sync, I think my setup already gives me more of a competitive edge compared to the average online gamer. I know it is NOT indicative of players you'd see on hardware forums such as AT. I'm also not a super hardcore competitive online gamer, so I don't get matched up too often with those really good players. My skill in most games holds me back more than input lag ever will.

Once I start getting any sort of tearing or less than smooth gameplay, that kills my competitive advantage...mostly because it annoys me and starts to hurt my head after a while. Tearing to me invokes nearly the same feeling as stuttering, low FPS, etc.

BTW, I don't play LoL anymore, but it's a perfect game for Lightboost. Really easy to push beyond 120FPS, and the additional smoothness and lack of motion blur really helps, especially during crazy battles. Also just tried Torchlight 2 with it last night...also a great game for Lightboost.

On a slightly different topic, while I love the ToastyX Strobelight tool, I hate having to always change my ICC profiles when I switch back and forth for games that don't have the FPS for Lightboost. The VG248QE needs them, and the settings are quite different with LB on/off.
__________________
"What the heck are you gonna do if you're on a picnic and have an ice cream, and the ants crawl on the ice cream? What are you gonna do? You're gonna eat the ants, because it's made out of protein. For your health!" - Dr. Steve Brule

Last edited by hans030390; 08-07-2013 at 03:46 AM.
hans030390 is offline   Reply With Quote
Old 08-07-2013, 11:21 AM   #69
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by hans030390 View Post
I've always felt that network lag is generally so high compared to input lag with v-sync on that a tiny bit of input lag
This is generally true for the common leagues, but can become an issue during pro leagues where so many high end gamers are equally matched in reaction times, that the lower input lag means more opportunity to shoot first before the enemy does. It adds to your reaction time, e.g. 15ms of input lag can increase your equivalent reaction time by an additional 15ms.

So if two high end professional gamers shoot at each other at the same time, the one with less input lag can win that frag more often -- even if input lag only increases your frags by 2 times out of 100, it can win your game -- like winning a sprint race between equally matched racer by just mere milliseconds.

Gamestyles have often varied between LCD users and CRT users over the history of FPS gaming. A pro gamer explained to me that some tactics don't need motion blur improvements -- e.g. staring at crosshairs & linear strafing in big areas, getting far-away enemies to slowly pass the crosshairs before doing a precise one-shot frag. This technique more motion-blur-independent than close-distance circle strafing (a tactic more common from CRT days) which creates lots more motion blur, and more challenging to pull off on non-LightBoost LCD than CRT.


Quote:
Once I start getting any sort of tearing or less than smooth gameplay, that kills my competitive advantage...mostly because it annoys me and starts to hurt my head after a while. Tearing to me invokes nearly the same feeling as stuttering, low FPS, etc.
For perfect LightBoost zero motion blur, the perfectino of VSYNC ON framerate=Hz (zero slowdowns) can never be beat for CRT's and LightBoost. If input lag isn't holding you back, but motion blur is, then that's the ideal setup if you don't like the bulk of CRT.

Adaptive VSYNC looks exactly like VSYNC ON (zero tearing) whenever framerates are high (e.g. 120fps@120Hz), because Adaptive VSYNC is simply the graphics drivers intelligently timing the VSYNC OFF frames to steer the tearline to just off the edge of the screen. No extra buffer layers are needed since it's simply VSYNC OFF with intelligent driver steering of the tearline. However, tearing will show up everytime the framerate drops below framerate=Hz, because the tearline won't be offscreen at those times. Adaptive VSYNC a good compromise setting for VSYNC ON motion fluidity lovers, but hate the input lag of VSYNC ON, and are users of graphics cards that can keeps 120fps@120Hz most of the time (e.g. 90%+ of the time). It eliminates the sudden halving of framerates that occurs with VSYNC ON, which can create sudden increases in input lag that throws off competitive players. So it's a great compromise setting for LightBoost users.

Adaptive VSYNC is like an input-lag reduced VSYNC ON, that just has special behaviours when framerate slows. What simply happens is but busy moments (lots of shooting, explosions and enemies) can cause the framerate drops to make the tearing show up again; but from 120fps you'd be getting brief moments of "95fps+tearing" instead of brief moments of "sudden halving to 60fps+sudden extra input lag". So VSYNC ON versus Adaptive VSYNC is a "pick your poison". Some games such as Team Fortress 2 are able to run at a perfect sustained 120fps@120Hz at all times 24/7 if you got a fast system, so VSYNC ON and Adaptive VSYNC would look indistinguishable (except for Adaptive VSYNC often sometimes having slightly less input lag due to the elimination of an extra buffer layers, since you're still turning VSYNC OFF in the game, while enabling Adaptive in the graphics driver to steer the tearline off-screen)

I should write a Blur Busters VSYNC article sometime, because of the contradiction of LightBoost benefiting VSYNC ON (dramatic LightBoost improvement) far more than VSYNC OFF (better competitive input lag), causing the little known "Adaptive VSYNC" technique to be far more useful compromise than some people realize.

Last edited by Mark Rejhon; 08-07-2013 at 11:33 AM.
Mark Rejhon is offline   Reply With Quote
Old 08-10-2013, 03:24 AM   #70
Shamrock
Senior Member
 
Join Date: Oct 1999
Posts: 830
Default

Quote:
Originally Posted by Mark Rejhon View Post
Your "CRT" and "Geforce 770" mention is screaming for 120Hz.
I make two recommendations:
(1) ASUS VG278H (best for color inside LightBoost, after calibration)
(2) QNIX QX2710 (2560x1440 120Hz PLS overclock for only $300)
(3) Both of the above. Find room on your desk for both.

You said CRT. So I'd lean towards the ASUS VG278H if you want to relive your CRT days. Just see the testimonials from former CRT users who happily replaced Sony W900's with LightBoost.
You recommend the QNIX, but will it have the motion blur, being a 8ms response monitor?
__________________
I Protest that These forums get some kind of St. Patrick's Day Icon available!
Shamrock is offline   Reply With Quote
Old 08-10-2013, 11:48 PM   #71
Callsign_Vega
Junior Member
 
Join Date: Jul 2013
Posts: 19
Default

All 2560x1440 monitors have bad motion blur.
Callsign_Vega is offline   Reply With Quote
Old 08-13-2013, 10:25 PM   #72
thujone
Golden Member
 
Join Date: Jun 2003
Posts: 1,114
Default

still going back and forth constantly between a VG248QE and a qx2710. interestingly enough... you can get the qx2710 on newegg through a reseller in s. korea now. it's still $100+ more than what it will cost on you on ebay though.


honestly for the current newegg price the VG248QE is pretty hard to pass up. the other thing is it will fit on my desk whereas the 27" would probably have to be wall mounted.
__________________
heat|ebay|email
thujone is offline   Reply With Quote
Old 08-13-2013, 11:20 PM   #73
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by Shamrock View Post
You recommend the QNIX, but will it have the motion blur, being a 8ms response monitor?
8ms 120Hz can still have less motion blur than 2ms 60Hz.
(If you're talking about pixel transition speed, rather than strobe length).

The LCD pixel transition speed does not dictate the whole motion blur story; the visibility of the LCD pixel is the current dominant determinator of motion blur in modern LCD's (sample-and-hold effect). That's why the strobing of LightBoost helps so much, and why CRT 60fps@60Hz has less motion blur than LCD 120fps@120Hz. All 60Hz LCD's have effectively 16.7ms of sample-and-hold motion blur, and all 120Hz LCD's effectively have 8.3ms of motion blur.



The LightBoost monitor (at optimized settings) has 6x less motion blur than the QNIX Q2710. e.g. in situations where motion blur trail was about 6 pixels long, the motion blur becomes only 1 pixel long for the same motion. See Photos: 60Hz vs 120Hz vs LightBoost.

So, if you hate motion blur more than losing resolution, definitely get LightBoost over the QNIX. Alternatively, if you prefer better colors and higher resolution, get the QNIX but you'll not get perfectly-sharp-looking fast-pans. Fast pans such as www.testufo.com/#test=photo viewed in Chrome browser, won't be as crystal-sharp except on CRT and LightBoost displays.

Last edited by Mark Rejhon; 08-13-2013 at 11:25 PM.
Mark Rejhon is offline   Reply With Quote
Old 08-19-2013, 08:53 AM   #74
bwanaaa
Senior Member
 
Join Date: Dec 2002
Posts: 677
Default

Are you the same mark rejhon who wrote theatertek? You sound as knowledgeable and helpful.
__________________
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
sorry i dont have references for this basic knowledge at my fingertips, but i did read it somewhere and dont remember the source.
bwanaaa is offline   Reply With Quote
Old 08-19-2013, 11:18 AM   #75
Mark Rejhon
Senior Member
 
Mark Rejhon's Avatar
 
Join Date: Dec 2012
Posts: 271
Default

Quote:
Originally Posted by bwanaaa View Post
Are you the same mark rejhon who wrote theatertek? You sound as knowledgeable and helpful.
Yes, I used to work in the home theater industry. Worked with Holo3DGraph, TAW ROCK (2003 Editor's Choice in Stereophile Guide To Home Theater), KeyDigital.com's LEEZA, and yes, I had a contract with Andrew of TheaterTek Inc too (He's the one who created the TheaterTek player).

I used to own an NEC XG135LC CRT projector, which supported 1080p@60Hz in 1999 (and probably at 120Hz too, no graphics card I had could output that.) Imagine a wall sized Sony GDM-W900 CRT.

Last edited by Mark Rejhon; 08-19-2013 at 11:20 AM.
Mark Rejhon is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 05:37 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.