I am going to provide a very contrarian view.
First, while CRTs are excellent at eliminating motion blur, traditional CRTs are crappy for the Stroboscopic Effect of Finite Frame Rates. We want to improve beyond CRT. That's why we want a strobeless/impulseless/flickerless method of eliminating motion blur, so we eliminate motion blur.
And, Blur Busters exists because "nonsense" is not in our vocabulary. The word "nonsense" is a four letter word.
In the last 10 years, the improvement to LCD capabilities have surprised many. The problem is a lot of innovations have not yet converged (LCD was the first to achieve 10,000 nits but not in a low-persistence display), the perfect blacks (multithousand-element FALD LCDs), the zero blur without compromises (good lagless scanning backlight systems or future 1000Hz systems), all into the same display.
I have visited conventions and seen LCDs that looked better than OLEDs, and vice versa. There are prototype panels that really look amazing. I have feasted my eyes on thousands of LCDs. Do not judge LCD by only looking at sub-$1000 retail displays.
Even LCD is a valid contender for 1000Hz, but valid contenders are LCD, OLED, MicroLED, etc. There are some technology bottleneck with OLED that makes LCD a little bit easier (in some ways) to reach 1000Hz by 2030 -- but the use of full-color MicroLED may actually bypass it all, and provide superior pixel response.
Perhaps a 1000Hz OLED will be better, but a 1000Hz LCD will have a pixel response fast enough to be clearer than a 750Hz OLED. The GtG may not necessarily HAVE to be perfect, but it is just about (barely) keeping up with the refresh rate race, and there are currently engineering paths to make LCDs get faster. There are some with microsecond pixel response (Blue Phase LCDs), more than ten times faster than today's LCDs.
We are inclusive and keep an open mind. A mandatory Blur Busters philosophy. We exist because we keep an open mind.
Over 40 years ago, wearing the early LCD wristwatches in the 1970s, we never dreamed LCDs could generate the beautiful wall-hanging displays of today.
The word "nonsense" relating to refresh rates at Blur Busters, is considered a non-scientific luddite word when you need to put proper research & study on this. After all, there are some LCD technologies that have microsecond pixel responses, and it's possible it's a different panel format than TN, IPS, VA, but they exist in the laboratory already. When one has seen thousands of LCDs, retail, laboratory, in travels to conventions, laboratories, samples, and more, one tends to gain an more open mind.
Sure, OLED and MicroLED may make it first, and I keep an open mind to that. But right now, it's 50-50 odds that LCD is the first one to reach 1000Hz in a panel-based desktop display.
AddictFPS wrote: ↑
17 May 2020, 05:20
LCD Strobing add input lag, is not recommended for competitive, where players want the lowest input lag settings.
The strobe penalty in a good high-Hz strobed monitor is only half a refresh cycle.
1/240sec ~= 4ms. Half of that is 2ms
The goal is, if you use strobing in esports is: The improved human rection time of strobing for some gaming tactics (You will react 2ms faster) can sometimes outweigh the strobe latency penalty.
But it depends on your game, on the motion-dependant gaming tactics, and other error margins such as microstutter interfering with blur reduction benefits. You might not benefit from strobing for stationary-gaze-at-crosshairs, but benefit more from strobing.
AddictFPS wrote: ↑
17 May 2020, 05:20
Professional gamers never play leagues with strobe mode On.
prosettings.net show a lot of players use DyAc because of its tiny strobe penalty. Not all of them, but a lot of them.
That's why some of the low-lag strobe backlights such as DyAc are used more often by competitive gamers. Not all of them, but a bigger % view the strobe penalty small enough to keep DyAc turned on. If you look at the players, a lot of them turn off DyAc, but not all of them.
AddictFPS wrote: ↑
17 May 2020, 05:20
For instance, with Viewsonic XG270, if play non strobed 240FPS/Hz VSync On, each pixel is refreshed each 4.12ms, when is refreshed you see the color start change inmediately to the next. No increase default panel input lag. Unfortunately any LCD can do this change near instant, so always there are smearing due to low response time, some GtG transitions are beyond 4.12ms, that adds more smearing that GtG under the refresh rate interval. Along with motion blur due to sample and hold LCD behavior.
To other readers wanting more information about pixel response, see Pixel Response FAQ: GtG Versus MPRT, to understand how GtG is measured.
Since some words written here, may be misunderstood by new forum readers, here's a graph:
Humans will react to the first photons of the GtG long before GtG100%. GtG10% is already a dark gray in the transition from black to white. GtG50% is a middle gray in the transition from black to white. Sure, we prefer instant GtG, but the lag stopwatch (from a human reaction time perspective) doesn't end at GtG100%
The motion blur can be a bigger problem than GtG sometimes, though.
AddictFPS wrote: ↑
17 May 2020, 05:20
If XG270 run at 120FPS/Hz strobed (PureXP+ On), Ultra setting for instance 10% MPRT ~0.5ms, the input lag come from the wait time until LCD change color. Internaly XG270 continue working at 240Hz, but now when pixel is refreshed you can't see it, because is hided by backlight Off. This is exactly the cause of input lag strobing. Is not slow electronics lag, is how the strobing tech work.
It's true there is strobe lag, but the strobe lag penalty is only a half refreshtime. If you want less strobe lag, use 224Hz. It's a bit more crosstalky but still produces useful esports-quality reaction-time improvements for some gaming tactics, for purposes such as aim stabilizing.
Did you know BenQ strobing implementation DyAc stands for Dynamic Accuracy, and Gigabyte AORUS strobing implementation is called Aim Stabilizer? The motion blur reduction benefits of trying to re-aim a shaky automatic gun, can outweigh the strobe lag in many games.
Reduced blur creates reaction-time improvements that outweigh the tiny ~2ms strobe lag penalty.
It is only ~2ms to screen centre, given the half-time of scanout (High Speed Videos of Display Scanout, as the time differntial between pixel refresh in darkness, and the strobe-flash, as seen in High Speed Video of LightBoost.
Sure, some strobe modes (like LightBoost) have WAY more lag, because they buffer longer to do more processing (advanced strobe-optimized overdrive, etc), but that's not really necessary when you have a fast scanout and fast enough pixel response.
Some modes such as old LIghtBoost was much, much more laggy. LightBoost does really Y-Axis optimized overdrive, with different overdrive setting per scanline (pixel row), because of the different time differentials of panel scanout (in dark) versus the global strobe flash (seen). This was more important when GtG was slower and you strobed closer to max Hz. But it's just simpler to reduce strobe crosstalk via a large VBI (or Large Vertical Totals), eliminating the need for such advanced algorithms.
AddictFPS wrote: ↑
17 May 2020, 05:20
The only way to fix, is to made a gaming monitor with a near instant response time panel, now OLED, in future MicroLED and the cheap version QDEL, and use BFI (Black Frame Insertion) for reduce motion blur. For instance, in LG TV OLED CX 48" 4K 120Hz BFI, internaly the panel work at 240Hz like XG270, but due to very fast OLED response time, BFI (OLED Motion Pro) work different respect strobing.
I agree but the reality is that LCD will continue to be around for decades. LCDs will continue to improve and it will remain a horse in the race for a long time to come because there are amazing LCD improvements yet to come.
Also, 1000Hz can makes strobing unnecessary.
Real life does not strobe to reduce motion blur.
Also, CRT impulses & LCD strobe & OLED strobe creates stroboscopic artifacts.
To emulate real life, reduce display motion blur without strobing, requires full persistence to be simultaneously low persistence. 1ms persistence with no black periods in between, requires filling the whole second with 1ms frames, 1000fps at 1000Hz, in order to have 1ms MPRT without strobing.
AddictFPS wrote: ↑
When CX refresh pixel, user can see the color change inmediately, no hide time ! This internal 240Hz OLED rolling scan with 120FPS/Hz signal BFI, not add input lag by itself way of work, is the same input lag that play 120FPS/Hz without BFI. But unfortunately the electronic of this TV add to much input lag, 22ms with BFI On, so is not recommended for competitive. Is far more that any LCD gaming monitor with strobing.17 May 2020, 05:20
The good news is well-developed strobing implementations only add half a refresh cycle lag.
Proof: BenQ ZOWIE DyAc.
Also, ViewSonic XG270 strobing at 224Hz and 240Hz adds less than 3ms of lag for screen centre, so it's well within the error margins of improved human reaction time outweighing strobe lag. Some gaming tactics (e.g. low-altitude high speed helicoptor flyby over camoflaged enemies, or trying to find a camoflaged flying ball in Rocket League while turning your car around, or very pan-heavy MOBA where you want to see things while panning, etc) -- can produce measurable reaction time improvements. Provided other strobe-benefits-killing weak links are fixed (e.g. 1600dpi mouse, etc).
AddictFPS wrote: ↑
17 May 2020, 05:20
But a especificaly designed gaming monitor OLED BFI with under 1ms input lag electronics, is the way to go. Unfortunately, no panel in the market with these specs. QDEL and MicroLED is more fast that OLED, but will take longer to arrive. These panels fix the fast burn-in issue in OLED.
Agree. But idealism.
We welcome LCD to 1000Hz too. All horses are fair game in this refresh rate race.
AddictFPS wrote: ↑
17 May 2020, 05:20
Now market is focused in LCD 240@360Hz, but again, input lag is here when set 120FPS/Hz strobing, the improvement is in lower crosstalk. There should be a line of panel manufacturing research working in OLED gaming BFI.
There are some issues with OLED including something called Talbot-Plateau law. Outsourced light (e.g. backlight) is easier to generate brighter than direct light source (e.g. OLED pixels). The problem is that tiny microwires can't send enough electricity to pixels powerful enough to output enough nits.
On some tubes, a CRT phosphor dot can be emitting over 100,000 nits for an instant. We can't strobe OLED that bright, but the world's first 10,000 nit HDR flat panel is a LCD, when I saw the Sony 10,000 nit prototype HDR LCD. That would be delicious for strobing, 90% persistence reduction while still having 1,000 nits HDR.
The plain fact, is, LCD has a huge advantage in the ability to do outsourced light. LEDs can be stadium bright, and you can focus that (e.g. water cooled light sources, prisms, etc) and focus that all through an LCD.
Now, that said, the great news is MIcroLED is able to output a light cannon -- there are MicroLEDs capable of matching CRT brightness momentary in lumens-per-dot. This will produce some amazing persistence reductions when the time comes.
But ultimately, by the time this comes, BFI will just be garbage. BFI is not real life. Real life doesn't strobe. Real life doesn't flicker. Just eliminate motion blur strobelessly, using a superior artifactless lagless frame rate amplification technology that looks perfect (zero soap opera effect). Flicker is a humankind band-aid. Black frame insertion is a humankind band-aid. Phosphor impulsing is a humankind band-aid. We have to impulse a display only because it's not possible to emulate analog motion yet. But we're now getting closer (via ultrahigh refresh rates). Once you've feasted your eyes on a prototype 1000fps @ 1000Hz, it's even more amazing than CRT. Blurless sample-hold. Strobeless ULMB. Lagless ULMB. Goodbye BFI, goodbye phosphor, goodbye strobing, good riddance! But it will take a long time (a decade or two) before such technology becomes mainstream.
Blur Busters is a strobing fan, and we are born because of strobing (LightBoost FTW!), but strobing is not the Holy Grail of humankind.
AddictFPS wrote: ↑
17 May 2020, 05:20
Research fix burn-in and increase bright. OLED 120FPS/Hz BFI 50% gets near feel that old CRT monitors at the same FPS/Hz VSync On, and improve LCD strobing due to no add input lag, and no add crosstalk. Motion blur can be reduced more if there are more bright OLED.
Brightness-wise, we should bypass OLED and go straight to MicroLED.
AddictFPS wrote: ↑
17 May 2020, 05:20
In my opinion, is the way to go, but panel manufacturers not do it. Again LCD, again backlight, again non pure black, again low contrast, again backlight bleeding, again slow response time. Is obvious that they are dosing gaming monitors evolution by a economic point of view. Is understandable, user allways want the best tech available, but bussiness want take the much money possible of these tech transitions, so for sure they will squeeze LCD to the limit until change to next panel tech.
LCD can achieve perfect blacks with multithousand-element FALD backlights. I've seen some LCDs that look better than OLEDs, have you been travelling to dozens of conventions? Have you flown over the Atlantic Ocean to visit display manufacturers? Have you flown over the Pacific Ocean to visit display manufacturers? I have done all the above. I've feasted my eyes on thousands of displays in the last 8 years of Blur Busters' existence. Take my word, LCD is a horse in the blacks race too.
I can now buy a machine-manufactured 300 LED on a 5 meter ribbon for only $10 off Alibaba.
They already make machines that manufacture Jumbotron modules, like 32x32 RGB discrete LED panels -- the raw screen material of Jumbotrons, now costs about ~$200 per square meter in China (thanks to assembly-line manufacture of 32x32 and 64x64 RGB LEDs).
Shortly, they are about to make machines that makes ultra-cheap FALD MicroLED arrays for LCDs. I believe, that by the year 2025, the average $400-$500 gaming monitor (2020 funds, before inflation) will commonly include cheap a FALD MicroLED sheet. Technology won't be cheap enough to make 1080p MicroLEDs, but technology will soon be cheap enough to make 2000-LED FALD sheets for only a few dollars apiece by the mid 2020s.
This finally, for the first time, brings perfect-looking LCD blacks (with no visible haloing of low-FALD-counts) to inexpensive $500 gaming monitors, unlike the extra-expensive home theater displays that used to be the sole holder of FALD technologies. Sure, there's a few below-the-noisefloor photons leaking, but that's noisefloor stuff like OLED banding (a common complaint), so OLED-vs-LCD will have very well balanced pros-cons for long time to come.
This is only 1 breakthrough technology example. I have seen multiple other breakthrough technologies that guarantees LCD stays alive for about 20 years, even 30 more years, so I'm micdropping the "LCD is dead" debate. LCD will die a slow death yes, when cheap retina-resolution MicroLEDs arrive (superior to OLED and superior to LCD, and can eventually be superior to CRT), but LCD won't be the sudden death that many claim.
AddictFPS wrote: ↑
17 May 2020, 05:20
But they take some risk, because if one OLED TV manufacturer like LG launch a 120Hz BFI 1ms input lag using a exclusive gaming mode, gamers that want gaming screen without motion blur go buy TV instead to get the CRT/Plasma feel in PC desktop. 48" 4K 120Hz can be used at native 24" FHD 120Hz with GPU integer scaling, so is perfectly usable for PC desktop if is wall mounted and user have enought space for it, and money
On LG, the OLED BFI 120Hz is 8ms of motion blur, much more motionblurry than LCD strobing. I've not seen an OLED strobe as well as an LCD just yet, unfortunately. Even the best OLED strobing I've seen -- the Oculus Rift VR headset -- is 2ms persistence, which is about 4x more motionblurry than PureXP Ultra on the ViewSonic XG270. There is a problem/bottleneck partially caused by Talbot-Plateau Law, combined with the increased noise margin (noisy/grainy OLED blacks / dark shades / banding) that gets amplified by OLED strobing. Pushing the needle on LCD and OLED shows different technology bottlenecks. (LCD GtG becomes an easier horse to solve in some ways... to a certain extent)
AddictFPS wrote: ↑
17 May 2020, 05:20
If LG launch a OLED panel with extremely low input lag, that break the gaming market from motion blur reduction point of view (no crosstalk no input lag), and the rest of gaming panel manufacturers they will be late to these next generation. They should be thinking about doing something about it before this occurs, but only JOLED is in this race Vs LG, but now only 22" monitor OLED 4K 60Hz no gaming.
At this point, we're in idealistic speculation. The matter of the fact is that real life is sometimes disappointing and we have to keep the door open to all technologies, LCD, OLED, MicroLED.
AddictFPS wrote: ↑
17 May 2020, 05:20
Race to 1000Hz display with LCD is nonsense [Mod EDIT: Nope -- not nonsense], response time can't fit into refresh rate, non-strobed with smearing, and 120FPS/Hz strobed, can be crosstalk free, but increase input lag, and gamers not want more input lag, so need to go straight to OLED.
And, "nonsense" is a verboten word on Blur Busters. Long time readers know how we mythbust this kind of stuff.
One huge problem that has hurt LCD is the "race to bottom". Some aspects of LCD is worse than 20 years ago. People don't want to pay $5000 for an LCD that looks better than an OLED.
Back in year 2001, the IBM T221 (the world's first 4K LCD) did not have much of an IPS glow issue. But today's cheaply made LCDs have more nonuniformities, and so on.
The bottom line is that you cannot judge a book by the cover by having worked maijnly with sub-$1000 LCDs. If you travelled the world to display conventions, display prototypes, and laboratories, one will understand better what I am talking about.
I understand why many people make assumptions on LCD limitations because we judge by the $500 monitor. But the fact is CRT-quality blacks are already achieved in LCDs. Just not at the $500 level.
Just like how Blur Busters helped prove to the world that motion blur can be eliminated on LCDs back in the day people thought that motion blur can never be fixed on an LCD -- that's is how Blur Busters was born. Back in the hobby days, the Blur Busters website used to be www.scanningbacklight.com and I still have a very old scanning backlight FAQ from the orginal 2012 site.
However, a lot has changed since, and it will become much easier/cheaper to do FALD LCDs in a gaming monitors in the coming years, as a stopgap before full-RGB MicroLED slowly goes cheap/commodity/retina (~10-20 years). We'll be able to enjoy great CRT blacks without blooming artifacts much sooner.
While approximately half of your post had good (albiet idealistic) points -- the word "nonsense" is blasphemy to Blur Busters existence.
MicroLED is pretty much the ultimate goal but it's going to be a long time before it's affordable/cheap.