VOGONS


First post, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

I'm creating this thread to address some posts that were made in another thread recently regarding LCD\LED motion blur. Regardless of a monitor's specs, significant blurring is an unavoidable fact of the technology behind any kind of display that keeps the image on the screen between refreshes (this is considered "sample and hold"). Your eyes will cause the image to blur on ANY display that does not blank the image between frames (LCD, LED, OLED, any of them...). The response time is not the problem... in fact, it isn't even the screen doing the blurring. It is the sample and hold method of displaying the image that causes your eyes to see blurring. This is why things like Lightboost, BenQ Blur Reduction and Nvidia's ULMB (ultra low motion blur... which is available on Gsync monitors but cannot be used at the same time as Gsync) exist on more expensive monitors.

The only way to eliminate this problem is to either have 600Hz+ refresh rate (and frame rate) with less than 1ms response time, or to blank the screen between frames. The latter is obviously far easier to achieve, but it isn't perfect and has many downsides. At this point in time, it is the best option available aside from using a CRT, which does this naturally (since it only draws the image briefly before blanking out) without most of the downsides of forcing a sample and hold display to do it. Its amazing just how fast our eyes expect things to be moving... you actually have to trick them to not see blurring on a sub 600Hz sample and hold display!

Look at the tests on this site for some really cool examples:
http://www.testufo.com/#test=eyetracking&pattern=stars
Focus on the stationary UFO... you see sharp lines around it (though probably somewhat blurry if you are using a slower display technology like an IPS, PVA or VA screen), now if you look at the moving ufo, the lines are massively blurry and the entire image changes. To me, the entire area with lines is nearly solid grey...

Since the vast majority of "very old games" would have been played on a CRT of some kind (either a TV or a monitor) it goes without saying that the experience will be different than you remember when using an LCD (without blur reduction). Imagine things like detailed background textures\images in games, scrolling by quickly. Brick walls, tile floors, etc. Rather than seeing the image as it was intended on a CRT, you will at best see a blurred version, but you may possibly not even see the texture at all until the motion stops if the details are fine and broken up by large portions of flat color.

The slightly frustrating thing about it, is that unless you have a CRT or a blur-reducing LCD of some kind, you cannot see a monitor that really "passes" the UFO test, so it can be hard to picture the difference if its been a long time since you last used a CRT.

Personally, I own an old high end CRT (HP P1230... rebranded Mitsubishi Diamondo Pro 2070SB) which has been getting flaky for several years, so I finally broke down and got a refurbished BenQ XL2720Z for my main system (the CRT will be used exclusively for older systems). This particular BenQ is known for having the best blur-reduction features available while having decent color reproduction and viewing angles (for TN LCD\LED screen). The best part is that it can handle blur reduction at 60Hz, which means my old systems can have CRT-like motion at their native refresh rates. Sure, I'd love an IPS screen or the option to go higher than 1080P, but unless someone starts producing CRTs again, we're stuck with the superior but inferior technologies that are currently available. I could have spent $700 on a Gsync IPS display that would still have motion blur... and I likely would have wanted to use my 11 year old CRT instead for many things. 😊

On the above tests, I see the same thing as any other LCD\LED\OLED screen when Blur Reduction is disabled. When I turn on Blur Reduction (or use my CRT) the grey blurred lines are not blurred, no matter what I look at on the screen.

I highly recommend that everyone interested in the topic spend some time at this site:
http://www.blurbusters.com/faq/lcd-motion-artifacts/

Now for some blitting from the back buffer.

Reply 1 of 23, by Deep Thought

User metadata
Rank Newbie
Rank
Newbie

TLDR:
Image persistence = motion blur
Response time = panel ghosting/streaking

Even if you have 0.0001ms response times, there will still be 16.7ms of motion blur resulting from image persistence if your display holds the image for the full duration of a frame at 60Hz.
If your display flickers like a CRT, persistence (hold time) is far lower. An average CRT might only have a persistence of 1ms.

To match an average CRT, that means you need 1000Hz (and FPS to match).
600Hz would be quite a bit worse.
But our displays are now much higher resolution, so we really need far shorter persistence than that. Something more like 0.1ms would be closer to ideal.
The problem with low-persistence displays is that they flicker a lot at low framerates and you lose a lot of brightness.

I do hope that one day we have displays which combine the advantages of modern flat panels with the motion handling of CRT displays.

Something I will point out is that G-Sync is not about reducing motion blur.
G-Sync is about eliminating latency, screen tearing, and stuttering. And it does a very good job of it.

I would like to see a mode that enables G-Sync to work at the same time as ULMB because then you would be able to play old games that run at a fixed framerate without V-Sync latency and without motion blur.
I doubt NVIDIA will enable this as an option though because people will be stupid enough to try and use it with modern 3D games where fluctuating framerates will cause the screen to change in brightness and flicker more/less as the framerate changes, or they will complain that it flickers too much at low framerates.

Reply 2 of 23, by nforce4max

User metadata
Rank l33t
Rank
l33t

Give early LCDs a try and you will really appreciate modern lcd monitors even when it comes to retro use. I got one lcd monitor from around 98 or 99 and it is the pickiest monitor I have ever owned.

On a far away planet reading your posts in the year 10,191.

Reply 3 of 23, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

Thanks for the input and for the correction on the 600Hz thing. I was getting it confused with something else. I've looked it up again and 1000-2000Hz would be needed to achieve CRT-like motion clarity at high resolutions on an LCD\OLED screen.

So far the only display technology I have heard of that is available and being improved that can bring back the capabilities of a CRT is laser projection. I guess it does a pretty nice job, but I've never seen it myself and at this point I think resolution and expense are the main issues with it. Most likely it requires some distance for the display too which brings us back to larger monitors or at least needed a lot of space around them.

A 2000Hz OLED screen could do it too, but unless your games are running at 2000Hz, you'll still see persistence.

And yes, variable refresh rate is probably a very cool thing to experience, but it simply can't work with low persistence displays at this point. Considering the advantages of just having higher frame rates, I find low persistence to be the more valuable of the two technologies at this point, which is why I chose the BenQ screen rather than saving up for a Gsync model.

To keep all of this on the subject of "old games" (since I put it in Marvin for that purpose), playing emulated 2D games on an LCD is a good way to see the shortcomings of modern displays. Just watch anything with fine, pixely details. If you have access to a CRT of any kind, just try it and you'll be amazed at the terrible loss in motion clarity we've grown accustomed to over the past 15 years.

Yes, LCDs have improved, but when we've got people splitting hairs over 1Hz tone differences in midi music and the specific way that 3dfx cards dithered 16bit color graphics in Glide, you have to acknowledge that gaming on anything but a low-persistence display is a gigantic step backward from what we had years ago when our old games were new.

Now for some blitting from the back buffer.

Reply 4 of 23, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

What about plasma? I've noticed that plasma TVs have a very CRT-like flicker, mostly noticeable when displaying a completely white screen.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 5 of 23, by Deep Thought

User metadata
Rank Newbie
Rank
Newbie
Ozzuneoj wrote:

So far the only display technology I have heard of that is available and being improved that can bring back the capabilities of a CRT is laser projection. I guess it does a pretty nice job, but I've never seen it myself and at this point I think resolution and expense are the main issues with it. Most likely it requires some distance for the display too which brings us back to larger monitors or at least needed a lot of space around them.

The problem with laser projection is that physical deflection using mirrors is orders of magnitude slower than the electromagnetic deflection of an electron beam in a CRT.
There is also difficulty scanning the image in straight horizontal lines too.

Ozzuneoj wrote:

A 2000Hz OLED screen could do it too, but unless your games are running at 2000Hz, you'll still see persistence.

The annoying thing is that a lot of this problem is just software.
You could build an OLED display that lets you specify the number of lines that you want held from 1 to the panel's maximum vertical resolution, and that would be a way to do a minimal latency low-persistence mode.
As you reduce the number of lines being held on-screen at once, you could drive the panel harder to compensate for the loss of brightness, since the overall power consumption will be lower.

Ozzuneoj wrote:

And yes, variable refresh rate is probably a very cool thing to experience, but it simply can't work with low persistence displays at this point. Considering the advantages of just having higher frame rates, I find low persistence to be the more valuable of the two technologies at this point, which is why I chose the BenQ screen rather than saving up for a Gsync model.

Depends what you play. G-Sync is very nice to have for modern games with their highly variable framerates, or old games that run at non-standard framerates that most displays won't sync to now.

Ozzuneoj wrote:

To keep all of this on the subject of "old games" (since I put it in Marvin for that purpose), playing emulated 2D games on an LCD is a good way to see the shortcomings of modern displays. Just watch anything with fine, pixely details. If you have access to a CRT of any kind, just try it and you'll be amazed at the terrible loss in motion clarity we've grown accustomed to over the past 15 years.

I agree wholeheartedly. It's really disappointing that the only way to get good motion performance in many games is to dig up an old CRT. Even the best of these low-persistence LCDs are far from ideal.

Standard Def Steve wrote:

What about plasma? I've noticed that plasma TVs have a very CRT-like flicker, mostly noticeable when displaying a completely white screen.

Plasma does flicker, but they usually have a duty cycle of about 40-50% while CRTs are more like 5%.
So they're better than a flicker-free LCD or OLED, but noticeably worse than a CRT.

Plasmas also have some very undesirable properties due to the response time of the phosphors used, and the fact that their pixels can only be switched on or off instead of displaying discrete values.
They make up their shades by driving the display at rates like 600Hz and using temporal dithering which means that the image can break up into separate color images similar to the problems that DLP displays had.
I find Plasmas very uncomfortable to watch, but have no problem with CRT flicker.

Reply 6 of 23, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
Deep Thought wrote:
Plasma does flicker, but they usually have a duty cycle of about 40-50% while CRTs are more like 5%. So they're better than a fl […]
Show full quote
Standard Def Steve wrote:

What about plasma? I've noticed that plasma TVs have a very CRT-like flicker, mostly noticeable when displaying a completely white screen.

Plasma does flicker, but they usually have a duty cycle of about 40-50% while CRTs are more like 5%.
So they're better than a flicker-free LCD or OLED, but noticeably worse than a CRT.

Plasmas also have some very undesirable properties due to the response time of the phosphors used, and the fact that their pixels can only be switched on or off instead of displaying discrete values.
They make up their shades by driving the display at rates like 600Hz and using temporal dithering which means that the image can break up into separate color images similar to the problems that DLP displays had.
I find Plasmas very uncomfortable to watch, but have no problem with CRT flicker.

OMG. I knew I was seeing some minor rainbow effect on my F8500 whenever I'd move my eyes across the screen.
Very interesting info, thanks. 😀

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 7 of 23, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Could do a retro type review of your BenQ XL2720Z?

Basically looking at anything that relates to old games, like 4:3 aspect ratio modes, refresh rate support. I've been looking at this screen for a long time, I think it can do 100+ FPS at 1024 x 768 over VGA, but nobody ever tests these things during reviews 😀

YouTube, Facebook, Website

Reply 8 of 23, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
PhilsComputerLab wrote:

Could do a retro type review of your BenQ XL2720Z?

Basically looking at anything that relates to old games, like 4:3 aspect ratio modes, refresh rate support. I've been looking at this screen for a long time, I think it can do 100+ FPS at 1024 x 768 over VGA, but nobody ever tests these things during reviews 😀

I will try to do this when I get time. 😀

I will say, I do very much appreciate the 4:3 modes. It has ones that emulate a 17" or 19" CRT... however, on that same note, I find it odd that it has a 21" screen mode for 5:4 resolutions, but not 4:3, as there is plenty of space to make a 4:3 resolution larger than a 19" screen. Also, in many situations it seems to disable "aspect" mode, which would give me quite a huge 4:3 screen. Coming from a 21" CRT (probably 19-20" viewable) I would appreciate having that extra bit of screen size back, especially since this monitor sits back quite a bit farther than my CRT did. Playing older games at 640x480 or lower seems to work fine aside from having to choose a smaller 19" display size. I may still be able to figure something out to fix that though.

It is pretty awesome to have little to no motion blur, no matter what input or source I'm using. As mentioned before, I don't believe there are any other monitors that can use any method of blur-reduction at 60hz (which is necessary for old games and systems). This flexibility gives the monitor a very CRT-like feel. The only problem is strobe crosstalk, which can be tinkered with using software or hidden factory menus... and its an unavoidable problem with any screen that uses a strobing backlight to reduce image persistence. After toying with it a bit, it is easy to ignore. If I got used to having two faint black lines on my old shadow mask CRTs, I can live with some crosstalk artifacts to completely get rid of image persistence. 😀 Still, if the crosstalk artifacts bother you and you find that image persistence doesn't bother you as much at super low resolutions, you can just use a different preset with Blur Reduction disabled to get rid of crosstalk.

Now for some blitting from the back buffer.

Reply 9 of 23, by candle_86

User metadata
Rank l33t
Rank
l33t

is the blurring really all that tragic, i see it as free AA which CRT's dont give you. If you can precive the blur you get free AA in game

Reply 10 of 23, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

And the AA gets even "better" in non-native resolutions. I don't see a way how to exploit the potential while being agnostic to rendering.
It is not at all tragic for 99% of people. Some will get paranoid over delayed response, very few are fast enough to be affected.
But I agree blurriness has to go.

Reply 11 of 23, by Scali

User metadata
Rank l33t
Rank
l33t

One advantage of LCDs over CRT that I don't see mentioned much, if at all, is that the pixels don't move.
On CRTs, scanlines will stretch horizontally depending on the brightness of the pixels. So you will get some distortion. LCDs don't have this at all.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 12 of 23, by Jo22

User metadata
Rank l33t++
Rank
l33t++

That's exactly what I was excited of when LCD monitors were the new thing in town.
Pros:
- lower power draw
- lightweight, thinner
- flicker-free at 60Hz (no more messing with refresh rates)
- "no" radiation
- less eye strain
- perfect pixels
- better for the environment (less poisonous parts inside)
- only little heat generation (you could put an LCD safely near the wall)
- less RF noise
- can't be damaged by wrong video modes/demoscene stuff (no more broken flyback transformers)
- longer life span (ok, in practice things were different)
- the coolness factor

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 13 of 23, by Deep Thought

User metadata
Rank Newbie
Rank
Newbie

I don't know that I'd call CRTs "blurry".
They generally have a sharper appearance than using non-native resolutions on an LCD for example.
I'd say that CRT images are smooth because of the Gaussian beam profile, where pixels are made up of nice smooth circles, rather than hard-edged squares.

Scali wrote:

One advantage of LCDs over CRT that I don't see mentioned much, if at all, is that the pixels don't move.
On CRTs, scanlines will stretch horizontally depending on the brightness of the pixels. So you will get some distortion. LCDs don't have this at all.

I wouldn't necessarily call that an advantage if you're playing old games.
The scanlines and "bloom" that you get when displaying low-res games on a CRT has an "anti-aliasing" effect.

For high-res, modern games, flat panels are absolutely preferable as far as image sharpness/resolution is concerned.

Jo22 wrote:

That's exactly what I was excited of when LCD monitors were the new thing in town.
Pros:
- flicker-free at 60Hz (no more messing with refresh rates)

Being flicker-free is why these displays have so much motion blur.
Most of your other concerns are either unfounded, or overblown. Power consumption was actually not a problem with most CRTs. CCFL LCDs got just as warm.

Reply 14 of 23, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

the technique to reduce blur on LCDs that started with the 3D Vision monitors is really interesting, but being tied just to Nvidia kills it for me, is this benq blur reduction method independent of driver support? just on the monitor side? that would be a nice thing to have, specially if it works with VGA connection.

Reply 15 of 23, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:

the technique to reduce blur on LCDs that started with the 3D Vision monitors is really interesting, but being tied just to Nvidia kills it for me, is this benq blur reduction method independent of driver support? just on the monitor side? that would be a nice thing to have, specially if it works with VGA connection.

Yes, as I said I can use it on my retro systems at 60Hz. Which is what makes it special vs. other LCDs that have low-persistence modes.

There are some expensive TVs that do it as well, and they have no connection to nvidia.

Nvidia's Lightboost just sort of got monitor manufacturers to start thinking about the technology I believe. BenQ Blur Reduction is done on the monitor itself.

Now for some blitting from the back buffer.

Reply 16 of 23, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Deep Thought wrote:
Jo22 wrote:

That's exactly what I was excited of when LCD monitors were the new thing in town.
Pros:
- flicker-free at 60Hz (no more messing with refresh rates)

Being flicker-free is why these displays have so much motion blur.
Most of your other concerns are either unfounded, or overblown. Power consumption was actually not a problem with most CRTs. CCFL LCDs got just as warm.

1) Cool, I think we should be happy about this. 😀

2) I guess I should be grateful that only "most" of my other concerns are either unfounded, or overblown.. 😁

3) I disagree, my experience was different. I had got a 20" computer monitor a long time ago and its power draw was about 100 watts.
Maybe that was because it was an very old professional model. It think it was previously used for computer tomography or CAD.
Can't remember anymore. We got this thing in the early 90s. Anyway, a TFT at about the same size tops at about ~40 watts nowerdays.
And those models with LED backlight should be even more power efficient now.
I'm not saying that it is always this way, but I think CRTs and LCDs can differ significantly in this regard.

4) Can't confirm this. My 15"LCD monitor, a NEC LCD 1550ME gets barely warm after hours of continuous operation.
Heat generation is lower than the one of my external HDD or my 14" portable TV (CRT). It's quite an old model with CCFL backlight (2002).
The only other thing comparable in terms of heat generation is my old 9" monochrome monitor which I do use for my hercules card(s).
It does, however, only contain one tube instead of three. So I don't know whether this is a fair comparison or not (besides the differences in size).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 17 of 23, by SquallStrife

User metadata
Rank l33t
Rank
l33t
Deep Thought wrote:

Sorry, but that looks dreadful! 🙁

VogonsDrivers.com | Link | News Thread

Reply 18 of 23, by Deep Thought

User metadata
Rank Newbie
Rank
Newbie
Jo22 wrote:
3) I disagree, my experience was different. I had got a 20" computer monitor a long time ago and its power draw was about 100 wa […]
Show full quote

3) I disagree, my experience was different. I had got a 20" computer monitor a long time ago and its power draw was about 100 watts.
Maybe that was because it was an very old professional model. It think it was previously used for computer tomography or CAD.
Can't remember anymore. We got this thing in the early 90s. Anyway, a TFT at about the same size tops at about ~40 watts nowerdays.
And those models with LED backlight should be even more power efficient now.
I'm not saying that it is always this way, but I think CRTs and LCDs can differ significantly in this regard.

The wattage on the back of a CRT was it maximum power draw, which it would only pull when displaying an all-white screen at maximum contrast.
Average power draw was much lower since it varied depending on the picture content.
The latest LED backlit LCDs are indeed more efficient, but there actually wasn't much between CRTs and early LCDs in my experience. Manufacturers were being intentionally misleading about it.

Jo22 wrote:
4) Can't confirm this. My 15"LCD monitor, a NEC LCD 1550ME gets barely warm after hours of continuous operation. Heat generation […]
Show full quote

4) Can't confirm this. My 15"LCD monitor, a NEC LCD 1550ME gets barely warm after hours of continuous operation.
Heat generation is lower than the one of my external HDD or my 14" portable TV (CRT). It's quite an old model with CCFL backlight (2002).
The only other thing comparable in terms of heat generation is my old 9" monochrome monitor which I do use for my hercules card(s).
It does, however, only contain one tube instead of three. So I don't know whether this is a fair comparison or not (besides the differences in size).

I think that being a 15" display skews the results somewhat. Larger LCDs used to put out a decent amount of heat.

Reply 19 of 23, by h-a-l-9000

User metadata
Rank DOSBox Author
Rank
DOSBox Author

> but there actually wasn't much between CRTs and early LCDs
Got any real measurements to back up your statement? Since the LCDs have less volume they can emit less heat. If they were consuming the same power as a CRT they would be getting quite hot.

The 15 inch CRT were pulling 60, 70 watts (measured - it depends a lot on the beam current, thus the amount of light it currently shows). A late 22.5 inch monster was around 120 watts.

A 19 inch 4:3 CCFL LCD from around 2005 I have here takes ~25 watts.

> Most of your other concerns are either unfounded, or overblown.
Do you bother to give a reason why they are invalid? For me these are valid concerns. It rather appears you are blinded by love to CRTs or something.

1+1=10