VOGONS

Common searches


Old EGA or Tandy Monitor Emulation

Topic actions

Reply 20 of 27, by GPDP

User metadata
Rank Newbie
Rank
Newbie

Indeed, the interlacing code was only added later. I meant to say the version I specifically used to make that shot was made with that version of the shader, although with a bit of code change to disable the interlacing.

Anyway, I first became aware of this phenomenon when I ported this shader for use with the OpenGL2 plugin for Playstation emulators (namely ePSXe and PCSX-R), which supports GLSL shaders similar to bsnes. It's kind of a rough port, since you have to specify a window resolution for the dot mask emulation to work, but it works. However, the OpenGL2 plugin allows you to set separate X and Y internal resolution "levels", and I noticed raising the internal Y resolution above native made the scanlines disappear just like with the version we're currently discussing. From playing with the shader code, I found out doubling the rubyTextureSize uniform on the Y axis alone value also does this (doubling it on the X axis results in less horizontal blur), although it does not correspond with a raise in internal resolution, of course.

So yeah, merely doubling the texture size alone does not produce the desired effect. Something else has to be done.

Reply 21 of 27, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

To make testing easier, I've moved to CRT-simple and removed all code except scanline emulation. To fix scanlines in 1280x800, uv_ratio needs to be divided by 2.0 as well. With low-res images, the texture size is doubled, with high-res it is not. Unfortunately this breaks 1920x1200, half of the image is darker than the other half, I guess the pixels are not centered somewhere. I have to rethink the math behind this as doubling the texture size and halving the distance between scanlines does not seem the most optimal way...

Attachments

  • Filename
    CRT-scanline.D3D.zip
    File size
    2.27 KiB
    Downloads
    126 downloads
    File license
    Fair use/fair dealing exception

http://www.si-gamer.net/gulikoza

Reply 22 of 27, by GPDP

User metadata
Rank Newbie
Rank
Newbie

Well, it works like you said. I get line-doubled scanlines both on the command prompt and on a low res game like Prince of Persia. I tried adding the change to the shader you shared before, but the scanlines disappear on the command prompt, while they are present on the game as intended. Perhaps I missed another change, but so far this is looking good to me.

Still, keep us posted on any further improvements.

Edit: Nevermind, just spotted the line-double code. Can't believe I missed it.

I'm gonna port this back to the GLSL versions. I am particularly looking forward to using this for PS1.

Edit 2: Actually, I'm noticing just from adding the doublescan code, it looks a bit blurrier than it should.

Reply 23 of 27, by GPDP

User metadata
Rank Newbie
Rank
Newbie

I did a quick port of the code for use with PS1 emulators, and I think it turned out really well. I made some adjustments to the code which I think helped the issue with blurriness. I know it's GLSL, but perhaps you can get some ideas from it.

Attachments

  • Filename
    CRT-Monitor.rar
    File size
    4.55 KiB
    Downloads
    131 downloads
    File license
    Fair use/fair dealing exception

Reply 24 of 27, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

What adjustments? 😀

Anyway, this perhaps is not the best thread, but since the discussion is already here...

I've spent some more time testing and figuring out the shader. The trick why:

uv_ratio = fract(ratio_scale) / 2.0;

works for 1280x800 is that normally the brightest part of the scanline is in the middle of the pixel (Gaussian function). This is exactly in between both pixels drawn so they are both the same distance from peak of the beam and both darker than they should be. Dividing uv_ratio by 2.0 shifts the peak of the beam to the upper pixel. This also changes the pattern in 1920x1200, where the original algorithm will draw:

dark
peak
dark
dark
peak
dark

to:

peak
lighter
dark
peak
lighter
dark

While this pattern might not be mathematically correct (peak is next to the darkest line), it does produces thinner (looks more like CRT to me) scanlines and might be a good compromise with the limited amount of pixels available. The problem is that this will sometimes produce a diagonal stripe across the image where the two triangles meet. I can't figure out why this happens...I've gone over all the code and it looks fine to me, the triangles are drawn using a triagnlestrip so they share the common vertices. It is not shader related as I have now noticed it is there even without the shader (set dosbox to 1920x1200, fullscreen and see how 'l' in 'Welcome' and 'p' in 'keymapper' are broken). From my extensive debugging I can only conclude that the triangles are interpolated differently and the coordinates won't match due to precision errors. I've tried passing pre-transformed vertices, used trianglefan...OpenGL probably does not have this problem as it can draw a GL_QUAD primitive.
As a workaround I've added coordinate rounding in the shader:

float2 ratio_scale = round((xy * doublescan * SourceDims - 0.5) * 12.0) / 12.0;

With these changes, the image is no longer darker with the shader compared to original (except 1280x800 where it will be slightly darker) and not broken diagonally.

Attachments

  • Filename
    CRT-scanline.D3D.zip
    File size
    2.46 KiB
    Downloads
    111 downloads
    File license
    Fair use/fair dealing exception

http://www.si-gamer.net/gulikoza

Reply 25 of 27, by GPDP

User metadata
Rank Newbie
Rank
Newbie

Ah, good to see you're still around! I was hoping you could help me with an issue I'm still having with this shader.

See, what I want to do is not only get one-pixel-wide scanlines, but also maintain just a tiny bit of horizontal blur. Thing is, as it stands, while I can get the shader to blur a bit, it only seems to blur to the left, and not to the right. A proper blur should work on both sides evenly.

I can get the desired effect on RetroArch, but no matter what I try, I cannot get it properly working on DOSBox. Thing is, the only way I get it on RetroArch is by taking advantage of its shader spec's outscale attributes, not through actual shader code. I can approximate it using the code you have provided, but again, it results in inconsistent blurring.

I have attached two pictures, one showing the desired behavior through RetroArch, and what I get trying to replicate it on DOSBox, along with the code used to achieve it. From the looks of it, it appears the DOSBox approximation shifts the image one pixel down and one pixel to the right for some reason. Could that have something to do with it?

Attachments

Reply 26 of 27, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

With these modifications, the troublesome part is:

float2 uv_ratio = frac(ratio_scale) / 2.0;

This shifts the peak of the "beam", but also affects the neighboring pixel influence as 1 - uv_ratio will always be >0.5. I don't like that, perhaps I'll try to find a better way. You could however try dividing uv_ratio with float2(1.0, 2.0) so it only affects the y-component.

http://www.si-gamer.net/gulikoza

Reply 27 of 27, by GPDP

User metadata
Rank Newbie
Rank
Newbie

Dude, that did the trick. Holy crap, thank you so much! In fact, that addition may have made it slightly better than the code I was using on RetroArch. This rocks so much. I'm going to port this over to both RetroArch and Pete's OpenGL2 spec. I'll make sure to credit you, of course. Once again, thank you.

Edit: Ok, that may have been a bit too much praising, since it still does shift the image downwards by one pixel, resulting in slightly more vertical blur, but it's hardly noticeable, so it still looks great. Other than that one minute detail, this is basically what I have been wanting to accomplish on a shader for a good while.