VOGONS


First post, by bZbZbZ

User metadata
Rank Member
Rank
Member

So for awhile now I've been using my 19" CRT as my preferred retro gaming monitor, and I've primarily been using it at 1280 x 1024 @ 85 Hz. The monitor also supports 1600 x 1200 @ 75 Hz but to my eye 85 Hz looks noticeably smoother than 75 Hz.

Lately I've experimented with 640 x 480 and 800 x 600 both at 120 Hz. These are spectacular in some situations such as:

  • Games that are locked to 60 fps, which is many of them, seem to play flawlessly at 120 Hz which is of course an integer multiple of 60. But I get to avoid the eyestrain (I don't know how I survived 60 Hz on CRTs back in the day).
  • Games with unlocked framerates that can actually run at up to 120 fps (eg UT99)

Yesterday, I tried an experiment: 1024 x 768 @ 120 Hz. And... it worked?

I'm running Windows 10 on a Ivy Bridge + GeForce 980, with the CRT connected through native VGA (DVI-I port). I've used Custom Resolution Utility in the past, but at this time I'm just using the nVidia Control Panel to set up custom resolutions. In fact, the nVidia tool gives me the following if I leave the timing settings on "automatic"

My monitor's manual seems to suggest that 1024 x 768 is only supported at up to 116 Hz with a maximum horizontal rating of 96 kHz. I do see that as a result of a 1-line front porch and some other settings that I don't fully understand, the nVidia tool is keeping my horizontal refresh rate very close to 96 kHz...

So my questions to you folks:

  1. Do you think that this setting is "safe" for the monitor? Should I reduce some settings (horizontal front porch??) to bring the horizontal refresh below 96.00 kHz?
  2. If I use CRU to delete all refresh rate permutations of 640 x 480, 800 x 600, and 1024 x 768 EXCEPT FOR 120 Hz, which will force 120 Hz whenever those three resolutions are selected by any application... is that crazy? I run modern flat panels at 120 Hz even if they support 144 Hz and I consider it "the answer" for most everything (eg 30fps YouTube videos, 24fps movies, 60fps console emulation) due to 120 being an integer multiple of 30, 24, and 60...

Attachments

Last edited by bZbZbZ on 2024-08-20, 02:45. Edited 2 times in total.

Reply 1 of 10, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

Be careful of games which have issues when running at more than 60 FPS. This usually affects console ports, so Splinter Cell and the Star Wars KOTOR games are notable examples. Check the PC Gaming Wiki for each game that you intend to play at high refresh rates.

On the plus side, you can cap the games to 60 FPS while still using a 120 Hz refresh rate. This can be done from the Nvidia driver panel or via third-party apps.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 2 of 10, by badmojo

User metadata
Rank l33t
Rank
l33t

My understanding is that running a CRT at high Hz is effectively overclocking its components, which means more heat and ultimately will reduce their lifespan. Same as any overclock.

But in saying that I still do it to mine. It's much easier on the eyes and my gaming sessions are short and infrequent - I've been using the same monitor for 10 years and it still works great.

Life? Don't talk to me about life.

Reply 4 of 10, by Tiido

User metadata
Rank l33t
Rank
l33t

The monitor will certainly make more heat as your are using it at a higher scanrate, the amount of losses per each scan operation stay same but you have more of these operations.

Higher framerates are more or less same to the monitor as higher resolutions. The blur at high resolutions and/or refresh rates is a result of the limitations of video path bandwidth and not all monitors are made equal on that regard. Some have higher bandwidth video path than others and will be able to provide a sharp image at higher refresh rates and/or resolutions compared to others. Sometimes this is specified, but always.

Now as far as the 96kHz goes, you can reduce a few lines from frame porch timings and it should fit into the limit then but you only bother if the monitor actually prevents you from displaying the resolution, if it doesn't then you don't really have to do anything more but it is beneficial to reduce porches as much as possible because :

A) It utilizes video signal bandwidth better, which plays into the blur aspect since the pixels effectively get longer and thus won't need as much bandwidth to properly display.
i) Video card has effectively less pixels to output, this reduces the amount necessary from VRAM bandwidth to simply display the image which can increase your in-game frame rate performance a little.
1) You can get to higher framerates on the monitor within the line rate limitations if stars are being chased.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 5 of 10, by bZbZbZ

User metadata
Rank Member
Rank
Member

Thanks for the replies. I'm using RTSS to cap framerates in games (the Scanline Sync feature, which works in Windows 10 but not XP, is amazing). Yes, 1600 x 1200 (75 Hz or 60 Hz) is slightly softer on this monitor. 1024 x 768 at 120 Hz is amazingly sharp and fluid.

So I edited the custom resolution and changed the Timing from Automatic to Manual.
I decreased the Horizontal Front Porch from 48 to 16.
I decreased the Horizontal Sync Width from 96 to 32.
I decreased the Total Pixels from 1376 to 1280 (96 being the sum of the subtractions from front porch and sync width, my intention being no change to back porch).

Despite the 96px decrease in total horizontal width, the stated refresh rate stays at exactly 96.96 kHz.
Testing and applying the settings results in crystal clear 1024 x 768 at 120 Hz, just as flawless as the Automatic settings. I can't tell the difference.

Any idea what the changes actually did? Is 1280 total horizontal pixels in any way better than 1376?

Reply 6 of 10, by Tiido

User metadata
Rank l33t
Rank
l33t

Decreasing the horizontal timings should have made the line rate go up since you were effectively making the lines shorter (and should have triggered monitor's out of range message etc.), but it appears nothing changed.

What does the monitor's OSD say about the input resolution ? Most tell horizontal and vertical rates and sometimes a resolution when it matches something they know. I now vaguely recall that on my GTX970 I had to jump through a hoop to get a custom resolution going before it actually did the resolution wanted instead of scaling what I wanted, into something the video card thought the monitor could do instead (i.e 2560x1600 was wanted but it actually output 1280 x 800 instead for some reason).

The image in first post appears not to work anymore. vagely recall 133MHz pixel clock, and with 1376 pixels per line it will give 96.65697....kHz line rate which checks out, 1280 would give 103.90625kHz which should be out of range.
Pixel clock / pixels per line = Line rate
Line rate / Lines per frame = Frame rate

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 7 of 10, by bZbZbZ

User metadata
Rank Member
Rank
Member

Thank-you so much for explaining the math to me. I have a much better understanding of how this works now.

You're right, the Automatic settings used a Pixel Clock of 133Mhz combined with a horizontal refresh of 96.96 KHz, which based on 808 total vertical lines yields 120 Hz vertical refresh rate.

From the formulas, if I demand 120 Hz vertical refresh and there are 808 vertical lines, the horizontal refresh rate MUST be 96.96 KHz (120 * 808 = 96960). When I reduce the total pixels per line from 1376 to 1280, the Pixel Clock decreased from 133 MHz to 124 MHz.

Furthermore, I have now done an additional test where I decreased the vertical total pixels (lines per frame) from 808 to 800. I suspect that this decreases the vertical back porch (not directly editable in the tool) by 8px. I did this with the aim of mathematically forcing the horizontal refresh rate to 96.00 KHz. This has the side effect of further decreasing the Pixel Clock to 122.8 MHz. This tested fine, just as clear as the other trials.

As an aside... when I fool around with the Custom Resolution tool (and look at CRU), I see that the highest officially supported resolutions (1280 x 1024 @ 85 Hz, 1600 x 1200 @ 75 Hz) seem to specify back porch in excess of 200 px... which ends up pushing pixel clock to beyond 150 MHz. Pixel Clock is not mentioned anywhere in the monitor's user manual. I'm not sure what pixel clock is "safe" for the monitor, or if I've been running it >150MHz for all these years anyway...

Reply 9 of 10, by Tiido

User metadata
Rank l33t
Rank
l33t

Pixel clock doesn't matter to the monitor at all, it only cares about the sync timings that are derived from it.

Video card works off that pixel clock though. There video timings can be generated from the pixel clock, and also the other way but the PLL that generates it, does have percision limits so it cannot generate any clock but usually in some sort of steps so you may not be able to get the exact timings needed but for a CRT it doesn't matter very much as long as the resulting timings are within its operating limits.

The porches are as big as they are in higher resolutions because of the retrace limitations. When the beam has reached the right edge of screen, it needs to go back to the left side and this takes fixed amount of time. The porch+sync period is when that happens and when this time is too short you will begin to see compression and other weird things at the image edges because the retrace isn't yet complete but new image data is already wanted to be displayed.

Diffenent monitors have different retrace speeds, some are faster than others and sometimes it is mentioned in the specs too, usually in microseconds.
Retrace time in µs = (1 / Pixel Clock in MHz) * Porch+Sync pixels

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 10 of 10, by bZbZbZ

User metadata
Rank Member
Rank
Member

So instead of relying on the nVidia control panel, I tried using Custom Resolution Utility again.

I unchecked all the "established resolutions" and deleted all existing custom and standard resolutions.

I added the following standard resolutions only:
1024 x 768 120Hz
640 x 400 70Hz
640 x 480 120Hz
800 x 600 120Hz
1280 x 1024 85Hz
1600 x 1200 75Hz

Since 1024 x 768 120Hz is at the top of the list, Windows thinks it's the native resolution. I set Windows 10 desktop to this resolution.

The goal is that any time the monitor is set to 640x480, 800x600, or 1024x768, the refresh rate will be 120Hz. Any game that is locked to 60fps or 30fps should run with frame doubling or quadroupling. Any game with an unlocked framerate should run at 120fps given enough hardware thrown at it. I use RTSS to limit framerate (either full refresh or half refresh) on a per game basis, usually using Scanline Sync with VSYNC off.

So far everything I've tried that has a 60fps frame limit cap seems to run just fine at 120 Hz:

  • PCEm running Windows 95 seems to run great, and Win95 games (which all run at 1024 x 768 or below and thus the monitor never leaves 120 Hz) stutter less than they did when I was running the monitor at 85Hz.
  • PCSX2 runs awesome, as PS2 games generally are 60fps. I like to run PCSX2 at 2x internal resolution but downsampled to 640 x 480 monitor resolution, de-interlaced (or in progressive mode for games that support 480p).
  • Zelda 64 Recompiled looks great at 640 x 480 120Hz and runs at 120fps
  • Modded versions of Need For Speed Underground 2 and UT2004 look great at 1024 x 768 120Hz and run at 120fps

Still haven't tried, but intend to soon:

  • Many games in DOSBox which are 640 x 480 or 640 x 400
  • Dolphin Emulator