VOGONS


First post, by DrLucienSanchez

User metadata
Rank Newbie
Rank
Newbie

Hi

I currently run a Sun Microsystems 16" CRT Monitor - PN17J0 - and usually game at 1024x768 at 85Hz, but after runing some demos that defaulted to 640x480, it defaulted to an optional refresh rate of 150Hz, and Windows 98 display properties lets me go anyweher from 60-150Hz at this resolution.

I prefer gaming at this resolution, the higher refresh the better and plan on staying at 150Hz, if I can. The monitor was purchased new old stock 2.5 years ago, but I'm concerned at any additional wear and tear at this high refresh depite the low resolution.

So just asking realy, will running this long term put stress/strain on the electron guns, caps, or should I lower to 120Hz.

Episode One rcaer at 640x480 150Hz, the motion clarity and picture if amazing, but don't want to really play russian roulette with the monitor.

Classic rig - MS6156 Ver 1.0 Bx7 Slot1 Motherboard - Pentium II Deschutes 400Mhz, 320MB PC100 RAM, 20GB SATA Toshiba 2.5 via IDE/SATA converter, Intel i740 8Mb AGP, Sun Microsystems 16" CRT Monitor - PN17J0 CRT monitor

Reply 1 of 6, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t

Your best bet is to dig up the manual of that monitor and check if those refresh rates are officially supported.

For example, the manual for my 17" Samsung SyncMaster states that it supports 640x480 @ 160Hz and 800x600 @ 120 Hz. I've been using it with those refresh rates for years, both back in the mid 2000s, and now for retro gaming.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 2 of 6, by Tiido

User metadata
Rank l33t
Rank
l33t

Higher refresh rate is same to the monitor as running at higher resolution. The deflection system will run with bigger losses so power consumption is higher and with it the heat generation, which is what ultimately makes some parts (such as capacitors that are sensitive to temperature) fail sooner.

For the kinescope itself, only the duration of power on hours matters (when cathodes are being in use). The cathodes will wear out over time and their emission lowers, leading to a dimmer image (since the beam will have less electrons in it) and eventually image that cannot be focused well anymore since center of the cathode area has got depleted and the larger edge area is left and built in focus mechanism will not be able to make a fine spot anymore. Dim and soft image is a sign of a worn tube that will never be as good as it used to be.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 3 of 6, by DrLucienSanchez

User metadata
Rank Newbie
Rank
Newbie

Thanks for the info, very imformative!

I can't find a manual for this, but have found some specifications - https://shrubbery.net/~heas/sun-feh-2_1/Devic … T.html#365-1417

Resolution 640x490 to 1280x1024, although I can go as high as 1600x1200 at 60hz. The supported resolutuion section gives a max Vertical Frequesncy of 85Hz, so previouly I have ran this at 1024x768 at 85Hz so that would be accurate.

However, this part here -

Horizontal Scan 30 khz - 85 khz
Vertical Scan 50 hz - 160 hz

Correct me if I'm wrong, but as the max Vertical Scan is 160Hz then this would indicate support up to 160Hzand be within the specification?

Yes, it does feel warmer to the touch at the ventilation holdes at the top of the monitor, so I may settle at no more than 120Hz, but my god, it looks and feels amazing, especially after i've been using a 75Hz 1440p monitor!

Classic rig - MS6156 Ver 1.0 Bx7 Slot1 Motherboard - Pentium II Deschutes 400Mhz, 320MB PC100 RAM, 20GB SATA Toshiba 2.5 via IDE/SATA converter, Intel i740 8Mb AGP, Sun Microsystems 16" CRT Monitor - PN17J0 CRT monitor

Reply 5 of 6, by mkarcher

User metadata
Rank l33t
Rank
l33t

Regarding wear on the deflection electronics, the main factor is the horizontal scan rate. The amount of energy dissipated in the deflection circuit is approximately constant per scanline. So if you scan twice as fast, the circuit dissipates the same amount of energy twice as often, and produces twice as much heat. Regarting the tube itself, Tiido is correct in pointing out that the scan rate is irrelevant. On the other hand, it is likely not just power-on hours that matters, but also the brightness of the picture. A higher beam current (brighter picture) puts more stress on the phosphor ("burn in") and the cathode.

Vertical refresh rates basically don't matter.

The advice to stay away from the highest rates to get a sharper image mainly depends on the pixel clock. At lower resolutions, bandwidth limits are not an issue at all.

To prolong the life of the tube, stay clear of maximum brightness unless needed. Obviously, pixel clock, horizontal scan rate and pixel clock are related to each other, depending on the resolution. Wear at 640x480 at 150Hz is likely similar to wear at 1280x1024 at 70Hz (approximately twice as much lines at half the refresh rate means around the same number of lines per second). 1024x768 at 85Hz should be in the same ball park, perhaps slightly less demanding. So if you feel comfortable to run 1024x768 at 85Hz (typically around 69kHz horizontal scan rate), 640 x 480 at 140Hz should be fine, too. 150Hz might raise the horizontal scan rate to around 75kHz, which is also used for 1600x1200 at 60Hz, so close to the maximum of your monitor. You might want to stay slightly below that to reduce stress and wear.

You can likely check you monitor's OSD to find out the actual horizontal scan rate used by your card.

Reply 6 of 6, by Tiido

User metadata
Rank l33t
Rank
l33t

I totally forgot that the absolute brightness makes a difference in lifetime too, aswell as potential burn-in. I have seen several monitors with taskbar imprint burned in the bottom of the screen and I have started to use the "hide taskbar" option in windows as I actually noticed that there was a faint imprint of it starting to form and turn down the contrast setting in normal use unless I really do need things to "pop". For burn-in what matters is the absorbed energy in any given phosphor spot so static bits such as a taskbar and window edge of a maximized one will have highest potential of burn-in over prolonged use. Nowdays it has become popular to use dark themes, which will benefit OLEDs and your retinas, they also will benefit the kinescopes. (They also reduce power use on non-LCD displays).

As far as line deflection goes, since inductances are fixed and they limit the current build up and thus scan speed for given voltage, for higher resolutions there needs to be some magic done to get the scan happen faster and with the increasing freq there are increasing losses from skin effect (especially in the linearity coil or coils in better monitors, switched by a relay between some resolutions to maintain linearity) and to lesser extent core losses. Combination of changing deflection supply voltage and some magic in EW correction are used to get roughly screen wide scan as freq changes but there's definitely much increased losses with rising freq. Double the scan freq will have more than double the energy use and increasing with the freq from the loss mechanisms. Maybe one day with superconducting wire that can handle this stuff, it will be possible to have these sort of losses dramatically minimized 🤣

Quick and dirty way to get an idea what the ballpark line freq is for a given resoluton is to multiply the line count and frame rate and for a bit closer to reality add ~10% more to account for blanking (GTF is a term to search if something more accurate is desired). This works backwards too, if you know your monitor's maximum line rate, you get maximum framerate by dividing that number with total number of lines for the resolution. This ignores any limits the monitor might impose, i.e my Nokia has 110kHz max line rate, but only allows 150Hz max frame rate, even though deflection system itself is capable of 209Hz for 640x480(525 lines with blanking included).

Image itself will be better in lower pixel clocks, due to the limits of cathode amps themselves. They will not be able to pass whatever is fed to them and as pixel clock increases image will get softer and softer horizontally as input bandwidth approaches or goes past the bandwidth of the amplifier itself. Sometimes there are some modifications one can do to improve video amp bandwidth, in case of that said Nokia, I got rid of some EMI filters and modified few other things and as a result I got significantly better output at the highest resolutions and/or framerates.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜