VOGONS


First post, by eddman

User metadata
Rank Oldbie
Rank
Oldbie

How and why certain refresh rates were chosen, especially in the DOS era, e.g. 70, 72, 85, etc. They look rather arbitrary; why not 74 or 86 instead? Are there any technical reasons hardware developers ended up with those specific values?

EDIT: I'm not asking why there isn't more of them. I'm asking why those few, specific values were selected.

Last edited by eddman on 2025-07-16, 23:00. Edited 2 times in total.

Reply 1 of 7, by onethirdxcubed

User metadata
Rank Newbie
Rank
Newbie

The main reasons for non 60hz refresh rates were to reduce flicker (above 60 hz) or fit within memory bandwidth limitations (below 60 hz or interlaced which was terrible to look at).

Also for VGA, they wanted to keep the horizontal line rate fixed for non-multisync monitors, and minimize the number of fixed-frequency crystal oscillators needed for different resolutions. Some early super VGA cards had six or more crystal oscillators on them. This was later replaced by digital PLLs which could be set to any needed frequency.

Reply 2 of 7, by auron

User metadata
Rank Oldbie
Rank
Oldbie

72 hz seems an oddball mode in that in VESA modes its hfreq and pixel clock is actually the same or higher compared to 75 hz, maybe this mode accomodates certain early video hardware or monitors. in other terms 72 hz is useful for ideal 24 hz movie playback, but it seems quite unlikely that this was thought of here.

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market. the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

also 1024x768 at 256 colors 43.5 hz interlaced was a thing with the 8514.

Reply 3 of 7, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie

The largest CRT I ever had was 19". Would 85 Hz have been of more interest on extremely large ones like 21"?

Reply 4 of 7, by auron

User metadata
Rank Oldbie
Rank
Oldbie

it's probably more about content than size, the biggest difference is when looking at a white screen, which would have been quite relevant for word documents or spreadsheets. but for most game/movie content i would say that 60 hz is sufficient enough and in fact the PAL standard had to make do with only 50 hz.

VBE 3.0 cards give the option to force higher refresh rates in DOS games, but i rarely find myself to actually bother with doing that.

Reply 5 of 7, by eddman

User metadata
Rank Oldbie
Rank
Oldbie
auron wrote on 2025-07-16, 22:29:

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market.

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

auron wrote on 2025-07-16, 22:29:

the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

I actually wanted to mention this. Were software developers basing the fps on the available display refresh rates, or was it the opposite, where the hardware devs based it on common software fps?

Reply 6 of 7, by auron

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2025-07-16, 22:58:
auron wrote on 2025-07-16, 22:29:

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market.

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

would you rather see 74/87/102 in a drop-down list, or 75/85/100? with how quickly things progressed, the most obvious explanation is that past 72 hz instead of technical reasons they just went with easy to read numbers, with 144 going back to the aforementioned 24 hz and 120/240 hz being particular sweet spots, accommodating both 60 and 30 fps content in addition.

1936x1089 didn't quite catch on either, despite being a proper 16:9 resolution as well...

eddman wrote on 2025-07-16, 22:58:
auron wrote on 2025-07-16, 22:29:

the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

I actually wanted to mention this. Were software developers basing the fps on the available display refresh rates, or was it the opposite, where the hardware devs based it on common software fps?

for the two mentioned cases, it's possible they bet on 75 hz being a common default refresh rate that would offer less flicker than running at 30 fps and targeting 60 hz instead (120 hz would be possible in theory but uncommon at the time), and in the former case there is a PAL region developer console lineage as well. this is conjecture though, who knows if they actually cared about motion smoothness in that way - jazz2 developers certainly didn't seem to, for one.

as another example, doom capping to 35 fps (half of VGA's 70 hz) was definitely a good move. why not run at the full 70 fps? maybe it allowed them to simplify a few things in the code and even high-end machines in 93-94 didn't reliably attain even a constant 35 fps anyway. otherwise, entering win95 you would expect devs to not cap their game logic to a certain fps as unlike DOS you couldn't know what the user was running at, although there were definitely games that still expected 30 fps max and would run too fast above that.

Reply 7 of 7, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2025-07-16, 22:58:

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

I only understand a bit of this because of messing with old XFree86 recently and the video timing howto (https://tldp.org/HOWTO/XFree86-Video-Timings- … OWTO/specs.html)

The relevant numbers in a Modeline are HTotal and VTotal. These are not just the resolution but extra time involved for non-displayed "pixels" like blanking interval, front porch back porch, sync pulses, etc.
Basically old cards only have a discrete set of dot-clocks (in MHz).
The horizontal frequency is the dot clock divided by HTotal, and old monitors can often only handle one or a small set of discrete ones (such as 31.5 kHz, 35.15 kHz, and 35.5 kHz per xf86config) too.
HTotal has to be a multiple of 8 (usually?).

So a standard VGA card has only 25.175 and 28.32 MHz dot clocks on the card.
That gives two possible HTotal values to maintain the HTotal*31.5 kHz=dot clock: 800 and 900

So XFree86's default 640x480 @ 60 Hz uses a dot clock of 25.175, and HTotalxVTotal of 800x525; that maintains 25175000/800 = 31468.75 Hz HorizSync and produces 25175000/800/525 = 59.94 Hz VertRefresh.
720x400 text mode and friends goes along with the 28.32 clock.
800@600 @ 56 Hz uses a dot clock of 36 and maintains a HorizSync of 35.15 kHz, and HTotalxVTotal of 1024x625 producing 56.25 Hz. Looks like 35.5 kHz went with 1024x768 @ 87 interlaced Hz.

It's interesting that the padding added to the displayed resolution to obtain HTotal often ends up being the displayed resolution of the next highest resolution... not sure if there was some circuitry reason for that.
Back to the question - in this restricted environment where the dot clock, and the ratio of the dot clock to HTotal have to be constrained to discrete values, and then VTotal correspondingly has to be within some restricted range to maintain the desired aspect ratio and blanking/porch/sync timings - that would naturally produce some refresh rate. So the question is did they pick a desired refresh rate and work backward - or pick the dot clock and HorizSync rate and then determine what refresh rates were possible? were particular dot clocks or HorizSync rates desirable for physics or economics reasons like the availability of oscillators at that frequency?

I'm not sure what dot clocks or horiz sync frequencies they may have wanted to avoid either; for example the Cirrus 542x data book suggests avoiding frequencies that are close to the memory frequency for interference reasons, and if you are close to the memory frequency, there is an option to just use the memory frequency as the dot clock rather than trying to generate a second one.