VOGONS


First post, by eddman

User metadata
Rank Oldbie
Rank
Oldbie

How and why certain refresh rates were chosen, especially in the DOS era, e.g. 70, 72, 85, etc. They look rather arbitrary; why not 74 or 86 instead? Are there any technical reasons hardware developers ended up with those specific values?

EDIT: I'm not asking why there isn't more of them. I'm asking why those few, specific values were selected.

Last edited by eddman on 2025-07-16, 23:00. Edited 2 times in total.

Reply 1 of 12, by onethirdxcubed

User metadata
Rank Newbie
Rank
Newbie

The main reasons for non 60hz refresh rates were to reduce flicker (above 60 hz) or fit within memory bandwidth limitations (below 60 hz or interlaced which was terrible to look at).

Also for VGA, they wanted to keep the horizontal line rate fixed for non-multisync monitors, and minimize the number of fixed-frequency crystal oscillators needed for different resolutions. Some early super VGA cards had six or more crystal oscillators on them. This was later replaced by digital PLLs which could be set to any needed frequency.

Reply 2 of 12, by auron

User metadata
Rank Oldbie
Rank
Oldbie

72 hz seems an oddball mode in that in VESA modes its hfreq and pixel clock is actually the same or higher compared to 75 hz, maybe this mode accomodates certain early video hardware or monitors. in other terms 72 hz is useful for ideal 24 hz movie playback, but it seems quite unlikely that this was thought of here.

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market. the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

also 1024x768 at 256 colors 43.5 hz interlaced was a thing with the 8514.

Reply 3 of 12, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie

The largest CRT I ever had was 19". Would 85 Hz have been of more interest on extremely large ones like 21"?

Reply 4 of 12, by auron

User metadata
Rank Oldbie
Rank
Oldbie

it's probably more about content than size, the biggest difference is when looking at a white screen, which would have been quite relevant for word documents or spreadsheets. but for most game/movie content i would say that 60 hz is sufficient enough and in fact the PAL standard had to make do with only 50 hz.

VBE 3.0 cards give the option to force higher refresh rates in DOS games, but i rarely find myself to actually bother with doing that.

Reply 5 of 12, by eddman

User metadata
Rank Oldbie
Rank
Oldbie
auron wrote on 2025-07-16, 22:29:

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market.

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

auron wrote on 2025-07-16, 22:29:

the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

I actually wanted to mention this. Were software developers basing the fps on the available display refresh rates, or was it the opposite, where the hardware devs based it on common software fps?

Reply 6 of 12, by auron

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2025-07-16, 22:58:
auron wrote on 2025-07-16, 22:29:

85 hz was common later on, but hardly worth mentioning as a standard, not anymore than 75. these were just to alleviate flicker. why not 80 hz? perhaps they wanted to cut down a bit on the VESA modes list as with the higher color modes and going up resolutions, it would reach quite a bit of entries, or maybe they were confident that monitors could already do 85 hz by the time those VBE 1.0 cards hit the market.

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

would you rather see 74/87/102 in a drop-down list, or 75/85/100? with how quickly things progressed, the most obvious explanation is that past 72 hz instead of technical reasons they just went with easy to read numbers, with 144 going back to the aforementioned 24 hz and 120/240 hz being particular sweet spots, accommodating both 60 and 30 fps content in addition.

1936x1089 didn't quite catch on either, despite being a proper 16:9 resolution as well...

eddman wrote on 2025-07-16, 22:58:
auron wrote on 2025-07-16, 22:29:

the 75/100 hz modes are useful though as a number of games (GTA 2, diablo 2) run with 25 fps caps by default.

I actually wanted to mention this. Were software developers basing the fps on the available display refresh rates, or was it the opposite, where the hardware devs based it on common software fps?

for the two mentioned cases, it's possible they bet on 75 hz being a common default refresh rate that would offer less flicker than running at 30 fps and targeting 60 hz instead (120 hz would be possible in theory but uncommon at the time), and in the former case there is a PAL region developer console lineage as well. this is conjecture though, who knows if they actually cared about motion smoothness in that way - jazz2 developers certainly didn't seem to, for one.

as another example, doom capping to 35 fps (half of VGA's 70 hz) was definitely a good move. why not run at the full 70 fps? maybe it allowed them to simplify a few things in the code and even high-end machines in 93-94 didn't reliably attain even a constant 35 fps anyway. otherwise, entering win95 you would expect devs to not cap their game logic to a certain fps as unlike DOS you couldn't know what the user was running at, although there were definitely games that still expected 30 fps max and would run too fast above that.

Reply 7 of 12, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2025-07-16, 22:58:

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

I only understand a bit of this because of messing with old XFree86 recently and the video timing howto (https://tldp.org/HOWTO/XFree86-Video-Timings- … OWTO/specs.html)

The relevant numbers in a Modeline are HTotal and VTotal. These are not just the resolution but extra time involved for non-displayed "pixels" like blanking interval, front porch back porch, sync pulses, etc.
Basically old cards only have a discrete set of dot-clocks (in MHz).
The horizontal frequency is the dot clock divided by HTotal, and old monitors can often only handle one or a small set of discrete ones (such as 31.5 kHz, 35.15 kHz, and 35.5 kHz per xf86config) too.
HTotal has to be a multiple of 8 (usually?).

So a standard VGA card has only 25.175 and 28.32 MHz dot clocks on the card.
That gives two possible HTotal values to maintain the HTotal*31.5 kHz=dot clock: 800 and 900

So XFree86's default 640x480 @ 60 Hz uses a dot clock of 25.175, and HTotalxVTotal of 800x525; that maintains 25175000/800 = 31468.75 Hz HorizSync and produces 25175000/800/525 = 59.94 Hz VertRefresh.
720x400 text mode and friends goes along with the 28.32 clock.
800@600 @ 56 Hz uses a dot clock of 36 and maintains a HorizSync of 35.15 kHz, and HTotalxVTotal of 1024x625 producing 56.25 Hz. Looks like 35.5 kHz went with 1024x768 @ 87 interlaced Hz.

It's interesting that the padding added to the displayed resolution to obtain HTotal often ends up being the displayed resolution of the next highest resolution... not sure if there was some circuitry reason for that.
Back to the question - in this restricted environment where the dot clock, and the ratio of the dot clock to HTotal have to be constrained to discrete values, and then VTotal correspondingly has to be within some restricted range to maintain the desired aspect ratio and blanking/porch/sync timings - that would naturally produce some refresh rate. So the question is did they pick a desired refresh rate and work backward - or pick the dot clock and HorizSync rate and then determine what refresh rates were possible? were particular dot clocks or HorizSync rates desirable for physics or economics reasons like the availability of oscillators at that frequency?

I'm not sure what dot clocks or horiz sync frequencies they may have wanted to avoid either; for example the Cirrus 542x data book suggests avoiding frequencies that are close to the memory frequency for interference reasons, and if you are close to the memory frequency, there is an option to just use the memory frequency as the dot clock rather than trying to generate a second one.

Reply 8 of 12, by 7F20

User metadata
Rank Member
Rank
Member
eddman wrote on 2025-07-16, 22:58:

I understand not having too many refresh rates as it would just bloat the standards, but I don't understand why they chose what they chose. Why go with 75 and not, say, 74 instead?

In general, there are some pretty big constraints around the bandwidth available and the pixel clocks back in those days, and they also had to make the refresh rates work with the rest of the display timing (visible resolution, porches, sync pulse), but when you are talking 75 vs 74, that's likely just a choice they made because it had to be done and it felt like a nice increment. It's not impossible that 75 looked cleaner with the math, you can always go play with an modeline calculator and see the actual effects of altering small values of a resolution

https://tomverbeure.github.io/video_timings_calculator

https://arachnoid.com/modelines/index.html

I think if you screw around with that a bit, you'll get a better understanding of just how much a small change can make, and that there were so so many different hardware considerations that things sometimes chose themselves, but when they didn't it was really just a decision.

Reply 9 of 12, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
eddman wrote on 2025-07-16, 21:43:

How and why certain refresh rates were chosen, especially in the DOS era, e.g. 70, 72, 85, etc. They look rather arbitrary; why not 74 or 86 instead? Are there any technical reasons hardware developers ended up with those specific values?

EDIT: I'm not asking why there isn't more of them. I'm asking why those few, specific values were selected.

Go to early Macintosh screens and there were all sorts of strange refresh rates between 60 and 70 hz

72hz was a very common upgrade in the dos era even being bios selectable on certain early 90’s dos machines.

70hz was an artifact of the fixed frequency nature of the original vga screen, (aka lower resolution for text means higher refresh for the same bandwidth )

Memory bandwidth limitations and marketing were the main drivers of what the refresh rates were.
As an example 87i hz was very common 1985-1997 as a result of bandwidth limitations . (And it was an early industry standard IBM 8514)
Likewise 56hz - 800x600 was 100% on old multisync cards and screens solely because of the bandwidth of commodity components.

75hz and above were just because and aligned to marketing

Reply 10 of 12, by keenmaster486

User metadata
Rank l33t
Rank
l33t

>call crystal manufacturer
>say "do you have 70Hz crystals"
>"sorry sir all we have is 72Hz crystals"
>uh I guess that's fine

World's foremost 486 enjoyer.

Reply 11 of 12, by SquallStrife

User metadata
Rank l33t
Rank
l33t

60Hz came from CGA being NTSC TV-adjacent, and we just rolled with it.

70Hz on VGA was a clever way to marry up legacy compatible video modes with fixed scan 31kHz monitors.

Everything after that is basically arbitrary, and hasn't mattered for a long time. More = smoother motion.

VogonsDrivers.com | Link | News Thread

Reply 12 of 12, by myne

User metadata
Rank Oldbie
Rank
Oldbie

Yep.

Gotta remember TV came first.
Early home home computers almost all plugged directly into TV's.
Monitors were built by... You guessed it! TV manufacturers.

Almost everything in human existence has some form of limitations for fitting into a legacy environment.

Trains are the width they are because Roman roads and especially bridges were the width they were, and the bridges were the width they were, because 2 abreast teams of mules/horses/oxen were more controllable/practical than single file or more than 2 abreast.

So the more or less global standard for trains is more or less directly linked to the width of 2 fat oxen.

I built:
Convert old ASUS ASC boardviews to KICAD PCB!
Re: A comprehensive guide to install and play MechWarrior 2 on new versions on Windows.
Dos+Windows 3.11+tcp+vbe_svga auto-install iso template
Script to backup Win9x\ME drivers from a working install
Re: The thing no one asked for: KICAD 440bx reference schematic