First post, by MattRocks
- Rank
- Member
- Real-world evidence is expected to refine conclusions.
- If you’re looking for a quick definitive answer, this thread may frustrate you.
Note: In historical discussions, evidence is assembled from records and patterns rather than controlled experiments.
Laptops all used LCD screens.
By the mid 1990s, it was not rare or unusual for reviewers to expected laptops to run real games. I found this in a 1999 review of the Asus F7400:
"if you ever have any urge to kill something and watch it splat when you are in the train then get this notebook."
Until about 1996 reviewers said 30 fps was good enough, and until about 1998 they said 60 fps was good enough.
Why? Had human brains changed, or had computer screens changed?
It turns out the screens changed. Mid-1990s LCDs stopped looking better above about 30 fps. Late-1990s LCDs stopped looking better above about 60 fps.
Another point is that software rendering was really important to games studios: Major titles like Unreal and Motorhead ran smoothly in software at 30 to 60 fps, with hardware acceleration optional.
Was that because laptops didn't have 3Dfx cards?
CRT users could benefit from much higher fps, because higher FPS created exploits by breaking game physics and that shows the games were not designed to be casually run at those speeds.
It seems to me that most 1990s PC games may have been designed around the most limiting displays that mattered - and those displays were not CRTs.
