VOGONS


Reply 80 of 161, by chronoreverse

User metadata
Rank Newbie
Rank
Newbie
RetroPCCupboard wrote on 2026-01-23, 18:16:

Is this thread supposed to be "Rage Bait"? If so, then mission accomplished! If not, then poor OP is taking a beating for trying to start a discussion.

Well deserved beating considering the poster's condescending tone while also being wrong a lot of the time making assertions without basis. I hate the current state of the world that I'm wondering if it's partly LLM-assisted ragebait.

Reply 81 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
RetroPCCupboard wrote on 2026-01-23, 18:16:

Is this thread supposed to be "Rage Bait"? If so, then mission accomplished! If not, then poor OP is taking a beating for trying to start a discussion.

The LCD thread topic has been derailed, but "elucidating" the experiences is going well.

No no no rage bait! Thread title clearly indicates I absolutely expected my LCD angle to be resisted. The hardcore gamers around me at the time resisted LCDs too, but I had one and nobody complained when I took it to LAN parties. That LCD was not aspirational. The image was nothing like an Iiyama Pro CRT, but it was fine and people used squishy LCDs a lot more than retro communities like to admit.

I know the non-work desktop-replacement laptops were real: I had one, and the hardcore gamers around me at the time knew exactly what it was. It was just an entry level PC in a laptop form factor, and as good as any other entry-level PC for playing games. Not a gamer PC, just a non-work PC that could play games. Games just worked. I am actually very sadly surprised that retro has forgotten them.

Most of what I have said is straight from memories.

And, I made a mistake on the PS1's capacity to be updated, but that's fine - nobody is perfect 😉

Reply 82 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2026-01-23, 17:30:

The way in which early Playstation was PC like is in how it had CPU, graphics, memory arranged, such that development, which Sony was also more open with, became more like PC development, rather than the very closed system, our way or the highway development which was the norm on nintendo and sega. The PC-ness of having an operating system, driver upgrades and persistent storage didn't come along until later. While this openness may have attracted a lot of developers in the 90s they found themselves walled in again in the 2000s through different methods, like the online store model and increasingly sophisticated encryption and authentication.

Conceptually, independent devs were broke out of Sega/Nintendo jail, allowed to run free until Sega and Nintendo suffered from lack of titles and then rounded up and imprisoned in new compounds. They were somewhat "free" on the PC until 2010s when Microsoft began to put more limits and conditions on what was and wasn't allowed to run on Win10++ Hence breakout attempts like the steam deck. However, I think we are needing to see hardware costs plummet before the free-er, more open platform cycle repeats, as locked in but subsidised hardware is going to look more attractive in the short to mid term.

Thank you for sharing. I agree mostly, but maybe the app lockdown started with Windows 8? I have one of Microsoft's Win8 laptops and I can install practically nothing on it.

Sega was in decline a long time before any of that. You could see in the mid-90s things were going badly for them. My gut is telling me something. Trust is historically earned. People stopped buying Dreamcast when PS2 was announced - they didn't even wait for the PS2 to ship! That tells me that Sony earned trust, and Sega squandered trust, and it happened the generation before.

Reply 83 of 161, by st31276a

User metadata
Rank Member
Rank
Member

What is a laptop other than a lower powered, more restricted desktop with a shitty screen?

Games in the 90’s were written to run “at all” on what was out there. The fact that 3d stuff worked at all back then… it is a work of art.

Nobody I know played on lcd’s (basically laptop screens at the time) in the 90’s because it was a smudgefest.

Nobody bought a laptop as a first and only computer back then too. You had a desktop and got the laptop because ypu needed something on the go amd accepted the compromises that came with it. If you played the latest games, you obviously chose the desktop because it was much faster and nicer to operate for hours on end.

Framerates increased as faster hardware became available.

Reply 84 of 161, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

The inflection point came in about 1992/1993 where laptops being only computer began to make sense for some. But for some professions, journalists etc who were constantly on the go, they began to prefer laptops as soon as they existed, and maybe saw little point in a desktop they never sat at. For text work, you'd maybe prefer to stare at a laptop for hours, CRTs were extremely hard on the eyes. Multimedia, lagged behind on laptop class and didn't really take off until about 1996, hence the huge lack of 486 machines with a sound card or CDROM, but easier to find in slow pentium. Around about 1995/1996 the workstation class had begun to take shape, something like $10k could get you a Pentium Pro 200 in a laptop (Just, I think that was the base price, could be configged up to 20k) , not off the shelf at CompUSA or RadioShack though. At a component manufacturer I interned at in 1993, they had the execs set up with a permanently wired docking station on their desks, no desktop PC and Compaq laptops. Anyway, it's all a case of degree, ordinary families were probably finding it hard to scrape up the "7 year old used car" price of a branded desktop setup never mind the "current model off lease, or cheapest new car" kind of price that laptops went for until prices started coming down from mid 90s. So for every hundred people ahead of the curve there were millions with lower budgets, requirements and expectations.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 85 of 161, by st31276a

User metadata
Rank Member
Rank
Member

A good crt at 85Hz was nice to look at for me.

The thing about the early lcd’s was the ridiculous viewing angles and the way the colours distorted when you did not aim your face’s normal vector exactly right. That was quite tiring.

Interesting take on text work on the go. I’m from deep rural and was in school in the 90’s - laptops were not things you saw everywhere.

My father had a Twinhead DX2-50 with 4MB with a dual scan colour screen. We later upgraded it to 8MB to run win95. That was an on the go machine, he always worked on the DX4-100 back at home on the crt in those days.

Reply 86 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
st31276a wrote on 2026-01-24, 20:44:
A good crt at 85Hz was nice to look at for me. […]
Show full quote

A good crt at 85Hz was nice to look at for me.

The thing about the early lcd’s was the ridiculous viewing angles and the way the colours distorted when you did not aim your face’s normal vector exactly right. That was quite tiring.

Interesting take on text work on the go. I’m from deep rural and was in school in the 90’s - laptops were not things you saw everywhere.

My father had a Twinhead DX2-50 with 4MB with a dual scan colour screen. We later upgraded it to 8MB to run win95. That was an on the go machine, he always worked on the DX4-100 back at home on the crt in those days.

Before Internet gaming, multiplayer on LCDs was impossible because two players cannot physically occupy the same space directly in front of one small screen. In those days we had LCDs on single-player handheld game units, on the back of airplane seats (chess or movies), on our calculators and watches, etc. Laptops were rare. LCDs were specialised.

The Internet changed things because in that moment multiplayer was one head fixated directly in front of one screen. The problems were then more nuanced: sub-second lag in screen refresh, temporal blur, inconsistent backlighting, and cost. Those barriers were coming down during the mid-90s and mainstream affordability changed in the late 1990s.

By 1999 it was less about technical limitation and more social expectation. Reputations linger. Dads would say something disparaging but ten years out of date, such as 120Hz reducing eye strain and LCDs have limited viewing angles. Hardcore competitive gamers could muster true technical reasons why high-end CRTs gave them an edge (bright colours, flat glass, anti-glare coatings, and high refresh rates) but they were a small minority among mass market gamers. And, when competitive gamers were talking about a CRT gaming advantage they were talking about an iiyama Pro (or similar) - not a cheap mass market CRT with curved glass, screen glare, flicker, and eye-strain!

Logically, the LCD had to be working fine before social expectations changed - and that's simply repeating a basic rule that shows up everywhere in tech adoption! If it was socially acceptable in 2001, then by definition the real-world technical inflection had already come and gone.

For me, in the late '90s, I was playing on CRTs too much - my eyes were twitchy and painful. It felt like having a nerve above my cheeks was permanently and continuously flexing, and my eyes were dry. An optician explained this was caused by too much screen time. In other words I was suffering from self-inflicted stupidity. Changing my gaming habits was hard, but my physical need to medically recover was non-negotiable so I simply could not afford to buy another CRT no matter how much peers insisted the iiyama was the best value for gaming.

So, I wasn't an early adopter out of cleverness, I was an early-adopter out of stupidity! But, what I immediately discovered is that my cheap LCD was perfectly usable for games. I played well enough and my eyes recovered. It was actually less suitable for work because of dead pixels.

Reply 87 of 161, by MattRocks

User metadata
Rank Member
Rank
Member

Unfortunately the 1990s sits in an awkward era before Internet archives matured. The primary source evidence below is from 2003, not 1999.

For another thread, @zuldan pulled out this December 3rd 2003 broadcast of a LAN party. That 2003 broadcast shows LCDs and laptops mixed in among CRTs and ATX towers. The 2003 broadcast picked these up as background noise without commentary, suggesting by 2003 their presence was expected.

I dated the footage to 2003 from the lottery ball numbers in the news ticker. Archived footage: https://www.youtube.com/watch?v=J1LF4GScs7U

Last edited by MattRocks on 2026-01-29, 23:19. Edited 1 time in total.

Reply 88 of 161, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

Some opted for them because they were easier to carry. Also, the 5:4 format made them ill-suited for computer entertainment.

Reply 89 of 161, by Jo22

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on 2026-01-24, 21:16:

For me, in the late '90s, I was playing on CRTs too much - my eyes were twitchy and painful. It felt like having a nerve above my cheeks was permanently and continuously flexing, and my eyes were dry. An optician explained this was caused by too much screen time. In other words I was suffering from self-inflicted stupidity. Changing my gaming habits was hard, but my physical need to medically recover was non-negotiable so I simply could not afford to buy another CRT no matter how much peers insisted the iiyama was the best value for gaming.

I think I can relate to that.
I've had dry eyes after sitting in front of my 14" IBM PS/2 monitor for too a while.
It also made me feel dizzy, I think.

Interestingly, playing NES or SNES on my Commodore 1702 monitor (PAL) didn't cause that much eye pain.
Which is strange, because both use progressive scan (VGA is non-interlaced, the consoles only use half the lines to avoid interlacing).

The only notable difference is viewing distance and 50 Hz vs 60 Hz (70 Hz) and lighting (light bulbs flickering at 50 Hz).
Maybe it was the conflict of having 50 Hz light bulbs and a 60 Hz monitor?

Years later, my hearing became overly sensitive and I couldn't tolerate much noise anymore.
It went so far I've had to build a power-efficient, fan-less PC (fan-less PSU even)..

MattRocks wrote on 2026-01-29, 17:33:

Unfortunately the 1990s sits in an awkward era before Internet archives matured. The primary source evidence below is from 2003, not 1999.

The Wayback Machine opened in 2001 or so, I think.
So it makes sense that before that date there weren't many volunteer contributors yet, I guess.

But still, there's hope. Pictures or videos taken on VHS by amateurs might surface eventually.

Also, years before YouTube there had been video tutorials/reviews/ads put on cover CDs (PC/Video game magazines).
These CDs date back to early 90s when AVI (Video for Windows) and MOV files (QuickTime) were still common.
Average video resolution was 160x120 to 320x240 pixel back then.

Here are some examples:
https://m.youtube.com/@PCPlayerforever/search … =%20CD%20Player

MattRocks wrote on 2026-01-29, 17:33:

I dated the footage to 2003 from the lottery ball numbers in the news ticker. Archived footage: https://www.youtube.com/watch?v=J1LF4GScs7U

Ah yes, I remember 2003! That was one year before XP SP2 came out.
Most people I knew were still using Windows 98SE and I had to convince them to give XP a chance. 😁

Btw, here's an old YouTube video I do remember watching.
It shows an grayscale LCD monitor from 1990.

https://www.youtube.com/watch?v=pJvqnpMT8RM

I also remember that back in the 90s, there also were some individuals who converted laptop LCD panels of used/broken laptops into stand-alone VGA monitors!
The idea of having a slim, flicker-free PC monitor still seemed like a novel idea to some.
So there definitely was some interest in using flat-screens early on.

Especially since working on DOS/Windows 3.x with a Standard VGA mode of 640x480 pels 60 Hz wasn't the most pleasant.
That was before PCI VGA cards and VESA VBE were very common,
before users could run a DOS utility to change refesh rate to 85 Hz or something (for DOS/system wide, not just for Windows through graphics drivers).

At the time, many of the simpler 14" VGA monitors had a limited numbers of resolutions/frequencies they could sync on.
They had no on-screen-display and microcontroller yet, but knobs for adjustments.
So 640x480 at 60 Hz/800x600 at 56 Hz/1024x768 at 43 Hz were the typical choices.
Other resolutions required to manually re-adjust every time ln such basic monitors.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 90 of 161, by theelf

User metadata
Rank Oldbie
Rank
Oldbie

I use CRT at 60hz always, never feel nothing in my eyes.... and i did heavy programming in 90s. In fact, hercules 50hz was totally fine to me, i use a green screen to to text until first 2000s, never feel any difference when i use LCD beside the resolution and horrible screen angle of lcd

Never understand the people hz stuff, my eyes are perfect and for me 50+hz are totally fine to work for long hours, i dont care tft or crt even now im getting old

Reply 91 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
Putas wrote on 2026-01-29, 20:07:

Some opted for them because they were easier to carry. Also, the 5:4 format made them ill-suited for computer entertainment.

You need to expand on that because 5:4 is not specific to any technology; desktops were equipped with 5:4 before laptops.

The Hz topic is really interesting because the physical analog section is long (from DACs in the computer to phosphors and light at the screen surface) meaning more real-world interference. The mains power frequency could be 50Hz (EU) or 60Hz (US) and this could sync or disrupt the analog section, and actual real-world Hz varies by street. The phosphors in the CRT were engineered to decay over a length of time so that they would blend into the next update, and that length of time might be under ~1ns or above ~60ns. So one screen in one region in one year could impact a persons eyes differently to another screen in another region in another year. And, that's before considering each set of eyes is genetically distinct.

Last edited by MattRocks on 2026-01-29, 20:51. Edited 1 time in total.

Reply 92 of 161, by the3dfxdude

User metadata
Rank Oldbie
Rank
Oldbie
MattRocks wrote on 2026-01-29, 17:33:

For another thread, @zuldan pulled out this December 3rd 2003 broadcast of a LAN party. That 2003 broadcast shows LCDs and laptops mixed in among CRTs and ATX towers. The 2003 broadcast picked these up as background noise without commentary, suggesting by 2003 their presence was expected.

Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter because people at the time pointed out their flaws when seeing the screen I bought. Well guess what?-- I am still using the same Samsung screen right now typing this, and yes, it still has the same flaws versus a CRT.

Expected to be a presence since no one commented about it? C'mon, the technology is not new by any means, but why assume things by that? Everyone there, CRT and LCD users alike were just there to play games. As far as gaming just was not typical, or not in favor and the reasons were valid, and even the scene then shows it. LCD technology has changed alot in the last 25 years, and that companies didn't want to make CRTs anymore blurring them with flat screen TV manufacturing to get the job done and make some more dollars back in the process. So everyone was going there no matter how long it took.

Why did I get an LCD? Yes to have something lighter than my desktop CRT at 17" to carry to a LAN party and better than a laptop. But again, it wasn't as good as my CRT, but whatever. Eventually everyone threw out their CRT to save some space, and this is apparently why we are having this conversation. But LCDs were not necessarily accepted or as useful or better, it was just going to happen eventually. If you see my desk now, with how many screens can fit on it, like probably many of you, it's easy to understand why. Could I have been one of those people at a LAN party? Sure! But even then I remember I appreciated the @home CRT for other reasons.

Have games been specifically designed for one or the other? Doesn't matter. Everyone eventually adapts to what is working at the time, so games weren't really a factor in adoption. But as demand for LCDs went up, they did get better. I guess only lately they are considered good enough to what once was. In all this, there aint gonna be really anything to conclude here. All this just happened regardless of the opinion or the results of something at any one time.

Reply 93 of 161, by Jo22

User metadata
Rank l33t++
Rank
l33t++
theelf wrote on 2026-01-29, 20:15:

I use CRT at 60hz always, never feel nothing in my eyes.... and i did heavy programming in 90s. In fact, hercules 50hz was totally fine to me, i use a green screen to to text until first 2000s, never feel any difference when i use LCD beside the resolution and horrible screen angle of lcd

Never understand the people hz stuff, my eyes are perfect and for me 50+hz are totally fine to work for long hours, i dont care tft or crt even now im getting old

Hi! Hercules monitors had eye-friendly phosphor coated screens to compensate for low refresh, I think.
Especially the green and amber screens (there also was paper-white type).
The IBM 5151 was the extreme, with its long afterglow (persistance time). Like a radar screen, almost.
There's a cool video on YouTube showing it: https://www.youtube.com/watch?v=BEkDRa--YaY
These TTL monitors also had no sub-pixels, they were truely monochrome (3 levels; off/on and bright with intensity bit/pin used in MDA text-mode).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 94 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
the3dfxdude wrote on 2026-01-29, 20:31:
Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter be […]
Show full quote
MattRocks wrote on 2026-01-29, 17:33:

For another thread, @zuldan pulled out this December 3rd 2003 broadcast of a LAN party. That 2003 broadcast shows LCDs and laptops mixed in among CRTs and ATX towers. The 2003 broadcast picked these up as background noise without commentary, suggesting by 2003 their presence was expected.

Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter because people at the time pointed out their flaws when seeing the screen I bought. Well guess what?-- I am still using the same Samsung screen right now typing this, and yes, it still has the same flaws versus a CRT.

Expected to be a presence since no one commented about it? C'mon, the technology is not new by any means, but why assume things by that? Everyone there, CRT and LCD users alike were just there to play games. As far as gaming just was not typical, or not in favor and the reasons were valid, and even the scene then shows it. LCD technology has changed alot in the last 25 years, and that companies didn't want to make CRTs anymore blurring them with flat screen TV manufacturing to get the job done and make some more dollars back in the process. So everyone was going there no matter how long it took.

Why did I get an LCD? Yes to have something lighter than my desktop CRT at 17" to carry to a LAN party and better than a laptop. But again, it wasn't as good as my CRT, but whatever. Eventually everyone threw out their CRT to save some space, and this is apparently why we are having this conversation. But LCDs were not necessarily accepted or as useful or better, it was just going to happen eventually. If you see my desk now, with how many screens can fit on it, like probably many of you, it's easy to understand why. Could I have been one of those people at a LAN party? Sure! But even then I remember I appreciated the @home CRT for other reasons.

Have games been specifically designed for one or the other? Doesn't matter. Everyone eventually adapts to what is working at the time, so games weren't really a factor in adoption. But as demand for LCDs went up, they did get better. I guess only lately they are considered good enough to what once was. In all this, there aint gonna be really anything to conclude here. All this just happened regardless of the opinion or the results of something at any one time.

What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were more common, but it's not automatically ahistorical to choose an LCD.

There were generations of CRTs, generations of LCDs, and overlaps where certain LCDs were engineered to emulate certain CRTs - or certain CRTs engineered to emulated certain LCDs. The big overlap came in the 1990s.

My view is that more thought should go into screen selection.

Last edited by MattRocks on 2026-01-29, 23:21. Edited 1 time in total.

Reply 95 of 161, by theelf

User metadata
Rank Oldbie
Rank
Oldbie
MattRocks wrote on 2026-01-29, 21:12:
What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were mo […]
Show full quote
the3dfxdude wrote on 2026-01-29, 20:31:
Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter be […]
Show full quote
MattRocks wrote on 2026-01-29, 17:33:

For another thread, @zuldan pulled out this December 3rd 2003 broadcast of a LAN party. That 2003 broadcast shows LCDs and laptops mixed in among CRTs and ATX towers. The 2003 broadcast picked these up as background noise without commentary, suggesting by 2003 their presence was expected.

Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter because people at the time pointed out their flaws when seeing the screen I bought. Well guess what?-- I am still using the same Samsung screen right now typing this, and yes, it still has the same flaws versus a CRT.

Expected to be a presence since no one commented about it? C'mon, the technology is not new by any means, but why assume things by that? Everyone there, CRT and LCD users alike were just there to play games. As far as gaming just was not typical, or not in favor and the reasons were valid, and even the scene then shows it. LCD technology has changed alot in the last 25 years, and that companies didn't want to make CRTs anymore blurring them with flat screen TV manufacturing to get the job done and make some more dollars back in the process. So everyone was going there no matter how long it took.

Why did I get an LCD? Yes to have something lighter than my desktop CRT at 17" to carry to a LAN party and better than a laptop. But again, it wasn't as good as my CRT, but whatever. Eventually everyone threw out their CRT to save some space, and this is apparently why we are having this conversation. But LCDs were not necessarily accepted or as useful or better, it was just going to happen eventually. If you see my desk now, with how many screens can fit on it, like probably many of you, it's easy to understand why. Could I have been one of those people at a LAN party? Sure! But even then I remember I appreciated the @home CRT for other reasons.

Have games been specifically designed for one or the other? Doesn't matter. Everyone eventually adapts to what is working at the time, so games weren't really a factor in adoption. But as demand for LCDs went up, they did get better. I guess only lately they are considered good enough to what once was. In all this, there aint gonna be really anything to conclude here. All this just happened regardless of the opinion or the results of something at any one time.

What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were more common, but it's not automatically ahistorical to choose an LCD.

There were generations of CRTs, generations of LCDs, and overlaps where certain LCDs were engineered to emulate certain CRTs - or certain CRTs engineered to emulated certain LCDs.

My view is that more thought should go into screen selection.

No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and every content was programming thinking in CRT too

LCDs was for portability, bussiness in computers and game for consoles etc

thats all

Reply 96 of 161, by theelf

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote on 2026-01-29, 20:35:
Hi! Hercules monitors had eye-friendly phosphor coated screens to compensate for low refresh, I think. Especially the green and […]
Show full quote
theelf wrote on 2026-01-29, 20:15:

I use CRT at 60hz always, never feel nothing in my eyes.... and i did heavy programming in 90s. In fact, hercules 50hz was totally fine to me, i use a green screen to to text until first 2000s, never feel any difference when i use LCD beside the resolution and horrible screen angle of lcd

Never understand the people hz stuff, my eyes are perfect and for me 50+hz are totally fine to work for long hours, i dont care tft or crt even now im getting old

Hi! Hercules monitors had eye-friendly phosphor coated screens to compensate for low refresh, I think.
Especially the green and amber screens (there also was paper-white type).
The IBM 5151 was the extreme, with its long afterglow (persistance time). Like a radar screen, almost.
There's a cool video on YouTube showing it: https://www.youtube.com/watch?v=BEkDRa--YaY
These TTL monitors also had no sub-pixels, they were truely monochrome (3 levels; off/on and bright with intensity bit/pin used in MDA text-mode).

Yes 50hz look a little better on old amber/green screens than VGA ones because phosphor persistance time, of course scroll look better to read on VGA because of the same. for VGA 60hz for me look perfect, 50hz is ok but not soo nice like the old hercules or CRT TVs

I can do programming in a amiga on a CRT TV using 288p at 50hz and never get tired

Reply 97 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
theelf wrote on 2026-01-29, 21:45:
No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and ever […]
Show full quote
MattRocks wrote on 2026-01-29, 21:12:
What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were mo […]
Show full quote
the3dfxdude wrote on 2026-01-29, 20:31:
Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter be […]
Show full quote

Are you trying to prove that people used LCDs? Yes people used LCDs for a long time. I guess I was an "early" mid-00s adopter because people at the time pointed out their flaws when seeing the screen I bought. Well guess what?-- I am still using the same Samsung screen right now typing this, and yes, it still has the same flaws versus a CRT.

Expected to be a presence since no one commented about it? C'mon, the technology is not new by any means, but why assume things by that? Everyone there, CRT and LCD users alike were just there to play games. As far as gaming just was not typical, or not in favor and the reasons were valid, and even the scene then shows it. LCD technology has changed alot in the last 25 years, and that companies didn't want to make CRTs anymore blurring them with flat screen TV manufacturing to get the job done and make some more dollars back in the process. So everyone was going there no matter how long it took.

Why did I get an LCD? Yes to have something lighter than my desktop CRT at 17" to carry to a LAN party and better than a laptop. But again, it wasn't as good as my CRT, but whatever. Eventually everyone threw out their CRT to save some space, and this is apparently why we are having this conversation. But LCDs were not necessarily accepted or as useful or better, it was just going to happen eventually. If you see my desk now, with how many screens can fit on it, like probably many of you, it's easy to understand why. Could I have been one of those people at a LAN party? Sure! But even then I remember I appreciated the @home CRT for other reasons.

Have games been specifically designed for one or the other? Doesn't matter. Everyone eventually adapts to what is working at the time, so games weren't really a factor in adoption. But as demand for LCDs went up, they did get better. I guess only lately they are considered good enough to what once was. In all this, there aint gonna be really anything to conclude here. All this just happened regardless of the opinion or the results of something at any one time.

What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were more common, but it's not automatically ahistorical to choose an LCD.

There were generations of CRTs, generations of LCDs, and overlaps where certain LCDs were engineered to emulate certain CRTs - or certain CRTs engineered to emulated certain LCDs.

My view is that more thought should go into screen selection.

No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and every content was programming thinking in CRT too

LCDs was for portability, bussiness in computers and game for consoles etc

thats all

The record shows LCDs were being sold for PCs in the late 90s. I checked magazines that advertised LCDs as superior to CRTs in the late 90s.

Here's the spec sheet of a 1996 Hitachi LCD sold as Windows 95 plug & play compatible.
https://www.hitachi.com/New/cnews/E/1996/960418A.html

Here's an Eizo LCD manual with setup instructions for Windows 95/98:
https://www.eizoglobal.com/support/db/files/m … umeng/L66um.pdf

I don't believe Hitachi, Eizo, Toshiba and others were selling LCDs to nobody doing nothing. I swapped my CRT for a generic LCD around 1999 and that was used for displaying code (Microsoft VisualStudio, Macromedia Dreamweaver, etc.)

Last edited by MattRocks on 2026-01-29, 23:05. Edited 1 time in total.

Reply 98 of 161, by theelf

User metadata
Rank Oldbie
Rank
Oldbie
MattRocks wrote on 2026-01-29, 22:59:
The record shows LCDs were being sold for PCs in the late 90s. I checked magazines that advertised LCDs as superior to CRTs in […]
Show full quote
theelf wrote on 2026-01-29, 21:45:
No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and ever […]
Show full quote
MattRocks wrote on 2026-01-29, 21:12:

What I am challenging is the notion that we need a CRT to have a representative retro gaming PC. There are nuances, CRTs were more common, but it's not automatically ahistorical to choose an LCD.

There were generations of CRTs, generations of LCDs, and overlaps where certain LCDs were engineered to emulate certain CRTs - or certain CRTs engineered to emulated certain LCDs.

My view is that more thought should go into screen selection.

No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and every content was programming thinking in CRT too

LCDs was for portability, bussiness in computers and game for consoles etc

thats all

The record shows LCDs were being sold for PCs in the late 90s. I checked magazines that advertised LCDs as superior to CRTs in the late 90s.

Here's the spec sheet of a 1996 Hitachi LCD sold as Windows 95 plug & play compatible.
https://www.hitachi.com/New/cnews/E/1996/960418A.html

Here's an Eizo LCD manual with setup instructions for Windows 95/98:
https://www.eizoglobal.com/support/db/files/m … umeng/L66um.pdf

I don't believe Hitachi, Eizo, Toshiba and others were selling LCDs to nobody doing nothing. I swapped my CRT for a generic LCD around 1999. And, I was a computer programmer, so code was definitely being written on LCDs.

No one use TFTs in desktop computers back on time

Reply 99 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
theelf wrote on 2026-01-29, 23:05:
MattRocks wrote on 2026-01-29, 22:59:
The record shows LCDs were being sold for PCs in the late 90s. I checked magazines that advertised LCDs as superior to CRTs in […]
Show full quote
theelf wrote on 2026-01-29, 21:45:

No one use LCD for nothing until 2000+ in desktop. No one do programming thinking in LCD for desktop stuff. TVs was CRT and every content was programming thinking in CRT too

LCDs was for portability, bussiness in computers and game for consoles etc

thats all

The record shows LCDs were being sold for PCs in the late 90s. I checked magazines that advertised LCDs as superior to CRTs in the late 90s.

Here's the spec sheet of a 1996 Hitachi LCD sold as Windows 95 plug & play compatible.
https://www.hitachi.com/New/cnews/E/1996/960418A.html

Here's an Eizo LCD manual with setup instructions for Windows 95/98:
https://www.eizoglobal.com/support/db/files/m … umeng/L66um.pdf

I don't believe Hitachi, Eizo, Toshiba and others were selling LCDs to nobody doing nothing. I swapped my CRT for a generic LCD around 1999. And, I was a computer programmer, so code was definitely being written on LCDs.

No one use TFTs in desktop computers back on time

The record shows stand alone LCDs were being sold, which means stand alone LCDs were being bought. What were they being bought for?