VOGONS


Reply 20 of 20, by lti

User metadata
Rank Member
Rank
Member
momaka wrote on 2025-07-21, 19:38:

Another favorite Dell of mine is my M782 - a 17" from the early 2000's with a Samsung chassis and picture tube. The colors are top-notch and picture is nice and crisp. Nice bright tube too. It was a crazy random dumpster find that I carried home by hands for over a mile... but it was totally worth it.

I was about to post that late model low-end CRTs (the standard throw-in with a consumer or business desktop) tended to have worse colors and a fuzzy picture. 🤣 Those Dell monitors didn't seem that bad except for the E772c. That one was a little fuzzy at 1024x768, but it never got sharper as you changed the resolution. If anything, it got worse. 640x480 looked like cheap LCD upscaling, even though there was no scaling hardware. Also, the Chunghwa tube had a dark band on the right edge of the front glass and would burn in abnormally quickly.

I'd rather have a mid-1990s CRT, but it's easier to find one of those low-end business CRTs and make some adjustments that they rarely seemed to get from the factory. Since I work with some younger people now, I'm realizing that some people didn't see (or remember) the better monitors until they were so old that the tube was worn out.

Compaq/HP had some bad ones, like the V55. I still remember that model because it was extremely common in schools and businesses. Depending on which model of graphics card you had, the picture either had ghosting or was horribly out of focus (to the point of being almost impossible to read text). It seemed like I was the only person who had a problem reading that horribly blurry text, and that was back when I had good vision (I had 20/20 vision all the way through high school and then suddenly needed glasses near the end of college). The 7550 was cheap and didn't seem to last long, but when they worked (or if), they seemed to have about average picture quality.

momaka wrote on 2025-07-21, 19:38:

Also, I don't want to say that CRTs use too much energy, because compared to modern GPUs these days, they are almost like a fart in the wind. Nevertheless, if you've got one in a tiny or very small room and if it gets hot in that room during the summer, expect it to get even hotter with a CRT. In essence, they don't really use more than 70-90W during normal gaming use (rarely jump over 100W power consumption). But in a tiny room, you *will* noticed it compared to using a thin modern LED-backlit LCD that sips less than 1/5 of that power. Of course, if you're using your CRT on a setup with some power-hungry SLI pig-rig, then again, the CRT won't be the big stinker in terms of the heat production.

I remembered when the middle school I went to decided to add another computer lab by putting up walls in the corner of the library. Unfortunately, they picked a corner with no HVAC vents and loaded this tiny room with Dell Optiplex GX280s and those M782 monitors that you mentioned. Someone put a fan in the room, but it just made things worse by blowing hot air around.

I think my modern desktop has abnormally high idle power consumption, so I'm feeling it right now. It's in a room next to the garage, and the air conditioner isn't keeping up with the heat wave that started in 2022 and never went away.

momaka wrote on 2025-07-21, 19:38:

Oh, and one more benefit / cool factor of CRTs that I forgot to mention: the smell from their back when they get hot and the degaussing noise. I mean, that itself gives me an instant nostalgia trip. 🤣

There's also the static noise when the HV comes on. Now that I only have one worn-out CRT in the same hot (and brightly-lit because "60W replacement" LED lights are 800 lumens instead of 600 for a real incandescent bulb) room as my modern stuff, I have to go watch those arcade monitor repair videos for the nostalgia hit.