VOGONS


CRT vs LCD?

Topic actions

Reply 21 of 68, by kixs

User metadata
Rank l33t
Rank
l33t
newstuff.jpg
Filename
newstuff.jpg
File size
147.98 KiB
Views
1551 views
File license
Fair use/fair dealing exception

I only have LCD's on my main desk (no space). But I also own several CRT's. From small 12" to 22".

At the moment this is my CRT list:

- 12" mono VGA (one TVM and one Dell)
- 13" color VGA Siemens
- 15" Sony Trinitron 100SX
- 17" Philips (high end, it does 1600x1200 too)
- 17" Samsung 763MB (pictured)
- 22" Mitsubishi Diamond Pro 2070SB (pictured)

I believe the 15" Sony will be best for games up to 1997. Then some 17" and 22" after 1999.

Requests are also possible... /msg kixs

Reply 22 of 68, by kixs

User metadata
Rank l33t
Rank
l33t
Stojke wrote:

Can any one suggest some high end CRTs?

High end... what do you mean by that exactly? The 22" Mitsubishi 2070SB I have is best of the best in my book 😁 and I had lots of high end CRTs back in the days... But it's not really recommended for low res games. In DOS you have very big scan lines.

Requests are also possible... /msg kixs

Reply 24 of 68, by kixs

User metadata
Rank l33t
Rank
l33t

What will you use it for?

For low res games and DOS look for good 15-17" Trinitron (Sony, Mitsubishi, IIyama). For Windows and hi res games 19-22" Trinitron (Sony, Mitsubishi, IIyama). I like almost flat screen of Trinitron tube. There are also good non-Trinitron screens (Samsung, Nokia).

Because any CRT monitor will be at least 10 years old. You'll have to see it in person to verify if the quality of the screen is good (not too dim or too bright, geometry, focus...). I've seen many high end CRT's that had very poor image after so many years.

Requests are also possible... /msg kixs

Reply 25 of 68, by Stojke

User metadata
Rank l33t
Rank
l33t

Because I have a few LCD screens that I use for DOS and Windows apps/games I will use it as an classic everyday monitor now and then. I love the image quality on CRT screens, plus I can throw stuff at it when I'm pissed and nothing breaks 🤣

Pretty much if it suits my eyes for a while of usage its good. Usually the monitors here are either in very good shape or clearly lacking picture quality.

Note | LLSID | "Big boobs are important!"

Reply 26 of 68, by Logistics

User metadata
Rank Oldbie
Rank
Oldbie
SquallStrife wrote:

The analogue revival in audio isn't about audio fidelity though, it's about the tangible experience of owning and using records. I'd wager that most of the people buying vinyl today will take it home and play it on a cheapo Chinese USB turntable.

People chasing audio fidelity and "pure-ness" are into Blu-Ray Audio and SACD, very much digital.

Why are they happy with low-quality LCDs? Well, if you watch a Blu-Ray movie of filmed content (i.e. not animated), it actually doesn't look terrible on most non-bargain-basement screens under usual viewing conditions (naturally lit room, good viewing distance, etc).

Your opinion represents one of the many views on what an audiophile is, and there are several. One rule rings true and doesn't allow any of those views to be absolute fact: Everyone hears differently!

Appreciation for what one hears is not based on math or specifications. One person will think that the best sound you can get is from a listening rig consisting of all pre-1983 equipment, possibly early 70's or older. Some audiophiles even prefer Reel-to-Reel tape as their listening media of choice. It doesn't matter what the frequency response, dynamic range or THD measure at--it's about what one hears.

And as far as that "pure-ness" you mentioned, playing a recording made in the 60's or 70's in analog, meant for playback on analog, may not be considered very pure if it is transferred onto a CD or any modern digital recording media and played back on your newest, digital reciever. On the flip-side, a brand-new recording made with digital equipment and transferred onto a digital media such as CD's or Blu-Ray disc's, possibly with multiple channels in mind, may not be well received by people who use old, analog equipment.

Unfortunately, we live in a world where the majority of people attempting to classify themselves as an audiophile are really trying to MAKE themselves appreciate something because someone else said it's the best, and their enthusiasm is quashed by the inordinate amounts of money they spend chasing the preferred sound of another individual. All one has to do is be enthusiastic about listening to what one loves, not listen to X or Y piece of equipment on Z recording because a modern magazine said, "It's top notch!"

As far as video and CRT's are concerned, my only problem with them is that we all seem to require more desktop space, (on the monitor and off) these days which in turn requires higher resolutions--I don't recall even having an analog out on my newest video card. I would need to build a second system in order to use my big, heavy, SGI CRT monitor. Either that or I would need to get a video card with analog output, which is becoming more sparse. I feel the way CRT's "look" makes them superior for my wants, but I use LCD's out of necessity. But we all SEE differently as well so what's good for this goose is not necessarily good for that gander.

Reply 27 of 68, by obobskivich

User metadata
Rank l33t
Rank
l33t
Logistics wrote:

As far as video and CRT's are concerned, my only problem with them is that we all seem to require more desktop space, (on the monitor and off) these days which in turn requires higher resolutions--I don't recall even having an analog out on my newest video card. I would need to build a second system in order to use my big, heavy, SGI CRT monitor. Either that or I would need to get a video card with analog output, which is becoming more sparse.

This is a good point too - a number of modern graphics cards either entirely omit analog video out (so would have to rely on a converter), or have fairly poor analog output quality. It seems that VGA is finally dying, at least for higher end machines. I know there are some CRTs that take HDMI/DVI inputs as well, so it's not like CRT must = VGA, but it commonly does. The same has happened with game consoles in the most recent generation - YPbPr/RGB output is absent, and they have transitioned to HDMI. Again, not the end of the world, but for some folks it may seem like it.

But we all SEE differently as well so what's good for this goose is not necessarily good for that gander.

Something else I've noticed over the years, both in audio and video, is that "older is better" or "we don't need no stinkin digital" etc kinds of discussions tend to involve waxing nostalgic about absolute TOTL summit-class flagship equipment as examples up against whatever run-of-the-mill average cabbage equipment of today (I'm not saying anyone here has done this, at least that I've seen thus far). This produces very unfair comparisons. IME high-end gear is high-end gear no matter if it's brand new or twenty years old, and a high end monitor is a high end monitor is a high end monitor - be it CRT, LCD, PDP, OLED, etc. I will say, from what I remember of average cabbage (read: cheap) CRTs and the machines that drove them, the transition to digital connections, digital displays, etc has been a significant improvement in terms of baseline average quality in the last few years (and it's gotten cheaper while doing it). But "the world is alright" or "the world is getting better" doesn't make nice headlines - "everything is going to hell in a handbasket" is a lot more fun to read and write. 🤣

I'm also not saying CRTs are "bad" or anything of the sort. As I said earlier - either can be good, and each has specific strengths and weaknesses that can make one or the other better for your specific application. I think it is unfortunate that due to [whatever factors] we as consumers can no longer make that choice at purchase-time though, and the only option for CRTs (and it would seem this is quickly becoming "non-LCDs") is to purchase them used. While I'm not beating down anyone's door to buy a CRT today, I would certainly welcome more options in the marketplace, and it may even lead to more improvements/innovations on the LCD side of the aisle too. I know a few years ago, when CRTs and LCDs shared the marketplace, LCD-makers tended to tout "CRT-like" features on their monitors, like improved black levels or faster response times, just as CRT-makers tended to tout "LCD-like" features on their monitors, like flat screens or better energy efficiency. Recently though, it seems like LCD displays have stagnated - they only really have to compete with each other, so there's no real challenge to meet; of course they can be as good as themselves. The same thing was becoming true of CRTs prior to the introduction of affordable LCDs. But "post-LCD" is where some of the most revered and special CRTs came from (e.g. the FW900). Competition is good. 😀

Reply 28 of 68, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

12" mono VGA (one TVM and one Dell)

That's unique. Sounds fun to play with. what does it look like?

This is a good point too - a number of modern graphics cards either entirely omit analog video out (so would have to rely on a converter), or have fairly poor analog output quality. It seems that VGA is finally dying, at least for higher end machines. I know there are some CRTs that take HDMI/DVI inputs as well, so it's not like CRT must = VGA, but it commonly does. The same has happened with game consoles in the most recent generation - YPbPr/RGB output is absent, and they have transitioned to HDMI. Again, not the end of the world, but for some folks it may seem like it.

AMDs latest GPUS don't even have DVI 😠
Even though a lot of people still have monitors with VGA/DVI inputs, they are now expecting people to have display port.
Kind of ridiculous IMO.

Last edited by smeezekitty on 2015-06-22, 21:15. Edited 1 time in total.

Reply 29 of 68, by dr_st

User metadata
Rank l33t
Rank
l33t

Short of perhaps AMD's latest GPUs, every video card I have seen either had VGA, or at least one of the DVI outputs was DVI-I, which includes DVI-A, which is VGA. Meaning that you can convert it to VGA with a passive adapter.

Then again, today there are very cheap, even if active, converters from DP to VGA, so perhaps it's not a big issue after all.

Quality is a different thing. I can easily believe modern video cards don't care much for VGA quality. Bad cables don't help either.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 30 of 68, by obobskivich

User metadata
Rank l33t
Rank
l33t
smeezekitty wrote:

AMDs latest GPUS don't even have DVI 😠
Even though a lot of people still have monitors with VGA/DVI inputs, they are not expecting people to have display port.
Kind of ridiculous IMO.

Which "latest"? The ones that just came out like a week ago? 😕

My 290X has dual DVI-D dual-link, but no DVI-I/VGA output. You can do DP->VGA of course though. I've not actually tried it (none of my monitors actually require it), but from what I've seen on Apple computers that use Thunderbolt/DP->VGA adapters, the quality isn't bad by any means.

FWIR the big reason that manufacturers/OEMs love DisplayPort is that it's royalty free and (in theory/on paper) very flexible (e.g. MiniDP, MST, etc). In practice I have no issues with DP, except that cable quality seems to be outright horrible (VESA actually posted a bulletin about this a while back), and it seems to be relatively common practice to just swap cables over and over until you get one that works - I gave up and just went back to DL-DVI for the time being. 😊

dr_st wrote:

Quality is a different thing. I can easily believe modern video cards don't care much for VGA quality. Bad cables don't help either.

On my newer GeForce cards that have VGA out (GeForce 610 and 660), they both offer a single VGA out (vs the typical dual VGA/DVI that cards had for years), and in both cases the image quality (at least plugged into my older CRTs) is noticeably worse than older cards (e.g. GeForce 😎, as is VGA performance (the 610 running SuperScape 3D is actually slower than some cards from the late 90s 😲). I'm not sure if this is a constant across all of the newer nVidia cards with VGA outputs, and I don't have an AMD card from the generation before the 290X (e.g. 280/7900 series, which do have VGA output) to compare that (the next newest AMD cards I have are HD 4000 series, which look fine via analog).

Reply 31 of 68, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Which "latest"? The ones that just came out like a week ago? 😕

Yes. The Fury-X

My 290X has dual DVI-D dual-link, but no DVI-I/VGA output. You can do DP->VGA of course though. I've not actually tried it (none of my monitors actually require it), but from what I've seen on Apple computers that use Thunderbolt/DP->VGA adapters, the quality isn't bad by any means.

My 7850 has 2xDVI (one is DVI-A), 1xHDMI and 1xdisplay port. I think that is a fair mix of outputs.
I never never actually tried the DVI-I or display port output. I used the VGA output on my old 6670 at one point and the quality was kinda "meh"

FWIR the big reason that manufacturers/OEMs love DisplayPort is that it's royalty free and (in theory/on paper) very flexible (e.g. MiniDP, MST, etc). In practice I have no issues with DP, except that cable quality seems to be outright horrible (VESA actually posted a bulletin about this a while back), and it seems to be relatively common practice to just swap cables over and over until you get one that works - I gave up and just went back to DL-DVI for the time being. 😊

Yeah, I have nothing against Displayport and I support manufacturers including it. I just think they should keep DVI around for people that have slightly older monitors.
There is plenty of empty space on the card for 1 or even 2 DVI ports.

Reply 32 of 68, by Stojke

User metadata
Rank l33t
Rank
l33t

we feel things physically, analogue. Devices with digital components are more powerful and flexible. What we need is good bridge / converter from digital processing to analogue presentation we receive with our senses.

CRTs much more accurately displayed colors from my experience, but some NEC / Sharp flat screens (LCD/LED/Plasma) I saw were very pleasant to look at.
Quality costs. That's why 3rd class devices that focus on a single thing that's missing in low end models exist. And such things are usually the target of buyers because of its higher price and better presentation. One of the interesting examples of this I saw were some Sony speakers that only had bass and nothing else. The price was insane, at least 3x more expensive than typical prices. The speakers had fancy diodes and every store was blasting some ghetto gangsta crap that only has bass. When I talked to the clerk about its specifications his face was clearly uneasy about it, his response to my question ("So its only for show?") was : "Well... Yeah".

Note | LLSID | "Big boobs are important!"

Reply 33 of 68, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

CRT > LCD.
shadow mask > aperture grille.
Eizo 0.21mm > absolutely everything.

Like this is even worth a discussion anymore. The real question should be, is there anything better than a 0.21mm Eizo?

(Not counting that 0.20mm eizo, I'm starting to give up hope of ever finding one...)

Reply 34 of 68, by obobskivich

User metadata
Rank l33t
Rank
l33t
smeezekitty wrote:

Yes. The Fury-X

Interesting. I mean, it was only a matter of time I guess - it probably helps to reduce costs, since afaik DP is royalty free. Although, come to think of it, this wouldn't be the first DP-only card that AMD has released. There was the 5870 Eyefinity6 edition that was DP-only in order to fit six connectors onto one board, and afaik there have been a few Eyefinity6 style cards released after the 5870 as well. I'm honestly much more welcoming to the 6xMiniDP configuration than using some exotic connector like DMS-59, LFH-60, etc

My 7850 has 2xDVI (one is DVI-A), 1xHDMI and 1xdisplay port. I think that is a fair mix of outputs.
I never never actually tried the DVI-I or display port output. I used the VGA output on my old 6670 at one point and the quality was kinda "meh"

My GTX 660 is like that - it's a fine arrangement imho. Personally I didn't see a problem with 2xDVI-I like GeForce 6/7 had either, but I guess they pulled out a RAMDAC at some point.

Reply 36 of 68, by 133MHz

User metadata
Rank Oldbie
Rank
Oldbie

It depends. For large screen retro console or arcade gaming I'd say presentation monitors like the Mitsubishi MegaView, NEC XM29, or broadcast monitors like the Sony PVM/BVM series. High-end boutique brand CRT TVs from Europe like B&O might be good too.

http://133FSB.wordpress.com

Reply 37 of 68, by ynari

User metadata
Rank Member
Rank
Member

For gaming on a CRT TV, get a decent Sony Trinitron - if you read retro gaming forums, whilst it's true the Sony BVH are ultimately the best, they're small, expensive and present console output in a format that's not representative (ie. much better) than you'd expect at the time. A decent Sony telly will offer everything except the last 5%.

For CRT monitors, I'd suggest 'whatever is on ebay, close to you'. I have two IBM C220p monitors - high end, supporting up to 2048x1536x75Hz with a DVI-A connector (it won't actually resolve all the detail at the highest resolution). However, even though I bought them both at a similar time, one remains in a better state than the other. They're lovely, but take up a huge amount of desk space. Even with cards that still support VGA output, it's nigh on impossible to have more than two outputs with two of them being CRT monitors, without using two graphics cards. Not to mention the lack of HDCP.

I am using both monitors on a daily basis, however. It's a bit different if you're only using them for retro gaming purposes.

Reply 38 of 68, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:
Which "latest"? The ones that just came out like a week ago? :confused: […]
Show full quote
smeezekitty wrote:

AMDs latest GPUS don't even have DVI 😠
Even though a lot of people still have monitors with VGA/DVI inputs, they are not expecting people to have display port.
Kind of ridiculous IMO.

Which "latest"? The ones that just came out like a week ago? 😕

My 290X has dual DVI-D dual-link, but no DVI-I/VGA output. You can do DP->VGA of course though. I've not actually tried it (none of my monitors actually require it), but from what I've seen on Apple computers that use Thunderbolt/DP->VGA adapters, the quality isn't bad by any means.

FWIR the big reason that manufacturers/OEMs love DisplayPort is that it's royalty free and (in theory/on paper) very flexible (e.g. MiniDP, MST, etc). In practice I have no issues with DP, except that cable quality seems to be outright horrible (VESA actually posted a bulletin about this a while back), and it seems to be relatively common practice to just swap cables over and over until you get one that works - I gave up and just went back to DL-DVI for the time being. 😊

dr_st wrote:

Quality is a different thing. I can easily believe modern video cards don't care much for VGA quality. Bad cables don't help either.

On my newer GeForce cards that have VGA out (GeForce 610 and 660), they both offer a single VGA out (vs the typical dual VGA/DVI that cards had for years), and in both cases the image quality (at least plugged into my older CRTs) is noticeably worse than older cards (e.g. GeForce 😎, as is VGA performance (the 610 running SuperScape 3D is actually slower than some cards from the late 90s 😲). I'm not sure if this is a constant across all of the newer nVidia cards with VGA outputs, and I don't have an AMD card from the generation before the 290X (e.g. 280/7900 series, which do have VGA output) to compare that (the next newest AMD cards I have are HD 4000 series, which look fine via analog).

My old Sapphire Toxic R9 290x did DVI to VGA with either a DVI-VGA converter or by using a DVI-VGA cable. The GTX 980M on my ROG G751 does HDMI to DVI via adapter cable to feed my AOC e2795Vh - not that the AoC doesn't have HDMI - but for some reason colors look washed up and 16-bit like on the monitor's HDMI input, so I have to use the old HDMI to DVI trick witch looks PERFECT. I also tried a Thunderbolt -> DVI adapter, but it didn't work as well.

Back on topic - whenever I think CRT I remember long warm-up times, bad geometry - impossible to get right, and issues like one part of the screen being fuzzy while the rest being clear. I don't know why anyone would want to subject themselves to something like that again. Also, for DOS gaming you would be forced to use small CRT monitors for good image quality ( witch in my opinion are a pain in the neck to use - literally ). Believe it or not, games like Dune II and Commander Keen played on my 386 + Tseng ET4000AX look BETTER on my 19" 5:4 Iiyama prolite b1906s LED backlit LCD then on my newly restored 21" EIZO CRT. Image is crisp and clear, no fuzziness at all, and the colors are perfect. Of course, things are a little different when using my Samsung 940N - this one like to pixel-bleed a little at low resolutions, but not as much as most cheap LCDs.

If you get the chance to lay hands on newer 4:3 / 5:4 LED-backlit TN or IPS LCDs like the IIyama I mentioned earlier or Eizo's latest 5:4 FlexScan S2133 (1600x1200 6ms) / S1923 (1280x1024 5ms) IPS LCDs you'll see what I'm talking about.

I also believe that loss of image quality on LCDs occurs over the conversion from digital (GPU) to analogue (VGA output) then back to digital (LCD's controller). This is most noticeable at low resolutions. Using a cheap korean LCD (LG 19") on a GF 4 Ti 4200 over VGA, image quality is mediocre at best - even when using the panel's native resolution. Switching to the Ti 4200's DVI port changes things significantly. This is not noticeable on my Iiyama tough - pushing the Auto Adjust button has the LCD correctly align the picture with great results.

Quality CRT monitors I've used are: IBM's P275, EIZO's FlexScan T960 and T930, AoC's 9GLR 19" and Samsung's 997MB. I've had the Eizo for about 9-10 years now, and it hasn't deteriorated much image-wise, but it hasn't seen intense use either. The Samsung and IBM have lost focus over the years tough. The AOC I got sometime between 2002 and 2004 I think, new and I've used it up until 2006 when I switched to my first LCD. Then I traded the LCD for the 21" Eizo, witch despite great image quality, was so tiring I switched back to an LCD. I wholeheartedly recommend the Eizo T960 and AoC 9GLR.

Last edited by kanecvr on 2015-06-23, 09:16. Edited 4 times in total.

Reply 39 of 68, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
kanecvr wrote:

[...] look BETTERon my 19" Iiyama prolite b1906s LED backlit LCD then on my newly restored 21" EIZO CRT.

...F9xx series?

Of course low-res material never looks good on any hi-res screen though.