VOGONS

Common searches


8k

Topic actions

Reply 40 of 58, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie

My TV's are 4k at 55" and 60"

My gaming desktop I'm using a 37" ultrawide at 3440x2440p -100hz

My laptop does have a 4k 15.6" screen but the GTX 1650 can't really drive any games at that resolution, it is very nice for casual usage and videos however.

I don't see the point for 8K gaming in almost any sector. It is too early.

Eeven for TVs it is a hard sell for my.

HDR/Color Accuracy/refresh rate etc are FAR more important IMHO

Reply 41 of 58, by WDStudios

User metadata
Rank Member
Rank
Member
rmay635703 wrote on 2021-06-24, 12:30:

What I find strange is that newer 4:3 rarely exceed 1600x1280 but in olden times there were 2 common resolutions above that.

I think you mean 1200, not 1280, and there was only one "common" resolution above that: 2048 x 1536. And that's being pretty generous with what "common" means: Video cards, Windows, and a FEW games (most ones based on the Quake III engine) could support it, but extremely few monitors did, and the ones that did cost like $4,000.

I know other fullscreen resolutions (1792 x 1344? 1920 x 1440?) existed but they weren't "common". No monitors were ever made in these sizes to my knowledge and I can't name any games that offered them. Wikipedia denies that they ever existed:

960px-Vector_Video_Standards2.svg.png

Since people like posting system specs:

LGA 2011
Core i7 Sandy Bridge @ 3.6 ghz
4 GB of RAM in quad-channel
Geforce GTX 780
1600 x 1200 monitor
Dual-booting WinXP Integral Edition and Win7 Pro 64-bit
-----
XP compatibility is the hill that I will die on.

Reply 42 of 58, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I had a Philips Brilliance 170P that did 1920x1440 60Hz and I loved it.

Amazing monitor.

I dont habe the intention of collecting CRTs but if O ever come across a 190P I will buy it, storage space be damned.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 43 of 58, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie

https://arstechnica.com/civis/viewtopic.php?t=316738

1600x1280 was a thing in both CRTS and LCDs of a particular age

Many fixed frequency CRTs ran 1600x1280

19XX x 1XXX was a more common max resolution than 20xx resolutions back in the day as well.

I say this because I owned cards that supported those resolutions and my old 21” CRT screen supported up to 1900 wide resolutions in its documentation

Reply 44 of 58, by darry

User metadata
Rank l33t++
Rank
l33t++
rmay635703 wrote on 2021-06-26, 03:29:
https://arstechnica.com/civis/viewtopic.php?t=316738 […]
Show full quote

https://arstechnica.com/civis/viewtopic.php?t=316738

1600x1280 was a thing in both CRTS and LCDs of a particular age

Many fixed frequency CRTs ran 1600x1280

19XX x 1XXX was a more common max resolution than 20xx resolutions back in the day as well.

I say this because I owned cards that supported those resolutions and my old 21” CRT screen supported up to 1900 wide resolutions in its documentation

I have doubts about 1600x1280 ever being a native resolution for LCDs .

- CRTs do not have native resolutions and were typically 4:3 (were there any 5:4 ones ?). Any CRT capable of 1600x1280 would also have been able to display 1600x1200 . If the CRT was 4:3, 1600x1200 would have made more sense to use, but CRT manufacturers were free to advertise any resolution their products could sync to and display. I have seen references to 1600x1280 in ads and on at least on one Philips hosted documented
- There do seem to be references to 1600x1280 timings on the Internet
- At least one variant of the Sceptre X20 was 1680x1050 according to its manual . -->https://www.manualslib.com/manual/478801/Scep … ?page=33#manual
- One example from that thread says 1600x1280 , but the PDF tech sheet says 1600x1200
https://web.archive.org/web/20060221014638/ht … 231powervue.asp
https://web.archive.org/web/20060311233523/ht … VT231CMtech.pdf

I have however seen plenty of 1600x1200 native resolution LCD (and own two of them: Samsung 204B and a Dell 2007FPB) .

I have never heard of a native resolution 1600x1280 LCD ever having existed based on any reliable source that I could find (manufacturer website or PDF manual/scan) .

I am not saying that 1600x1280 LCD do/did not exist, I am just saying that I can find no reliable evidence of that ever being the case . Forum posts and manufacturers who contradict themselves in their own docs/websites are not conclusive evidence to me . Maybe they existed as very niche products or maybe manual writers were confused because of 1600x1280 previously being advertised on CRT monitors and wrote erroneous information in docs for newfangled LCDs .

If you do have/find evidence, even suspect, of 1600x1280 native LCD monitors and/or panels, please do share .

Reply 45 of 58, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
darry wrote on 2021-06-26, 04:40:

- CRTs do not have native resolutions

This is true, but they did have an "optimal resolution".

The manual for my 17" Samsung SyncMaster 795MB states that its optimal resolution is 1024x768 @ 85 Hz. And indeed, its picture is the sharpest when that exact resolution and refresh rate is used. However, it works just fine in lower resolutions and with other refresh rates. A minor scanline effect is visible in 800x600 and 640x480, but I'm personally a fan of that look. Going to 1280x960 and above results in a slightly blurrier image, though it's still tolerable.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 46 of 58, by darry

User metadata
Rank l33t++
Rank
l33t++
Joseph_Joestar wrote on 2021-06-26, 04:50:
darry wrote on 2021-06-26, 04:40:

- CRTs do not have native resolutions

This is true, but they did have an "optimal resolution".

The manual for my 17" Samsung SyncMaster 795MB states that its optimal resolution is 1024x768 @ 85 Hz. And indeed, its picture is the sharpest when that exact resolution and refresh rate is used. However, it works just fine in lower resolutions and with other refresh rates. A minor scanline effect is visible in 800x600 and 640x480, but I'm personally a fan of that look. Going to 1280x960 and above results in a slightly blurrier image, though it's still tolerable.

Indeed this was due to dot/aperture pitch coarseness which often rendered higher resolutions impractical before one reached a given monitor's max scanning rate/resolution. The "optimal resolution" was the sweet spot beyond which the picture became blurrier rather than sharper. Ironically, this "optimal resolution" was not always advertised as such by the manufacturer who sometimes preferred to showcase the maximum resolution the monitor was able to sync to and display . AFAICR, this was especially true in the era of cheap 1024x768 SVGA monitors that could only actually handle that resolution in interlaced mode (1024x768@87Hz interlaced) . Even 800x600 interlaced was a thing on some video cards ( Re: Cd rom driver ) ! I wonder if there were ever any monitors that both handled and topped off at 800x600 interlaced .

Reply 47 of 58, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2021-06-26, 05:01:

! I wonder if there were ever any monitors that both handled and topped off at 800x600 interlaced .

I owned a 1986 20” Mitsumi Fixed Frequency color monitor that had been modified for VGA with manual sync adjustment knobs on the front bezel

It was originally used at a power plant and became a boss computer monitor in the early 90’s

I got it on disposal and yes originally it was spec’d for 800x600 x 43hz (aka87hz interlaced)

At the time I could only drive it at 640x480 as I didn’t have a card that would interlace 800x600

Reply 48 of 58, by darry

User metadata
Rank l33t++
Rank
l33t++
rmay635703 wrote on 2021-06-26, 15:28:
I owned a 1986 20” Mitsumi Fixed Frequency color monitor that had been modified for VGA with manual sync adjustment knobs on the […]
Show full quote
darry wrote on 2021-06-26, 05:01:

! I wonder if there were ever any monitors that both handled and topped off at 800x600 interlaced .

I owned a 1986 20” Mitsumi Fixed Frequency color monitor that had been modified for VGA with manual sync adjustment knobs on the front bezel

It was originally used at a power plant and became a boss computer monitor in the early 90’s

I got it on disposal and yes originally it was spec’d for 800x600 x 43hz (aka87hz interlaced)

At the time I could only drive it at 640x480 as I didn’t have a card that would interlace 800x600

That is interesting. Do you have the model number ?

Reply 50 of 58, by appiah4

User metadata
Rank l33t++
Rank
l33t++
antrad wrote on 2021-06-28, 06:18:

I was thinking of buying a 1440p monitor, but since GPU prices are ridiculous now, I can't buy a GPU to run it.

You don't need one, if you have a GPU that does 1080p fine AMD FSR has you covered for 1440p in the future.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 51 of 58, by WDStudios

User metadata
Rank Member
Rank
Member
darry wrote on 2021-06-26, 04:40:

- CRTs do not have native resolutions and were typically 4:3 (were there any 5:4 ones ?).

No. 5:4 was originally a SAR intended to be displayed with rectangular pixels on 4:3 DAR monitors. The idea of displaying a 5:4 SAR image with square pixels on a 5:4 DAR monitor did not come about until after LCDs had displaced CRTs.

Since people like posting system specs:

LGA 2011
Core i7 Sandy Bridge @ 3.6 ghz
4 GB of RAM in quad-channel
Geforce GTX 780
1600 x 1200 monitor
Dual-booting WinXP Integral Edition and Win7 Pro 64-bit
-----
XP compatibility is the hill that I will die on.

Reply 52 of 58, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
antrad wrote on 2021-06-28, 06:18:

I was thinking of buying a 1440p monitor, but since GPU prices are ridiculous now, I can't buy a GPU to run it.

I honestly can't drive my 3440 x 1440 to it's best potential either. I've just got a GTX 1080 in my main rig. But for older games or easier to run ones it is still a nice resolution to have.

Any who knows maybe one of these days we will all be able to upgrade our GPUs again....we can only hope.

Reply 53 of 58, by ZellSF

User metadata
Rank l33t
Rank
l33t
appiah4 wrote on 2021-06-28, 07:10:
antrad wrote on 2021-06-28, 06:18:

I was thinking of buying a 1440p monitor, but since GPU prices are ridiculous now, I can't buy a GPU to run it.

You don't need one, if you have a GPU that does 1080p fine AMD FSR has you covered for 1440p in the future.

I would be a bit more careful with saying that. AMD FSR, besides needing to be implemented by games, isn't magic. It just makes 1080p look better than a regular bilinear upscale.

How much that difference makes is very individual. Someone might think FSR looks as good as a higher resolution native while some might think it looks very aliased and blurry.

Personally I would advice against buying a higher resolution monitor if you're not going to spend most of the time running at native resolution. 5K/8K monitors might be exceptions to this, but I haven't tested either yet.

Reply 54 of 58, by ZellSF

User metadata
Rank l33t
Rank
l33t

On the 1600x1280 discussion. I've only heard of this resolution once. It's one of the available resolutions in Armor Command. It's also 5:4 (well it's probably 4:3 DAR). So maybe someone thought that was the logical next step and was overruled by other manufacturers wanting a 1:1 PAR resolution.

Not that I understand why 1280x1024 was a thing in the first place.

Reply 55 of 58, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote on 2021-06-30, 22:59:

Not that I understand why 1280x1024 was a thing in the first place.

Historically The only reason for a variety of resolutions had to do with memory use or timing limits

640x200 perfectly fits in 16k
Same for Hercules and the Japanese 640x400
Even the 8 bit oddities matched memory layout.

Even Suns high end 1024x1024 display was 100% to fit in memory

640 wide only exists because 80 columns was a thing for some reason matching old 7 pin printers

Once IBM releases the (150k) 640x480 that breaks any relationship between efficiency and the frame buffer and makes absolutely no technical sense as it wastes almost half of the frame buffer. One could argue it was designed for 320x200 to have pages but that’s an afterthought since most spent their time in 640x480.

IBM never mentioned 4:3 aspect until 5 years after vga was out so it likely was just to match newer screens capabilities and retain the 640/80column relationship.

The “SVGA” resolutions actually followed what was being done years earlier with fixed frequency displays 1024x??? Was popular since it fits common memory sizes
AKA 1024x768 fits into 384kb, 768kb or ever 2.25mb (yes I have a 2,25mb video card)
800x560 in fixed frequency land became 800x600 another non-compliant res but actually after 1024x768
1280x1024 fits in 640kb in fixed frequency land which then transferred to PC

1280x960 existed as well but was not popular since it wastes more memory for no gain, many in the industry didn’t care about aspect ratio especially on dedicated applications where you needed a certain amount of real estate.

1600x1280 was a popular fixed frequency monochrome screen resolution because it fit in the buffer better than 1600x1200. I say this because I owned a Cornerstone mono screen from a bank that ran 1280 instead of 1200

AKA Back in the day you either needed color at potato resolution or you needed raw resolution in strict b&w
1600x1280 1 bit “color” fits perfectly in 256kb of memory

Last edited by rmay635703 on 2021-07-01, 14:21. Edited 1 time in total.

Reply 56 of 58, by DraxDomax

User metadata
Rank Newbie
Rank
Newbie

My partner bought for my birthday a Samsung QE75Q950TS (75" 8k QLED)... I thought it was too much and asked her to return it and just buy me Sony MX4 headphones (which I also thought was a bit much).
We don't have a "TV", we just watch movies and, very rarely, play games on our perfectly adequate 49" 4k LG...

I mean, I hate watching stuff anyway - it's just a form of escapism from daily life and I feel like I've just wasted a lot of time, instead of learning something or doing something creative... We've practically watched all the good movies anyway.

As for gaming, why would anyone want anything over 1440 is beyond me. My eyes are only getting worse, if anything 😀
I'd rather spend more on better response, better color, better overall specs than 4k/8k... The extra resolution is a tiny and very hard to achieve improvement.

Reply 57 of 58, by robertmo

User metadata
Rank l33t++
Rank
l33t++
DraxDomax wrote on 2021-07-01, 13:16:

As for gaming, why would anyone want anything over 1440 is beyond me. My eyes are only getting worse, if anything 😀
I'd rather spend more on better response, better color, better overall specs than 4k/8k... The extra resolution is a tiny and very hard to achieve improvement.

I guess you haven't tried playing cannon fodder without map scrolling 😀

Reply 58 of 58, by DraxDomax

User metadata
Rank Newbie
Rank
Newbie

This could come handy in Red Alert, with the grenade glitch (when you click a target to attack with the grenadier, if you manage to click another target while he's pulling the pin, there is no range limit)