VOGONS


Reply 20 of 28, by Jo22

User metadata
Rank l33t++
Rank
l33t++
mkarcher wrote on 2023-04-09, 08:58:
Jo22 wrote on 2023-04-09, 07:33:

To programmers, using small or low-res 640x480 monitors must have been borderline already by the late 80s.
-I mean, most DOS tools or IDEs had an optional 43 line/50 line mode, for example (DOS Shell, VBDOS).

The 43-line mode is borrowed from EGA and uses an 8x8 character box in a 640x350 pixel screen. The 50-line mode has the same timing as the 25-line mode, i.e. 720x400, but with an 9x8 character box instead of an 9x16 character box. So these two modes are not an example of anything requiring 480 lines, so these character modes still displayed at the nice 70Hz of VGA.

True. That goes without saying, I assume. Most of us know the basics of EGA/VGA about good enough, I think.

The statement about hi-res text modes was in context with monitor fidelity.
About the level of professionalism, if we will.
VGA as such differed from its predecessors in terms of the paradigm.
Before VGA, it was about a low colour count but crisp/hires graphics and digital monitors.

When VGA/MCGA was introduced, the consumer/entertainment sector moved on to big colour count and blurry analogue monitors. 320x200 had to look good, the rest was secondary. They were also cheaply to produce, with the 31KHz rate being the most challenging part, due to the need for good transformers.

And multi-sync or dual-frequency monitors differed from this, since they weren't consumer grade items. Their screen masks were typically better, they had less colour convergence issues etc. In other words, their overall visual quality was more like that of previous EGA screens or broadcast monitors.

Edit: Edited.

Edit: That's what I had in mind when I meant consumer monitors:
https://www.youtube.com/watch?v=m79HxULt3O8&p … hIDY0MHg0ODA%3D
Low-res, QVGA/MCGA games like Donkey Island looked nice on low quality monitors.
But text was rather hard to read, anything above 80x25 was almost indecipherable.

Last edited by Jo22 on 2023-04-16, 20:08. Edited 1 time in total.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 21 of 28, by Scali

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-04-09, 07:33:

With a small/budget VGA monitor, these text modes were hardly readable.
A professional monitor was really needed here for longer programming sessions.

I would argue that PCs in general, and VGA specifically were always 'professional'.
There were just basic and more high-end configurations available.

In the 80s, PAL/NTSC broadcast quality were 'standard' display devices for most computers (which is also why CGA and EGA were compatible with them).

If you had a PC instead of a home computer, you were already likely to be a 'professional' in the 80s, as PCs were far more expensive than home computers... and even then, VGA was a high-end luxury option.
It wasn't until the early 90s that home computers disappeared, PCs were cheap enough for anyone buying a computer, and VGA was standard.

In those days, people generally only had PAL/NTSC SD screens in their homes (television sets and such), and (S)VGA was way higher fidelity than that.
It's way different from today, where TVs are much higher resolution and much higher quality, where 4k is now the standard, before it became widely adopted by PCs.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 22 of 28, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++
Scali wrote on 2023-04-09, 10:07:

It's way different from today, where TVs are much higher resolution and much higher quality, where 4k is now the standard, before it became widely adopted by PCs.

Yes, it's exactly like that, where if the 90% of ordinary people who walk into Bestbuy with $300 to buy a TV or computer, say, hey I just want the 1080P 40" or the base Atom in Pentium clothing 4GB/256GB laptop get magically upgraded by the upgrade fairy to a 60" "standard" 4K TV or a 17" i7/16G/2TB so obviously no reason for lower end parts to really exist in the first place.... /sarcasm

Anyway, one can't reason 256KB VGA cards out of existing any more than you can reason Toyota Corollas out of existing because large Lexus SUVs are made. This is the same line of revisionist BS as assuming everyone got given a Pentium in 1993 at release and 286es popped out of existence in the market the moment the 386SX was announced. In 20 years kids are gonna be telling you you must have drove a Tesla and had one of those 10ft TVs that roll up into a base, just because they were the best thing that existed, so there was no reason for any alternative.

As for people who needed to look at text all day, who must have been wanting high res SVGA... no they didn't.. not for a few years... it was still kinda flickery and muddy, not high enough dot pitch unless you spent ridiculous amounts, most people (not all, for the dumbasses that think providing one or two counterexamples proves that false) who had to stare at text all day in 80s/90s crossover period, stuck with MDA/Herc monochrome screens, crisp high quality ones with flicker reducing phosphors. CAD using engineers got to have $5000 monitors and $1000 cards to drive them, data entry users didn't.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 23 of 28, by Scali

User metadata
Rank l33t
Rank
l33t
BitWrangler wrote on 2023-04-09, 15:04:
Scali wrote on 2023-04-09, 10:07:

It's way different from today, where TVs are much higher resolution and much higher quality, where 4k is now the standard, before it became widely adopted by PCs.

Yes, it's exactly like that, where if the 90% of ordinary people who walk into Bestbuy with $300 to buy a TV or computer, say, hey I just want the 1080P 40" or the base Atom in Pentium clothing 4GB/256GB laptop get magically upgraded by the upgrade fairy to a 60" "standard" 4K TV or a 17" i7/16G/2TB so obviously no reason for lower end parts to really exist in the first place.... /sarcasm

Well that's a completely misplaced comment.
In my experience 4k has indeed been the 'standard' for a while. Yes, you can still get low-end 1080p screens if you really want. Generally you'll have off-brand models, or old stock.
Just checked at coolblue.nl, a large online shop in my country...
https://www.coolblue.nl/televisies/filter

It sells a total of 16 1080p models, 9 720p models, 477 4k models and 31 8k models.
I think based on those numbers, 4k is the standard, and 72op and 1080p are on their way out.

The attachment coolblue_4k.png is no longer available

Looking at laptops, 452 of them are 108op, only 12 are 4k. And a lot of sizes in between.

The attachment coolblue_laptops.png is no longer available

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 24 of 28, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

I am sorry if you felt picked on, I was using your quote more to address other points in the thread. However I am meaning standard for the lowest possible basic specification, you are meaning standard for the highest standard in common use. If you sort by most popular they seem to be selling a low more non 4k than the number of models available suggest, in that it looks like 5% whereas by most popular it looks like 33% or so. I would also think that many people spending that on TVs would be more likely to pick up "supermarket" TVs at Aldi etc, so that site may not be representative of the full distribution of TV sales. It also looks to me like I picked the price break right, above $300ish you get 4K, at $300 or below, they are mostly just HD. All us smartasses who shoot the crap online about specs all day are not representative of the average buyer either.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 25 of 28, by jakethompson1

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote on 2023-04-09, 19:50:

Looking at laptops, 452 of them are 108op, only 12 are 4k. And a lot of sizes in between.

Good riddance to 1366x768 and especially how it displaced better resolutions for a while eg 1280x1024

Reply 26 of 28, by Scali

User metadata
Rank l33t
Rank
l33t
BitWrangler wrote on 2023-04-09, 20:14:

I am sorry if you felt picked on, I was using your quote more to address other points in the thread.

My point was this:
In the days of the CRT, most screens, most notably TVs, were fixed to SD resolutions (usually interlaced to boot).
VGA was an outlier, and delivered much higher resolutions than common displays.
When flatscreens started to displace CRTs, that was around the time that television became digital and moved from SD to HD.
That put TV on par with computers for a while, as both generally used 720 and 1080 standards for years.

But in recent years, mostly because of streaming services, 4k became available for video, and 4k TVs and monitors became commonplace.
However, decoding 4k video can be done very efficiently with a cheap chip. Rendering entire desktops and games efficiently in 4k requires a very powerful GPU.
This is why the PC world is still struggling to make 4k the standard, and have effectively been overtaken by TVs now.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 27 of 28, by mkarcher

User metadata
Rank l33t
Rank
l33t
Scali wrote on 2023-04-09, 20:40:

However, decoding 4k video can be done very efficiently with a cheap chip. Rendering entire desktops and games efficiently in 4k requires a very powerful GPU.
This is why the PC world is still struggling to make 4k the standard, and have effectively been overtaken by TVs now.

Rendering games at 4k native resolution definitely requires a powerful GPU, point taken. Though I highly doubt that rendering desktops for 4K requires a very powerful GPU. Running office applications and browsers in 4K can provide sharper text, allows slightly smaller text to still be readable, and while it requires more resources to render, the amount of resources required is still very low. I'm observing lot of offices switching to 4K monitors as default for new computers in Germany. Oftentimes those new computers do not have dedicated graphics cards.

Reply 28 of 28, by Scali

User metadata
Rank l33t
Rank
l33t
mkarcher wrote on 2023-04-09, 20:50:

Rendering games at 4k native resolution definitely requires a powerful GPU, point taken. Though I highly doubt that rendering desktops for 4K requires a very powerful GPU. Running office applications and browsers in 4K can provide sharper text, allows slightly smaller text to still be readable, and while it requires more resources to render, the amount of resources required is still very low. I'm observing lot of offices switching to 4K monitors as default for new computers in Germany. Oftentimes those new computers do not have dedicated graphics cards.

Yes, we're slowly getting there.
I've had a 4k monitor since the early days, in 2015 or so. At the time my 1080p screen died, and a 4k model didn't cost that much of a premium over a 1080p yet.
But when I bought it, I found that my current video card couldn't even drive a 4k desktop at 60 Hz. I could either run 4k at 30 Hz, or go some resolutions lower to get 60 Hz. I found that a desktop at 30 Hz is really annoying, with choppy scrolling and moving of windows, so I preferred a lower res and 60 Hz.
After I upgraded to a GeForce GTX970, I could do 4k at 60 Hz, but playing video at 60 Hz was still not possible, let alone playing any semi-recent games at reasonable detail settings.
These days an onboard GPU from a reasonable CPU, like a Core i5 can barely manage video playback at 60 Hz.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/