VOGONS


First post, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Hi everyone,

I was going to reply to an interesting topic, but it vanished while I was writing/typing.
So I create one myself here, since I think it's an interesting topic, I think. Hope that's okay.

The question is: So why do early VGA cards, IBMs original and the clones do have a lumpy 256KB that's useless for anything except what we call "Standard VGA"?
You know, that pseudo NTSC TV resolution with merely a nibble of colour. 😉

Personally, I think that the 256KB configuration of early wasn't ideal. 640x480 @60 Hz, 31.5 KHz in 256 colour was the real deal, what VGA was meant to be.

The best evidence is IBM PGA/PGC, Professional Graphics Controller.
It had this specifications and provided the foundation to VGA's analogue specifications.

I assume, it's all because of EGA. EGA's full video memory was 256 KB in capacity. That's why VGA used it as a default, standard.

However, VGA seems has provisions for more memory.
There's one register bit indicating "512KB or more", if memory serves.

Most EGA and VGA clones outperformed their IBM originals in terms of resolution/memory.
Super EGA cards existed, for example.

Many of them, if not all, incorporated standard VGA in 640x480 in 16c.
Because, both EGA and VGA use same palette in 16 colour mode: No special initialization is necessary and 256KB of Video RAM are sufficient.

That's why the original Super VGA mode, 800x600 @56Hz in 16c was so widespread.
Except for the mode number, which varied among different models (until VBE came along), the initializing was the same.
Well, most of the time. Exceptions surely exist, but most VGA cards simply kept the normal memory layout of Standard VGA/EGA.

Funnily, there's one VBE mode in 256c that can be used by low-end VGAs:
640x400 @70 Hz in 256c - VESA VBE 1.0 mode 100h

Some games even used that, however, it has one disadvantage - it requires banked memory access, afaik.
Ie, the video modes exceeds 64KB and must use some kind of bank-switching (like EMS).

That's the main reason why game programmers were so obsessed with MCGA's mode 13h, 320x200 pels in 256c (320x400 actually, because line doubling is enabled).

If it wasn't for 320x200's ability to fit in a single x86 memory segment, with each pixel of colour addressable, PC gaming could have been so much higher end.
Or less popular, if we're pessimistic.

Growing up with the shareware scene and Windows 3.1x, I later learned how pixelated and low-res the majority of (commercial) DOS games were by contrast.

While shareware games made by one man team were much simpler in design, they used the 16c modes to their fullest.
That's why ironically EGA was such a nice thing at the time.

Users/programmers with EGA PCs did value low-colour art much more than most of us did.
Like the Japanese users/developers they focused on using extra resolution rather than colour.

Games like Zeliard look much prettier in 640x350 or 640x200 than in 256c but MCGA/320x200 (IMHO).
Standard EGA gives them some sort of grace, of elegance.

These are just my two cents, of course.

Any ideas welcome.

Edit: Typos fixex.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 1 of 28, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

The answer is almost always economical. If you can get away with less for a specific mode, why waste precious memory chips? Office space most certainly don't need that and gamer needs were hardly in equation.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 28, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie

Those who needed 512 or 1024 KB were supposed to purchase an 8514/A.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 3 of 28, by keenmaster486

User metadata
Rank l33t
Rank
l33t

320x200 with 256 palletized colors is pretty much the perfect game graphics mode for 386-tier computers.

Windows didn't become popular until SVGA was already commonplace.

There's your answer.

World's foremost 486 enjoyer.

Reply 4 of 28, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie

Oh yeah, PC gaming was mostly stuck in 320x200x256 well into the SVGA era...
Doom, Doom II, Heretic, Hexen (1993..95) - all 320x200x256 only.
In the action genre, the first games with SVGA modes were Witchaven and Quake (1995..96), with SVGA still being optional.

The primary reason was, of course, speed.
In the early VGA era even 320x200x256 might be too slow, and there was plenty of "VGA" games actually running in 320x200x16, with the only difference from EGA being the 16 colors were picked out of the 262144-color palette.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 5 of 28, by mkarcher

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-04-04, 00:36:

The question is: So why do early VGA cards, IBMs original and the clones do have a lumpy 256KB that's useless for anything except what we call "Standard VGA"?
You know, that pseudo NTSC TV resolution with merely a nibble of colour. 😉

The answer is quite simple: It was considered good enough for most people, and memory was expensive at that time. Also disk space was expensive, and software was delivered on floppy disks. Distributing software with higher resolution or higher color depth graphics would have required more floppy disks (making distribution more expensive) and installing that software would have used more space on the hard disk. When the VGA card was introduced, hard disk sizes of 20MB to 40MB were still common. Take a moment to think about that: A 20MB hard drive is completely filled with just 320 uncompressed MCGA/VGA images, or 270 16-color uncompressed VGA images. Also games were intended to run with 1MB of system memory. As most games ran in real mode, only 640KB of memory was available to the game, which had to be shared with DOS. This means (without loading times) there can be at max around 400 kilobytes of graphics data. High resolution or high color depth just didn't fit a 286 machine at 6MHz, 1MB of RAM and a 40MB hard drive. Also, bus bandwidth is another issue: The IBM VGA design only uses an 8 bit bus, as does the first generation of clone cards. Just filling the video memory takes too much time if it uses too much data.

The point that higher resolution was not focussed by developers is cleary visible on the PC game market: Even at times where most users had at least 256KB on their upgrade IBM EGA, Super EGA or VGA card, a lot of games still ran at 320x200 @ 16 colors, instead of using 640x350 @ 16 colors, due to the considerations from the paragraph before.

On the graphics card side: The card is limited to generate 640/720 pixels per scanline at 31.5kHz with only 4 bits of color. The memory scan-out circuit isn't fast enough for the bandwidth required for 640 pixels at 256 colors. The 256-color mode is created by combining two 4-bit chunks of color information to one 8-bit color number later in the processing pipeline. The VGA circuit thus is maxed out at 360 pixels of 256 colors per scanline.

Re-check the gaming history: The time 640x480 @ 256 colors got widespread was not only when the VESA BIOS extension was standardized, but also when VESA Local Bus cards started to appear. There were some games using computationally generated graphics at high resolutions before that, but only for games where screen redraw time was unimportent, like puzzle games and non-realtime role-playing games, or games with very little animation on screen.

Jo22 wrote on 2023-04-04, 00:36:

Personally, I think that the 256KB configuration of early wasn't ideal. 640x480 @60 Hz, 31.5 KHz in 256 colour was the real deal, what VGA was meant to be.

The best evidence is IBM PGA/PGC, Professional Graphics Controller.
It had this specifications and provided the foundation to VGA's analogue specifications.

The IBM PGC also showed why 640x480 at 256 colors (it's not 31.5kHz, IIRC it's 30.8 kHz) was not a general purpose graphics option at that time: That card basically is a single-board computer included with the graphics card, to be able to provide sensible performance at that resolution. People were happy if they could afford one computer for gaming, and wouldn't have the money to pay for a second computer just to power the graphics card. To achieve the required memory bandwidth for 640x480 at 256 colors (that is around 25MB/s just for scan-out; drawing requires additional bandwidth), the PGC surely used VRAM, which also is an expensive premium item, at least at higher capacity.

Jo22 wrote on 2023-04-04, 00:36:

However, VGA seems has provisions for more memory.
There's one register bit indicating "512KB or more", if memory serves.

Many Super-VGA chips have bits like that, but VGA does not.

Jo22 wrote on 2023-04-04, 00:36:
Most EGA and VGA clones outperformed their IBM originals in terms of resolution/memory. Super EGA cards existed, for example. [. […]
Show full quote

Most EGA and VGA clones outperformed their IBM originals in terms of resolution/memory.
Super EGA cards existed, for example.
[...]
That's why the original Super VGA mode, 800x600 @56Hz in 16c was so widespread.

Getting 800x600 indeed was easy, because you just had to make sure that the circuit works at the required dot clock. This applies to both EGA and VGA. On the other hand, the first generation of clones of both of those card types did not exceed 256KB of RAM. Some early Super VGA cards could do 1024x768 @ 87i, but only at 4 colors. IBM already built 4-color modes into the EGA circuit to have it display 640x350 with an un-upgraded EGA card with only 64KB of RAM, so that logic could be re-used.

Jo22 wrote on 2023-04-04, 00:36:

Funnily, there's one VBE mode in 256c that can be used by low-end VGAs:
640x400 @70 Hz in 256c - VESA VBE 1.0 mode 100h

The requirement for that mode is still twice the memory bandwith of standard VGA, which was not available on 80's VGA clones. This mode is targeted to early 90s Super-VGA cards that had 256KB as entry-level configuration, but was expandable to 512KB, e.g. the TVGA8800 series. Also, this resolution is an easy upgrade path from 320x200, because it has the same pixel aspect ratio.
EDIT: In fact, that Trident 8800 is a prime example for an early SVGA card that hits its bandwith limit before reaching 640 pixels at 31.5kHz in 256 colors. Maybe you can push the chip that far if you overclock it by providing a 50MHz "pixel" clock into it, instead of using the stock oscillators.

Jo22 wrote on 2023-04-04, 00:36:

Some games even used that, however, it has one disadvantage - it requires banked memory access, afaik.
Ie, the video modes exceeds 64KB and must use some kind of bank-switching (like EMS).

That's the main reason why game programmers were so obsessed with MCGA's mode 13h, 320x200 pels in 256c (320x400 actually, because line doubling is enabled).

You already had access to 360x480 with standard VGA hardware using undocumented but well-known techniques. This also required a kind of banking, and unless the complete graphics engine is taylored to use the kind of memory access used in that mode ("Mode-X like scheme"), it usually provides low performance. Simple SVGA banking was an advantage over Mode-X, because it made the graphics code way easier.

Summary: There was no market requirement for VGA compatible cards with more than 256KB of RAM when they started making VGA clones. Those cards would have been way to expensive for the target audience.

If you were focussed on gaming, an Amiga provided a significantly better bang-for-the-buck ratio anyway, as the Amiga architecture was optimized for multimedia from the start, e.g. including a 16-bit bus to the graphics card, and a drawing accelerator chip, the Blitter. The Amiga also included the Copper chip to allow for Console-like graphics effect with real-time graphics configuration changes, and a quality digital sound playback solution. On the other hand, the first-generation Amiga only had 15.6kHz output, allowing 640x400 only as interlaced mode. Running the first-generation Amiga at 640 pixels per scan line with 16 colors enabled blocked chip memory to be accessed by the CPU during display operation, whereas VGA 640x480 didn't impede the processor at all as it had local memory.

Last edited by mkarcher on 2023-04-04, 10:47. Edited 1 time in total.

Reply 6 of 28, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
mkarcher wrote on 2023-04-04, 08:27:
Jo22 wrote on 2023-04-04, 00:36:

Funnily, there's one VBE mode in 256c that can be used by low-end VGAs:
640x400 @70 Hz in 256c - VESA VBE 1.0 mode 100h

The requirement for that mode is still twice the memory bandwith of standard VGA, which was not available on 80's VGA clones. This mode is targeted to early 90s Super-VGA cards that had 256KB as entry-level configuration, but was expandable to 512KB, e.g. the TVGA8800 series.

640x400 (and more) with 256 colors was still problematic on the TVGA8800...

The Trident 8800 chips have a problem with 256 color modes, as they always double
the pixels output in 256 color mode. Thus a 640x400 256 color mode (5Ch) actually
uses a 1280x400 frame, requiring at least a multi sync monitor.
This problem is fixed on the 8900.

...the required "multi sync monitor" means the original multisync, ie. 15+ kHz HSYNC.
I tried some 8800 card with a regular SVGA monitor (30+ kHz) - 16-color SVGA modes worked fine, but 256-color ones were below the range.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 7 of 28, by Jo22

User metadata
Rank l33t++
Rank
l33t++
mkarcher wrote on 2023-04-04, 08:27:
Jo22 wrote on 2023-04-04, 00:36:

Funnily, there's one VBE mode in 256c that can be used by low-end VGAs:
640x400 @70 Hz in 256c - VESA VBE 1.0 mode 100h

The requirement for that mode is still twice the memory bandwith of standard VGA, which was not available on 80's VGA clones. This mode is targeted to early 90s Super-VGA cards that had 256KB as entry-level configuration, but was expandable to 512KB, e.g. the TVGA8800 series.

Yes and no. That's why many SVGA cards had a couple of different crystal oscillators installed.
I think that early SVGA cards like Trident 8900, PVGA1A and OAK 37c/67 were capable of doing 640x400 in 256c. And 800x600 in 16c.

In 1987, IBM had deprecated the PGC and praised the IBM 8514/A with 640x480 in 256 or 1024x768 in 256c (at 43Hz interlaced).

Afaik, all/most 256KB SVGA models were expandable to 512KB. It were the cheap PCB designs that caused the limitation of the time.

By installing chips atop of each others (piggy pack) and routing the address lines appropriately, it was possible to upgrade 256KB cards. :)

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 8 of 28, by Jo22

User metadata
Rank l33t++
Rank
l33t++
mkarcher wrote on 2023-04-04, 08:27:

Summary: There was no market requirement for VGA compatible cards with more than 256KB of RAM when they started making VGA clones. Those cards would have been way to expensive for the target audience.

That's debatable. But there was a memory shortage in the late 80s, which peaked ~'88.
It caused prices of formerly cheap RAM to sky rocket.
That's why in our collective memory, RAM was very expensive in the 80s/early 90s.
But that's just half the story. Prior ca. late 1987, memory capacities of 512KB or 1 MB wasn't something to make home about.

Best example/evidence for RAM being affordablebe for this time were the
Tandy PC series in the US or Schneider PC1512/1640.
They were made in 1986 and before and cheap in price, despite having 512KB or 640KB RAM on-board.

The situation was perhaps comparable to the HDD situation past 2011 (the flood in Japan), in which prices stayed high for years to come.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 9 of 28, by root42

User metadata
Rank l33t
Rank
l33t

That register bit for 512k interests me. Where did you read that?

YouTube and Bonus
80486DX@33 MHz, 16 MiB RAM, Tseng ET4000 1 MiB, SnarkBarker & GUSar Lite, PC MIDI Card+X2+SC55+MT32, OSSC

Reply 10 of 28, by Scali

User metadata
Rank l33t
Rank
l33t

I agree with the above people who say it was a matter of economics/'good enough'.
256k was actually a lot of memory back in 1987. Especially for a video card alone (popular Atari ST and Amiga models shipped with 512k for the entire system, including video). The main problem is that higher resolutions get exponentially more expensive in terms of bandwidth and storage requirements. So making a video card perform acceptably at 320x200 in 256 colours was already quite difficult, and VGA cards in the early days were notoriously expensive. For higher resolutions you'd need much faster VRAM (and VRAM tends to be more expensive than regular system ram, as you need dual ported ram to avoid snow (hello CGA), and VRAM was often faster than system ram as well, because it has to constantly be scanned out in realtime by the RAMDAC to generate the image... Higher resolution, colours and refresh rates mean more bytes per second), and ideally also a faster CPU and localbus.
Let's not forget, early VGA cards weren't exactly ET4000s.

So, while it was possible to go beyond VGA specs with 1987 tech, you'd get very large and expensive boards like the PCG (which still wasn't exactly a speed demon).

For EGA the original spec was actually 64k, and you could go up to 256k with a daughterboard (expensive, so optional extra).
But clones started a price/spec war, so 256k quickly became standard, and many games of the day actually assume 256k, as they stored extra graphics in offscreen VRAM for faster scrolling and blitting.

Games were pretty much limited to 320x200 anyway, as the PC was too primitive to move more data around in realtime. Things that a NES, C64 or Amiga can do effortlessly with its custom graphics chipset, takes a lot of bruteforce pixel pushing over a slow ISA bus on a PC.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 11 of 28, by mkarcher

User metadata
Rank l33t
Rank
l33t
Scali wrote on 2023-04-04, 13:35:

So making a video card perform acceptably at 320x200 in 256 colours was already quite difficult, and VGA cards in the early days were notoriously expensive. For higher resolutions you'd need much faster VRAM (and VRAM tends to be more expensive than regular system ram, as you need dual ported ram to avoid snow (hello CGA), and VRAM was often faster than system ram as well, because it has to constantly be scanned out in realtime by the RAMDAC to generate the image... Higher resolution, colours and refresh rates mean more bytes per second), and ideally also a faster CPU and localbus.
Let's not forget, early VGA cards weren't exactly ET4000s.

Actual "VRAM" usually was an expensive high-performance item (but see footnote 1), but none of the standard IBM consumer cards (MDA, CGA, EGA, VGA) ever used that kind of VRAM, which is dual-ported and provides a "serial access memory" for scanout. All those IBM cards were able to coordinate access to the video memory between the processor and the video card, by assigning fixed timing slots to both parties. The CGA snow is the only exception: In 80-column text mode, the CGA requires all memory access slots (it was 250ns memory IIRC) for reading character and attribute bytes from the video memory. In the worst case, there was no processor slot for 40µs. Blocking the ISA bus for 40µs is unacceptable, as every 16µs a row of memory needs to be refreshed, and memory refresh is also signalled on the ISA bus. So IBM decided that in 80-column text modes, the processor is able to steal cycles from the video refresh circuit. The video refresh circuit basically got the data that was exchanged with the CPU instead of the data it required.

With EGA, IBM did not add VRAM, but they improved the graphics memory handling a lot instead, still using DRAM with fixed timing slots. There is a configuration bit on the EGA card to choose between a 80/20 or a 40/60 split between scanout and CPU access. Furthermore, for 16-color graphics mode, the RAM interface of the graphics logic was widened from 16 to 32 bits. This allowed to run the 640-pixel EGA graphics mode at 2 megacycles per second for scanout. AFAIK the 640-pixel modes used the 80/20 split, so add another 0.5 megacycles per second for CPU access. 2.5 megacycles per second require a 400ns cycle time which is easily achievable with common DRAM of that time (but the CPU interface performance is quite poor). VGA just clocked higher, but used the same design, so it still is DRAM-based.

You mention the ET4000 that is known for its high speed: Again, most ET4000 graphics card were DRAM-based (at least early revisions of the ET4000 had an alternate VRAM-based mode, but the Tseng lmplementation of a VRAM interface in the ET4000 didn't provide significant advantage). The ET4000 gets its high speed from utilizing "page mode" access to video RAM. Writes from the CPU are buffered, and if they happen to hit similar memory locations, they were performed in a burst that provides significantly higher throughput than the simple RAS cycle for every access used in the original EGA/VGA design. Also for scanout, the chip loaded a bunch of words into a scanout FIFO using page-mode access. The use of buffers allows a higher flexibility on distributing access time between the graphics scanout circuit and CPU writes. You can't easily do speculative reads on VGA-type cards, as reads from video memory cause side effects in the graphics chip. And indeed, read performance of the ET4000 is notably slower than write performance.

Of course, there were VRAM-based cards, most notably the S3 911 and successors (they later made entry-level DRAM-based models starting with the 805). Also the Mach 8 is VRAM-based on the accelerator side of things. The Mach32, which combined the accelerator with standard VGA into the same memory existed in both VRAM and DRAM-based configurations.

[1]: The cheap "improved CGA" in the PS/2 model 30 (8086) called MCGA did use VRAM, using two 64K x 4 chips. Probably those chips were no longer required in the high-end market and available at a good price. Using VRAM enabled the MCGA to run 320x200 @ 70 Hz (double-scanned) with an 8-bit RAM interface, whereas the DRAM-based VGA required 32 bits. Using a slightly more expensive RAM type allowed a significantly cheaper mainboard construction.

Reply 12 of 28, by Scali

User metadata
Rank l33t
Rank
l33t

I use 'VRAM' as short for 'video ram', as in RAM that is on the video card. Not for a specific memory technology (there have been numerous different technologies over the years).
Indeed, one of them was actually called VRAM (later superceded by WRAM, "Window RAM"). That was in the early days of Windows acceleration I believe, generally only found on the more high-end cards. Cheaper cards used cheaper, more conventional DRAM technologies.

The ET4000 was fast because it was a good combination of fast memory technology with a wide bus and a very efficient memory controller.

You mention EGA, but the PCjr also solved the snow issue, and it also had 16 colour modes, by using two memory banks in parallel to get the extra bandwidth required, from essentially the same memory chips as found on an IBM PC mainboard and CGA card.

PCjr_video.png
Filename
PCjr_video.png
File size
21.77 KiB
Views
1438 views
File license
Public domain

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 13 of 28, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote on 2023-04-04, 10:40:

Afaik, all/most 256KB SVGA models were expandable to 512KB. It were the cheap PCB designs that caused the limitation of the time.

There were SVGA chips that only supported 256 KB, eg. Acumos AVGA1 (1991), later on rebranded to CL-GD5401.

On the other hand, according to the datesheet, PVGA1A (1988, definitely early!) supports up to 1024 KB.
At the same time, the highest mode listed is 640 x 480 x 256, which needs only 300 KB, and I've never seen a PVGA1A card with more than 512 KB.
Note that there's no support for 800 x 600 x 256, even though the amount of RAM is sufficient - obviously yet another case of not enough bandwidth...

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 14 of 28, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-04-04, 00:36:

why do early VGA cards, IBMs original and the clones do have a lumpy 256KB that's useless for anything except what we call "Standard VGA"?

1) VGA is build on EGA, most likely reusing its sequencer. Afaik in original IBM VGA internally even lovely 320x200@256C is not stored linearly 😮 but in four planes in a "Chain 4", just hidden from programmer/user Re: FastDoom. A new Doom port for DOS, optimized to be as fast as possible for 386/486 personal computers!
"Chain 4 addressing as used in Mode 13h, consecutive pixels are stored in consecutive maps 0 through 3, at every fourth location in video memory."
You can disable this translation mechanism exposing Unchained modes X/Y (Hello Michael Abrash https://en.wikipedia.org/wiki/Mode_X). There is a trick on S3 cards http://www.geocities.ws/liaor2/myutil/m13speed.html where you can intercept video bios call and replace mode 13h on the fly with purely linear VESA mode gaining unreasonable amount of performance http://www.geocities.ws/liaor2/myutil/mbench.html
i430VX/Tr9680 320x200 MCGA 15031/5263 (32bitW/R)
i430VX/Tr9680 320x200 "M13SPEED" 50103/8701 (32bitW/R)
while displaying same 320x200@256, this performance comes from VGA chip no longer having to emulate this original VGA Chain 4 mapping internally.

2) already mentioned ram price at the time and IBM having high end option so not even considering anything better for VGA

3) 256KB was considered big in 1987, excessive maybe. Be glad we didnt just get MCGA with 64KB 😉

Grzyb wrote on 2023-04-04, 06:05:

In the action genre, the first games with SVGA modes were Witchaven and Quake (1995..96), with SVGA still being optional.

1994 Transport Tycoon mandatory SVGA. I wonder if I can find something popular released even earlier. There is 1993 super-vga Harrier https://www.mobygames.com/game/18480/super-vga-harrier/, but I feel its cheating as its just a high resolution version of AV-8B Harrier Assault.
Edit: Doh, a game Transport Tycoon was stylized on, 1993 Simcity 2000 is also SVGA only.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 15 of 28, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

High res, higher scanrate CRTs were pricey too. Basic VGA monitor $250, 800x600 $500, multiscan 1024x768 $1000. If you didn't see yourself paying $1000 for a monitor, why buy the card for it. Also it was quite common late 80s for monitor/card to be sold as package with small discount. like $300 for base VGA package, where card might cost $80 otherwise.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 16 of 28, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2023-04-09, 01:24:
Grzyb wrote on 2023-04-04, 06:05:

In the action genre, the first games with SVGA modes were Witchaven and Quake (1995..96), with SVGA still being optional.

1994 Transport Tycoon mandatory SVGA. I wonder if I can find something popular released even earlier. There is 1993 super-vga Harrier https://www.mobygames.com/game/18480/super-vga-harrier/, but I feel its cheating as its just a high resolution version of AV-8B Harrier Assault.
Edit: Doh, a game Transport Tycoon was stylized on, 1993 Simcity 2000 is also SVGA only.

Sure, in the strategy genre SVGA was used earlier, with the first major game being Links 386 Pro (1992), running in 640 x 400 x 256.
But the action genre had to wait several years more.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 17 of 28, by rasz_pl

User metadata
Rank l33t
Rank
l33t

Oh nice, I totally missed Links. Looking at the files its using same collection (VESALIB) of Vesa TRSes as Simcity 2000, just less (21 vs 28) and earlier versions. Wiki says 400K units sold in two years, and Iv seen Links quite high in some period correct popularity charts so shipping early VESA game wasnt detrimental at all.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 18 of 28, by Jo22

User metadata
Rank l33t++
Rank
l33t++
BitWrangler wrote on 2023-04-09, 05:09:

High res, higher scanrate CRTs were pricey too. Basic VGA monitor $250, 800x600 $500, multiscan 1024x768 $1000. If you didn't see yourself paying $1000 for a monitor, why buy the card for it. Also it was quite common late 80s for monitor/card to be sold as package with small discount. like $300 for base VGA package, where card might cost $80 otherwise.

Thanks for the information! 🙂
I agree that multi-sync monitors were valuable items.
They were more like multi-standard monitors, also. With both analogue/digital inputs, 50/75 Ohm switched, lots of knobs or an early OSD, even..

But 800x600, at least, wasn't too sophisticated, I believe.
It was in reach to semi-professionals, I think.

To programmers, using small or low-res 640x480 monitors must have been borderline already by the late 80s.
-I mean, most DOS tools or IDEs had an optional 43 line/50 line mode, for example (DOS Shell, VBDOS).

With a small/budget VGA monitor, these text modes were hardly readable.
A professional monitor was really needed here for longer programming sessions.

Some dual-frequency monitors (not quite multi-sync yet) could do 800x600 at 56Hz, by adjusting the pots on the back of the monitor. - I've seen a 20" model with BNC connectors that was like that.

On the downside, they'd require the adjustment each time when switching between 800x600 56Hz and 640x480 60Hz, I think.

There also was a driver patch for 800x600 SVGA drivers (Windows 3.0) that allowed a 800x600 - ish resolution (800x564) on normal, but more tolerant VGA monitors.

https://www.pcorner.com/list/WINDOWS/TWEAK2.ZIP/INFO/

Edit: By the late 80s, I vaguely remember, ambitious Atari ST users already switched from their humble SM124 to 17" or 19" hi-res monochrom monitors.
There also were hacks and modifications to let GEM use higher resolutions.

Edited.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 19 of 28, by mkarcher

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-04-09, 07:33:

To programmers, using small or low-res 640x480 monitors must have been borderline already by the late 80s.
-I mean, most DOS tools or IDEs had an optional 43 line/50 line mode, for example (DOS Shell, VBDOS).

The 43-line mode is borrowed from EGA and uses an 8x8 character box in a 640x350 pixel screen. The 50-line mode has the same timing as the 25-line mode, i.e. 720x400, but with an 9x8 character box instead of an 9x16 character box. So these two modes are not an example of anything requiring 480 lines, so these character modes still displayed at the nice 70Hz of VGA.