VOGONS


16-bit ISA EGA card?

Topic actions

First post, by r00tb33r

User metadata
Rank Member
Rank
Member

I've been wondering why most EGA cards weren't made for 16-bit ISA, as they debuted in AT systems with 16-bit ISA bus. Considering the amount of memory (128KB or more for 16 colors at 640x350 resolution) an EGA card can have it's odd that they were given only an 8-bit bus.

There are lots of dual port VGA/EGA cards that are 16-bit. Or is the EGA circuit always 8-bit?

I also wondered why they had chosen 350 lines, and those exact timings, as that wastes a good bit of the memory page. 400 lines utilizes the memory page better, 3rd party "super EGA" cards did just that. Also forced into the weird 14-pixel ROM fonts.

Has anyone built a discrete hobby clone of an EGA card? Is using a 6845 CRTC possible with additional chips or is EGA too different?

Info is appreciated, thanks!

Reply 2 of 39, by r00tb33r

User metadata
Rank Member
Rank
Member
rmay635703 wrote on 2022-11-04, 00:14:

EGA is a bunch of custom stuff like VGA

Many programmers didn’t move video data 16 bits at a time even with vga

Curious, since there are 8-bit VGA cards, how does a 16-bit data transfer work, or rather how is it negotiated?

This is conceptually important for me to understand for my project. How do data transfers seamlessly cope with different possible bus widths for devices?

Still, I'm curious to know how IBM(?) settled on 350 lines and ~22KHz scan for EGA, what was the design constraint that forced them to waste memory and end up with an even weirder pixel aspect ratio, and ROM font strangeness.

In a photo of an IBM card I saw a 16.257MHz clock oscillator and the most info I found is that EGA is exactly 60Hz (and not 59.94 like VGA and NTSC), with a total possible frame of 744x364 pixel clocks?
Does anybody know how IBM chose these numbers?

Reply 3 of 39, by Tiido

User metadata
Rank l33t
Rank
l33t

there's two signals (!IOCS16 and !MCS16) that the card signals in response to IO or memory cycle to turn them from 8bits to 16bit. When you do a 16bit or even a 32bit out, you get it automagically transformed into two or four 8 bit transactions, unless the card signals that it'll be able to take 16bits (with one of the two previously mentioned signals), in which case you get one or two 16bit transactions instead.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 4 of 39, by r00tb33r

User metadata
Rank Member
Rank
Member
Tiido wrote on 2022-11-04, 04:56:

there's two signals (!IOCS16 and !MCS16) that the card signals in response to IO or memory cycle to turn them from 8bits to 16bit. When you do a 16bit or even a 32bit out, you get it automagically transformed into two or four 8 bit transactions, unless the card signals that it'll be able to take 16bits (with one of the two previously mentioned signals), in which case you get one or two 16bit transactions instead.

Thanks, that's a useful starting point for further research.

Reply 5 of 39, by mkarcher

User metadata
Rank l33t
Rank
l33t
r00tb33r wrote on 2022-11-03, 23:54:

I've been wondering why most EGA cards weren't made for 16-bit ISA, as they debuted in AT systems with 16-bit ISA bus. Considering the amount of memory (128KB or more for 16 colors at 640x350 resolution) an EGA card can have it's odd that they were given only an 8-bit bus.

The EGA was intended to be an upgrade path for PC/XT users as well as a display card for the AT. The EGA card really shines at what stepwise upgrades were possible:

  • You have an MDA + IBM monochrome monitor? Just buy an EGA, and you keep the excellent 80x25 monochrome text mode at 720x350, and get an extra 640x350 graphics mode. You can buy the Enhanced color display later when you can afford it.
  • You have a CGA + IBM color display? Just buy an EGA, and you keep all CGA modes, as well as getting 320x200 and 640x200 at 16 colors. You can buy the Enhanced Color Display later when you can afford it.
  • Your IBM color display broke? Just buy an Enhanced Color Display as replacement. It will still work with your CGA card, and you can upgrade to EGA any time you can afford it.

As soon as you were proud owner of both the EGA card and the Enhanced Color Display, you could enjoy 16-color text modes at 640x350 and high-resolution graphics at 640x350, but every step towards getting there made perfect sense and used the available hardware to the best it can do. In a CGA/MDA dual-display setup, you can replace either one of those cards by an EGA card to get the benefits on that card, as mentioned above. You can't use two EGA cards to replace both the MDA and the CGA card. (Well, technically you likely can put two EGA cards into a single computer by jumpering one of the cards to its "alternate base address" 2B0..2DF, but you won't have any BIOS support for it, and you need to make sure that you don't get bus conflicts when you try to read the BIOS.)

r00tb33r wrote on 2022-11-03, 23:54:

There are lots of dual port VGA/EGA cards that are 16-bit. Or is the EGA circuit always 8-bit?

Both the EGA and the VGA circuits are designed around an 8-bit bus interface. The original IBM "video graphics array" chip (their first graphics card in a single custom chip), which was introduced with the IBM PS/2 Model 50 and available on the "PS/2 display adapter" ISA card also had an 8-bit bus interface. The programming model for most advanced capabilities of the EGA/VGA in 16-color modes requires cycles to be performed with 8-bit access. The modern 16-bit VGA cards either reject 16-bit cycles in configurations don't provide straight-forward video memory access, or they buffer the cycle on the card and execute them as two 8-bit operations (at a speed independent of the ISA clock, as fast as the card can do). CGA modes and 256-color mode (without special programming tricks) are 16-bit friendly, so likely the first generation of 16-bit EGA/VGA cards only supported these modes with 16 bit.

Anyway, we got 16-bit graphics cards as soon as chips got complex enough to emulate an 8 bit graphics card without losing all the benefits of the 16-bit slots. And at the time we got there, VGA cards were considered standard. Especially as VGA is just an incremental upgrade from EGA (the actual design is very similar), building a third-party VGA card that can "emulate" EGA was not that much more expensive than just building a third-party (super) EGA without VGA features. Supporting VGA made the card appealing to a lot more customers, so building a 16-bit EGA-only board made no sense at that time.

r00tb33r wrote on 2022-11-03, 23:54:

I also wondered why they had chosen 350 lines, and those exact timings, as that wastes a good bit of the memory page. 400 lines utilizes the memory page better, 3rd party "super EGA" cards did just that. Also forced into the weird 14-pixel ROM fonts.

The main point of EGA seemed to get "the best of both worlds" (MDA and CGA). MDA had this selling point of high-quality text at 720x350 pixels, using a 9x14 font, whereas CGA had the selling point of 16 colors. So the design target of EGA was to obtain 16 colors at 350 lines. And that's what they did. The main design results were:

  • MDA display was flicker-free at 50Hz, but only because of a monitor that had quite long persistence (the technical term is "medium-persistence phosphor"). This kind of phosphor is inappropriate for animated (color) graphics, so we need a higher refresh rate for EGA. CGA already had 60Hz refresh, which proved high enough to be accepted for use with monitors with short-persistence phosphor.
  • MDA uses 16.257MHz dot clock. To save cost, we try to use the same clock for the high-resolution EGA modes, but unless MDA, the goal is 60Hz instead of 50Hz. That's why EGA went for 640x350 instead of 720x350 in text mode (and downgraded to 8x14 from 9x14).
  • For CGA compatibility, we can't get rid of the 8x8 font. For MDA compatibility, we can't get rid of the 9x14 font. As the EGA doesn't use a dedicated character generator ROM, the fonts have to be stored in the BIOS ROM. In fact, EGA stores an 8x8 font, and 8x14 font and a list of characters to replace in the 8x14 font if it is running in MDA-compatible mode. A very notable replacement character is the "m" character. IBM uses two-pixel wide vertical lines in their default fonts, so the m would need 6 white pixels for the three stems, seperated by one black pixel each, making 8 pixels in total. This means the m would touch both ends of the character box and subsequent m characters would run into each other. To avoid that effect, the 8-pixel-wide m character uses only a single pixel wide central stem and an empty space in the 8th pixel. The 9-pixel-wide m doesn't have this issue and can be displayed with even stem width. Re-using the 14-pixel ROM font instead of adding a 16-pixel ROM font saved 4KB of ROM space. That's 25% of the 16KB BIOS ROM on an EGA card.
r00tb33r wrote on 2022-11-03, 23:54:

Has anyone built a discrete hobby clone of an EGA card? Is using a 6845 CRTC possible with additional chips or is EGA too different?

The MDA and CGA cards shows the limit on what you can achieve with an integrated CRTC and performing all other functions with discrete chips. Both of those cards were taking the maximum available space on an ISA card. Granted, they were thru-hole, and we can save space today using SMD chips, but that would be (estimated) just a factor of two in kind of complexity. The logic of the EGA card is way more complex than that of the CGA or MDA. If you want your head to blow up, just try to understand what the graphics controller can do and how you would try to implement that with discrete logic. Note how IBM's EGA also was a full-sized board with just 64KB of RAM, and needed an upgrade board for the full 256KB of RAM. The RAM problem would be solved today, as we don't need to build the video RAM from 16KB x 4 chips. Today, we can just use four 64KB x 4 chips, so the 256KB of RAM fit where IBM just managed to place 64KB. But EGA was only possible because the remaining logic was consolidated into ASICs

  • The CRTC, while 6845-inspired, is different enough from the 6845 that you can't emulate it using a 6845.
  • The Timing Sequencer, generating the clock signals and memory access signals at the correct timing
  • The Attribute Controller, managing the 64-color palette in high res modes, the border color and the blinking
  • The Graphics Controller, interfacing the video memory at a width of 32 bits, and mapping it to an 8-bit ISA bus as well as generating sequences of 4-bit colors to be sent to the attribute Controller. As IBM was limited to 40-pin DIP chips, the Graphics Controller was implemented 16 bit wide, so they placed two of the Graphics Controller chips onto the EGA card. And that why we have these strange "graphics position 0" and "graphics position 1" registers on the EGA: These registers are used to initialize the graphics controllers and tell each one whether it is meant to handle the low or the high 16 bits. All other graphics controller registers always access both controllers at the same time, so they need to be set up correctly to work as virtual 32-bit graphics controller.

I don't see how you could manage to put all the stuff IBM integrated into these chips on a single ISA board using just 74-series logic. Even if you could re-use the 6845, you won't fit the graphics controllers onto the board.

EDIT: fixed quote formatting

Last edited by mkarcher on 2022-11-04, 19:32. Edited 1 time in total.

Reply 6 of 39, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
r00tb33r wrote on 2022-11-04, 02:41:

Still, I'm curious to know how IBM(?) settled on 350 lines

350 scanlines was already used earlier, in MDA.
I think the general idea was to create the replacement for both MDA and CGA, therefore 350 lines, color, and graphics.

and ~22KHz scan for EGA

MDA monitor has long persistence phosphor, so 50 Hz refresh rate is enough for flicker-free.
For a color monitor, however, it was necessary to increase that to 60 Hz, so the HSYNC also had to be increased - to 22 kHz.

what was the design constraint that forced them to waste memory

HSYNC frequency was expensive, no way to easily increase vertical resolution.
About the same time IBM also introduced the PGC, with 31 kHz HSYNC and 480 scanlines, but the price was prohibitive for an average user.

and end up with an even weirder pixel aspect ratio

It wasn't weirder:
CGA - 640/200 = 3.2
MDA - 720/350 = 2.057
EGA - 640/350 = 1.829
actually, it was a step closer to the square pixel perfection of 1.333

and ROM font strangeness

No strangeness, the 8x14 font was already in MDA.

In a photo of an IBM card I saw a 16.257MHz clock oscillator

Again, same as on MDA.
Note that EGA can also use the MDA monitor.

The 16.257MHz pixel clock is used for both EGA and MDA monitors:
EGA - 640 pixel wide, 60 Hz
MDA - 720 pixel wide, 50 Hz

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 7 of 39, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Grzyb wrote on 2022-11-04, 09:14:

MDA monitor has long persistence phosphor, so 50 Hz refresh rate is enough for flicker-free.
For a color monitor, however, it was necessary to increase that to 60 Hz, so the HSYNC also had to be increased - to 22 kHz.

+1

I think the same mostly, but.. Now, let us tell that to all the PAL countries with 50 Hz TVs and 50 Hz lighing.. - Back in the day, I mean. 😉
MDA and Hercules with its 50 Hz felt strangely familiar to PAL or our classic monochrome video monitors with 625 lines, I think.
A few years later, 100 Hz / 120 Hz were standard among new CRT TVs worldwide, also.

And in computing, VGA with 70 Hz improved upon even further, gratefully. 🙏
The 60 Hz flicker of VGA's fine 640x480 resolution wasn't great, either, however.
At least text mode in 70 Hz was okay. And 640x400 (VBE).

I remember being cured from unknown headaches and eye strain when switching to LCD monitors because of this.
That's because I mainly hung out with DOS stuff, which cared very little about the 85 to 120 Hz Windows 9x offered.

Edit: The AOL thing for DOS can do EGA in 640x350 monochrome, among other modes.

Attachments

  • pcem_ega_hgc_mono.png
    Filename
    pcem_ega_hgc_mono.png
    File size
    18.03 KiB
    Views
    1691 views
    File license
    Fair use/fair dealing exception

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 8 of 39, by r00tb33r

User metadata
Rank Member
Rank
Member

Thanks for the history lesson, you guys, definitely something I didn't know.

The upgrade path certainly makes sense. I have heard the term "MDA" before, but being a text-only output I never gave it much thought, and didn't remember it's specs.

Do you know why IBM chose 350 lines for MDA?

mkarcher wrote on 2022-11-04, 08:44:

As the EGA doesn't use a dedicated character generator ROM, the fonts have to be stored in the BIOS ROM. In fact, EGA stores an 8x8 font, and 8x14 font and a list of characters to replace in the 8x14 font if it is running in MDA-compatible mode. A very notable replacement character is the "m" character. IBM uses two-pixel wide vertical lines in their default fonts, so the m would need 6 white pixels for the three stems, seperated by one black pixel each, making 8 pixels in total. This means the m would touch both ends of the character box and subsequent m characters would run into each other. To avoid that effect, the 8-pixel-wide m character uses only a single pixel wide central stem and an empty space in the 8th pixel. The 9-pixel-wide m doesn't have this issue and can be displayed with even stem width. Re-using the 14-pixel ROM font instead of adding a 16-pixel ROM font saved 4KB of ROM space. That's 25% of the 16KB BIOS ROM on an EGA card.

I meant like from ROM address generation standpoint 14 is ugly relative to 16. Power of 2 is much easier for simple counters/gate comparators. It's a matter of logic design elegance and simplicity.

From what I recall a fun fact regarding the 9-pixel-wide glyphs is that the ROM is 8 bits wide, and the 9th bit is generated using special case circuit trickery, mainly the box-drawing characters, as it's normally blank for other symbols. For this reason I personally like 6-pixel-wide ROM fonts, as they cleanly accommodate letters like "M" and "W", while keeping the pixel clock down.

I mean it's just such a shame that they didn't go with 400 lines to fill the memory page. 3rd-party cards certainly went for it.

Grzyb wrote on 2022-11-04, 09:14:
It wasn't weirder: CGA - 640/200 = 3.2 MDA - 720/350 = 2.057 EGA - 640/350 = 1.829 actually, it was a step closer to the square […]
Show full quote

It wasn't weirder:
CGA - 640/200 = 3.2
MDA - 720/350 = 2.057
EGA - 640/350 = 1.829
actually, it was a step closer to the square pixel perfection of 1.333

Well, not quite. Using your math:
CGA - 320/200 = 1.6
Super EGA - 640/400 = 1.6

So compared to CGA it's a step in the wrong direction in terms of pixel aspect ratio. But yeah, I just didn't consider the existence of MDA.


If I bought myself an IBM EGA card to play with (for circuit design purposes), will it work with the BIOS in a 486 machine? The board I usually use has AMI graphical BIOS. I also use a 440BX board with Award(?) BIOS.

Reply 9 of 39, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
r00tb33r wrote on 2022-11-05, 04:47:
Well, not quite. Using your math: CGA - 320/200 = 1.6 Super EGA - 640/400 = 1.6 […]
Show full quote
Grzyb wrote on 2022-11-04, 09:14:
It wasn't weirder: CGA - 640/200 = 3.2 MDA - 720/350 = 2.057 EGA - 640/350 = 1.829 actually, it was a step closer to the square […]
Show full quote

It wasn't weirder:
CGA - 640/200 = 3.2
MDA - 720/350 = 2.057
EGA - 640/350 = 1.829
actually, it was a step closer to the square pixel perfection of 1.333

Well, not quite. Using your math:
CGA - 320/200 = 1.6
Super EGA - 640/400 = 1.6

So compared to CGA it's a step in the wrong direction in terms of pixel aspect ratio. But yeah, I just didn't consider the existence of MDA.

CGA maximum resolution is 640x200.
So, the full path towards the square pixel was:

1981 - CGA - 640/200 = 3.2
1981 - MDA - 720/350 = 2.057
1984 - EGA - 640/350 = 1.829
198x - various non-IBM adapters - 640/400 = 1.6
1987 - VGA - 640/480 = 1.333

Precisely speaking, square pixel was already available in 1984 with the PGC, but that was expensive high-end stuff, so the mainstream had to wait until the VGA.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 10 of 39, by mkarcher

User metadata
Rank l33t
Rank
l33t
r00tb33r wrote on 2022-11-05, 04:47:

Thanks for the history lesson, you guys, definitely something I didn't know.

The upgrade path certainly makes sense. I have heard the term "MDA" before, but being a text-only output I never gave it much thought, and didn't remember it's specs.

Do you know why IBM chose 350 lines for MDA?

Grzyb wrote on 2022-11-04, 09:14:

HSYNC frequency was expensive, no way to easily increase vertical resolution.

That's essentially the key point. The more scanlines you have, the higher the HSYNC frequency needs to be. So more scanlines directly mean more expensive monitors. Take a look at the IBM MDA monitor schematics in the IBM PC Technical Reference: The whole thing fits on a single sheet! That monitor is built in a very simple and straightforward way, obviously to keep the production costs low. 350 scanlines are enough for percieved "high quality text", whereas 200 scanlines is barely enough for recognizatble representation of latin uppercase and lowercase letters, including descenders. 14 lines per character seem like a good compromise between quality and horizontal scan frequency for text display.

r00tb33r wrote on 2022-11-05, 04:47:

I meant like from ROM address generation standpoint 14 is ugly relative to 16. Power of 2 is much easier for simple counters/gate comparators. It's a matter of logic design elegance and simplicity.

The MDA and CGA shared the same mask-programmed character generation ROM. The MDA font is stored in an 8x16 frame, and there are two 8x8 CGA fonts in that ROM: The normal font and a second one that doesn't use 2-pixel wide vertical lines. Address generation and ROM space were not a problem while they used a dedicated character generation ROM. You couldn't buy a 3.5K ROM chip, a "minimal MDA" would have needed a 4K ROM chip anyway. As IBM decided to use the same ROM for MDA and CGA, they went for an 8K ROM chip. On the other hand, the fonts stored in the EGA ROM are not used by display hardware for character generation. EGA takes the font from video RAM, and fonts in video RAM are always stored in a 8x32 frame, so again a power-of-2 height. When a text video mode using the 8x14 font is initialized, the 8x14 characters from the ROM are copied into the top 14 scanlines of the 8x32 character RAM space.

r00tb33r wrote on 2022-11-05, 04:47:

From what I recall a fun fact regarding the 9-pixel-wide glyphs is that the ROM is 8 bits wide, and the 9th bit is generated using special case circuit trickery, mainly the box-drawing characters, as it's normally blank for other symbols. For this reason I personally like 6-pixel-wide ROM fonts, as they cleanly accommodate letters like "M" and "W", while keeping the pixel clock down.

Pixel clock isn't really an important problem, unless you want to generate a composite color video signal, which is severely bandwith limited. The original EGA card had quite a lot of margin around the pixel clock. The EGA could be switched to a mode where it took the clock from its feature connector instead of using an on-card clock. There are 132-character add-ons for the EGA card that do little more than inject a 24MHz clock on the feature connector and add BIOS support for using that clock and setting up video modes with EGA-compatible VSYNC/HSYNC timings.

You are completely right about the trickery for 9-pixel fonts: The MDA card uses discrete logic that generates a blank 9th pixel for all characters except characters in the range 192-223 (C0-DF hex). The CGA is capable of 8-pixel fonts only and doesn't need this circuit. In EGA/VGA, the attribute controller chip integrates this logic as optional feature. EGA/VGA can operate in 8-pixel, 9-pixel-with-hack or 9-pixel-9th-always-blank mode. As you see, EGA/VGA again is more complicated than the earlier cards, making a discrete re-implementation difficult.

r00tb33r wrote on 2022-11-05, 04:47:

I mean it's just such a shame that they didn't go with 400 lines to fill the memory page. 3rd-party cards certainly went for it.

In 1984, third-party cards didn't have 400-line modes (if there were 3rd-party EGA cards at all). The monitors just weren't there yet. The 3rd-party Super EGA cards started to appear around 1986 or 1987, when monitors that could display this resolution (with 25 to 28 kHz HSYNC) were available at affordable prices. In my oppinion, it's not historically fair to compare the specs of a 1984 board to what competitors did 3 years later when the market changed. That's like complaining about the mediocre clock efficiency of the Intel 8088 taking the NEC V20 as comparison. While it's true that the V20 is considerably more efficient than the 8088 (and thus suffering even more from memory starvation), the V20 is designed later and uses around twice as much transistors. At the time the 8088 was designed, using the amount of transistors of the V20 was infeasible at the target price point.

r00tb33r wrote on 2022-11-05, 04:47:

If I bought myself an IBM EGA card to play with (for circuit design purposes), will it work with the BIOS in a 486 machine? The board I usually use has AMI graphical BIOS. I also use a 440BX board with Award(?) BIOS.

I'm usure whether the graphic AMI BIOS ("AMI WinBIOS") is EGA compatible. All other kinds of BIOSes work perfectly with any kind of video card, even MDA or CGA.

Reply 11 of 39, by dr.zeissler

User metadata
Rank l33t
Rank
l33t
mkarcher wrote on 2022-11-04, 08:44:
The EGA was intended to be an upgrade path for PC/XT users as well as a display card for the AT. The EGA card really shines at w […]
Show full quote
r00tb33r wrote on 2022-11-03, 23:54:

I've been wondering why most EGA cards weren't made for 16-bit ISA, as they debuted in AT systems with 16-bit ISA bus. Considering the amount of memory (128KB or more for 16 colors at 640x350 resolution) an EGA card can have it's odd that they were given only an 8-bit bus.

The EGA was intended to be an upgrade path for PC/XT users as well as a display card for the AT. The EGA card really shines at what stepwise upgrades were possible:

  • You have an MDA + IBM monochrome monitor? Just buy an EGA, and you keep the excellent 80x25 monochrome text mode at 720x350, and get an extra 640x350 graphics mode. You can buy the Enhanced color display later when you can afford it.
  • You have a CGA + IBM color display? Just buy an EGA, and you keep all CGA modes, as well as getting 320x200 and 640x200 at 16 colors. You can buy the Enhanced Color Display later when you can afford it.
  • Your IBM color display broke? Just buy an Enhanced Color Display as replacement. It will still work with your CGA card, and you can upgrade to EGA any time you can afford it.

As soon as you were proud owner of both the EGA card and the Enhanced Color Display, you could enjoy 16-color text modes at 640x350 and high-resolution graphics at 640x350, but every step towards getting there made perfect sense and used the available hardware to the best it can do. In a CGA/MDA dual-display setup, you can replace either one of those cards by an EGA card to get the benefits on that card, as mentioned above. You can't use two EGA cards to replace both the MDA and the CGA card. (Well, technically you likely can put two EGA cards into a single computer by jumpering one of the cards to its "alternate base address" 2B0..2DF, but you won't have any BIOS support for it, and you need to make sure that you don't get bus conflicts when you try to read the BIOS.)

r00tb33r wrote on 2022-11-03, 23:54:

There are lots of dual port VGA/EGA cards that are 16-bit. Or is the EGA circuit always 8-bit?

Both the EGA and the VGA circuits are designed around an 8-bit bus interface. The original IBM "video graphics array" chip (their first graphics card in a single custom chip), which was introduced with the IBM PS/2 Model 50 and available on the "PS/2 display adapter" ISA card also had an 8-bit bus interface. The programming model for most advanced capabilities of the EGA/VGA in 16-color modes requires cycles to be performed with 8-bit access. The modern 16-bit VGA cards either reject 16-bit cycles in configurations don't provide straight-forward video memory access, or they buffer the cycle on the card and execute them as two 8-bit operations (at a speed independent of the ISA clock, as fast as the card can do). CGA modes and 256-color mode (without special programming tricks) are 16-bit friendly, so likely the first generation of 16-bit EGA/VGA cards only supported these modes with 16 bit.

Anyway, we got 16-bit graphics cards as soon as chips got complex enough to emulate an 8 bit graphics card without losing all the benefits of the 16-bit slots. And at the time we got there, VGA cards were considered standard. Especially as VGA is just an incremental upgrade from EGA (the actual design is very similar), building a third-party VGA card that can "emulate" EGA was not that much more expensive than just building a third-party (super) EGA without VGA features. Supporting VGA made the card appealing to a lot more customers, so building a 16-bit EGA-only board made no sense at that time.

r00tb33r wrote on 2022-11-03, 23:54:

I also wondered why they had chosen 350 lines, and those exact timings, as that wastes a good bit of the memory page. 400 lines utilizes the memory page better, 3rd party "super EGA" cards did just that. Also forced into the weird 14-pixel ROM fonts.

The main point of EGA seemed to get "the best of both worlds" (MDA and CGA). MDA had this selling point of high-quality text at 720x350 pixels, using a 9x14 font, whereas CGA had the selling point of 16 colors. So the design target of EGA was to obtain 16 colors at 350 lines. And that's what they did. The main design results were:

  • MDA display was flicker-free at 50Hz, but only because of a monitor that had quite long persistence (the technical term is "medium-persistence phosphor"). This kind of phosphor is inappropriate for animated (color) graphics, so we need a higher refresh rate for EGA. CGA already had 60Hz refresh, which proved high enough to be accepted for use with monitors with short-persistence phosphor.
  • MDA uses 16.257MHz dot clock. To save cost, we try to use the same clock for the high-resolution EGA modes, but unless MDA, the goal is 60Hz instead of 50Hz. That's why EGA went for 640x350 instead of 720x350 in text mode (and downgraded to 8x14 from 9x14).
  • For CGA compatibility, we can't get rid of the 8x8 font. For MDA compatibility, we can't get rid of the 9x14 font. As the EGA doesn't use a dedicated character generator ROM, the fonts have to be stored in the BIOS ROM. In fact, EGA stores an 8x8 font, and 8x14 font and a list of characters to replace in the 8x14 font if it is running in MDA-compatible mode. A very notable replacement character is the "m" character. IBM uses two-pixel wide vertical lines in their default fonts, so the m would need 6 white pixels for the three stems, seperated by one black pixel each, making 8 pixels in total. This means the m would touch both ends of the character box and subsequent m characters would run into each other. To avoid that effect, the 8-pixel-wide m character uses only a single pixel wide central stem and an empty space in the 8th pixel. The 9-pixel-wide m doesn't have this issue and can be displayed with even stem width. Re-using the 14-pixel ROM font instead of adding a 16-pixel ROM font saved 4KB of ROM space. That's 25% of the 16KB BIOS ROM on an EGA card.
r00tb33r wrote on 2022-11-03, 23:54:

Has anyone built a discrete hobby clone of an EGA card? Is using a 6845 CRTC possible with additional chips or is EGA too different?

The MDA and CGA cards shows the limit on what you can achieve with an integrated CRTC and performing all other functions with discrete chips. Both of those cards were taking the maximum available space on an ISA card. Granted, they were thru-hole, and we can save space today using SMD chips, but that would be (estimated) just a factor of two in kind of complexity. The logic of the EGA card is way more complex than that of the CGA or MDA. If you want your head to blow up, just try to understand what the graphics controller can do and how you would try to implement that with discrete logic. Note how IBM's EGA also was a full-sized board with just 64KB of RAM, and needed an upgrade board for the full 256KB of RAM. The RAM problem would be solved today, as we don't need to build the video RAM from 16KB x 4 chips. Today, we can just use four 64KB x 4 chips, so the 256KB of RAM fit where IBM just managed to place 64KB. But EGA was only possible because the remaining logic was consolidated into ASICs

  • The CRTC, while 6845-inspired, is different enough from the 6845 that you can't emulate it using a 6845.
  • The Timing Sequencer, generating the clock signals and memory access signals at the correct timing
  • The Attribute Controller, managing the 64-color palette in high res modes, the border color and the blinking
  • The Graphics Controller, interfacing the video memory at a width of 32 bits, and mapping it to an 8-bit ISA bus as well as generating sequences of 4-bit colors to be sent to the attribute Controller. As IBM was limited to 40-pin DIP chips, the Graphics Controller was implemented 16 bit wide, so they placed two of the Graphics Controller chips onto the EGA card. And that why we have these strange "graphics position 0" and "graphics position 1" registers on the EGA: These registers are used to initialize the graphics controllers and tell each one whether it is meant to handle the low or the high 16 bits. All other graphics controller registers always access both controllers at the same time, so they need to be set up correctly to work as virtual 32-bit graphics controller.

I don't see how you could manage to put all the stuff IBM integrated into these chips on a single ISA board using just 74-series logic. Even if you could re-use the 6845, you won't fit the graphics controllers onto the board.

EDIT: fixed quote formatting

excellent informations! thx! I am thinking about changing my ET4000 against something more period correct in my A2000/A2286 BB. I was thinking about an EGA card... were there EGA cards that offer more resolutions and colors like the standard EGA cards did? I am still searching for an overview of EGA cards with a comparison..perhaps I find something like that on archive.org.

again, thx for those excellent informations!

Retro-Gamer 😀 ...on different machines

Reply 12 of 39, by mkarcher

User metadata
Rank l33t
Rank
l33t
dr.zeissler wrote on 2024-01-14, 08:51:

excellent informations! thx! I am thinking about changing my ET4000 against something more period correct in my A2000/A2286 BB. I was thinking about an EGA card... were there EGA cards that offer more resolutions and colors like the standard EGA cards did? I am still searching for an overview of EGA cards with a comparison..perhaps I find something like that on archive.org.

again, thx for those excellent informations!

Indeed, there were EGA cards that supported higher resolutions than the standard EGA card. Those cards were commonly called "Super EGA" cards. They did not extend the EGA architecture, though, and thus were still providing the same amount of colors (16 out of 64 colors). A lesser-known fact about the IBM EGA combined with the Enhanced Color Display is that it provides the 64 color palette only in high-resolution (350 line) modes, but not in low-resolution (CGA compatible) modes, which used 16 colors only. This limitation was implemented in the monitor, to make it a drop-in replacement for the 16-color CGA monitor on CGA cards that wouldn't provide the correct color signalling for 64 colors. They also had no space on the 9-pin connector for a dedicated "64 color enable" signal, so IBM re-used the "high resolution enable" flag generated from the VSYNC polarity to also genete a "64 color enable" signal. Third-party monitors can choose to implement or not implement the color mode switch (some are configurable). Getting 16-out-of-64 colors in CGA-compatible 200-line modes is not something that can be implemented in "enhanced EGA cards", but only a different monitor. So the "Super EGA" cards do not provide more colors. (Execpt of course if you compare the base configuration of the EGA with just 64KB RAM to the base configuration of any Super EGA card at 256KB RAM. In that case, the EGA in base configuration only provides 4-color high-resolution graphics, whereas Super EGA cards provide 16-color graphics - but this can be achieved on the original EGA as well if you upgrade the RAM to at least 128K).

Super EGA cards do not change the monitor interface: It's still a 6-bit RrGgBb digital interface, allowing a wider palette than 64 colors is not possible without using a proprietary monitor connection. There are some graphics chips that do provide 9-bit digital RGB output (3 bits per component) as an option, but they never got common in PCs, and I yet have to see any CRT monitor implementing an interface for that. Most likely, the 9-bit mode was used in proprietary systems only. As soon as you replace the digital monitor interface of a (Super)EGA card to an analog interface, allowing an unlimited amount of colors, you have 90% of a (Super) VGA card, so SuperEGA cards with analog monitor output are extremely rare or do not exist at all.

So if you go (Super) EGA, make sure you have a solution to make the output visible! You need a digital RGB monitor that supports the video frequencies generated by the EGA card. For the high resolutions on Super EGA cards, there is no way around using a "digital multisync" monitor (or a digital/analog switchable multisync monitor) if you want a CRT to directly connect to the EGA card. The other solution is a conversion board (like the MCE2VGA) with a monitor supporting analog RGB at below VGA frequencies (which people claim to be supported by a couple of LCD monitors that also support composite video input).

Information about Super EGA cards can be scavenged from Computer magazines around 1986-1989, I don't know wheter anyone assembled a list. Super EGA cards are usually based on highly integrated EGA chipsets by Genoa, Chips & Technologies or Paradise. It might help adding these chip manufacturer names to search requests to get more focuessed results.

Reply 13 of 39, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

This site has some really usefull informations http://www.dosdays.co.uk/dos_hardware_index.php (but I think the backend of this page seems rather weak since I linked it on another forum this sute seems SLOWWWW 😀

I think I own some of those cards already https://www.flickr.com/photos/94839221@N05/al … 57657710688635/

Retro-Gamer 😀 ...on different machines

Reply 14 of 39, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
dr.zeissler wrote on 2024-01-14, 08:51:

I am thinking about changing my ET4000 against something more period correct in my A2000/A2286 BB. I was thinking about an EGA card...

Amiga 2000 period was 1987..1991
ET4000 was released in 1989 or 1990

So it's pretty much correct, no need to back off to EGA.
But if you really want something earlier, then any early VGA would be perfect, eg. ET3000, TVGA8800, PVGA1A, ATI 18800... and many such cards have two outputs: VGA and TTL.

Especially the 18800 chip can be found on different cards:
- with DAC and analog output: ATI VGA Wonder
- without DAC, TTL output only: ATI EGA Wonder 800+

were there EGA cards that offer more resolutions and colors like the standard EGA cards did?

EGA means 6-bit TTL output, so the palette is always 64 colors.

But greater resolutions, yes, eg.:
- ATI EGA Wonder - 800 x 560
- ATI EGA Wonder 800+ - 800 x 600
A multisync monitor with 15..35 kHz HSYNC is necessary.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 15 of 39, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

Thx, no problem I have an mce-adapter that works with nec 1970nxp for MDA/CGA/EGA.
Because I already have different VGA machines, I am thinking about EGA Hires.
The ET4000 is indeed nice in that machine, but I mostly use 16colors in Games because that is much faster on that lowend machine. https://youtu.be/P2z_6hgYvcs

btw. there are VGA cards with 9pin output for TTL? monitors... do they display 256colors...?

Retro-Gamer 😀 ...on different machines

Reply 16 of 39, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie
dr.zeissler wrote on 2024-01-14, 09:57:

btw. there are VGA cards with 9pin output for TTL? monitors... do they display 256colors...?

Yes, plenty of early VGA clones have two outputs, see eg. the already mentioned ATI VGA Wonder.
But of course not all modes are supported on the TTL output - no way to display 256 colors.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...

Reply 17 of 39, by Jo22

User metadata
Rank l33t++
Rank
l33t++

There also was an analogue 9-pin connector on early VGA cards, I vaguely remember.
It contained merely the very basis VGA signals, such as R, G, B, H Sync, V Sync, GND (3 x?), mono monitor detection pin
But I can't find an actual pinout diagram, sadly.

Edit: It's theoretically possible to pulse the RGB lines on a TTL monitor.
With something like PWM (pulse width modulation), different brightness levels could be achieved. But that's just theory.

Closest I can think of are MDA monitors that were being used as cheap CGA displays.
Some CGA/EGA clone cards had the ability to simulate the colours as shades of grey on an MDA monitor (Hercules monitor).

But I'm not sure how this was being done.
By using intensity pin or by doing PWM on the main TTL video pins?

Edit: A real EGA TTL monitor has Rr/Gg/Bb pins.
2 bits for each primary colour, essentially. So it can physically display 64 colours simultaneously.

It's the EGA card who's the limiting factor, though.
A VGA card could display all 64 colours, by contrast.
If there was a 64c video mode, I mean.

Edit: There's one EGA game which ia cheating to display those 64 colours.
It switches between palettes very quickly.
64 simultaneous colors possible with EGA?

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 18 of 39, by Grzyb

User metadata
Rank Oldbie
Rank
Oldbie

Come to think of it, maybe there was a Super EGA card with 64 simultaneous colors?
I'm not aware of one, but we all know that absence of evidence is not evidence of absence.
There would be one big problem with such a card, though: difficult to convince software vendors to support such an oddball.

Żywotwór planetarny, jego gnijące błoto, jest świtem egzystencji, fazą wstępną, i wyłoni się z krwawych ciastomózgowych miedź miłująca...