VOGONS


First post, by drsly

User metadata
Rank Newbie
Rank
Newbie

Hi,

Working on VGA aspect of my PC emulator, but stuck on which register(s) determine 320 vs 640 pixels in horizontal.

https://wiki.osdev.org/VGA_Hardware
http://www.osdever.net/FreeVGA/vga/vga.htm

These registers I understand:

  • End Horizontal Display (3d4 index 01h) set to number of character clocks - 1. Controls how many pixels make up a 'scan line'. Most modes use 79, giving 640 or 720 pixels depending on 9/8 dot mode.
  • Clocking Mode Register (3c4 index 01h) bit 0 - used to select between 720 and 640 pixel modes (or 360 and 320) and also is used to provide 9 bit wide character fonts in text mode.
  • Offset (3d4 index 13h) - this field specifies the address difference between consecutive scan lines or two lines of characters. This register can be modified to provide for a virtual resolution, in which case Width is the width is the width in pixels of the virtual screen.

What I don't get is what selects between for instance 320 and 640 pixels.

From the documentation, I considered these a possibility:

  • Clocking Mode Register (3c4 index 01h bit 3) - Dot Clock Rate. When this bit is set to 1, the master clock will be divided by 2 to generate the dot clock. All other timings are affected because they are derived from the dot clock. The dot clock divided by 2 is used for 320 and 360 horizontal PEL modes.
  • CRTC Mode Control Register (3d4 index 17h bit 3) - Divide Memory Address clock by 2. When this bit is set to 1, the address counter uses the character clock input divided by 2. This bit is used to create either a byte or word refresh address for the display buffer.

The problem is neither of these bits are set under mode 13h (320 x 200). My guess is that there are other registers involved but which?

Reply 1 of 5, by superfury

User metadata
Rank l33t++
Rank
l33t++
drsly wrote on 2020-01-11, 15:08:
Hi, […]
Show full quote

Hi,

Working on VGA aspect of my PC emulator, but stuck on which register(s) determine 320 vs 640 pixels in horizontal.

https://wiki.osdev.org/VGA_Hardware
http://www.osdever.net/FreeVGA/vga/vga.htm

These registers I understand:

  • End Horizontal Display (3d4 index 01h) set to number of character clocks - 1. Controls how many pixels make up a 'scan line'. Most modes use 79, giving 640 or 720 pixels depending on 9/8 dot mode.
  • Clocking Mode Register (3c4 index 01h) bit 0 - used to select between 720 and 640 pixel modes (or 360 and 320) and also is used to provide 9 bit wide character fonts in text mode.
  • Offset (3d4 index 13h) - this field specifies the address difference between consecutive scan lines or two lines of characters. This register can be modified to provide for a virtual resolution, in which case Width is the width is the width in pixels of the virtual screen.

What I don't get is what selects between for instance 320 and 640 pixels.

From the documentation, I considered these a possibility:

  • Clocking Mode Register (3c4 index 01h bit 3) - Dot Clock Rate. When this bit is set to 1, the master clock will be divided by 2 to generate the dot clock. All other timings are affected because they are derived from the dot clock. The dot clock divided by 2 is used for 320 and 360 horizontal PEL modes.
  • CRTC Mode Control Register (3d4 index 17h bit 3) - Divide Memory Address clock by 2. When this bit is set to 1, the address counter uses the character clock input divided by 2. This bit is used to create either a byte or word refresh address for the display buffer.

The problem is neither of these bits are set under mode 13h (320 x 200). My guess is that there are other registers involved but which?

It's a bit more complicated than that. Most of those clocks don't affect each other as simple as just 640 or 720 pixels or double width or not.

There are different clocks (dot clocks, pixel clocks) that are affecting the output in different ways, all working in parallel(like a multicore CPU, each thread handling e.g. the dot clock or the pixel clock respectively.

Basically, there's a main clock that's generating the display signal(e.g. overscan, active display and blanking signal outputs).
Then, in parallel to the earlier clock, there's at least two other clocks(or based on the common clock) that determine the fetching and shifting of the pixels to the DAC.
And with advanced DACs, the DAC can have it's own 'clock'(derived from the dot clock) for higher pixel depth latching as well(e.g. 15/16-bit/32-bit color modes from 8BPP pixel inputs).

UniPCemu has this implemented already:
https://bitbucket.org/superfury/unipcemu/src/ … /vga_renderer.c

Look at the VGA_Renderer function for it's implementation(it starts out to check the display state(precalculated as horizontal/vertical timings precalcs for speed, see vga_precalcs.c and vga_crtcontroller.c for most of those).

Essentially, it starts with 3 basic parallelized steps(although executed serially):
1. Get the display state(the current scanline's current rendering pixel, taken from the precalcs). This is based on a horizontal and vertical counter for the virtual rendering clock in UniPCemu(it's CRTC precalcs). The result is a bitmask identifying different rendering states for the following stages(e.g. active display, blanking, retracing etc.)
2. Assign the current display and rendering signal(e.g. active display, overscan, blanking and retracing combined). This also handles stuff like retracing and horizontal/vertical total triggers.
3. Handle the current rendering scheme(active display or NOP).

The rendering scheme for overscan is simple, just render the color.

The active display generation is a bit more complicated(e.g. VGA_ActiveDisplay_Text):
- First, determine if we're to handle an active rendering clock or wait(see VGA_ActiveDisplay_timing(), which handles the ticking of the rendering scheme for the current mode. This basically takes the byte/word/doubleword modes for the rendering scheme into account, together with shifts etc.). It's states can be fount in vga_crtcontroller.h
- When rendering actively, first get the current pixel from the text/graphics mode pixel generator(from the latched values from VRAM and (in text mode only) the used font RAM).
- Then, check it against the attribute controller. If latched and not renderable yet(e.g. 8-bit(VGA and up)/16-bit pixel(SVGA only)) stop handling and don't render.
- Finally, when not latching, render the pixel(s), the attribute controller latching duplicating the pixels by some times, as well as the dot clock).

Basically, there's a dot clock(primary clock), and the character clock in turn is derived from the divided dot clock(secondary clock) in 8 or 9 clocks, of which in turn there's another clock derived(half character clock on e.g. ET4000's 16-bit color modes) for the VRAM timing(e.g. character/pixel fetching and latching from VRAM) for the Memory Address Counter and Video Load Rate.

Even more fun to be had: The character clock in SVGA modes(e.g. ET4000AX) can be 9 dots/clock, while the display rendering still needs to use 8 dots/clock(because it can't handle 9 pixels from 8 pixels of video RAM data).

Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io

Reply 2 of 5, by superfury

User metadata
Rank l33t++
Rank
l33t++

Also, as a side note, the mode 13h is an oddball in fact: it sets the 8BPP mode(8-bit color enable in the Attribute Controller registers), 4-bit pixel from the sequencer(which IS rendering 640 4BPP pixels horizontally, just like any 640 4-bit EGA/VGA color mode. So the Sequencer and the CRTC are still rendering 640 pixels) are fetched and given to the attribute controller at 640/720 4-bit wide, but the attribute controller latches each 2 4-bit pixels into one 8-bit pixels and emits it every second clock(when it has the full 8-bit value latched, see http://www.osdever.net/FreeVGA/vga/vgaseq.htm ) to the DAC, which renders two pixels in said color as one pixel.

Although, the doubling of pixels should indeed by also done by setting the Dot clock divide by 2 bit for said mode, so having it not set means that there's a bug in the software driving the VGA controller somewhere, since the 320 pixels that are rendered should be stretched to fit the 640 pixel width.

So, the Sequencer simply renders 640 4-bit pixels from VRAM to the Attribute controller.
Then the Attribute controller combines each 2 of those 4-bit pixels into one 8-bit pixel at single width, or at double width when the divide dot clock by 2 is active.
The DAC is simply working in it's normal VGA-compatible 8BPP mode, rendering those 8-bit pixels from it's palette.

Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io

Reply 4 of 5, by superfury

User metadata
Rank l33t++
Rank
l33t++

Hmmm...
https://hackaday.io/project/6150-beckman-du-6 … -a-new-vga-mode

That makes me think about the SLR/SL4 bits in Sequencer Register 1...
What happens when it's set to 2 or 4 clocks?
Does it really shift the bytes from the low planes into the high planes(essentially a 32-bit shift(uint_32 x = x<<1), where the MSB=plane 0?) on the latched planes? What would be the purpose(other than support for 1-bit modes, other modes being unusable)? What about the other graphics modes?

Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io

Reply 5 of 5, by drsly

User metadata
Rank Newbie
Rank
Newbie
superfury wrote on 2020-01-11, 17:43:

Also, as a side note, the mode 13h is an oddball in fact: it sets the 8BPP mode(8-bit color enable in the Attribute Controller registers), 4-bit pixel from the sequencer(which IS rendering 640 4BPP pixels horizontally, just like any 640 4-bit EGA/VGA color mode. So the Sequencer and the CRTC are still rendering 640 pixels) are fetched and given to the attribute controller at 640/720 4-bit wide, but the attribute controller latches each 2 4-bit pixels into one 8-bit pixels and emits it every second clock(when it has the full 8-bit value latched, see http://www.osdever.net/FreeVGA/vga/vgaseq.htm ) to the DAC, which renders two pixels in said color as one pixel.

Took me a while, but I think finally understand...

[256-Color Shift Mode]
Graphics Mode Register (Index 05h), bit 6
"When set to 1, this bit causes the shift registers to be loaded in a manner that supports the 256-color mode."

[8-bit Color Enable]
Attribute Mode Control Register (Port 3C0 Index 10h), bit 6
"When this bit is set to 1, the video data is sampled so that eight bits are available to select a color in the 256-color mode (0x13). This bit is set to 0 in all other modes."

And from http://www.osdever.net/FreeVGA/vga/vgaseq.htm:

When this shift mode (256-Color Shift Mode field is set to 1) is enabled, the VGA hardware shifts 4 bit pixel values out of the 32-bit memory location each dot clock. This 4-bit value is processed by the attribute controller, and the lower 4 bits of the resulting DAC index is combined with the lower 4 bits of the previous attribute lookup to produce an 8-bit index into the DAC palette. This is why, for example, a 320 pixel wide 256 color mode needs to be programmed with timing values for a 640 pixel wide normal mode.

Mode 13h has 8-bit Color Enable set to 1. As you mentioned, this means sequencer still feeds 640/720 4-bit into attribute controller, but the AC combines them into 320/360 8-bit values. I did not realise before that every 8-bit value is rendered for as 2 pixels.