First post, by superfury
I know that it loads 32-bits from all 4 planes and splits it up into pixels(8-bit each pixel). What I don't understand is, that the clocking mode register(Sequencer Clocking Mode Register) is set to 0(Both Shift/Load Rate&Shift by 4 bits), which makes the VGA read VRAM every 8 pixels. After 4 pixels the data read out from VRAM is already emptied(32-bits hold 4 pixels of 8-bits each). How is this handled in actual VGA hardware?
Anyone can explain this to me?
Edit: So it seems that it actually reads 4 bits 8 times(still 8 pixels processed from the Sequencer's point of view), but this is somehow combined into 4 pixel on the screen? How does this compare to normal CRTC timing? Is the CRTC timing halved(Divided by 2) to make this work?
Edit: I managed to get this working on plain VGA hardware (and thus it's SVGA ET3000/ET4000 extension). If I use the extended modes (non 16-color 640x480), which use more than 256K memory (the 800x600x16, 1024x768x16 and 640x480x256 modes) it seems to mess up with parts of display merged into one big mess on the screen:
Anyone knows what the cause of this might be?
Edit: Looking at a dump of VRAM when the first screen of the graphics interface is shown (gray active display) reveals that only up to 0x3FFFF(256k RAM) is filled with data.
Edit: After disabling the behaviour that the extended timings and registers only work when the extensions are enabled by using the KEY(as described in the documentation), I get a wider screen(doubled horizontal timings):
Anyone can see what's going wrong here?
Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io