VOGONS

Common searches


Reply 260 of 758, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie
reenigne wrote:

If I'm understanding it correctly, the problem is this: There are some games where the art was originally made for EGA/Tandy/PCJr 160x200 16 colour mode (which has the standard RGBI colours) but which also support composite CGA (which has a completely different set of colours).

That may be robertmo's goal, not mine.

reenigne wrote:

If, for any given composite dot pattern, you look at all of the "original art" colours that map to that dot pattern across all games, you'll find that the colour in the middle of all those colours is the same as the average colour that that dot pattern produces over all the composite monitors used by the game designers, which in turn is going to be something pretty close to the ideal colour for that dot pattern that the NTSC standards demand.

I'm not sure I understand that statement. The average RGB color is the same as the average composite color is the same as the NTSC-ideal? If all composite monitors deviate from the NTSC standard in a similar way (which they do), at least the last part of that statement will turn out to be untrue. For example, composite color 8 is almost always used as brown, on CGA composite and Apple II double hi-res alike, yet looks more like a yellowish green with standard NTSC. The only exception I have seen is the PC version of BurgerTime, which indeed uses it as a darker shade of green for lettuce.

Conclusion: to at least have a chance of DosBox replicating any real monitor or TV instead of just the hypothetical NTSC reference receiver, make the NTSC decoding matrix adjustable in dosbox.conf in polar coordinate form the way NEStopia does, and be done with it. I could imagine something like this:

[composite]
angler=90
angleg=236
angleb=0
gainr=1.140
gaing=0.701
gainb=2.029

which would produce the NTSC reference receiver in this example. I think that's tolerable for developers and users alike.

And having thought some more about comb filters, the best implementation for DosBox would be in my opinion to not comb filter except when there's a detectable reason for it, because for CGA images, the bad results of indiscrimate comb filtering outweigh the good. I think that is the method used by most Apple II emulators as well. A reasonable detection routine might be (CVBS meaning a composite signal):

bCombFilter =((cvbs[row][col] == cvbs[row+2][col] != cvbs[row+1][col]) | 
(cvbs[row+1][col] == cvbs[row-1][col] != cvbs[row][col]) |
(cvbs[row][col] == cvbs[row-2][col] != cvbs[row-1][col]))

Just from the top of my head, I haven't tried it yet.

Reply 261 of 758, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

Amdek Color-1 and the two TI Color Monitors should also be included in the list of "reference" Composite Monitors. The Amdek was very popular for the Apple II machines that wanted high quality color.

An AppleColor IIe Composite Monitor had a Black and White/Color switch, but the monochrome color is white, not green. All traces of the color burst are eradicated when it is on.

http://nerdlypleasures.blogspot.com/ - Nerdly Pleasures - My Retro Gaming, Computing & Tech Blog

Reply 262 of 758, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

If all composite monitors deviate from the NTSC standard in a similar way (which they do), at least the last part of that statement will turn out to be untrue. For example, composite color 8 is almost always used as brown, on CGA composite and Apple II double hi-res alike, yet looks more like a yellowish green with standard NTSC. The only exception I have seen is the PC version of BurgerTime, which indeed uses it as a darker shade of green for lettuce.

I think that composite colour 8 is used as brown because it's the closest colour to brown in the set of composite colours, not because it actually looked brown on any monitors (at least without messing with the tint control).

There's also the fact that colour 6 on an RGBI monitor (brown) is translated by the CGA card to a colour with the same hue as the color burst and as composite colour 8 (i.e. slightly greenish yellow) on the composite output. Developers may have taken this to mean that composite colour 8 is supposed to look like RGBI colour 6, when in fact the reason is just the limited number of colour burst phases the CGA's flip-flops can generate for chroma colour (hues that are multiples of 45 degrees from the color burst).

It seems like you're suggesting that DOSBox should make the CGA composite colours look as much like the RGBI colours as possible (given some typical mapping from RGBI colours to CGA composite colours) but I'm not convinced there's any benefit in that. If DOSBox users want to play these games with the RGBI colours, they can use machine=ega - the point of CGA composite emulation is to show how these games would have looked on CGA composite.

NewRisingSun wrote:

Conclusion: to at least have a chance of DosBox replicating any real monitor or TV instead of just the hypothetical NTSC reference receiver, make the NTSC decoding matrix adjustable in dosbox.conf in polar coordinate form the way NEStopia does, and be done with it.

I like that solution too.

Reply 263 of 758, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie
reenigne wrote:

It seems like you're suggesting that DOSBox should make the CGA composite colours look as much like the RGBI colours as possible

Again, no. My reference is the Apple II's low resolution and double high resolution colors, which are identical to CGA's 640x200 mode from the point of view of NTSC artifacting. Here, too, color 8 is always used as brown, even called "brown" in most references, and often used as a darker shade of orange.

But none of this matters with an adjustable decoding matrix, so agreeing on this settles the "pretty colors" question as well.

Which leaves the following questions open:

  1. correct saturation on old versus new CGA with respect to the color burst as an amplitude reference;
  2. Should the CGA type be adjustable between old (1804472, 1501486) and new (1501981) type?
    Or isn't it that with EGA and PCjr coming out in 1984, all games using artifacting in 320x200 mode are almost certain to have been made with an old type card?
Last edited by NewRisingSun on 2012-04-23, 16:42. Edited 1 time in total.

Reply 264 of 758, by nikiniki

User metadata
Rank Member
Rank
Member
NewRisingSun wrote:
Again, no. My reference is the Apple II's low resolution and double high resolution colors, which are identical to CGA's 640x200 […]
Show full quote

It seems like you're suggesting that DOSBox should make the CGA composite colours look as much like the RGBI colours as possible

Again, no. My reference is the Apple II's low resolution and double high resolution colors, which are identical to CGA's 640x200 mode from the point of view of NTSC artifacting. Here, too, color 8 is always used as brown, even called "brown" in most references, and often used as a darker shade of orange.

But none of this matters with an adjustable decoding matrix, so agreeing on this settles the "pretty colors" question as well.

Which leaves the following questions open:

  1. correct saturation on old versus new CGA with respect to the color burst as an amplitude reference;
  2. Should the CGA type be adjustable between old (1804472, 1501486) and new (1501981) type?
    Or isn't it that with EGA and PCjr coming out in 1984, all games using artifacting in 320x200 mode are almost certain to have been made with an old type card?

How about press ALT+F11 and ALT +F12 to adjust old and new cga composite but remove non-sense having different colour by using Alt+F11 and Alt-F12 unless you can select different colour on cga composite monitors as well.

Reply 266 of 758, by Great Hierophant

User metadata
Rank l33t
Rank
l33t
nikiniki wrote:
NewRisingSun wrote:
Again, no. My reference is the Apple II's low resolution and double high resolution colors, which are identical to CGA's 640x200 […]
Show full quote

It seems like you're suggesting that DOSBox should make the CGA composite colours look as much like the RGBI colours as possible

Again, no. My reference is the Apple II's low resolution and double high resolution colors, which are identical to CGA's 640x200 mode from the point of view of NTSC artifacting. Here, too, color 8 is always used as brown, even called "brown" in most references, and often used as a darker shade of orange.

But none of this matters with an adjustable decoding matrix, so agreeing on this settles the "pretty colors" question as well.

Which leaves the following questions open:

  1. correct saturation on old versus new CGA with respect to the color burst as an amplitude reference;
  2. Should the CGA type be adjustable between old (1804472, 1501486) and new (1501981) type?
    Or isn't it that with EGA and PCjr coming out in 1984, all games using artifacting in 320x200 mode are almost certain to have been made with an old type card?

How about press ALT+F11 and ALT +F12 to adjust old and new cga composite but remove non-sense having different colour by using Alt+F11 and Alt-F12 unless you can select different colour on cga composite monitors as well.

You're talking a lot about Composite CGA combined with NTSC TVs.
But what about PAL TVs? Did Composite CGA supported these as well?

The Alt+F11 and Alt F+12 adjusts the hue, what they are talking about is changing the saturation, which is different between the old and newer cards. Both have their uses.

What about using something like Alt+F9 and Alt+F10 to select the saturation level? Few could afford the EGA when it was introduced and only about 150K PCjr.s were sold, so both modes were supported for years.

CGA composite color, as it exists on the cards discussed in this thread, does not work on PAL monitors. Even the Apple II, when exported to the European market, only supported monochrome and needed a separate card to show color graphics.

http://nerdlypleasures.blogspot.com/ - Nerdly Pleasures - My Retro Gaming, Computing & Tech Blog

Reply 267 of 758, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

For example, composite color 8 is almost always used as brown, on CGA composite and Apple II double hi-res alike, yet looks more like a yellowish green with standard NTSC.

I just checked on my TV-connected CGA card, and composite colours 8 and 13 are definitely a more reddish hue than chroma colours 6 and 14 (which are the greenish yellow expected for a chroma signal in phase with the color burst signal).

I think this is because of logic delays - each 74LSxx series gate introduces a delay of perhaps 10ns, which is a small but noticable hue shift. Because there are a different number of gates on path A (master clock to pixel clock) than there are on path B (master clock to color burst/chroma yellow flip-flop) the two signals end up with different phases. Perhaps the Apple II had (coincidentally) a similar phase difference.

So perhaps we can have all three of prettiness, accuracy and ease-of-implementation after all (and without kludging the YIQ decoding matrix) by properly emulating the phase difference between the pixel clock and the chroma/burst signals. I'll try to calibrate my NTSC decoder against my CGA card (something I've been meaning to do anyway) and see if I can come up with a good set of phases to use.

Reply 268 of 758, by VileR

User metadata
Rank l33t
Rank
l33t
NewRisingSun wrote:

2. Should the CGA type be adjustable between old (1804472, 1501486) and new (1501981) type?
Or isn't it that with EGA and PCjr coming out in 1984, all games using artifacting in 320x200 mode are almost certain to have been made with an old type card?

I think this should be adjustable, probably as another config file option - games with composite CGA support continued coming out well after that (e.g. Dragon Wars was a 1990 release that did artifacting in 320x200). There's no knowing what developers used, especially if there was an overlap in CGA model availability.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 269 of 758, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
VileRancour wrote:

I think this should be adjustable, probably as another config file option - games with composite CGA support continued coming out well after that (e.g. Dragon Wars was a 1990 release that did artifacting in 320x200). There's no knowing what developers used, especially if there was an overlap in CGA model availability.

Should it be what the developers used or what the users used that counts?

Either way, I agree with making it a config option.

We don't know for sure what the overlap of card availability was or when the changeover happened, but I think there are roughly equal numbers of old CGA cards out there as new ones now, given that there are 791 results on google.com for CGA 1501981 and 740 for CGA 1501486.

I'm thinking about making some modifications to my 1501486, building a little daughterboard that adds 1501981 composite output so that I can calibrate emulation software against both. The differences are fairly simple - the daughterboard would consist of 3 ICs, 10 resistors, a transistor and a couple of decoupling capacitors. A modification to add a 1501486 output to a 1501981 would be extremely similar.

Reply 270 of 758, by VileR

User metadata
Rank l33t
Rank
l33t
reenigne wrote:

Should it be what the developers used or what the users used that counts?

Either way, I agree with making it a config option.

True - users may simply want to reproduce what they had "back then", and having the option available is another preventative measure against "hey, the colors are all wrong".

I'm thinking about making some modifications to my 1501486, building a little daughterboard that adds 1501981 composite output so that I can calibrate emulation software against both. The differences are fairly simple - the daughterboard would consist of 3 ICs, 10 resistors, a transistor and a couple of decoupling capacitors. A modification to add a 1501486 output to a 1501981 would be extremely similar.

Interesting - I'd be curious to hear the results of that.

By the way, despite what I wrote earlier, I was able to refactor your code for DOSBox's current precalculation approach (forcing real Y values for the 16 artifact colors), and the results look a lot better than I expected... edge blending is indeed inaccurate, and it gets real bad in certain special cases, though acceptable in most others.
Will post the patch and some screenshots later.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 272 of 758, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie
reenigne wrote:

path A (master clock to pixel clock) than there are on path B (master clock to color burst/chroma yellow flip-flop)

Which reminds me: the CGA wikipedia article painfully needs updating in this regard. At least the 1501981 (haven't checked the earlier schematics) bases both the pixel clock and the color burst on the OSC signal (pin B30 in IO PORT becoming +14 MHz on sheets 3, 4 and 5, with S174 U4/U5 dividing this to produce the 7 MHz, 3.58 MHz and the other clock signals). The CLK signal (pin B20 in IO PORT) seems to only clock data input/output to the 6845. Because pixel clock and color burst are derived from the same source, the color adjust trimpot on the host system's main board could not possibly cause a phase difference. This was already experimentally found, but should be explained from the schematic as well. Has it been confirmed what the trimmer actually does?

reenigne wrote:

I think this is because of logic delays - each 74LSxx series gate introduces a delay of perhaps 10ns, which is a small but noticable hue shift.

It would be interesting to trace the signal path on a Tandy 1000 to find out why the artifact color hues are so off.

reenigne wrote:

I'll try to calibrate my NTSC decoder against my CGA card (something I've been meaning to do anyway)

Rather than doing it experimentally, I would rather have it done theoretically by tracing the signal path of the "14 MHz" and "3.58 MHz" signals and calculating the differential delay from that. Preferably someone who is better at reading schematics than I am.😀 I can follow a signal path quite well, even calculate coefficients from a resistor network, but those combinations of logic gates and shifting registers give me a headache. I still can't figure out what the composite output is when the 6845 outputs HSYNC and VSYNC, respectively. 😀

Reply 273 of 758, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

Which reminds me: the CGA wikipedia article painfully needs updating in this regard. At least the 1501981 (haven't checked the earlier schematics) bases both the pixel clock and the color burst on the OSC signal (pin B30 in IO PORT becoming +14 MHz on sheets 3, 4 and 5, with S174 U4/U5 dividing this to produce the 7 MHz, 3.58 MHz and the other clock signals).

The other versions of the CGA are the same in this regard - the only differences involve the horizontal sync pulse and the composite output.

NewRisingSun wrote:

The CLK signal (pin B20 in IO PORT) seems to only clock data input/output to the 6845. Because pixel clock and color burst are derived from the same source, the color adjust trimpot on the host system's main board could not possibly cause a phase difference. This was already experimentally found, but should be explained from the schematic as well. Has it been confirmed what the trimmer actually does?

It fine-tunes the oscillator frequency: see http://www.reenigne.org/blog/effect-of-the-co … able-capacitor/ and http://www.reenigne.org/blog/beats-of-two-colour-carriers/.

NewRisingSun wrote:

It would be interesting to trace the signal path on a Tandy 1000 to find out why the artifact color hues are so off.

The Tandy 1000 uses a ULA rather than discrete logic, so it's a complete redesign of the circuit. They probably just picked an arbitrary phase for the color burst and then made the other chroma phases correct relative to that without realizing that the phase relative to the leftmost pixel was also important.

NewRisingSun wrote:

Rather than doing it experimentally, I would rather have it done theoretically by tracing the signal path of the "14 MHz" and "3.58 MHz" signals and calculating the differential delay from that.

As a crude first approximation, I make it that the pixel clock is delayed by about 30ns from the color burst. The yellow burst goes through a 74LS74 and the data part of a 74LS151. The pixel clock goes through a 74LS04, a 74LS174 and the select part of a 74LS151, roughly 1 extra flip-flop which is about 30ns or 40 degrees of hue shift. That seems to roughly match what I see (it's a bit high if anything I think).

There are lots of approximations going into this number though:

  • Most of the propagation delays in the datasheets are only shown as maximums, not typical values.
  • The propagation delays depend on the capacitances of the output loads (which I don't know and have no way of measuring).
  • Many of the ICs have different propagation times for high-to-low transitions than for low-to-high transitions (which will affect both phases and duty cycles).
  • This only applies to the R, G and B bits from the pixel clock. The I bit doesn't go through the multiplexer, so its phase is about the same as that of the color burst.

So the error bars on that phase shift are maybe +/-10ns. I don't think we can get any more precise than that without sophisticated modelling software, which I don't have and which would embody just as many assumptions as measuring the hue shift on a real device would.

Through the same method, I make it that the magenta and green chroma colours are delayed by about 10ns from what they would otherwise be (the flip-flop generating these signals is clocked from the -14MHz signal rather than the +14MHz, which means it goes through an extra NOT gate).

NewRisingSun wrote:

I still can't figure out what the composite output is when the 6845 outputs HSYNC and VSYNC, respectively. 😀

Here's what happens on the 1501486: +HSYNC and +VSYNC from the 6845 are rounded up to the next HCLK by U21 to give +HSYNC DLY and +VSYNC DLY. When +HSYNC DLY goes high it enables the shift register U64 which causes the pins QA, QB, QC, QD, QE, QF, QG, QH and U43 pin 5 to be raised one at a time (one every lchar, or 1.12us). QB and QF are XORed to make the output HSYNC pulse at pin 8 of U42 giving an HSYNC pulse one (or maybe 2, not sure) lchars after +HSYNC OUT, with a width of 4 lchars. QG and U43/5 similarly give the burst pulse at pin 11 of U14 for a period of 2 lchars.

Meanwhile, the +VSYNC DLY pulse enables U63 which is clocked on +HSYNC OUT giving a +VSYNC OUT three scanlines high at pin 8 of LS08.

+HSYNC OUT and +VSYNC OUT are XORed together and inverted to give a -SYNC pulse which is combined into the composite output. -BLANK (+VSYNC DLY NOR +HSYNC DLY) gives a blanking period of 10 lchars around the hsync and burst pulses and 16 scanlines high (the length of the 6845's VSYNC pulse) around the vsync pulse, which is also combined into the composite output.

There's a diagram of all this at http://www.reenigne.org/misc/cga_timings.gif - red is the blanking period, dark grey is the sync, yellow is the color burst.

The 1501981 is similar, except there's no -BLANK signal and +HSYNC OUT is generated differently - a couple of the shift register outputs are combined and fed back into the shift register input which has the effect of causing -SYNC to be lowered during the color burst pulse as well. This might also make the VSYNC pulse 1 or 2 scanlines high instead of 3 (possibly a bug introduced by this fix).

Reply 274 of 758, by VileR

User metadata
Rank l33t
Rank
l33t

Here's the above-mentioned DOSBox patch. I also added the toggle-key handling from ripsaw8080's patch a few pages back (F12 to toggle composite output on/off/auto).

CGA type ("old" or "new") is adjustable, but not from the config file - haven't had the time to figure out how to do that properly. The color decoding/encoding is exactly the same as reenigne's code, but here I use it to derive the values for the 16 continuous artifact colors, then precalculate the rest of the palette based on that. So this works for all graphics modes and all possible color combinations.

However, the current line-drawing routine uses 80 colors, and this isn't enough to accurately represent every possible pixel color in all modes. This limitation forces us to make a trade-off - here I chose to ensure accurate luminance for the 16 principal artifact colors, but the other 64 (for averaging and edge blending) can't always be accurate.

mode 4 games appear to work fine (left = new CGA, right = old CGA; click for full size):

oe6oT.png 80JSz.png

mode 6 games work as well (no difference between old and new CGA. Note that the colors are different from current DOSBox results, as the 15 degree hue-offset hack is not present here):

PjdrF.png NBsT0.png

background color changes work as expected, and mode 5 games appear in greyscale (new CGA).
In the Jungle Hunt shot, note that the linewise color changes don't work (top text should be white): that would add yet more colors to the palette which we don't have room for.

wpAsP.png HLJPq.png

and here's where it plainly isn't working fine. In MS Decathlon, the bottom half of the display selection screen should be readable using the old CGA code - here it's worse than the top half... the dark blue text in the right screenshot also suffers:
[EDIT - corrected left side image]

uTpvd.png SkTkd.png

I could probably try to fine-tune the precalculation routine some more, and fix up some of the bad fringing; but I doubt that this would actually solve the unacceptable cases, since we just don't have enough palette entries. So, aside from the other issues of color correction and configurable options, the questions are:

  1. Can we do better with 128 colors, while keeping the linedrawing stuff (in vga_draw.cpp) optimized?
  2. If we need more than that, what's the reason for not using the entire palette? do we really need to reserve the 16 RGB cga colors at 0x00-0x0f while in composite mode?

Attachments

  • Filename
    composite_testbuilds.zip
    File size
    1.54 MiB
    Downloads
    162 downloads
    File comment
    win32 test builds - may or may not work for you (Visual Studio....)
    File license
    Fair use/fair dealing exception
  • Filename
    composite_update.diff
    File size
    5.84 KiB
    Downloads
    141 downloads
    File comment
    Patch against current SVN
    File license
    Fair use/fair dealing exception
Last edited by VileR on 2012-04-25, 09:10. Edited 1 time in total.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 275 of 758, by reenigne

User metadata
Rank Oldbie
Rank
Oldbie
VileRancour wrote:

mode 6 games work as well (no difference between old and new CGA when color register is set to white):

This isn't anything to do with composite decoding, but I think the text at the top of the screen in Bruce Lee is wrong - it's twice as wide in the screenshot at http://www.mobygames.com/game/pc-booter/bruce … eShotId,141512/.

I bet I know what's going on - on a real machine the BIOS thinks it is in mode 4 but the game writes 0x1a to port 0x3d8 so the hardware is really in 640x200 mode. The game is using the BIOS to draw text, so it draws it as if it is in mode 4 (2-byte wide characters). Both 320x200x4 and 640x200x2 modes have 80 bytes per line, so this works fine (colours 1 and 2 translate to the two identical shades of grey).

DOSBox must be not keeping track separately of the BIOS video mode and the hardware video mode, so when Bruce Lee calls the BIOS to draw the text, DOSBox draws it as if it is in mode 6 (1-byte wide characters).

VileRancour wrote:

If we need more than that, what's the reason for not using the entire palette? do we really need to reserve the 16 RGB cga colors at 0x00-0x0f while in composite mode?

Without looking at the code, my guess is that it's so that those palette entries don't need to be reprogrammed when switching out of composite mode (which is not a problem with the VGA's DAC since that is reprogrammed when switching modes anyway). So I bet we can use 240 palette entries with no other modifications, and 256 with a bit of extra code to reload the RGBI palette. One easy way to find out would be to try overwriting them and see what happens.

Would you like me to have a go at writing my 160 palette entry algorithm?

Reply 276 of 758, by robertmo

User metadata
Rank l33t++
Rank
l33t++
VileRancour wrote:

and here's where it plainly isn't working fine. In MS Decathlon, the bottom half of the display selection screen should be readable using the old CGA code - here it's worse than the top half... the dark blue text in the right screenshot also suffers:

my cgaold burger looks the same
my cganew jhunt looks the same
my cgaold decathlon looks the same
but my cganew decathlon looks differently than yours and it looks more clear. (my game is .com not booter but looks the same as your second(right) decathlon picture on dosboxoldcga) (BTW how does this screen actually look like on a real computer? as no screenshot on mobygames, and i have two cgacomposite clone cards and it is not clear on them too, but they are just clones...)
also left decathlon picture is way darker than your three other left pictures so i guess they may look differently too than dosbox you enclosed. Well jhunt can be bright too if at the top of screen the white stripe appears that affects whole screen but that is a different story 😉

Attachments

  • decanew.PNG
    Filename
    decanew.PNG
    File size
    19.92 KiB
    Views
    1662 views
    File license
    Fair use/fair dealing exception

Reply 277 of 758, by VileR

User metadata
Rank l33t
Rank
l33t
reenigne wrote:

I bet I know what's going on - on a real machine the BIOS thinks it is in mode 4 but the game writes 0x1a to port 0x3d8 so the hardware is really in 640x200 mode. The game is using the BIOS to draw text, so it draws it as if it is in mode 4 (2-byte wide characters). Both 320x200x4 and 640x200x2 modes have 80 bytes per line, so this works fine (colours 1 and 2 translate to the two identical shades of grey).

DOSBox must be not keeping track separately of the BIOS video mode and the hardware video mode, so when Bruce Lee calls the BIOS to draw the text, DOSBox draws it as if it is in mode 6 (1-byte wide characters).

There was an old thread where I noted this same weirdness with Bruce Lee. The text is also one row / 8 pixels higher up than in the MG image, but it's worth mentioning that I'm using a DOS conversion that was modified in various ways from the original booter - evidently not what Servo was running.

I'm not sure if you're right about the cause though. Something I've noticed before, when poking around with GWBASIC in DOSBox: when I do the same thing and enter mode 4 ("SCREEN 1"), then do an "OUT &H3D8,&H1A", I get 640x200, but subsequent text output still produces double width (40-column) characters. So DOSBox might not be at fault here, unless there's some other esoteric control value that affects this... might be interesting to see what this specific version does on a real CGA.

Without looking at the code, my guess is that it's so that those palette entries don't need to be reprogrammed when switching out of composite mode (which is not a problem with the VGA's DAC since that is reprogrammed when switching modes anyway). So I bet we can use 240 palette entries with no other modifications, and 256 with a bit of extra code to reload the RGBI palette. One easy way to find out would be to try overwriting them and see what happens.

Would you like me to have a go at writing my 160 palette entry algorithm?

Sure, if you feel like giving it a try - if it could really produce accurate results in all modes, it'll surely be better than this. Since the line-drawing routine would have to be modified, keeping it fast/optimized should probably be a concern (something you've already considered with your earlier 64-color idea; hopefully 160 would still allow for that).

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 278 of 758, by VileR

User metadata
Rank l33t
Rank
l33t
robertmo wrote:

but my cganew decathlon looks differently than yours and it looks more clear. (my game is .com not booter but looks the same as your second(right) decathlon picture on dosboxoldcga) (BTW how does this screen actually look like on a real computer? as no screenshot on mobygames, and i have two cgacomposite clone cards and it is not clear on them too, but they are just clones...)

ah, not all left side pictures are necessarily newCGA - I simply wanted to show a couple of examples for what happens in each case. For Decathlon I used oldCGA for both screenshots, since the game was evidently designed for use with older cards.

it's good that you made me check it though, since I ran the CGA input through the cga2ntsc converter, and realized that the oldCGA image shouldn't look like it does with my patch... [EDIT - found the reason, see next post]

for reference this is what the screen should look like on a real composite display (more or less), using cga2ntsc with both card types. Clearly the bottom text should be more readable on the older cards.

Attachments

  • deca_new.png
    Filename
    deca_new.png
    File size
    10.62 KiB
    Views
    1639 views
    File comment
    Decathlon display select screen - "new" CGA
    File license
    Fair use/fair dealing exception
  • deca_old.png
    Filename
    deca_old.png
    File size
    10.72 KiB
    Views
    1639 views
    File comment
    Decathlon display select screen - "old" CGA
    File license
    Fair use/fair dealing exception

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 279 of 758, by VileR

User metadata
Rank l33t
Rank
l33t

OK, I found the reason why my first Decathlon image appeared wrong... problem wasn't with my build, it's just that:

robertmo wrote:

(my game is .com not booter but looks the same as your second(right) decathlon picture on dosboxoldcga)

In the booter version, for some reason, the display selection screen sets the CGA palette to high intensity - that's why everything looked lighter. The .EXE conversion uses the same palette but in low intensity. I've updated the left side screenshot in my above post - this time it's from the .EXE (and again with old cga).

As you can see it looks much closer to ntsc2cga's output but the text is still not clear enough.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]