VOGONS


ATI Graphics Solution

Topic actions

First post, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie

There is some conflicting information on the net regarding ATI Graphics Solution rev 3 - the earliest known ATI card to be sold, based on CW16800-A chipset. Specifically whether or not it supports CGA only (in addition to MDA/Hercules) or does it also emulate Plantronics Colorplus CGA.

So I've decided to test it. There isn't exactly a lot of software written for Plantronics CGA but the ATI card comes with several drivers and testing programs on a floppy. I didn't get the floppy but found an archive with the files - so I can't be sure it's the correct software for this particular model (it could be for one of the later Graphics Solution cards) but it seems to work, so the assumption here is these cards are all pretty much hardware compatible.

The card seems to properly render the 320x200 and 640x200 16-color tests:

320x200x16.jpg
Filename
320x200x16.jpg
File size
31.47 KiB
Views
3113 views
File license
Fair use/fair dealing exception
640x200x16.jpg
Filename
640x200x16.jpg
File size
29.18 KiB
Views
3113 views
File license
Fair use/fair dealing exception

And it also supports 80x25 characters mode in 16 colors, although the font box is still CGA 8x8:

BIOS 80x25.jpg
Filename
BIOS 80x25.jpg
File size
36.46 KiB
Views
3113 views
File license
Fair use/fair dealing exception
LM6 80x25.jpg
Filename
LM6 80x25.jpg
File size
60.15 KiB
Views
3113 views
File license
Fair use/fair dealing exception
NC4 80x25.jpg
Filename
NC4 80x25.jpg
File size
76.04 KiB
Views
3113 views
File license
Fair use/fair dealing exception

Sorry for the photo quality, but then again the picture is a bit fuzzy - though the noise is to be expected. It took some unshielded wires, voltage converter, more wires, FPGA board, cheap VGA cable, OSSC and finally HDMI cable to a 2nd hand LCD monitor to actually output CGA video on a modern display.

Anyway, the answer seems to be "All ATI GS cards are Plantronics CGA compatible".

Reply 1 of 127, by blurks

User metadata
Rank Oldbie
Rank
Oldbie

Interesting, thanks for testing and sharing. I have an ATI Small Wonder Graphics Solution rev 2 in the box with complete accessories. It is sitting on my shelf and waits for proper testing since forever but I've yet to come up with an inexpensive and convenient solution to convert 9-pin CGA to 15-pin VGA.

Reply 2 of 127, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
blurks wrote:

I've yet to come up with an inexpensive and convenient solution to convert 9-pin CGA to 15-pin VGA.

Welcome to the club. I think I'm going to create my own FPGA scaler for this purpose since OSSC doesn't like the "weird" CGA resolution of 640x200. It keeps sampling in 262p mode and that's the major source of the horizontal pixel jitter. CGA2VGA is actually pretty fun project and not too complicated. I might even use HDMI encoder rather than VGA output. Directly digital to digital - well, almost since without pixel clock there is always going to be some sampling noise. I could maybe 4x over-sample and use mean value, or something like that. This approach would also work for most EGA modes.

Nothing I came up with is groundbreaking, there are other FPGA projects out there, from simple https://www.raphnet.net/electronique/cga2vga/index_en.php to more polished ones https://sites.google.com/site/tandycocoloco/m … -cga-ega-to-vga (this one is sold as MCE2VGA).
There's also no shortage of China-made RGBI to VGA upscalers for arcade PCBs but these produce image of varying quality and don't properly process CGA brown color. Plus these don't support MDA/Hercules either so if I was to buy something off-the-shelf for a PC I'd rather get MCE2VGA.

Reply 3 of 127, by keropi

User metadata
Rank l33t++
Rank
l33t++

This? Re: MDA/CGA/EGA/VGA to HDMI using Raspberry Pi Zero
Where are u located deunan?

🎵 🎧 PCMIDI MPU , OrpheusII , Action Rewind , Megacard and 🎶GoldLib soundcard website

Reply 4 of 127, by Caluser2000

User metadata
Rank l33t
Rank
l33t

Not too bad a price for a complete solution https://monotech.fwscart.com/EternalCRT_-_MDA … 4_19478791.aspx

There's a glitch in the matrix.
A founding member of the 286 appreciation society.
Apparently 32-bit is dead and nobody likes P4s.
Of course, as always, I'm open to correction...😉

Reply 5 of 127, by VileR

User metadata
Rank l33t
Rank
l33t

There are quite a few professional reviews (as well as ATI advertisements) which mention that the card supports the Plantronics ColorPlus modes. In theory, just running some software and getting 16 colors at 320/640x200 doesn't necessarily indicate that these modes are Plantronics-compatible, because there were other display cards that offered similar modes early on -- e.g. MicroGraphics Technology Master Graphics, Tecmar Graphics Master, and I think something from Paradise too (leaving aside the integrated ones, like later Tandy 1000s or certain Amstrads). Highly likely that they all differed in how the modes were accessed and programmed, at least a little.

In fact I wonder how the Plantronics card came to be remembered even a little better than those others, and why Plantronics compatibility was ever a selling point at all. As you say, it's a struggle to find any software that supported it in the first place. 😉

Deunan wrote:

And it also supports 80x25 characters mode in 16 colors, although the font box is still CGA 8x8

Plain ol' CGA does that too.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 6 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

In fact I wonder how the Plantronics card came to be remembered even a little better than those others, and why Plantronics compatibility was ever a selling point at all. As you say, it's a struggle to find any software that supported it in the first place. 😉

I suppose it's a combination of factors.
If we start from the angle of Hercules... I think it's safe to say that Hercules mostly became a standard because *every* clone chipset out there was Hercules-compatible. I don't think there were all that many real Hercules cards around.
I suppose Hercules was 'low-hanging fruit' for these clone builders, because Hercules itself was just a slightly modified MDA card. The main difference was that Hercules came a few years later, and it was economically viable to put 64k on the card instead of the 4k that the original MDA had. So if you're going to clone anyway, why stop at MDA when you can do Hercules?

Continuing the low-hanging fruit story... It seems that it was also common for clone cards to support both MDA/Hercules and CGA. Probably because OEMs wanted a colour solution, but since colour monitors were still relatively expensive, a budget MDA/Hercules option was also required. A single card (or onboard chipset) that could do both, would be an easy win for IHVs.

Now, since you already had the 64k on there for the Hercules mode, it was again low-hanging fruit to extend CGA to support 16-colour modes as well.
EGA wasn't an option: although 64k EGA configurations exist, you really need 256k in practice. Besides, EGA contains very advanced logic, which would not fit the budget market.
Plantronics was probably chosen because it was a similar quick-and-dirty hack to Hercules: just a bit of TTL logic does the job. Perhaps also because their dual bitplane layout was quite flexible, in allowing you to create overlays and such.
PCjr/Tandy was probably not an option because you couldn't make it compatible with existing software anyway. Firstly, as we've also seen with the Tandy sound card, most games simply do a BIOS check. Secondly, even if you would patch out the BIOS check, you still wouldn't be able to be fully compatible with PCjr/Tandy, because they have a shared memory architecture.
I don't know enough about the other 16-colour CGA-like solutions, but I doubt any of them would be more simple than Plantronics is. So I'd say that's why Plantronics was chosen by clone builders (that, and perhaps some manufacturers guarded their IP better than Plantronics and Hercules did 😀).

My personal experience with Hercules is that I never saw a real card until I bought one on Ebay a few years ago. The dozens of Hercules machines I've seen up to then, were all clones, mostly ATi GS/SW or Paradise PVC.
I have yet to see a real Plantronics card at all. I think they're an order of magnitude more rare than real Hercules cards.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 7 of 127, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
VileRancour wrote:

In theory, just running some software and getting 16 colors at 320/640x200 doesn't necessarily indicate that these modes are Plantronics-compatible

That's why I wrote "seems to" rather than "definitely". Even if I test it on that new game I can't be 100% sure since my understanding is the code was developed on a clone rather than original Plantronics CGA. Chances are a clone is easier to work with - like, say, you could write a code that would work on a "modern" MDA/Hercules clone with 64k RAM but not the original MDA which had less memory.

VileRancour wrote:

Plain ol' CGA does that too.

For some reason I was under the impression that CGA could only do 40x25 in color, but 80x25 was shades of gray. My bad then.

BTW that Raspberry Pi project is interesting but seems to be tied to a specific hardware. And it's not full-frame converter so while it does offer better latency in games, it can't convert 50Hz MDA/Hercules modes to modern VGA timings.

Reply 8 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
Deunan wrote:

For some reason I was under the impression that CGA could only do 40x25 in color, but 80x25 was shades of gray. My bad then.

Well, 80x25 is not in colour on a composite display. Perhaps that is what you remembered.
The reason for this is that 80x25 mode was hacked in by IBM. It uses a special high-bandwidth mode, because 80x25 requires 2 bytes per character (ASCII value and attribute), which totals to 160 bytes per scanline. All other modes are 80 bytes per scanline.
Because of this high-bandwidth mode, IBM had to double the clock on some parts of the circuit. In doing this, they messed up the position of the NTSC colorburst in the signal. As a result, colorburst is not enabled in 80x25 mode.
You can manually toggle the bit to enable it, but the result is that most displays will lose horizontal sync, because of the messed up signals in the hblank interval.

The ATi GS/SW and Paradise PVC4 simply do not allow you to enable colorburst even when you try to manually toggle the bit in 80x25 mode (probably a safeguard... shame they didn't bother to fix the circuit so that 80x25 mode would just work with colorburst enabled).
On real IBM CGA, someone figured out that if you picked a certain background colour, most NTSC circuits would wrongly interpret this as part of the hsync signaling, and sync can be restored.
This opened up the trickery for 8088 MPH's 1024 colour mode. Which is why we added a calibration screen. It allows people to fine-tune some low-level parameters of the CGA circuit, to try and get the display in sync, and get the colours fine-tuned.

RGBI doesn't suffer from this problem, because it has separate sync signals, and therefore 80x25 mode will just work in 16 colours on an RGBI monitor.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 9 of 127, by VileR

User metadata
Rank l33t
Rank
l33t

There's still the 80x25 snow however, which is another symptom of the double bandwidth being kind of a hack. But that's something most of the clones did fix. 😉

I do believe that if the IBM PC had been more successful in the home/games market early on, then the Plantronics card (or one of those other work-alikes) would've caught on much better. Hercules made it because of the "business graphics" market, i.e. Lotus 1-2-3 and similar stuff where higher resolution was far more important than extra colors. But that didn't happen with the "super-CGA"s, so it's weird to me that Plantronics still became a semi-standard of sorts for a while.

True, I highly suspect that the other early "super-CGA"s are functionally very similar to the ColorPlus, at least in how they handle the 16-color modes internally. But the programming interface must've been a little bit different, for instance in the extended non-CRTC registers, or the dance you have to do to enable the extra modes (since there was no BIOS standard for them).

Deunan wrote:

Even if I test it on that new game I can't be 100% sure since my understanding is the code was developed on a clone rather than original Plantronics CGA.

Heh, there's a new game out with Plantronics support? (Is that Planet X3?)

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 10 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

I do believe that if the IBM PC had been more successful in the home/games market early on, then the Plantronics card (or one of those other work-alikes) would've caught on much better.

Yea, initially I wanted to say that the 16-colour modes wouldn't be too interesting for games, because they're basically twice as slow as the 4-colour mode, which didn't exactly break any records in itself.
But then I would have to "well actually" myself, because PCjr/Tandy are even slower because of the shared memory, and they do use 16-colour modes (although I believe it's often 160x200 rather than 320x200).
I suppose for many games, the speed isn't very relevant, such as graphics adventures or strategy games.

VileRancour wrote:

Hercules made it because of the "business graphics" market, i.e. Lotus 1-2-3 and similar stuff where higher resolution was far more important than extra colors. But that didn't happen with the "super-CGA"s, so it's weird to me that Plantronics still became a semi-standard of sorts for a while.

It did though. I had Plantronics drivers for WordPerfect, Lotus 1-2-3 and some others (my first PC was a Commodore PC10-III with onboard PVC4, so I had Plantronics, and it also came with extended textmodes like 132x25... The joys of analog CRTs, horizontal resolution is technically 'infinite', so any standard CGA monitor can do that. My Amiga 600 did a similar thing, offering a 1280x512 video mode on standard PAL displays).

VileRancour wrote:

True, I highly suspect that the other early "super-CGA"s are functionally very similar to the ColorPlus, at least in how they handle the 16-color modes internally. But the programming interface must've been a little bit different, for instance in the extended non-CRTC registers, or the dance you have to do to enable the extra modes (since there was no BIOS standard for them).

Yes, well I meant specifically that they're all probably just a 'dumb framebuffer'. No hardware sprites, linedrawing, blitting or other fancy stuff. EGA is relatively fancy (and was very expensive in the early days, even clones were very expensive... same with VGA).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 11 of 127, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

PCjr/Tandy are even slower because of the shared memory, and they do use 16-colour modes (although I believe it's often 160x200 rather than 320x200).

I've read somewhere that it's possible to hack the text mode of any CGA to get 160x200 in 16 colors? Or was that 160x100, I can't remember now.

Anyway, the ATI GS can also do 640x200 in 16 colors so obviously the extra ram allows for more modes than what Plantronics could do. I'm going to try and code a working color output for my silly fractal benchmark - but I simply don't have the time right now. Once it is ready though perhaps I can find someone with actual Plantronics CGA to test it.

Scali wrote:

I had Plantronics drivers for WordPerfect, Lotus 1-2-3 and some others

ATI also comes with drivers for Lotus Symphony and 1-2-3 (132 char mode), Tate Framework and AutoCAD (640x200 in 16 colors). My guess would be the card got "popular" not because of games but the early pro users who demanded better color graphics. Much like with early co-processors, that's a very specific group of people who can pay the price of this extra hardware.

I have to say I can see the appeal. Sure, by current standards the 640x200 is weird mode due to pixel ratio and 16 constant colors (it should be called 8+8 as you can't mix low and high intensity RGB signals on CGA) is nothing to write home about - but back in the early '80? The 8x8 font box is poor but passable and 16-color text mode is enough for many programs. But CGA "graphics" modes are... absymal. How did IBM even conclude that would be acceptable when many 8-bit machines could do way better than CGA is beyond me.

Reply 12 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
Deunan wrote:

I've read somewhere that it's possible to hack the text mode of any CGA to get 160x200 in 16 colors? Or was that 160x100, I can't remember now.

Yes, 160x100 is possible. It's not very efficient/convenient however, because of the attribute/char pairs in memory. You set all chars to a specific character (a blocked character that is half foreground, half background colour), and then set the character height to just 2 scanlines, to get 100 rows. Then you can use the foreground and background fields of the attribute to set the colours of these characters.
Theoretically 160x200 would be possible, but because you need 2 bytes per character, you need 160 bytes per scanline, so 100 scanlines will max out the 16k of memory on CGA.

Deunan wrote:

ATI also comes with drivers for Lotus Symphony and 1-2-3 (132 char mode), Tate Framework and AutoCAD (640x200 in 16 colors). My guess would be the card got "popular" not because of games but the early pro users who demanded better color graphics. Much like with early co-processors, that's a very specific group of people who can pay the price of this extra hardware.

I think it's more the same case as with the Roland MPU-401 + MT-32: the fact that there's software support for it, does not mean that the hardware was actually popular or widely used.
I would think that hardly anyone specifically bought a Plantronics ColorPlus card to upgrade their machine. The same probably with the ATi GS/SW and the Paradise PVC4.
I think they were mostly sold as standard graphics solutions in clones, so the cards you find now (apparently mostly ATi GS/SW, the Paradise PVC4 is a lot more rare), are probably OEM cards, taken from clones, not sold separately.

As such I would think a lot of people happened to have a Plantronics-compatible machine, just because their clone happened to support it. I doubt that most of them were even aware of it, and just treated their machine as a Hercules or CGA machine, and never bothered to experiment with custom drivers in 1-2-3, WordPerfect etc.

Deunan wrote:

I have to say I can see the appeal. Sure, by current standards the 640x200 is weird mode due to pixel ratio and 16 constant colors (it should be called 8+8 as you can't mix low and high intensity RGB signals on CGA) is nothing to write home about - but back in the early '80? The 8x8 font box is poor but passable and 16-color text mode is enough for many programs. But CGA "graphics" modes are... absymal. How did IBM even conclude that would be acceptable when many 8-bit machines could do way better than CGA is beyond me.

It would seem that IBM was basically clueless about designing graphics hardware. CGA was hopeless, EGA was still quite bad (somewhat fancier hardware, but excruciatingly slow as a result), and even VGA wasn't all that great (the original IBM VGA chipsets were still very slow, it wasn't until much more efficient clones came around, such as the ET4000, that VGA really started to come into its own). Still no sprites, no hardware blitting, line drawing or other fancy stuff, and only very limited scrolling abilities.
I think another great missed opportunity was custom character sets. For example, the Commodore 64 allowed you to define multiple character sets in RAM, and you could easily switch between them. Most graphics on C64 aren't actually in graphics mode, but are built from cleverly constructed character sets. This is especially useful for tile-based games, since each character can be a 4x8 part of a tile. There's a lot less memory to update than with a raw framebuffer, where every bit directly represents a pixel/colour.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 13 of 127, by VileR

User metadata
Rank l33t
Rank
l33t
Deunan wrote:

I have to say I can see the appeal. Sure, by current standards the 640x200 is weird mode due to pixel ratio and 16 constant colors (it should be called 8+8 as you can't mix low and high intensity RGB signals on CGA) is nothing to write home about - but back in the early '80? The 8x8 font box is poor but passable and 16-color text mode is enough for many programs.

Oh yes, I definitely see the appeal. Pre-EGA, it would've been great to have a commonly-supported graphics mode with 16 colors. Balking at weird aspect ratios is for the weak! 🤣
Not sure what you mean by the bolded part, though? In CGA intensity is controlled by one bit, and goes on its own pin, just like red, green and blue. I'm not aware of such a limitation that could apply to intensity alone.

Anyway, I'm not surprised that Plantronics and similar cards had drivers out for widely used programs; however my impression is that early on, most business users would've had monochrome displays. People made much of the fact that the 9x14 text was much clearer, that monitors without a shadow-mask were sharper, that the long-persistent phosphor prevented flicker, etc. -- it was considered more ergonomic for people spending long hours in front of their monitor for a living. So it was natural consequence that Hercules made much better inroads into that market than any particular color-only card.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 14 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

Anyway, I'm not surprised that Plantronics and similar cards had drivers out for widely used programs; however my impression is that early on, most business users would've had monochrome displays. People made much of the fact that the 9x14 text was much clearer, that monitors without a shadow-mask were sharper, that the long-persistent phosphor prevented flicker, etc. -- it was considered more ergonomic for people spending long hours in front of their monitor for a living. So it was natural consequence that Hercules made much better inroads into that market than any particular color-only card.

Yea, I suppose this is an 'artificial' divide because of the limits of commodity colour screens at the time.
Mono screens were great for text and DTP stuff (probably why Apple also opted for a mono screen on the Mac, and why Atari offered a high-res mono option for the Atari ST).
But there was also a class of users who wanted colour graphics. People in the CAD/CAM business for example.
Thing is, they probably would have had the budget for more high-end stuff (special graphics workstations), where Plantronics was a budget solution aimed to eke out more from your standard NTSC-based CGA CRT monitor.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 15 of 127, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
VileRancour wrote:

Not sure what you mean by the bolded part, though?

On CGA you get just one intensity bit for all 3 RGB signals. So you can't have, say, "high red" and "low blue, low green" at the same time. In the end you have 8 colors only but in 2 versions: darker and brighter. The dark yellow being converted internally to brown inside the monitor is the exception but it's a hardwired function. That's why I called it 8+8. Intensity bit basically raises the black level and all colors follow - and that's it.

On EGA you have 2 bits per each color channel and since we have 3 channels that's 6 bit total so 64 combinations. It's almost the same system (in the sense it uses same connector and digital outputs) but so much better.

Now, I was too young to be using any sort of CAD software in the early '80 so I can't comment on the popularity of these "advanced" CGAs. But I did use some simple PCB routing software that was CGA-based at some point and it was... crude. The 4 colors were just enough for a 2-layer PCB and the 320x200 resolution allowed only for through-hole technology with at most one trace going between the DIP pins. Still, it kinda worked - and 640x200 mode was useless for this purpose as it only had black and white and doubled resolution was therefore useless anyway. So I bet any CAD software user back then would jump on the opportunity of having 4 colors in 640x200, and better yet 16 with these clones.

In other words, ATI made a lot of different cards based on this design and didn't go broke - so they had to be selling them in decent numbers. And they weren't the only ones. Maybe the card sold only as a cheper alternative to CGA, but I suspect the extra features were important to many people.

Reply 16 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
Deunan wrote:

On EGA you have 2 bits per each color channel and since we have 3 channels that's 6 bit total so 64 combinations. It's almost the same system (in the sense it uses same connector and digital outputs) but so much better.

Thing is that this requires a more complicated circuit. With CGA, the bits in memory correspond directly with the output lines (RGBI).
With EGA, there are 4 bitplanes in memory, which represent an index into a 16-entry palette, which contains the actual 6-bit colours for the output lines (RrGgBb).
This makes both the video circuit and the monitor more expensive.
IBM deliberately disabled the enhanced colours in 200-line modes (320x200 and 640x200), in order to remain compatible with the cheaper CGA monitors. The palette can only be used in 640x350 mode, on a real EGA-compatible monitor.

Deunan wrote:

In other words, ATI made a lot of different cards based on this design and didn't go broke - so they had to be selling them in decent numbers. And they weren't the only ones. Maybe the card sold only as a cheper alternative to CGA, but I suspect the extra features were important to many people.

ATi was the king of OEM deals. They probably sold nearly all their cards to machine builders. Many big brands shipped with ATi cards from the factory.
This was also what ATi's reputation was like... Similar to Intel these days I suppose: it's cheap, it ships default with most computers, and it works, but it's not particularly great (the ATi GS/SW is actually *slower* than a real IBM CGA).
When the 3D revolution was started by 3DFX, ATi was basically the only survivor, not because they built great 3D accelerators, because their early Rage solutions were horrible, both hardware-wise, and driver-wise. But ATi apparently had the best OEM deals, so they sold their chips anyway, and had a second chance. They eventually got it right with the first Radeon (although driver problems still plagued them for many years, and in fact even today AMD's drivers aren't too great).
All the other 'classic' SVGA brands were wiped out almost immediately by 3DFX. Paradise/WD, Tseng Labs, S3, Video7, Cirrus Logic, Trident, Oak, Matrox. You name it, they all got wiped out when 3D acceleration became the norm.
ATi lived on for years, until eventually they got bought out by AMD for a huge amount of money.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 17 of 127, by keropi

User metadata
Rank l33t++
Rank
l33t++

^ inside XT~386 Hyundai systems I always find ati cards, from small wonders to VGA ones

🎵 🎧 PCMIDI MPU , OrpheusII , Action Rewind , Megacard and 🎶GoldLib soundcard website

Reply 18 of 127, by Deunan

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

Thing is that this requires a more complicated circuit. With CGA, the bits in memory correspond directly with the output lines (RGBI).
With EGA, there are 4 bitplanes in memory

That's another beef I have with PC video cards. Bitplane modes. If they had to fit the VRAM into a smaller window below 1MB, why not use simple banked approach. The VGA hardware is so complicated for no good reason, one can come up with much more reasonable ways to overcome slow RAM to DAC transfers: wider words, interleaving, FIFO on the ISA side rather than direct memory access with waitstates, etc.
It's like whoever came up with the idea for a video card just gave the HW guys all-green for any solution that works in the end. Rather than think what the software will need to do to actually use it. I can sort-of excuse ZX Spectrum and it's wacky VRAM map - it was 8-bit system with just one PLD for all the mobo logic and shared RAM. PC cards though? It could've been done better. It should have.

Scali wrote:

in fact even today AMD's drivers aren't too great

Compared to what? NVidia? I'll concede that NV has came up with simply superb HW+SW combo solution that is way ahead in terms of overall efficiency than what ATI/AMD has. But that comes at a cost. AMD drivers "just work", and if the game/engine was not designed with NV in mind, need no updates to deliver 90% of performance. I haven't used NV cards in about a decade now so maybe it got better but I remember the crashes. There were a lot of them, and a new "game" driver every week that fixed one thing and broke another.
Intel drivers are pretty darn stable but also you get maybe a couple of releases for a given CPU family and the current Windows OS and that's it. Plus it's not a huge wonder these work, the GPUs are pretty pathetic to begin with so there's not a lot that can go wrong. Let's see Intel do a proper stand-alone card and a driver for that.

AMD sucks at OpenGL but then OGL should've died years ago at version 3.0.

Reply 19 of 127, by Scali

User metadata
Rank l33t
Rank
l33t
Deunan wrote:

That's another beef I have with PC video cards. Bitplane modes. If they had to fit the VRAM into a smaller window below 1MB, why not use simple banked approach.

For various 2d effects, bitplanes are actually far easier to work with. The Commodore Amiga also uses a bitplane-based approach, albeit more advanced than EGA.
But bitplanes allow you to cleverly arrange your palette so that setting a single bit can act as a transparency effect for example.
You can also selectively update only the bitplanes that you need, instead of always having to perform a read-modify-write on entire bytes to just update the bits you're interested in.

Deunan wrote:

The VGA hardware is so complicated for no good reason, one can come up with much more reasonable ways to overcome slow RAM to DAC transfers: wider words, interleaving, FIFO on the ISA side rather than direct memory access with waitstates, etc.

VGA marks the turning point: once you go up to 8 bitplanes, you basically have a 1:1 mapping of byte-to-pixel. So at this point, updating entire pixels becomes relatively simple, while at the same time 8 inidivdual bitplanes now become significant overhead.

Deunan wrote:

Compared to what? NVidia?

I think even Intel has more stable drivers than AMD at least.
I have a PC at work with a Radeon R7 360, and the video card crashes a few times per week. And it crashes badly. As in: I actually need to pull the display port cable loose, then power-cycle the PC, and then plug it back in once the PC is booting, else the video simply won't come on anymore.
When I get back to the desktop, there's always a message saying that Radeon Wattman restored the default settings because it detected that the GPU became unresponsive (and I never changed any settings to begin with... my card is not overclocked or tweaked in any way).
I've updated the drivers many times, always running the latest, but so far, they never produced a driver that fixes the issue.

Prior to that I had a laptop that kept crashing on the Windows 10 1709 update. After trying many times and unplugging/uninstalling all non-essential stuff, I eventually got it to work after I ran Driver Cleaner to completely remove all AMD driver junk (just a regular uninstall wasn't enough, apparently, and these were the latest, up-to-date drivers, which were supposed to be compatible with the 1709 update).
Even so, that laptop always had problems... It had a combined AMD and Intel graphics setup, and it rarely worked when I plugged in a VGA or HDMI cable for a beamer. Never had that problem with combined NV and Intel laptops.

Another annoying thing was that the AMD control panel always crashed when you connected to your machine via Remote Desktop. Never happened on NV or Intel drivers either.

I also had a crazy snafu some years ago when I downgraded a machine from a HD5770 to an X1900XTX card.
Apparently the uninstaller left some OpenGL libraries on disk. And the shared code in the drivers for the X1900 apparently just blindly loaded these libraries when they found them on the system (as opposed to checking what hardware you run on, and what libraries you need for that specific hardware). These libraries were specific to newer cards, and they crashed with a nullpointer exception on my X1900TXT (so no sanity checks either, just blindly using a pointer that was never initialized because a previous call had failed on my hardware). This meant that OpenGL simply didn't work.
I eventually managed to get it to work after I had stepped through their driver code to see where it failed, and it dawned on me that this particular DLL shouldn't even be there in the first place. After deleting the file, OpenGL came back to life. Which made me wonder how you would ever get both these cards working in the same system. The DLL would have to be there for the HD5770, but would break OpenGL for the X1900XTX.
Just shows how retarded their driver department is.

Deunan wrote:

AMD sucks at OpenGL but then OGL should've died years ago at version 3.0.

AMD also sucks at DirectX 11. They simply didn't even bother to implement the multithreading model at all. Their driver just buffers all commands and executes everything on the main thread. Then they blamed Microsoft because the API was allegedly broken. Nevermind that both NV and Intel managed to get it working just fine.
DirectX 12 and Vulkan are easier because all the complexity is shoved into the application now. Just means there's less for the driver team to screw up. Not that the drivers are actually better.
If anything, the larger gap between DX11 and DX12 on AMD merely shows how much performance they left on the table, that NV managed to extract from their DX11 driver.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/