VOGONS


First post, by serialShinobi

User metadata
Rank Newbie
Rank
Newbie

When does the acceleration take place? In the Windows 3.x shell? What causes a 2D Accelerator chipset to respond to the Windows GUI? As far as I know these chipsets of the early 90s did not give games in DOS a boost in VGA mode performance. But Windows 3.x had no graphics chipset driver. Maybe a 2D Acceleration API? I can't find a driver for the Diamond Stealth VRAM nor it's chipset the S3 P86C911. Yet I read an article "S3 Graphics Gone But Not Forgotten", TechSpot 2021, that spoke about bit block transfers -- how is it acceleration works only for the GUI or Windows? What about for DOS games? The early accelerators didn't also impact bit mapped games in DOS? I guess there was some universal driver in Windows 3.x?

Reply 1 of 48, by vstrakh

User metadata
Rank Member
Rank
Member

The real 2D acceleration predates Windows 3.x
There is IBM 8514 video card released in 1987, using standardised API to offload common drawing primitives - lines, polygon fills, block copying.
There were register-compatible clones, and clones not compatible in hw but providing own implementation for the same 8514 api.
Windows needs a video driver to translate device-independent drawing calls into the series of commands to the real hardware.
DOS applications can benefit from acceleration by making calls to the API.

Some videos:
https://www.youtube.com/watch?v=xrOci-LaMeo
https://www.youtube.com/watch?v=fX8uAwY0FGQ

Reply 2 of 48, by mattw

User metadata
Rank Oldbie
Rank
Oldbie
vstrakh wrote on 2023-03-18, 17:42:

IBM 8514

not only IBM 8514A is the first 2D accelerator, but all concepts and ideas behind any 2D acceleration comes from there. that makes basically, all 2D accelerators more or less clones of IBM 8514A - ATI Mach8/32 are even 100% registry compatible with IBM 8514A, while early S3 are almost fully register compatible. you can see here:

https://lists.gnu.org/archive/html/qemu-devel … 6/msg01679.html

Qemu patch that adds S3 support upon 8514A support. unfortunately, as far as I can tell, that patch was no upstreamed to Qemu even 12 years later. there is half-baked emulation in 86box:

https://github.com/86Box/86Box/issues/2968

but you can still use 86box with Tseng 4000 emulation to get working 8514A, as I covered here:

https://github.com/86Box/86Box/issues/2968#is … ment-1353431471

Reply 3 of 48, by rasz_pl

User metadata
Rank l33t
Rank
l33t

>What about for DOS games?

there was an attempt https://en.wikipedia.org/wiki/VESA_BIOS ... gust_1996) https://shawnhargreaves.com/freebe/ but 1996 was hilariously too late, 5 years too late. ATI and S3 released 8514 clones in 1991, Tseng in 1993. If VBE/AF was a thing in 1991 we would definitely play 2D accelerated DOS games before Win95.

Last edited by rasz_pl on 2023-03-19, 00:49. Edited 1 time in total.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 4 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++

IBM Professional Graphics Controller (PGC) in 1984?
AutoCAD supported it; 640x480 pels in 256c. The matching PGC aka PGA monitor (model 5175) formed the basis for VGA monitors (signaling, frequencies), AFAIK.

IBM 8514/A? Windows 2.x and OS/2 1.1 have drivers for it, giving accelerated 1024x768@256c in '87/' 88.
Alternate resolution was 640x480, AFAIK.

Then TIGA (released '89), based on TMS34010 and TMS34020?
Windows 3.10 shipped with a middleware driver for it (talked to the real TIGA driver).

IBM XGA also was a thing, but was it properly accelerated?

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 5 of 48, by hyoenmadan

User metadata
Rank Member
Rank
Member
serialShinobi wrote on 2023-03-18, 17:30:

how is it acceleration works only for the GUI or Windows?

Your answer is in http://www.os2museum.com/wp/display-drivers-o … 16-bit-windows/ and https://www.os2museum.com/wp/undocumented-vflatd/. The whole articles and the comments.

TL;DR: Back then graphic vendors were supposed to implement literally the whole drawing engine in their drivers for Windows. The old GDI.exe were mostly entrypoints and forwarders to the graphics driver drawing engine, plus glue code to interact with the USER "Window Manager". Since vendors were in control of the drawing engine, they could put their own specific chip sauce on it, and automatically every windows application would be benefited from it. Plus they could hide such sauce, by no providing libraries or documentation on the specifics of their graphics chip, instead just releasing a Windows driver for it. The downside of that is they had to maintain their own drawing engine codebase, with all the complexities it brings.

Later, MS would standardize the mechanisms, and offer a "generic" drawing engine in the form of WinG and the DIB engine, which later would be extended and finally becoming a standard component of the Windows GDI API. In Win95/98/NT/2000/XP period, Graphics manufacturers would make "miniport" graphics drivers, by just extending and accelerating the DIB engine capabilities built in the GDI, making the graphic driver code smaller and standarized to only one drawing engine.

Reply 6 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++
rasz_pl wrote on 2023-03-18, 21:38:

>What about for DOS games?

there was an attempt https://en.wikipedia.org/wiki/VESA_BIOS ... gust_1996) https://shawnhargreaves.com/freebe/ but 1996 was hilariously too late, 5 years too late. ATI and S3 released 8514 clones in 1991, Tseng in 1993. If VBE/AF was a thing in 1991 we would definitely play 2D accelerated DOS games before Win95.

The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor.

It can greatly accelerate the drawing in a coordinate system, can assist colour filling of forms, can draw all the graphics primitives, such as rectangles, circles (using Pi), etc.

If only DOS developers weren't so lazy and dumb.
Some compilers like QuickBasic 4.5 automatically used an available FPU.

Also, there always had been the possibility to add two main executables to an application/game (8086, 8086+8087).

So the old excuse of having to do "extra work" was just a poor excuse, IMHO.

Edit: There's one early x86 DOS PC with graphics acceleration, the Mindset PC. Very interesting. 😃

https://www.youtube.com/watch?v=3a_qJFD80_c

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 7 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++
hyoenmadan wrote on 2023-03-19, 02:32:
Your answer is in http://www.os2museum.com/wp/display-drivers-o … 16-bit-windows/ and https://www.os2museum.com/wp/undocumented- […]
Show full quote
serialShinobi wrote on 2023-03-18, 17:30:

how is it acceleration works only for the GUI or Windows?

Your answer is in http://www.os2museum.com/wp/display-drivers-o … 16-bit-windows/ and https://www.os2museum.com/wp/undocumented-vflatd/. The whole articles and the comments.

TL;DR: Back then graphic vendors were supposed to implement literally the whole drawing engine in their drivers for Windows. The old GDI.exe were mostly entrypoints and forwarders to the graphics driver drawing engine, plus glue code to interact with the USER "Window Manager". Since vendors were in control of the drawing engine, they could put their own specific chip sauce on it, and automatically every windows application would be benefited from it. Plus they could hide such sauce, by no providing libraries or documentation on the specifics of their graphics chip, instead just releasing a Windows driver for it. The downside of that is they had to maintain their own drawing engine codebase, with all the complexities it brings.

Later, MS would standardize the mechanisms, and offer a "generic" drawing engine in the form of WinG and the DIB engine, which later would be extended and finally becoming a standard component of the Windows GDI API. In Win95/98/NT/2000/XP period, Graphics manufacturers would make "miniport" graphics drivers, by just extending and accelerating the DIB engine capabilities built in the GDI, making the graphic driver code smaller and standarized to only one drawing engine.

There's one thing to keep in mind, however. When 16-Bit Windows was in development, the future was uncertain.
Both OS/2 PM and MS Windows used a great amount of hardware abstraction, to be flexible and competitive.

There used to be PC BUS, AT BUS, Olivetti BUS, EISA, 16-Bit MCA, 32-Bit MCA, VESA Local BUS, proprietary local BUSes, PCMCIA/CardBUS etc.

Outside the PC world, there were NuBus, C-Bus etc.

When Windows 95 debuted, everything had been withered to a PCI-only ecosystem.
AGP and PCIe are no better, still PCI derived technology.

This circumstance made things easier for the miniport drivers on Windows 9x.
Windows 3.x, by comparison, had no PCI or AGP drivers (GART etc).
Eventually required support had to be included in the graphics drivers themselves, thus.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 8 of 48, by kolderman

User metadata
Rank l33t
Rank
l33t

They never really took off. Software support was limited as a result. That being said, IBM pushed it with OS/2, which also never really took off.

https://www.os2museum.com/wp/the-8514a-graphics-accelerator/

Reply 9 of 48, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie

8514/A was likely predated by cards with HD63484 or uPD7220 in offering hardware-based drawing capabilities for IBM-compatible PCs.

Jo22 wrote on 2023-03-19, 08:29:

The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor.

It can greatly accelerate the drawing in a coordinate system, can assist colour filling of forms, can draw all the graphics primitives, such as rectangles, circles (using Pi), etc.

I don't think this is an accurate take. x87 doesn't have any functionality specific to graphics and can't do anything without the main CPU holding its hand. Nearly anything you can name in the realm of 2D graphics that the x87 could participate in could be done as fast, or faster, using the CPU alone with fixed-point math or lookup tables.

again another retro game on itch: https://90soft90.itch.io/shmup-salad

Reply 10 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++
kolderman wrote on 2023-03-19, 09:00:

They never really took off. Software support was limited as a result. That being said, IBM pushed it with OS/2, which also never really took off.

https://www.os2museum.com/wp/the-8514a-graphics-accelerator/

It did, in the professional fields. Just not at home. Like with Linux ("year of the Linux desktop").. 😉

Seriously, though. OS/2 was no failure. Commercially, maybe. But not socially.
Developers used it silently as a tool to write our software.
Microsoft itself had its compilers and servers running on it until NT was useable as replacement.
OS/2 was used in ATMs, in the DTP fields (Aldus PageMaker 3 had a port) etc.

But that's the problem. What does "successful" really mean? Money? Fame?
If so, humanity would be rather poor as such.

PS: The OS/2 Museum is really lacking here, sub standard not to say.
This guy here had done some notable experiments with 8514/A:
https://m.youtube.com/@robertkixmiller1459/se … ?query=8514%2FA

Also, what's not being mentioned.. The all ~14" IBM PS/2 monitors (85xx) could do both 640x480 Standard VGA and 1024x768 (46Hz interlaced) for 8514/A!
800x600, the famous Super VGA resolution, was ironically not supported due to inverted sync polarity and timings.

Edit: uPD7220 was interesting, agreed. It was a much better 'CGA chip' than the MC6845.
The Japanese systems used, AFAIK.
Weren't the Macintosh devs beong asked at some point why they didn't use it? I vaguely remember reading about something like this.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 11 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++
bakemono wrote on 2023-03-19, 09:05:

8514/A was likely predated by cards with HD63484 or uPD7220 in offering hardware-based drawing capabilities for IBM-compatible PCs.

Jo22 wrote on 2023-03-19, 08:29:

The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor.

It can greatly accelerate the drawing in a coordinate system, can assist colour filling of forms, can draw all the graphics primitives, such as rectangles, circles (using Pi), etc.

I don't think this is an accurate take. x87 doesn't have any functionality specific to graphics and can't do anything without the main CPU holding its hand. Nearly anything you can name in the realm of 2D graphics that the x87 could participate in could be done as fast, or faster, using the CPU alone with fixed-point math or lookup tables.

Hm. If that was true, then the need for an x87 or Weitek in CAD/CAM fields never was a necessity.
Going by that logic, the Intel RapidCad then was an unnecessary piece of electronics, too.

Edit: To be fair, fractals might be an exception here, for sure.
Integer math can perfom calculations quicker here.
Especially if 32-Bit registers of the 80386 are being used. FRACT386 was such an example.

Edit: Perhaps I should have been more precise here, too.
Thank you for mentioning this aspect of the matter.
What I mean to say was that flight simulators and other "games" could have been benefitting from an x87 unit.
At least if the main CPU was slow. Starting with the 80386, the CISC design reached its peak.
Both 386/486 had a better internal pipeline than, say, the 80186.
The 286 is a special case, maybe, because it was the final 16-Bit design and had an MMU, too.
Memory operations were no longer done via ALU etc etc.

But since DOS games developers were quite backwards before the advent of SVGA and the 80486 chip, x87 would have been in reach for older systems, at least.
If more software (games) had started to optionally support the x87 way back in the 80s (-like they did for Roland MT-32/LAPC-I),
then the market of x87 chips
would have been more interesting to FPU producers, maybe.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 12 of 48, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie
Jo22 wrote on 2023-03-19, 08:29:
The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor. […]
Show full quote

The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor.

It can greatly accelerate the drawing in a coordinate system, can assist colour filling of forms, can draw all the graphics primitives, such as rectangles, circles (using Pi), etc.

If only DOS developers weren't so lazy and dumb.
Some compilers like QuickBasic 4.5 automatically used an available FPU.

Given the nature of raster architecture (integer coordinates, pixels are at 12,42 not 12.3, 42.5 etc) and 24/32 bit colour spaces (2/4/8 bits per channel etc, int values) it would be extra overhead to process in FPU and would actually be a decelerator since all results would still need to be discretized to int coordinate space anyway. For this reason many early graphics algorithms that evolved were integer based e.g bresenham algorithms. FPU was useful for colour mixing, although not necessary, but mainly became useful with 3D due to vector/matrix calcs, even then in the early days many of these were still integer based due to performance difference and limited precision requirement.

I expect most games didn't use FPU because most games didn't need it. And the ones that perhaps could have used one, it was probably not worth it, have no bearing on the final end user experience (unlike fixed point/integer, FPU speed and precision is dependant on the value mantissa/exponent ranges) , not to mention the additional burden of hardware requirement on earlier systems.

So I wouldn't say early dos developers were dumb or lazy, actually the opposite, in many cases they knew what they were doing, knew how to utilise the hardware to get best performance and results from it and could develop and evolve algorithms fit for requirement.

Jo22 wrote on 2023-03-19, 09:24:

Hm. If that was true, then the need for an x87 or Weitek in CAD/CAM fields never was a necessity.
Going by that logic, the Intel RapidCad then was an unnecessary piece of electronics, too.

CAD/CAM, while speed is good, prioritise precision and so would require FPU to help with precision of the computations that are done on the 'data model' and not used for the graphical presentation aspects in the 'view model' (e.g you would want to calculate area of polygon from precise float values, not approximate from clamped int values, but draw/raster with clamped int values as pushing pixels with floats isn't possible). So the FPU is pretty useful/important for these types of applications, but is not necessarily anything to do with graphics and certainly not a requirement or help for "accelerating graphics".

Jo22 wrote on 2023-03-19, 09:24:

But since DOS games developers were quite backwards before the advent of SVGA and the 80486 chip

Backwards? How so?

Reply 13 of 48, by kixs

User metadata
Rank l33t
Rank
l33t
serialShinobi wrote on 2023-03-18, 17:30:

When does the acceleration take place? In the Windows 3.x shell? What causes a 2D Accelerator chipset to respond to the Windows GUI? As far as I know these chipsets of the early 90s did not give games in DOS a boost in VGA mode performance. But Windows 3.x had no graphics chipset driver. Maybe a 2D Acceleration API? I can't find a driver for the Diamond Stealth VRAM nor it's chipset the S3 P86C911. Yet I read an article "S3 Graphics Gone But Not Forgotten", TechSpot 2021, that spoke about bit block transfers -- how is it acceleration works only for the GUI or Windows? What about for DOS games? The early accelerators didn't also impact bit mapped games in DOS? I guess there was some universal driver in Windows 3.x?

Try these drivers:
https://oemdrivers.com/graphics-s3-p86c911

Acceleration works with driver sending proper commands to the GUI part of the chipset what to do. That's why it only works with correct drivers.

Requests are also possible... /msg kixs

Reply 14 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++
spiroyster wrote on 2023-03-19, 22:08:
Jo22 wrote on 2023-03-19, 09:24:

But since DOS games developers were quite backwards before the advent of SVGA and the 80486 chip

Backwards? How so?

Never mind, I had a bad day. It wasn't meant that way. I'm sorry.

What I meant to say is, that the x87 existed in higher end PCs at work, at least.
So users had at least a chance to experience the difference between x86 and x86+x87.
You know, in those times were games were still bein played at workplace and had a "boss key".

Adding floating-point support wasn't tricky, at the time. Most popular compilers (Turbo Pascal 3+ etc) provided the choice to
a) use a floating point library b) use x87 emulation c) use x87 co-processor.
Or a combination of those.

So this wasn't a problem. Integration of all of those options in a single binary was,
maybe, due to an increased conventional memory consumption.
Using overlays or two binaries, game.com and game87.com would have solved this, but apparently this wasn't pretty enough.

Autodesk AutoSketch 2 and 3 (cheap 2D CAD programs) had two different versions
in the box, one floppy disk with 8086 code and one with 8086+8087 code.

The difference was quite impressive.

8086+8087 were like a dual processor setup, architecturally wise.
So it made sense that both had each other.
Without the 8087, the 8086 is just half a processor.

https://www.youtube.com/watch?v=SGCUErENKBA

https://www.youtube.com/watch?v=ItA-_D2QkTk

Other programs like Quick Basic 4.5 supported x87 instructions/the FPU, when available.
A notable exception, maybe.

https://www.youtube.com/watch?v=mYJ1IPph4tw

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 15 of 48, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2023-03-19, 09:24:
bakemono wrote on 2023-03-19, 09:05:

8514/A was likely predated by cards with HD63484 or uPD7220 in offering hardware-based drawing capabilities for IBM-compatible PCs.

Jo22 wrote on 2023-03-19, 08:29:

The irony is that the PC platform already had a graphics accelerator since the very beginning - the math co-processor.

It can greatly accelerate the drawing in a coordinate system, can assist colour filling of forms, can draw all the graphics primitives, such as rectangles, circles (using Pi), etc.

I don't think this is an accurate take. x87 doesn't have any functionality specific to graphics and can't do anything without the main CPU holding its hand. Nearly anything you can name in the realm of 2D graphics that the x87 could participate in could be done as fast, or faster, using the CPU alone with fixed-point math or lookup tables.

Hm. If that was true, then the need for an x87 or Weitek in CAD/CAM fields never was a necessity.

it is true, FPU in CAD was used for rotation matrices bezier curves etc, math that goes before outputting to the screen. FPU doesnt help with actual putting pixels on the screen part at all.

Jo22 wrote on 2023-03-19, 09:24:

What I mean to say was that flight simulators and other "games" could have been benefitting from an x87 unit.

Even the only ? dos flight simulator (Falcon 3) with optional FPU flight model is broken 😀 all thanks to math edge cases (something about making a steep turn in certain plane configuration with certain weight breaks physics).

Pre 486 coprocessors were hilariously slow, afaik every FPU invocation was handled as CPU exception.
486DX while much faster was still useless for games. For example perfect perspective correction requires FPU calculations per pixel (1/z). Even Quake on Pentium had to settle for doing fdiv every 16 pixels by interleaving integer code between fpu instructions. Thats just drawing. For world geometry rotation/displacement and rasterization you need a TON of FPU, Quake takes full advantage of Intel 8 stage FPU pipeline https://www.eeeguide.com/internal-architectur … tium-processor/ executing multiple instructions in parallel (at different stages of execution). https://github.com/id-Software/Quake/search?q=fxch
Good example: https://github.com/id-Software/Quake/blob/bf4 … _drawa.asm#L248 every fxch lets you swap parameters and decouple subsequent instruction from dependency upon previous one https://www.phatcode.net/res/224/files/html/c … ml#:~:text=FXCH.
On Intel Pentium fxch executes for free, zero cycles. AMD caught up in late 1998 with CXT revision K6-2. Before that every FXCH was 2 cycles http://www.azillionmonkeys.com/qed/cpuwar.html, so while in theory you had pipelined FPU you still burned 2 cycles to unlock next instructions. Pre Pentium every instruction executed sequentially one after another. This is why Quake record breaking 486s (AMD x5 @180 Mhz etc) are somewhere around 18fps, a slow Pentium 75 score.

TLDR: FPU is useless for accelerating pixel movements. FPU is useful for manipulating content, but pre Pentium FPU was too slow for meaningful positive impact in games. CAD users were fine waiting for seconds to minutes per one screen update.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 16 of 48, by hyoenmadan

User metadata
Rank Member
Rank
Member
Jo22 wrote on 2023-03-19, 09:11:

But that's the problem. What does "successful" really mean? Money? Fame?

Ask Jobs and Apple. They never really invented anything, but people attributes them as inventors of personal computing, MP3 players, tablets, smartphones and many other stuff (In future probably they will be seen as the ones who invented PCs with ARM CPUs and everyone not question it) . There were personal computers before them, even with color and sound. There were surely MP3s before them (Creative Nomad, RIO and all their chinese clones) and Tablets (Windows Tablet PC was long before iPad). There were smarthphones (albeit still a bit primitive, I give them that) before them.

But in the end nothing of that even matters. What matters is that you find a way to sell your stuff and make it popular/a luxury desired among the masses. Something to give your "status". Even Microsoft lacks it... They have the money's but not the "fame". Money, Fame... Apple has all of it, and that's what makes them and their products get the status of "successful" even if MS has 2 times the money Apple has.

If so, humanity would be rather poor as such.

Just check our general social environment around us these days. Them you may get your answer.

Reply 17 of 48, by megatron-uk

User metadata
Rank Oldbie
Rank
Oldbie

The only 'sane' drawing acceleration API for Dos was VBE/AF, and like other have mentioned it was far, far too late in the life of Dos to be useful. Most people had already moved in to Windows as their gaming/application platform.

It *can* do acceleration of lines, fills, block transfers, vram to vram blitting and more... But... Other than the Allegro graphics library almost nothing ever supported it.

Also, if you look at the drivers available, very few of the mainstream video cards that were available even supported more than the most basic features.

I posted a fixed version of the drivers recently, I will see if I can find it.

My collection database and technical wiki:
https://www.target-earth.net

Reply 18 of 48, by megatron-uk

User metadata
Rank Oldbie
Rank
Oldbie

Freebe/AF installer segfaults

Fixed version of the VBE/FreeAF driver installer.

Have a look through the drivers and see what features they expose. A lot are just "exposes linear framebuffer"... Very few expose the more advanced drawing features that people are talking about here. Even our old favourite, the S3 Trio, doesn't have much in the way of cool stuff, and we all know it was a fairly decent 2D card in Windows.

My collection database and technical wiki:
https://www.target-earth.net

Reply 19 of 48, by Jo22

User metadata
Rank l33t++
Rank
l33t++

@hyoenmadan I understand what you mean. An old saying, however, goes like this: The best tool is the tool that doesn't attract attention (or similar).
Like OS/2 did with its tragic biography, kind of. It was doing better behind the scene than in the public.

rasz_pl wrote on 2023-03-20, 02:53:
Jo22 wrote on 2023-03-19, 09:24:
bakemono wrote on 2023-03-19, 09:05:

8514/A was likely predated by cards with HD63484 or uPD7220 in offering hardware-based drawing capabilities for IBM-compatible PCs.

I don't think this is an accurate take. x87 doesn't have any functionality specific to graphics and can't do anything without the main CPU holding its hand.
Nearly anything you can name in the realm of 2D graphics that the x87 could participate in could be done as fast, or faster, using the CPU alone with fixed-point math or lookup tables.

Hm. If that was true, then the need for an x87 or Weitek in CAD/CAM fields never was a necessity.

it is true, FPU in CAD was used for rotation matrices bezier curves etc, math that goes before outputting to the screen. FPU doesnt help with actual putting pixels on the screen part at all.

Thank you for response, but you're not entirely correct on this one, I think.
Edit: Well, technically, you are. We're talking talk past each other here, I suppose.
Early games for IBM PC or DOS did use vector graphics at some point, like games did on C64 or Apple II.

Examples:
Sierra On-Line Hi-Res Adventures (#2: Adventure in Serenia aka Wizard and the Princess)
https://www.mobygames.com/game/1761/hi-res-ad … d-the-princess/
https://www.youtube.com/watch?v=X8p7kIxwmVc

As well, as
Hi-Res Adventure #0: Mission Asteroid
Hi-Res Adventure #1: Mystery House
Hi-Res Adventure #3: Cranston Manor
Hi-Res Adventure #4: Ulysses and the Golden Fleece
Hi-Res Adventure #6: The Dark Crystal

Lane Mastodon vs. the Blubbermen
https://www.mobygames.com/game/1688/lane-mast … the-blubbermen/
https://www.youtube.com/watch?v=wptDEDeEsCI

As well, as
Gamma Force in Pit of a Thousand Screams
https://www.mobygames.com/game/1693/gamma-for … ousand-screams/

ZorkQuest: The Crystal of Doom
https://www.mobygames.com/game/1699/zorkquest … rystal-of-doom/

ZorkQuest: Assault on Egreth Castle
https://www.mobygames.com/game/1694/zorkquest … -egreth-castle/

Oo-Topos
https://www.mobygames.com/game/148433/oo-topos/
https://www.youtube.com/watch?v=E1H2_tztiCQ

Winnie the Pooh in the Hundred Acre Wood
https://www.mobygames.com/game/7274/winnie-th … dred-acre-wood/
https://en.wikipedia.org/wiki/Winnie_the_Pooh … ndred_Acre_Wood
https://www.youtube.com/watch?v=uQZzp_gLLMs

As well, as
Mickey's Space Adventure
Donald Duck's Playground
Troll's Tale (name of the games engine, too)

I believe this is just the tip of an ice berg, also. Unfortunatelly, I'm not a great video game player. There's so much I haven't found out yet. 😅

To my understanding, a 2D or 3D accelerator is all about graphics primitives, not high-bandwidth transfers into the frame buffer. That's what a blitter is for (in my opinion).
It did take a while up until even Windows 3.x got blitting right. WinG - and DCI in particular do allow frame buffer access for video overlay.
That's why there was such a fuss about DCI capable drivers for a while, I think, back when Video CD/CD-i and Quick Time were a thing. Windows 95 replaced DCI (DCI32 never really made it, AFAIK).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//