VOGONS


Your opinion on the ATi Rage IIC and Rage Pro

Topic actions

Reply 80 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Putas wrote on 2023-04-10, 04:50:
rasz_pl wrote on 2023-04-08, 23:20:

Only two choices will allow me to play GLQuake >30fps out of the box. Only two will give me working opengl driver.

Voodoo Graphics isn't that fast. And eventually you get a driver making Rage Pro faster in the game.

around 2000 with very fast CPU when second hand TNT/TNT2m64 was $30 and Voodoo2 went for $50

Putas wrote on 2023-04-10, 04:50:

And we should remember people use graphics cards for more than 3d games.

how would $120-200 Rage "pro" improve my non gaming life above $45 Diamond Stealth 3D 2000 in 1997?

Putas wrote on 2023-04-10, 04:50:
rasz_pl wrote on 2023-04-08, 23:20:

Permedia 2 and Verite V1000 delivered speed, but software support wasnt there to even consider.

That does not even make sense.

Verite practically went out of business in 1998 and wont run some games due to driver issues, for example Unreal. In 1997 you still occasionally played DOS games like Comanche 3/Duke Nukem 3D/Redneck Rampage/Elder Scrolls Battlespire etc, Verite is notorious for slow vga.

Permedia 2 had terrible cpu bottlenecked, unusable on period correct cpu, drivers and especially slow OpenGL. Juggling driver versions to try and fix missing textures/effects or get one game running faster while making another stop working was the life of Permedia 2 user. Typical experience https://www.youtube.com/watch?v=QagArL3FKoo

Back then the only vendor that let you buy a card, install and play games without worry was 3dfx with stable and optimized Glide. All others meant driver roulette and suffering glitches (broken transparency, missing textures, catastrophic fps drops). Even the best of the rest in 1997 Nvidia was pretty bad with compatibility and picture quality.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 81 of 116, by 386SX

User metadata
Rank l33t
Rank
l33t

I suppose the Rage Pro at least the AGP version would have been a good choice at those prices but drivers did the difference too late considering how many were released. What surprised me of the Rage Pro '3D Turbo' whatever version was the SGRAM PCI layout with the older chip package which was much slower than the AGP one even in fast CPU config at a point it would feel like any Rage 2C AGP and the much talked MPEG2 acceleration doesn't even help in that version as much as the AGP one (and the PCI bandwidth can't be a reason for that); I imagine something might have been different in the older package version of the chip or maybe not totally PCI oriented. But the AGP version if had better drivers in its time it'd not be a bad solution, just needed lighter better drivers soon instead of waiting years after the 2000 to get almost there.

Reply 82 of 116, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2023-04-10, 05:53:
Putas wrote on 2023-04-10, 04:50:
rasz_pl wrote on 2023-04-08, 23:20:

Only two choices will allow me to play GLQuake >30fps out of the box. Only two will give me working opengl driver.

Voodoo Graphics isn't that fast. And eventually you get a driver making Rage Pro faster in the game.

around 2000 with very fast CPU when second hand TNT/TNT2m64 was $30 and Voodoo2 went for $50

Are you saying that because it was only in 2000 when it got that fast? Do you happen to know with which driver?

rasz_pl wrote on 2023-04-08, 23:20:
Putas wrote on 2023-04-10, 04:50:

And we should remember people use graphics cards for more than 3d games.

how would $120-200 Rage "pro" improve my non gaming life above $45 Diamond Stealth 3D 2000 in 1997?

Why would you ask something that specific?

rasz_pl wrote on 2023-04-10, 05:53:
Putas wrote on 2023-04-10, 04:50:
rasz_pl wrote on 2023-04-08, 23:20:

Permedia 2 and Verite V1000 delivered speed, but software support wasnt there to even consider.

That does not even make sense.

Permedia 2 had terrible cpu bottlenecked, unusable on period correct cpu, drivers and especially slow OpenGL.

Sounds like it did not deliver much speed. What "speed" do you talk about in case of V1000 is beyond me.

Reply 83 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Putas wrote on 2023-04-10, 07:32:
rasz_pl wrote on 2023-04-10, 05:53:
Putas wrote on 2023-04-10, 04:50:

Voodoo Graphics isn't that fast. And eventually you get a driver making Rage Pro faster in the game.

around 2000 with very fast CPU when second hand TNT/TNT2m64 was $30 and Voodoo2 went for $50

Are you saying that because it was only in 2000 when it got that fast? Do you happen to know with which driver?

Im not the one saying it got fast 😀 looking at videos of people testing with fast CPUs and best drivers it only just got ok in its last driver iterations
Re: "ATIQuake" miniGL for Q1 & Q2 on Rage Pro
Re: Proprietary 3D API's

Putas wrote on 2023-04-10, 07:32:
rasz_pl wrote on 2023-04-08, 23:20:
Putas wrote on 2023-04-10, 04:50:

And we should remember people use graphics cards for more than 3d games.

how would $120-200 Rage "pro" improve my non gaming life above $45 Diamond Stealth 3D 2000 in 1997?

Why would you ask something that specific?

Because that was my original argument - ATI was scamming people positioning Rage PRO as some sort of high end 2D/3D accelerator when in fact delivering $45 worth of 2D with another $50 worth of 3D value, all packaged into $200 cards.

Putas wrote on 2023-04-10, 07:32:

Sounds like it did not deliver much speed. What "speed" do you talk about in case of V1000 is beyond me.

Playing Quake at 25 fps in 320x200 on Pentium 100, then going to 640x480 vQuake at >20fps. Thats a 1997 $775 Pentium 2 266MHz software rendering speed territory while V1000 was $175. If you happened to have almost top of the line Pentium 200 MMX vQuake would run at 30fps in 640x480, a $1981 Pentium 2 300MHz software rendering speed. For comparison with Rage PRO you will need that 300MHz Pentium 2 + 2002 driver to get same framerate in glQuake.

Last edited by rasz_pl on 2023-04-10, 22:03. Edited 2 times in total.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 84 of 116, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

Rage II and Rage Pro had a few things going for them:

a) 2D was faster than Virge 325 and Virge DX. That being said, for most people, Virge 325's 2D performance was more than enough.

b) Image Quality on Rage cards was pretty good as, much like Matrox, ATi made the cards themselves instead of selling the chips to board partners at the time. S3 Virge cards can be very noisy, it really makes a huge difference. It's been a while since I last used a Diamond Stealth 2000 so I can't really comment on how good the image quality was on that. Probably more than adequate though.

c) 3D was faster on Rage II and much faster on Rage Pro compared to Virge 325 and other Virge variants. Rage II might actually not be faster than Virge DX though. Drivers aside, Rage Pro can actually be faster than Voodoo Graphics even, OpenGL is a mess though and was only fixed very late, in the 2000's.

d) Rage II DVD and beyond offer full DVD decoding AFAIR. This was a massive selling point in ~1997-1998. No need for decoder cards that take up an extra slot (and cost a lot too!).

e) Depending on the model you chose, Rage cards offered interesting and useful Video-In and Video-Out. This was pretty huge at the time.

I might have to dig up some magazines of my own, in my country the Rage cards were not that expensive, although Virge will always reign supreme in value. You really couldn't beat Virge 2MB (or 4MB) paired with a Voodoo Graphics for gaming in mid 97!

Reply 85 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Garrett W wrote on 2023-04-10, 09:27:

a) 2D was faster than Virge 325 and Virge DX. That being said, for most people, Virge 325's 2D performance was more than enough.

Now that I think about are there any DirectDraw game benchmarks around? Were they ever needed?
There is this http://www.roylongbottom.org.uk/directdraw%20results.htm showing 2-3x differenve between some random Rage in a laptop vs Virge.
Hmm on the other hand I seem to remember someone showing 486 with Mach64 playing Starcraft no problem.
486SX-25 ET6000 is almost there on Win2000 😮 https://www.youtube.com/watch?v=y8P79hxLS7o
while 486DX2 66MHz Cirrus logic 5434 struggles hard https://www.youtube.com/watch?v=0dWwehM6MfA
UMC 486 33MHz + Trio64V2 does much better https://www.youtube.com/watch?v=4o5AJPdv0rs
and Pentium 133 with completely unaccelerated Trident 8900 ISA runs great https://www.youtube.com/watch?v=gztyxkkGNiE
P166MMX + Trio64 perfect https://www.youtube.com/watch?v=xdSkYoBd-TA

The question is was there a reason for stellar 2D acceleration if a DirectDraw game from 1998 ran perfectly fine on 4 year old computers when it came out?

Garrett W wrote on 2023-04-10, 09:27:

b) Image Quality on Rage cards was pretty good as, much like Matrox, ATi made the cards themselves instead of selling the chips to board partners at the time. S3 Virge cards can be very noisy, it really makes a huge difference. It's been a while since I last used a Diamond Stealth 2000 so I can't really comment on how good the image quality was on that. Probably more than adequate though.

I specifically picked Diamond from the list because it was one of very few brands actually putting effort into video output quality even on their cheap cards 😀

Garrett W wrote on 2023-04-10, 09:27:

d) Rage II DVD and beyond offer full DVD decoding AFAIR. This was a massive selling point in ~1997-1998. No need for decoder cards that take up an extra slot (and cost a lot too!).

2+DVD and PRO only do Motion Compensation which in case of mpeg2 was copying around 16×16 blocks of pixels. iDCT was introduced in Rage 128. Probably all have hardware color conversion and scaling support. Interestingly enough according to Re: DVD accelerators compared 2+DVD wasnt so hot when paired with P2 233MHz. Around a year of real utility out of that feature considering that by August 1998 you could buy $100 Celeron 300 and overclock to 450MHz for perfect purely software DVD playback.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 86 of 116, by 386SX

User metadata
Rank l33t
Rank
l33t

As said various Rage II DVD oriented variants only did Motion Compensation and not really useful until the Celeron 300 and above. iDCT was there also in some Rage Mobility versions and the Rage 128 early design. But on the cpu side the sw player engine did a big difference and some could get full frame rate in older engines version but not great quality. Better sw engines got full quality with the help of SSE instructions with a better FPU. So I would consider early Pentium III to be the right cpu for software decoding and stable frame rate.

Reply 87 of 116, by Garrett W

User metadata
Rank Oldbie
Rank
Oldbie

Ah, I thought it could do iDCT. That certainly makes for a much more limited usage, although running video full-screen + scaling/resizing however you wished without the CPU breaking a sweat was a big deal even on Pentiums believe it or not. I believe it was Scali that mentioned the upgrade going from a Trio64V2+ to a Mystique on his Pentium 133 at the time. I will link to his post if I can find it later in the day.

As for the other questions:

While those 486 and Trident Starcraft videos are fairly impressive (UMC 486 was such a killer CPU, too bad Intel fought them like crazy), even the Pentium one with the Trident is slow. This is hardly ideal footage, what with the intense close-up angle and grainy quality, but you can tell every time the screen moves with just a few units that it struggles. And this is just the tutorial level, things will only get (much) worse on later levels.
Impressive for what it is, sure, but I don't see how it factors into the discussion.
Both Virge and Rage II will run Starcraft (and other Blizzard games of that vintage, Diablo included) just fine. But as Putas implied in a previous post, the discussion can and should involve other things besides video games. There are cases where a faster 2D card will make a difference.

Image Quality wise, again, I haven't seen a Stealth 2000 in action in years so I cannot really comment. Diamond weren't exactly a quality brand, though they definitely captured the zeitgeist of the mid to late 90s PCs. My experience has been that there are Virge implementations that are blurry even at 800x600 and 1024x768 is often very blurry. Not to mention colors can be messed up. There are quite a few threads around here regarding this issue.
My guess is that Stealth 2000 is probably fine at up to 800x600. Considering the monitors most people had at the time, that's all they'd ever need honestly, but we have to take into account power users or professionals or offices that required better image quality and at higher resolutions. ATi cards were pretty good in that regard, Matrox being the golden standard. That being said, it all became somewhat moot in just a couple of years with Banshee, Riva 128, S3 Savage 3D offering fantastic image quality.

As for the example with the overclocked Celeron 300A, sure? I don't think most people interested in watching DVDs on their PCs had knowledge of this possibility or were willing to take the risk of overclocking. I honestly don't think this was a measurably high use-case.

Reply 88 of 116, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Garrett W wrote on 2023-04-10, 12:42:

Image Quality wise, again, I haven't seen a Stealth 2000 in action in years so I cannot really comment. Diamond weren't exactly a quality brand, though they definitely captured the zeitgeist of the mid to late 90s PCs.

In terms of image quality, Diamond was pretty decent, though not quite as good as Elsa, STB or Hercules.

That said, all of those were leagues above generic "no-name" Virge cards.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 89 of 116, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Drivers, drivers, drivers.... you know who escapes all the blame in this? Microsoft. They were the problem for everyone except 3DFX who managed to get their own independent API well established in the market... (Although, nobody buys a voodoo for DX compatibility either) Apparently what happened to Rage was that the hardware was tuned to implement every feature of DirectX in harmony with the API... then Microsoft changed it. This then meant ATI and others in development who had also relied on MS's right hand knowing what it's left hand was doing, were left with having to make an ugly software shim of a driver to kludge old API hardware to new API... and since drivers were going to be "cake" if the hardware lined up feature for feature with the software, probably didn't have the talent on hand to do this perfectly or quickly. This I believe also soured relations between Microsoft and many 3D companies, because by the time they brandished lawyers at each other, MS was probably saying "Fine! No more advance info, the product is the product when it releases!"

I think this then added some months to hardware still in development, which is why the "flood" looked too late and too little when other 3D chips launched. Nvidia on the other hand, had launched NV1 early with a different design philosophy. This put them out of phase with DX development anyway, so they were not thrown off by MS being assclowns and maybe had not soured relations by threatening lawyers, thus were able to swerve NV2 into synch with DX while everyone else was still thrashing around.

OpenGL on the other hand was a game of chicken...

An API designed by committee... yay 😒 ... it is/was huge, many features, covered most 3D graphics use cases, so was entirely too cumbersome for gaming in it's entirety. The game of chicken then, was picking the features that would be used in future games, and focusing on making those fast, while not implementing anything that was going to cost you silicon and heat and maybe not be used anyway. So gaming 3D boards were in wait and see mode, afraid to make too bold a move and end up with an unused feature that slowed the board down relative to competitors. Meanwhile professional 3D chips could do it all, or at least most of it, and were stuck with ponderous performance due to having to get the whole herd of elephants to dance the tango. 3DFX was established enough and didn't care enough due to having glide to fall back on, which was very GLish anyway, to worry about making a wrong move, so they could just go "well here's our miniGL subset, have fun"... so that ended up being the core subset of the standard everyone else stuck close to... but not necessarily how other cards would be best used.

Anyway, I think the success of glide, the duplicitousness of Microsoft and the scary behemoth of OpenGL caused more resources than ultimately looked wise were devoted by other 3D companies to internal own API maintenance and development, as a hedge, which might have been redirected to improvement of windows DX drivers. But it probably looked at the time, that that also could be a fruitless path, where success was not guaranteed. Looking back it's easy to think that everything happened "on rails" bringing us to present day, "obviously DirectX was the way", but nothing ever happens like that. There were hardcore DOS gamers in 1998 still who thought DX was a steaming pile that would never get anywhere (It's was kinda 3DFX's core market)

edit: Typo above, NV2 was kinda "more of the same" for different markets than NV1, the NV3 is where they swung back into PC 3D with the Riva 128

Last edited by BitWrangler on 2023-04-11, 01:35. Edited 1 time in total.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 90 of 116, by 386SX

User metadata
Rank l33t
Rank
l33t
Garrett W wrote on 2023-04-10, 12:42:
Ah, I thought it could do iDCT. That certainly makes for a much more limited usage, although running video full-screen + scaling […]
Show full quote

Ah, I thought it could do iDCT. That certainly makes for a much more limited usage, although running video full-screen + scaling/resizing however you wished without the CPU breaking a sweat was a big deal even on Pentiums believe it or not. I believe it was Scali that mentioned the upgrade going from a Trio64V2+ to a Mystique on his Pentium 133 at the time. I will link to his post if I can find it later in the day.

As for the other questions:

While those 486 and Trident Starcraft videos are fairly impressive (UMC 486 was such a killer CPU, too bad Intel fought them like crazy), even the Pentium one with the Trident is slow. This is hardly ideal footage, what with the intense close-up angle and grainy quality, but you can tell every time the screen moves with just a few units that it struggles. And this is just the tutorial level, things will only get (much) worse on later levels.
Impressive for what it is, sure, but I don't see how it factors into the discussion.
Both Virge and Rage II will run Starcraft (and other Blizzard games of that vintage, Diablo included) just fine. But as Putas implied in a previous post, the discussion can and should involve other things besides video games. There are cases where a faster 2D card will make a difference.

Image Quality wise, again, I haven't seen a Stealth 2000 in action in years so I cannot really comment. Diamond weren't exactly a quality brand, though they definitely captured the zeitgeist of the mid to late 90s PCs. My experience has been that there are Virge implementations that are blurry even at 800x600 and 1024x768 is often very blurry. Not to mention colors can be messed up. There are quite a few threads around here regarding this issue.
My guess is that Stealth 2000 is probably fine at up to 800x600. Considering the monitors most people had at the time, that's all they'd ever need honestly, but we have to take into account power users or professionals or offices that required better image quality and at higher resolutions. ATi cards were pretty good in that regard, Matrox being the golden standard. That being said, it all became somewhat moot in just a couple of years with Banshee, Riva 128, S3 Savage 3D offering fantastic image quality.

As for the example with the overclocked Celeron 300A, sure? I don't think most people interested in watching DVDs on their PCs had knowledge of this possibility or were willing to take the risk of overclocking. I honestly don't think this was a measurably high use-case.

It'd be interesting to know if those S3 cards got worse with time/capacitors or simply were not great in their times too and got much worse with LCD monitors but indeed ATi cards (even old ISA ones) had higher PCB quality levels and VGA output results while CRT monitors compensated a lot usual VGA problems. My fastest GD5429 ISA card can't get the same quality of the much older (MUCH better built) ATi 28800-5 ISA card (even if the first was much newer and a fast 2D accelerator for the Win GUI). Same thing on the Rage Pro PCI/AGP cards compared to other cards even better with the Rage 128 Pro and above.. Not maybe at the levels of Matrox VGA output quality but close.

On the CPU, I suppose some having later a cheap config with that cpu might have tried to get both games and eventually MPEG2 sw players to be a bit faster. Early sw players could even work on Pentium MMX last cpu but could not get full frame rate not even at 99% CPU usage and heavy sw tweaks with zero multitasking possible. Later sw player engines improved a lot the work done on the final image quality but required higher CPU/FPU speed to get a multitasking environment during MPEG2 real time decoding.

Reply 91 of 116, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2023-04-10, 08:54:
Im not the one saying it got fast :) looking at videos of people testing with fast CPUs and best drivers it only just got ok in […]
Show full quote
Putas wrote on 2023-04-10, 07:32:
rasz_pl wrote on 2023-04-10, 05:53:

around 2000 with very fast CPU when second hand TNT/TNT2m64 was $30 and Voodoo2 went for $50

Are you saying that because it was only in 2000 when it got that fast? Do you happen to know with which driver?

Im not the one saying it got fast 😀 looking at videos of people testing with fast CPUs and best drivers it only just got ok in its last driver iterations
Re: "ATIQuake" miniGL for Q1 & Q2 on Rage Pro
Re: Proprietary 3D API's

Looks like they were beating Voodoo Graphics in the game since summer of 1998. Again, I don't know why are you talking about year 2000.

rasz_pl wrote on 2023-04-10, 05:53:
Putas wrote on 2023-04-10, 07:32:
rasz_pl wrote on 2023-04-08, 23:20:

how would $120-200 Rage "pro" improve my non gaming life above $45 Diamond Stealth 3D 2000 in 1997?

Why would you ask something that specific?

Because that was my original argument - ATI was scamming people positioning Rage PRO as some sort of high end 2D/3D accelerator when in fact delivering $45 worth of 2D with another $50 worth of 3D value, all packaged into $200 cards.

While the price is hard to justify, scam is an activity of very different nature. And it was high-end performance for the time.

rasz_pl wrote on 2023-04-10, 05:53:
Putas wrote on 2023-04-10, 07:32:

Sounds like it did not deliver much speed. What "speed" do you talk about in case of V1000 is beyond me.

Playing Quake at 25 fps in 320x200 on Pentium 100, then going to 640x480 vQuake at >20fps. Thats a 1997 $775 Pentium 2 266MHz software rendering speed territory while V1000 was $175. If you happened to have almost top of the line Pentium 200 MMX vQuake would run at 30fps in 640x480, a $1981 Pentium 2 300MHz software rendering speed. For comparison with Rage PRO you will need that 300MHz Pentium 2 + 2002 driver to get same framerate in glQuake.

But VQuake uses monochromatic lighting. And it is not that fast for me, not even with stronger CPU and V1000L. On equal terms V1000 framerates are around half of Rage Pro.

Reply 92 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Garrett W wrote on 2023-04-10, 12:42:

While those 486 and Trident Starcraft videos are fairly impressive
even the Pentium one with the Trident is slow.

for 640x480 on 1995 133MHz CPU and unaccelerated ISA SVGA rendered purely in software its quite surprising 😮 I included it to show how little 2D acceleration brings to the table in 1997 with 200MHz cpus.

Garrett W wrote on 2023-04-10, 12:42:

Both Virge and Rage II will run Starcraft (and other Blizzard games of that vintage, Diablo included) just fine. But as Putas implied in a previous post, the discussion can and should involve other things besides video games. There are cases where a faster 2D card will make a difference.

Im very gaming oriented and have a hard time coming up with other heavy uses of 2D acceleration 😀 For example pre 2000 I did some pcb design in DOS Autotrax/Windows Protel (before it became Altium) and dont remember ever noticing any difference between computers other than ram/MHz. Maybe for something like AutoCAD with its first grown up 32bit windows version in 1997? But CPUs were already fast enough for anything amateur would be working on while professionals were probably still sitting on last Unix workstations with their Catia/CADKEY.

Garrett W wrote on 2023-04-10, 12:42:

Image Quality

medical, viewing high res Xray scans requires high resolution high color depth high fidelity, but I dont know if that field was established pre 2000.

BitWrangler wrote on 2023-04-10, 15:01:

Drivers, drivers, drivers.... you know who escapes all the blame in this? Microsoft. They were the problem for everyone except 3DFX who managed to get their own independent API well established in the market... (Although, nobody buys a voodoo for DX compatibility either) Apparently what happened to Rage was that the hardware was tuned to implement every feature of DirectX in harmony with the API... then Microsoft changed it. This then meant ATI and others in development who had also relied on MS's right hand knowing what it's left hand was doing, were left with having to make an ugly software shim of a driver to kludge old API hardware to new API... and since drivers were going to be "cake" if the hardware lined up feature for feature with the software, probably didn't have the talent on hand to do this perfectly or quickly. This I believe also soured relations between Microsoft and many 3D companies, because by the time they brandished lawyers at each other, MS was probably saying "Fine! No more advance info, the product is the product when it releases!"

MS screwed around with stupid ideas like "retained mode" in DX2-3 and hilariously bad https://en.wikipedia.org/wiki/Microsoft_Talisman.
I seem to remember that for DX version 8.1 Microsoft pretty much codified in the API ATI Radeon 8500 R200 feature set verbatim, or maybe it was dx9 and R300. At that point MS gave up dictating the future of API.

BitWrangler wrote on 2023-04-10, 15:01:

OpenGL on the other hand was a game of chicken...
An API designed by committee... yay 😒

Pre DirectX 5-6 every vendor wished directx would work just like opengl, at least everything was documented and somewhat logical

BitWrangler wrote on 2023-04-10, 15:01:

So gaming 3D boards were in wait and see mode, afraid to make too bold a move and end up with an unused feature that slowed the board down relative to competitors.

There are fancy features and there are basics like blending modes or transparency, yet still most vendors couldnt be bothered to even implement those 😀 Rage 2, Matrox, Virge all missed absolute minimum to properly render simple games 🙁

BitWrangler wrote on 2023-04-10, 15:01:

Meanwhile professional 3D chips could do it all, or at least most of it, and were stuck with ponderous performance due to having to get the whole herd of elephants to dance the tango.

3Dlabs looking down at peasants from ivory tower. Permedia/Glint/Oxygen with everything implemented by the book with 32bit precision and barely any speed

Putas wrote on 2023-04-10, 15:49:

Looks like they were beating Voodoo Graphics in the game since summer of 1998. Again, I don't know why are you talking about year 2000.

$200 Rage Pro could barely match Verite V1000 in middle of 1998 IF you paired it with top of the line $700 CPU
Voodoo1 is on another level. Same framerate in 512x384 paired with Pentium 120 https://www.soldcentralfl.com/quakecoop/compare1.htm
or you could overclock it a bit and get 640x480 57fps with $300 K6-2? http://www.3dgw.com/Articles/v1tweak/index.htm
We end up with:
ATI 19fps with $700 CPU in May 1998
ATI 27fps with $700 CPU possible only in June 1998
Verite ~22fps with obsolete Pentium 100 in December 1996 the day vQuake beta is released
Verite v2200 ($129) 30fps with $550 Pentium 200 MMX in August 1997 https://www.oocities.org/timessquare/fortress … 191/vquake.html https://www.anandtech.com/show/26/3
3dfx >40fps with $400 Pentium 166MMX in March 1997 the day GLQuake is released
3dfx >50fps with $300 K6-2 in March 1998

Putas wrote on 2023-04-10, 07:32:

While the price is hard to justify, scam is an activity of very different nature. And it was high-end performance for the time.

'a fraudulent or deceptive act or operation'. "3D", "PRO", "Turbo". Definitely not high end 3d performance.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 93 of 116, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2023-04-11, 00:27:
[…]
Show full quote

Putas wrote on 2023-04-10, 15:49:

Looks like they were beating Voodoo Graphics in the game since summer of 1998. Again, I don't know why are you talking about year 2000.

$200 Rage Pro could barely match Verite V1000 in middle of 1998 IF you paired it with top of the line $700 CPU
Voodoo1 is on another level. Same framerate in 512x384 paired with Pentium 120 https://www.soldcentralfl.com/quakecoop/compare1.htm
or you could overclock it a bit and get 640x480 57fps with $300 K6-2? http://www.3dgw.com/Articles/v1tweak/index.htm
We end up with:
ATI 19fps with $700 CPU in May 1998
ATI 27fps with $700 CPU possible only in June 1998
Verite ~22fps with obsolete Pentium 100 in December 1996 the day vQuake beta is released
Verite v2200 ($129) 30fps with $550 Pentium 200 MMX in August 1997 https://www.oocities.org/timessquare/fortress … 191/vquake.html https://www.anandtech.com/show/26/3
3dfx >40fps with $400 Pentium 166MMX in March 1997 the day GLQuake is released
3dfx >50fps with $300 K6-2 in March 1998

Putas wrote on 2023-04-10, 07:32:

While the price is hard to justify, scam is an activity of very different nature. And it was high-end performance for the time.

'a fraudulent or deceptive act or operation'. "3D", "PRO", "Turbo". Definitely not high end 3d performance.

You are again comparing apples and oranges. Kind of fraudalent behaviour.

Reply 94 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Putas wrote on 2023-04-11, 04:15:

You are again comparing apples and oranges. Kind of fraudalent behaviour.

I am comparing quake, released by Id Software. Did Carmack scam us all?

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 97 of 116, by rasz_pl

User metadata
Rank l33t
Rank
l33t
Putas wrote on 2023-04-11, 09:56:

And of course any game should be rendered in a similar way to judge graphics hardware performance.

In that case its 0 Quake fps for ATI for one years as there was no opengl at all before atiquake minigl released in June 1998 😀. 1997 DirectX driver is also almost 2x slower.
Rage Pro might look ok now with drivers from 2002, but back in 1997-8 it wasnt even considered as an option when benchmarking gaming cards.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 98 of 116, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie

Here is an interesting battle royale review including most 3D cards of the time as of January 1998 including CPU scaling, image quality, D3D, Quake 1 & 2, etc.

https://www.tomshardware.com/reviews/3d-accel … ew-step,51.html

Reply 99 of 116, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2023-04-11, 12:35:
Putas wrote on 2023-04-11, 09:56:

And of course any game should be rendered in a similar way to judge graphics hardware performance.

In that case its 0 Quake fps for ATI for one years as there was no opengl at all before atiquake minigl released in June 1998 😀. 1997 DirectX driver is also almost 2x slower.
Rage Pro might look ok now with drivers from 2002, but back in 1997-8 it wasnt even considered as an option when benchmarking gaming cards.

But those are statements about user experience, Not about performance of the hardware.
Of course Rendition and 3dfx had a headstart with Quake, they helped to develop their ports. Otherwise OpenGL would not be used for games and that's why it took other companies off guard and why it is silly to judge them by the game.

AtiQuake came earlier and is probably the slowest way to play the game.

What makes you think DirectX driver was 2x slower?