VOGONS


Appian Graphics Gemini PCI - S3 Savage MX 16MB

Topic actions

First post, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

Recently I was given a bunch of older PC hardware, and this card was one of the very few internal components in the lot. It's a bit of an odd looking card, and I assumed right off the bat that it was some kind of Permedia or other pro\workstation type card. I think I was getting Appian confused with Accelgraphics to be honest. With so little info out there about this card, I figured I'd do a short write-up.

The attachment 20210111_212836 (1600x1200).jpg is no longer available

Today I was testing some cards and decided to throw this one in. It took a bit of digging online but I was able to locate some drivers for the "Appian Gemini PCI". The driver packages posted on DriverGuide were a bit oddly named, so I downloaded both. The filename for one seems to suggest Windows 2k\NT support, where as the other seems to be for 9x. I'm not 100% sure if this is accurate however, so if you feel like experimenting, try either. I renamed the files to identify them a bit better, and I've attached them to this post, in case anyone wants to upload them to the Vogons Driver Library. The 9x driver worked fine in Windows 98SE... but with some caveats.

Normally I opt for drivers from chipset makers for cards from this time period, since they tended to support them longer than the card makers did. There isn't a lot of info online about this card but I found this wiki page which stated that it, surprisingly, used a Savage MX with 16MB of SGRAM. The Savage MX\IX driver package available here worked fine... also with some caveats.

Appian Gemini Driver: I couldn't get games to work in hardware accelerated mode (tested JK: Mysteries of the Sith demo from a 1998 PC Gamer CD), and even in software mode things were glitchy and seemed to be prone to crashing. Seemed to work okay on the desktop with the screen extended to a second monitor. I did get a discolored "ghost" line on the primary display though, as if it was a shadow of the other display. When dragging windows from one display to the other there is a slowdown as it is displayed on both screens. Definitely not a perfect multi-monitor setup. I also had to edit the registry to reveal 1280x1024x32, or else the highest it would go is 1280x1024x16. Once edited, it worked as expected, but I did experience one crash while opening a Window. For some reason, there seem to be no settings added to the display properties in Windows 9x with this driver. Overall, I don't recommend using these, but they do seem to basically work for desktop stuff.

S3 Savage MX (Final) Driver: Multimonitor support seemed to be broken. If you try it, make sure to hit esc right away if it doesn't work... I waited too long and then had to blindly click and drag the display properties back from a blank monitor to the secondary to get it back. Rebooting didn't even fix it... oops! Most likely Appian used their fancy Hydravision tech (which ATi later used after buying them out) to make multi-monitor work for this card and the generic drivers don't like it. Thankfully, games seemed to work much better with these drivers. Performance wise, don't expect a lot from later games (1999 or newer). Descent 3 worked in D3D but performance was pretty weak at 1024x768 or 800x600 and there was a lot of chunky looking dithering. OpenGL was a slideshow, so there is probably a driver bug there. MotS ran fine, though not buttery smooth, but due to the large frame buffer I could play it at 1280x1024x16. Hopefully, titles that use S3Metal are a bit better, but overall I wouldn't say this is a top tier gaming card by any means, but it does seem like S3 was able to at least get Savage MX cards work on some level by 2001 when they stopped updating the drivers. 😀 One thing of note, when S3 Tweak was running, games tended to just display a black screen at startup, so I think some setting in that application isn't agreeing with the games I tried.

Interestingly, since this is a laptop video chip, it barely even gets above room temperature with no air blowing over the heatsink. I left it running Descent 3 for 30 minutes and it was only slightly warm. I'm amazed they even bothered with a heatsink at all. What a difference from other cards available at the time!

I highly recommend using the S3 driver from the Vogons page above, unless you absolutely want multi monitor support and zero configuration settings. I'm not sure how compatible it is in a variety of 3D games, and the multi-monitor support is pretty sketchy (the top VGA port is always the primary display, and it's the only one that works before the driver is loaded), but this is my first experience with a Savage MX overall I'd say it was positive. Obviously, being from 1999 this thing is no match for 3dfx or nvidia cards of the time (TNT1 or Banshee are likely both much faster and more compatible), but for a workstation focused card running on S3's infamous drivers it does a respectable, if unimpressive, job in the 1997-1999 games I tried, has excellent 2D image quality, runs super cool and the S3 Tweak program (included with the S3 package above) has lots of really interesting options. Also, for what it's worth, the clock speeds listed for the Savage MX online are all blank, but on this card the S3 Tweak overclocking page showed the clock speed as 100Mhz. I didn't try overclocking it... but with such a cool running card, I wonder if it could handle a bit more juice? 😁

The attachment 20210111_212847 (1600x1200).jpg is no longer available

Also, take a look at the back of the PCB. They actually put a Gemini constellation in the PCB. I'm a sucker for screen printing and stuff like that on boards, so I think that's neat. 😁

Now for some blitting from the back buffer.

Reply 1 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I've tested that card in DOS and Win9x software games. More or less just regular Savage 4, but quirky.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

Wow, you know I never looked real closely at the specs of the S3 Savage series in comparison to the competition. It never occurred to me that they were only single pipeline, single TMU chips until the Savage 2000 came out. On top of that, they all had a 64bit memory BUS.

I had always assumed it was a driver issue that made them so bad at games compared to competitors with similar core and memory clock speeds, but I see now that they were waaaay behind on specs from the beginning.

Compare:
https://en.wikipedia.org/wiki/S3_Savage
https://en.wikipedia.org/wiki/List_of_Nvidia_ … its#Pre-GeForce

Even the original Riva TNT had a 2:2 design and 128bit memory, which gave it a higher pixel and texel fill rate AND higher memory bandwidth than the top tier Savage 4 Xtreme.

It's amazing they even played games at all with those specs. Most of the Savage cards are actually closer to the Riva 128 in specs (or behind it in bandwidth), and we know that nvidia's drivers were better, even then.

It's too bad they flubbed the Savage 2000. With steady improvements in drivers and at least matching the hardware specs of the competition, the Savage series could have been somewhat decent for at least a little longer. Though now, its hard to imagine any of the early 3D companies lasting past the transition to DirectX 10 level hardware (8800GTX), let alone all the way to today.

Now for some blitting from the back buffer.

Reply 3 of 22, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

You don't need to match the specs of competition in order to be decent if your price is right.

Reply 4 of 22, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Savage 4/MX performance is somewhere in TNT2 M64/Vanta/Vanta LT territory, depending on clocks and available memory. They were perfectly serviceable, but S3 made practically the same mistake as 3dfx, when they bought Diamond Multimedia.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 5 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie

Oh yeah, I have the same exact card for my Igel Winterminal J+/Winnet IV thin client - except mine was purchased new-old-stock complete with original packaging and driver floppies.

The attachment C8B3D580-0738-433F-94E7-CFCE8918E4DB.jpeg is no longer available

If you want to see if different drivers might help, I have a few in WinImage Floppy image (FLP) format. It should work as a drop-in for Flashfloppy Goteks, or you should be able to use WinImage to recreate those files on actual floppies.

The attachment A1D1E566-9836-463C-AA01-3BB930A6D7A6.jpeg is no longer available

a) Windows NT4 Driver v1.04, Disk 1 and 2:
https://retro-junk.s3.ca-central-1.amazonaws. … i/WNTv104D1.flp
https://retro-junk.s3.ca-central-1.amazonaws. … i/WNTv104D2.flp

b) Hydravision 1.02 for Windows 98:
https://retro-junk.s3.ca-central-1.amazonaws. … ision98-102.flp

c) Appian Helpfile, version 1.00:
https://retro-junk.s3.ca-central-1.amazonaws. … ian-hlp-100.flp

d) Appian Windows 98 driver, version 1.03
https://retro-junk.s3.ca-central-1.amazonaws. … i/W98Drv103.flp

Reply 6 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2021-01-12, 07:51:
Wow, you know I never looked real closely at the specs of the S3 Savage series in comparison to the competition. It never occurr […]
Show full quote

Wow, you know I never looked real closely at the specs of the S3 Savage series in comparison to the competition. It never occurred to me that they were only single pipeline, single TMU chips until the Savage 2000 came out. On top of that, they all had a 64bit memory BUS.

I had always assumed it was a driver issue that made them so bad at games compared to competitors with similar core and memory clock speeds, but I see now that they were waaaay behind on specs from the beginning.

Compare:
https://en.wikipedia.org/wiki/S3_Savage
https://en.wikipedia.org/wiki/List_of_Nvidia_ … its#Pre-GeForce

Even the original Riva TNT had a 2:2 design and 128bit memory, which gave it a higher pixel and texel fill rate AND higher memory bandwidth than the top tier Savage 4 Xtreme.

It's amazing they even played games at all with those specs. Most of the Savage cards are actually closer to the Riva 128 in specs (or behind it in bandwidth), and we know that nvidia's drivers were better, even then.

It's too bad they flubbed the Savage 2000. With steady improvements in drivers and at least matching the hardware specs of the competition, the Savage series could have been somewhat decent for at least a little longer. Though now, its hard to imagine any of the early 3D companies lasting past the transition to DirectX 10 level hardware (8800GTX), let alone all the way to today.

The best way to look at the SavageIX was to think of it as a cooler running, mostly bug fixed Savage3D, which only takes up about 2-3w on full blast and functions as an okay DirectX 5/6 accelerator. Great for SP G-Police, okay for Shogo but meh for Quake 3 Arena. It also benefits significantly from fastvid MTRR write combining in DOS and has decent compatibility. The entire Savage mobile series are not 64 bit only - the 8MB variant of the IX are 128 bit...not that it mattered that much - the inability to multi-texture in a single pass does cripple its performance in later DX6 and early Dx7 games.

Now if you want something that is somewhat comparable to the Savage4 in a laptop oriented GPU, look for a beautiful freak like the S3 SuperSavageIX. It has a Savage4 3D core (dual textures possible on the same pass), Savage2000 2D core, and found only in volume on the IBM Thinkpad T23, Toshiba Tecra 9x00s, and inside a fairly rare video card for embedded applications (Integral 9400-00321).
It would be good if someone benchmarks them against the Savage4 desktop cards. TheT23s and the Tecra 9x00 unfortunately makes for lousy pure DOS machines - AC97 based without a hardware fallback.

As for whether S3 would’ve survived? They did to a certain extent, but they didn’t have a large R&D effort and a disciplined QA scheme, and fell short time and again. Via milked them dry making “good enough” integrated graphics for their SoCs and starved them of R&D funds. When S3 did push out something somewhat competitive (the Gammachrome versus the Radeon 9500 or the Chrome 400 versus the GeForce 8400GS) it was only seen as “middle of the road” and didn’t get the enthusiast interest or sales needed to pull in R&D dollars. And the GPU business is very R&D intensive - can’t have big performance gains without big dollars.

Reply 7 of 22, by elianda

User metadata
Rank l33t
Rank
l33t

So what are actually the PCI Vendor / Device IDs of your card?
Does it also use the 86C290 chipset?

Since there was the question about benchmarks here, I did a few with the Spectrah Dynamics Inc LD-S370 Savage MX
https://retronn.de/imports/hwgal/hw_s3_savage … x_spectrah.html
See the second 3DMark2001SE image there.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 8 of 22, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
ragefury32 wrote on 2021-01-13, 05:39:

As for whether S3 would’ve survived? They did to a certain extent, but they didn’t have a large R&D effort and a disciplined QA scheme, and fell short time and again. Via milked them dry making “good enough” integrated graphics for their SoCs and starved them of R&D funds.

Can these claims be sourced?

Last edited by Stiletto on 2021-01-18, 00:18. Edited 1 time in total.

Reply 9 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
elianda wrote on 2021-01-17, 15:04:

Since there was the question about benchmarks here, I did a few with the Spectrah Dynamics Inc LD-S370 Savage MX
https://retronn.de/imports/hwgal/hw_s3_savage … x_spectrah.html
See the second 3DMark2001SE image there.

Thanks for the input! Your results got me curious, so I did some really quick fillrate tests of my own just now.

Using 3DMark 2000, on a PIII 850, 440BX, Windows 98SE with DX7.0a:

Appian Gemini PCI (Savage MX 16MB):
Single Texture Fillrate: 77.2
Multi Texture Fillrate: 77.2

Velocity 128 PCI (Riva 128 4MB):
Single Texture Fillrate: 73.5
Multi Texture Fillrate: 73.5

Roughly the same fillrate as a Riva 128.

I'll need to change some settings to get game benchmarks. I forgot that my 4MB (or 8MB for that matter) Riva 128 cards can't run game benchmarks at 1024x768 to compare to the tests I did on the 16MB Gemini. I'll post those results in a bit.

Last edited by Ozzuneoj on 2021-01-17, 22:02. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 10 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could have sworn I had one. So, I just went for a Velocity 4400 (Riva TNT) 16MB AGP. The difference between a basic TNT and a Savage MX is kind of amazing. Sure, the Savage stays cool... but wow.

The attachment geminiPCI_savmx_3dm2k (Custom).jpg is no longer available
The attachment TNT_3dm2k (Custom).jpg is no longer available

The humble Riva TNT from 1998 is almost exactly twice as fast across the board. Plus, the Gemini had some pretty nasty flickering in the helicopter scenes. I'm sure part of it can be chalked up to a complete lack of optimization for this benchmark with a Savage MX from either party. Really, who would bench an S3 mobile GPU in a 3D gaming benchmark back then? Though the fill rate (and memory bandwidth) pretty much tell the performance story on their own. I also tried out my Savage 4 Xtreme 32MB. The "Xtreme" being the top tier Savage 4 (marketing), 32MB of memory they bothered putting on it and the build quality of the card all make it seem like it should be a pretty solid performer for when it was released (14 months after the TNT), but in the end it can only only come close in some situations and falls flat in others:

The attachment savage4xtrm_3dm2k(Custom).jpg is no longer available

The core clock is more than 80% higher than the TNT and the memory clock is 50% higher than the TNT but with half the ROPs and TMUs and half the BUS width, it's just not enough. Thankfully, it wasn't as glitchy as the Savage MX was in these tests and everything basically looked fine.

What was the MSRP of a Savage 4 Xtreme versus the (1 year old) TNT back in 1999? I can't seem to find that information online anywhere.

I have a Savage 2000 here as well. Maybe I'll try that next, just for kicks.

Now for some blitting from the back buffer.

Reply 11 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Putas wrote on 2021-01-17, 18:08:
ragefury32 wrote on 2021-01-13, 05:39:

As for whether S3 would’ve survived? They did to a certain extent, but they didn’t have a large R&D effort and a disciplined QA scheme, and fell short time and again. Via milked them dry making “good enough” integrated graphics for their SoCs and starved them of R&D funds.

Can these claims be sourced?

And what kind of "sources" are you looking for?
Let's see.

S3 Graphics was a fully owned subsidiary of Via after 2000 - S3 themselves merged with Diamond Multimedia, turned into SonicBlue and made MP3 players. SonicBlue lasted 2 more years. For at least 2 years S3 Graphics did nothing but integrate ProSavages into Via SoCs. That's noted on Wikipedia:

https://en.wikipedia.org/wiki/S3_Graphics

They were given a big chance by Via afterwards and put together the Columbia chip, AKA the first DeltaChrome in '04. And even though it was not that bad, not too many cared. “It’s almost as good as an ATi/nVidia” doesn’t win mindshares.

https://techreport.com/review/5667/a-look-at- … 3s-deltachrome/

The working reviews on the first DeltaChromes were...indifferent. "Nice try, needs more R&D/driver work". Hmmm...kinda like the Savage series.

https://hexus.net/tech/reviews/graphics/691-s … ome-s8/?page=10

With the DeltaChrome S27 (which was an improved DeltaChrome) the reviews weren't bad (somewhere between a Geforce 6200 and a 6600, so low-to-mid level), the negatives cited were poor driver quality...and more R&D needed. Start to notice a pattern?

https://web.archive.org/web/20070806103227/ht … ome-s27_26.html

And towards the end, with the Chrome 430, truly no one really cared anymore...to the point where the only people making S3 based cards...were S3 themselves.
By this time the Chrome (without the big R&D dollars for a new design and at the edge of what they could've done with the GPU arch they have) is not keeping up anymore. In all cases the emphasis was on low power consumption, but there’s so much you can do to stretch the Chrome architecture.

https://www.techpowerup.com/review/s3-chrome-440-gtx/30.html

That’s the end of the line in terms of GPU development at S3 Graphics. S3 lasted longer than most of the former industry heavyweights from the 90s. Neomagic? Gone. Trident and SiS? Merged their graphics division to form XGI, made one or 2 Volari GPUs, and sunk without a trace. Chips and Technology? Worked with Real3D for the i740, and then was sold to Intel. Tseng? Merged with ATi. 3dfx? Bought out by nVidia. WD? Sold graphics division to Thompson SGS and went to storage. Cirrus? Went to the audio industry. And what happened to S3? Sold to Via, made integrated Savage based GPUs and the Chrome, then Via sold them onto HTC in 2011 for the S3TC/DXTC patents for litigation against Apple. AFAIK S3 graphics hasn't released a single new discrete video product since the 4x0/5x0 series, so while they exist on paper, they are effectively dead. They tried, no one cared, and sunk without a trace.

So yes, they didn't have the R&D money, Via's emphasis was on integrated graphics and HTPC stuff, and that was what killed S3's chances at a major comeback.

Last edited by ragefury32 on 2021-01-18, 03:10. Edited 2 times in total.

Reply 12 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
elianda wrote on 2021-01-17, 15:04:

So what are actually the PCI Vendor / Device IDs of your card?
Does it also use the 86C290 chipset?

I just remembered to check this. Interestingly, Everest says this is an 86C270.

I can't seem to find much about that chip.

The attachment appian_gemini_everest1 (Custom).jpg is no longer available

Now for some blitting from the back buffer.

Reply 13 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2021-01-17, 22:05:
I just remembered to check this. Interestingly, Everest says this is an 86C270. […]
Show full quote
elianda wrote on 2021-01-17, 15:04:

So what are actually the PCI Vendor / Device IDs of your card?
Does it also use the 86C290 chipset?

I just remembered to check this. Interestingly, Everest says this is an 86C270.

I can't seem to find much about that chip.

appian_gemini_everest1 (Custom).jpg

Plasma-online does list it as part of their S3 SavageMX listings:

http://www.plasma-online.de/index.html?conten … picture/s3.html

For a 2w part that came out in 1998, it didn't do too bad, I guess.

Reply 14 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2021-01-17, 21:55:
Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could […]
Show full quote

Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could have sworn I had one. So, I just went for a Velocity 4400 (Riva TNT) 16MB AGP. The difference between a basic TNT and a Savage MX is kind of amazing. Sure, the Savage stays cool... but wow.

geminiPCI_savmx_3dm2k (Custom).jpg
TNT_3dm2k (Custom).jpg

The humble Riva TNT from 1998 is almost exactly twice as fast across the board. Plus, the Gemini had some pretty nasty flickering in the helicopter scenes. I'm sure part of it can be chalked up to a complete lack of optimization for this benchmark with a Savage MX from either party. Really, who would bench an S3 mobile GPU in a 3D gaming benchmark back then? Though the fill rate (and memory bandwidth) pretty much tell the performance story on their own. I also tried out my Savage 4 Xtreme 32MB. The "Xtreme" being the top tier Savage 4 (marketing), 32MB of memory they bothered putting on it and the build quality of the card all make it seem like it should be a pretty solid performer for when it was released (14 months after the TNT), but in the end it can only only come close in some situations and falls flat in others:
savage4xtrm_3dm2k(Custom).jpg
The core clock is more than 80% higher than the TNT and the memory clock is 50% higher than the TNT but with half the ROPs and TMUs and half the BUS width, it's just not enough. Thankfully, it wasn't as glitchy as the Savage MX was in these tests and everything basically looked fine.

What was the MSRP of a Savage 4 Xtreme versus the (1 year old) TNT back in 1999? I can't seem to find that information online anywhere.

I have a Savage 2000 here as well. Maybe I'll try that next, just for kicks.

And here’s the numbers from a laptop using the
SavageIX (which is the SavageMX version with VRAM embedded within) - this is off a Thinkpad T21 with an 850MHz Coppermine P3, connected via a 2x AGP port:

The attachment DD684F30-0A28-4D34-98A8-B1BC2999BD51.jpeg is no longer available

The difference between double and triple buffered? Less than a 0.1% difference.
There’s some improvements in both texture fillrates and FPS, but it’s difficult to attribute it to a wider bus, different driver versions or whether it’s AGP or PCI underneath. It could well be that the chip has some thermal constraints placed upon it to fit the hardware in question (although given the heat sink they threw on the SavageMX, it’s more than enough)- The T21 that I use does not have any thermal management (not even a metallic passive heat sink) for the SavageIX.

The Appian Gemini PCI has 4 Samsung 32 bit, 4MB 166MHz SDR SGRAM units. It’s running at 100MHz and I am not even sure if it’s wired for 64 or 128 bit datapaths (I have yet to find a definitive datasheet from S3 describing the pin outs and functional differences between various Savage Mobile family GPUs but it looks like the traces goes to the pin outs on the SavageMX) - if it doesn’t perform all that great? Eeeh, not surprised. It was not advertised as a 3D performer.

Reply 15 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
ragefury32 wrote on 2021-01-17, 23:11:
And here’s the numbers from a laptop using the SavageIX (which is the SavageMX version with VRAM embedded within) - this is off […]
Show full quote
Ozzuneoj wrote on 2021-01-17, 21:55:
Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could […]
Show full quote

Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could have sworn I had one. So, I just went for a Velocity 4400 (Riva TNT) 16MB AGP. The difference between a basic TNT and a Savage MX is kind of amazing. Sure, the Savage stays cool... but wow.

geminiPCI_savmx_3dm2k (Custom).jpg
TNT_3dm2k (Custom).jpg

The humble Riva TNT from 1998 is almost exactly twice as fast across the board. Plus, the Gemini had some pretty nasty flickering in the helicopter scenes. I'm sure part of it can be chalked up to a complete lack of optimization for this benchmark with a Savage MX from either party. Really, who would bench an S3 mobile GPU in a 3D gaming benchmark back then? Though the fill rate (and memory bandwidth) pretty much tell the performance story on their own. I also tried out my Savage 4 Xtreme 32MB. The "Xtreme" being the top tier Savage 4 (marketing), 32MB of memory they bothered putting on it and the build quality of the card all make it seem like it should be a pretty solid performer for when it was released (14 months after the TNT), but in the end it can only only come close in some situations and falls flat in others:
savage4xtrm_3dm2k(Custom).jpg
The core clock is more than 80% higher than the TNT and the memory clock is 50% higher than the TNT but with half the ROPs and TMUs and half the BUS width, it's just not enough. Thankfully, it wasn't as glitchy as the Savage MX was in these tests and everything basically looked fine.

What was the MSRP of a Savage 4 Xtreme versus the (1 year old) TNT back in 1999? I can't seem to find that information online anywhere.

I have a Savage 2000 here as well. Maybe I'll try that next, just for kicks.

And here’s the numbers from a laptop using the
SavageIX (which is the SavageMX version with VRAM embedded within) - this is off a Thinkpad T21 with an 850MHz Coppermine P3, connected via a 2x AGP port:

DD684F30-0A28-4D34-98A8-B1BC2999BD51.jpeg

The difference between double and triple buffered? Less than a 0.1% difference.
There’s some improvements in both texture fillrates and FPS, but it’s difficult to attribute it to a wider bus, different driver versions or whether it’s AGP or PCI underneath. It could well be that the chip has some thermal constraints placed upon it to fit the hardware in question (although given the heat sink they threw on the SavageMX, it’s more than enough)- The T21 that I use does not have any thermal management (not even a metallic passive heat sink) for the SavageIX.

The Appian Gemini PCI has 4 Samsung 32 bit, 4MB 166MHz SDR SGRAM units. It’s running at 100MHz and I am not even sure if it’s wired for 64 or 128 bit datapaths (I have yet to find a definitive datasheet from S3 describing the pin outs and functional differences between various Savage Mobile family GPUs but it looks like the traces goes to the pin outs on the SavageMX) - if it doesn’t perform all that great? Eeeh, not surprised. It was not advertised as a 3D performer.

I have a Savage IX AGP card here too, so I just tested that. It uses the same drivers I installed for the Savage MX (still needed a reboot... it is detected as a different card).

The attachment savageix_3dm2k (Custom).jpg is no longer available

Definitely a much lower score than your mobile IX. Not sure why. I would have expected it to be similar at least, but it's way off.

Also, the reason I used Double Buffering was because it crashed with triple buffering for some reason on the Gemini, so I switched to double on all the benchmarks for consistency. As you said, it barely makes any difference with cards like this. It actually makes more of an impact on memory usage than anything else. I just tried the same tests again on the IX with triple buffering and got exactly the same results.

Now for some blitting from the back buffer.

Reply 16 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2021-01-18, 03:53:
I have a Savage IX AGP card here too, so I just tested that. It uses the same drivers I installed for the Savage MX (still neede […]
Show full quote
ragefury32 wrote on 2021-01-17, 23:11:
And here’s the numbers from a laptop using the SavageIX (which is the SavageMX version with VRAM embedded within) - this is off […]
Show full quote
Ozzuneoj wrote on 2021-01-17, 21:55:
Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could […]
Show full quote

Well, the Riva 128 I was testing had serious issues trying to run 3DMark 2000, and I couldn't find an 8MB Riva 128ZX... I could have sworn I had one. So, I just went for a Velocity 4400 (Riva TNT) 16MB AGP. The difference between a basic TNT and a Savage MX is kind of amazing. Sure, the Savage stays cool... but wow.

geminiPCI_savmx_3dm2k (Custom).jpg
TNT_3dm2k (Custom).jpg

The humble Riva TNT from 1998 is almost exactly twice as fast across the board. Plus, the Gemini had some pretty nasty flickering in the helicopter scenes. I'm sure part of it can be chalked up to a complete lack of optimization for this benchmark with a Savage MX from either party. Really, who would bench an S3 mobile GPU in a 3D gaming benchmark back then? Though the fill rate (and memory bandwidth) pretty much tell the performance story on their own. I also tried out my Savage 4 Xtreme 32MB. The "Xtreme" being the top tier Savage 4 (marketing), 32MB of memory they bothered putting on it and the build quality of the card all make it seem like it should be a pretty solid performer for when it was released (14 months after the TNT), but in the end it can only only come close in some situations and falls flat in others:
savage4xtrm_3dm2k(Custom).jpg
The core clock is more than 80% higher than the TNT and the memory clock is 50% higher than the TNT but with half the ROPs and TMUs and half the BUS width, it's just not enough. Thankfully, it wasn't as glitchy as the Savage MX was in these tests and everything basically looked fine.

What was the MSRP of a Savage 4 Xtreme versus the (1 year old) TNT back in 1999? I can't seem to find that information online anywhere.

I have a Savage 2000 here as well. Maybe I'll try that next, just for kicks.

And here’s the numbers from a laptop using the
SavageIX (which is the SavageMX version with VRAM embedded within) - this is off a Thinkpad T21 with an 850MHz Coppermine P3, connected via a 2x AGP port:

DD684F30-0A28-4D34-98A8-B1BC2999BD51.jpeg

The difference between double and triple buffered? Less than a 0.1% difference.
There’s some improvements in both texture fillrates and FPS, but it’s difficult to attribute it to a wider bus, different driver versions or whether it’s AGP or PCI underneath. It could well be that the chip has some thermal constraints placed upon it to fit the hardware in question (although given the heat sink they threw on the SavageMX, it’s more than enough)- The T21 that I use does not have any thermal management (not even a metallic passive heat sink) for the SavageIX.

The Appian Gemini PCI has 4 Samsung 32 bit, 4MB 166MHz SDR SGRAM units. It’s running at 100MHz and I am not even sure if it’s wired for 64 or 128 bit datapaths (I have yet to find a definitive datasheet from S3 describing the pin outs and functional differences between various Savage Mobile family GPUs but it looks like the traces goes to the pin outs on the SavageMX) - if it doesn’t perform all that great? Eeeh, not surprised. It was not advertised as a 3D performer.

I have a Savage IX AGP card here too, so I just tested that. It uses the same drivers I installed for the Savage MX (still needed a reboot... it is detected as a different card).

savageix_3dm2k (Custom).jpg

Definitely a much lower score than your mobile IX. Not sure why. I would have expected it to be similar at least, but it's way off.

Also, the reason I used Double Buffering was because it crashed with triple buffering for some reason on the Gemini, so I switched to double on all the benchmarks for consistency. As you said, it barely makes any difference with cards like this. It actually makes more of an impact on memory usage than anything else. I just tried the same tests again on the IX with triple buffering and got exactly the same results.

The issue with the SavageMX/IX family was that there are several versions - some are 64 bit bus, and some are 128bit bus, some are SDR (64 or 128 bit) and some are DDR (64 bit only) - it really depend on which one you have. The difference between the SavageMX and the IX is that the MX has no integrated VRAM while the IX does. As far as I am aware, only Toshiba used the SavageMX in volume production (Satellite Pro 42x0) . The other mainstream laptop makers went with SavageIX (Sony, IBM, HP and Toshiba for their Libretto L3s and Protege 3xxx series) since the machines won’t need to wire up external VRAM and use up valuable real estate. From what I can recall, the Savage3/4 family takes a fairly substantial performance hit (25% or more) going from 800x600 to 1024x768, but going from 16 to 32 bit doesn’t cost as much, roughly 10% or less, so that benchmark might not show the Savage at its best.

For the SavageMX it could be the 82c270 or 82c290 while on the SavageIX, it’s the 82c274, 82c294 or 82c298. Unfortunately, PCI-ID doesn’t tell you much since it’s either 5333:8c10-11 for the SavageMX or 5333:8c12-13 for the SavageIX, and in order for you to tell what is what, you really need to look at the card and/or benchmark it.
So depending on which version is on your card, the bus speed and clock speed can also vary significantly.

Oh yeah, in case you want a fresh point of reference, here’s the numbers off an ATi Mobility M3 with 16MB of RAM (0ff an otherwise similar Dell Latitude C600...which is a 440 based machine with an 850MHz Coppermine mobile). AFAIK that has a 128 bit bus with two textures per pass rather than the 1 per pass on the Savage Mobiles. The numbers aren’t too far from the TNT, and you do see the performance penalties of fitting a GPU to work with laptops.

The attachment 7F1C12B3-2453-4007-88E7-EF3B4FE03483.jpeg is no longer available

Reply 17 of 22, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
ragefury32 wrote on 2021-01-18, 04:36:
The issue with the SavageMX/IX family was that there are several versions - some are 64 bit bus, and some are 128bit bus, some a […]
Show full quote

The issue with the SavageMX/IX family was that there are several versions - some are 64 bit bus, and some are 128bit bus, some are SDR (64 or 128 bit) and some are DDR (64 bit only) - it really depend on which one you have. The difference between the SavageMX and the IX is that the MX has no integrated VRAM while the IX does. As far as I am aware, only Toshiba used the SavageMX in volume production (Satellite Pro 42x0) . The other mainstream laptop makers went with SavageIX (Sony, IBM, HP and Toshiba for their Libretto L3s and Protege 3xxx series) since the machines won’t need to wire up external VRAM and use up valuable real estate. From what I can recall, the Savage3/4 family takes a fairly substantial performance hit (25% or more) going from 800x600 to 1024x768, but going from 16 to 32 bit doesn’t cost as much, roughly 10% or less, so that benchmark might not show the Savage at its best.

For the SavageMX it could be the 82c270 or 82c290 while on the SavageIX, it’s the 82c274, 82c294 or 82c298. Unfortunately, PCI-ID doesn’t tell you much since it’s either 5333:8c10-11 for the SavageMX or 5333:8c12-13 for the SavageIX, and in order for you to tell what is what, you really need to look at the card and/or benchmark it.
So depending on which version is on your card, the bus speed and clock speed can also vary significantly.

Oh yeah, in case you want a fresh point of reference, here’s the numbers off an ATi Mobility M3 with 16MB of RAM (0ff an otherwise similar Dell Latitude C600...which is a 440 based machine with an 850MHz Coppermine mobile). AFAIK that has a 128 bit bus with two textures per pass rather than the 1 per pass on the Savage Mobiles. The numbers aren’t too far from the TNT, and you do see the performance penalties of fitting a GPU to work with laptops.

7F1C12B3-2453-4007-88E7-EF3B4FE03483.jpeg

Part of the reason I kept this Savage IX was that it had the integrated VRAM. I found that pretty unique.
My card is identical to the Apollo ASVGB1-8M on this page (with the silver sticker on the chip):
http://www.vgamuseum.info/index.php/component … 78-s3-savage-ix

Not sure how reliable the info is, but that page says the clock speed for this card is 100Mhz (as shown in the screenshot), and it uses SDRAM on a 64bit BUS. That seems pretty accurate to me, given the weak performance of this particular card. Mine also has that label on the back that says "S3 298SDR-8M", so I assume it uses the 86C298 too.

Were there any other Savage IX cards made? When I google it, these blue Apollo cards are the only ones that come up, and they seem to have been made for both PCI and AGP. It seems like these were a one-off thing and that maybe all of the other IX chips are in laptops or embedded devices. As for Savage MX... I don't really know. It seems Appian used them for a couple different cards (another is here, which uses an 86C290).

Anyway, not much point to all this. These will likely remain safely stored in my video card collection as oddities without much of a purpose, until some tidbit of into comes out that says they are somehow useful for some obscure thing, then I'll dig them out and play with them again. 🤣

Now for some blitting from the back buffer.

Reply 18 of 22, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
ragefury32 wrote on 2021-01-17, 22:00:
Putas wrote on 2021-01-17, 18:08:
ragefury32 wrote on 2021-01-13, 05:39:

As for whether S3 would’ve survived? They did to a certain extent, but they didn’t have a large R&D effort and a disciplined QA scheme, and fell short time and again. Via milked them dry making “good enough” integrated graphics for their SoCs and starved them of R&D funds.

Can these claims be sourced?

And what kind of "sources" are you looking for?

Something more trustworthy than opinions from reviews.

ragefury32 wrote on 2021-01-17, 22:00:

They were given a big chance by Via afterwards and put together the Columbia chip, AKA the first DeltaChrome in '04. And even though it was not that bad, not too many cared. “It’s almost as good as an ATi/nVidia” doesn’t win mindshares.

Sounds quite contradictory to milking them dry.

Reply 19 of 22, by ragefury32

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on 2021-01-18, 06:10:
Part of the reason I kept this Savage IX was that it had the integrated VRAM. I found that pretty unique. My card is identical t […]
Show full quote
ragefury32 wrote on 2021-01-18, 04:36:
The issue with the SavageMX/IX family was that there are several versions - some are 64 bit bus, and some are 128bit bus, some a […]
Show full quote

The issue with the SavageMX/IX family was that there are several versions - some are 64 bit bus, and some are 128bit bus, some are SDR (64 or 128 bit) and some are DDR (64 bit only) - it really depend on which one you have. The difference between the SavageMX and the IX is that the MX has no integrated VRAM while the IX does. As far as I am aware, only Toshiba used the SavageMX in volume production (Satellite Pro 42x0) . The other mainstream laptop makers went with SavageIX (Sony, IBM, HP and Toshiba for their Libretto L3s and Protege 3xxx series) since the machines won’t need to wire up external VRAM and use up valuable real estate. From what I can recall, the Savage3/4 family takes a fairly substantial performance hit (25% or more) going from 800x600 to 1024x768, but going from 16 to 32 bit doesn’t cost as much, roughly 10% or less, so that benchmark might not show the Savage at its best.

For the SavageMX it could be the 82c270 or 82c290 while on the SavageIX, it’s the 82c274, 82c294 or 82c298. Unfortunately, PCI-ID doesn’t tell you much since it’s either 5333:8c10-11 for the SavageMX or 5333:8c12-13 for the SavageIX, and in order for you to tell what is what, you really need to look at the card and/or benchmark it.
So depending on which version is on your card, the bus speed and clock speed can also vary significantly.

Oh yeah, in case you want a fresh point of reference, here’s the numbers off an ATi Mobility M3 with 16MB of RAM (0ff an otherwise similar Dell Latitude C600...which is a 440 based machine with an 850MHz Coppermine mobile). AFAIK that has a 128 bit bus with two textures per pass rather than the 1 per pass on the Savage Mobiles. The numbers aren’t too far from the TNT, and you do see the performance penalties of fitting a GPU to work with laptops.

7F1C12B3-2453-4007-88E7-EF3B4FE03483.jpeg

Part of the reason I kept this Savage IX was that it had the integrated VRAM. I found that pretty unique.
My card is identical to the Apollo ASVGB1-8M on this page (with the silver sticker on the chip):
http://www.vgamuseum.info/index.php/component … 78-s3-savage-ix

Not sure how reliable the info is, but that page says the clock speed for this card is 100Mhz (as shown in the screenshot), and it uses SDRAM on a 64bit BUS. That seems pretty accurate to me, given the weak performance of this particular card. Mine also has that label on the back that says "S3 298SDR-8M", so I assume it uses the 86C298 too.

Were there any other Savage IX cards made? When I google it, these blue Apollo cards are the only ones that come up, and they seem to have been made for both PCI and AGP. It seems like these were a one-off thing and that maybe all of the other IX chips are in laptops or embedded devices. As for Savage MX... I don't really know. It seems Appian used them for a couple different cards (another is here, which uses an 86C290).

Anyway, not much point to all this. These will likely remain safely stored in my video card collection as oddities without much of a purpose, until some tidbit of into comes out that says they are somehow useful for some obscure thing, then I'll dig them out and play with them again. 🤣

Well, yeah. Back in the old days the Appian Gemini was marketed as a much cheaper version of the Matrox G450 (I had the AGP version of the G450 Dualhead in my Deschutes 350MHz back in the very early 2000s, but it’s also available for PCI), when dual VGA outputs driving multiple desktops weren’t all that common. If you really think about it, the G450 had the same bus width issue as some of the SavageMX/IX boards - the DDR version is 64 bit but since it ran at 333MHz instead of 166, the throughput is supposed to be the same, but since DDR has higher latency it’s actually slight less.

As for that Apollo SavageIX...probably not, but then most of them are sold unbranded in some ugly generic boxes. My guess is that S3 had a production run of SavageIX 64 bit that didn’t sell that well, so when some generic Taiwanese board maker put in the order they got it for cheap. The whole thing reminds me of clone board makers who bought Rage Pro LT laptop GPUs and stuck them on cheap desktop cards back in the late 90s. They were sold cheap and worth about as much. The important distinction here is that not all SavageIX performs the same, since there are different variants with different VRAM speeds and bus widths. I suspect implementations across laptops from various manufacturers can show variance of at least 5 to 7 FPS, depending on which chipset it was paired with, which type of VRAM was chosen, and how it was attached. The SavageMX/IX also didn’t stay relevant for long for laptops - roughly 1999. Most vendors that used SavageIX went to the ATI Radeon M6 (the mobile version of the Radeon VE/7200) for follow-up machines...better performance at an even cheaper power budget, and integrated VRAM on-die. Only Toshiba stuck with S3 for the Savage4 based SuperSavageIX.

As for exclusive features - they supported S3TC and MeTAL - or rather, the SavageIX on my T21 does. But that’s not really much of a selling point if it only runs UT’99 at 20 fps under 800x600x16bpp.

Last edited by ragefury32 on 2021-01-18, 08:04. Edited 1 time in total.