VOGONS


3D Accelerator Video Captures

Topic actions

Reply 122 of 185, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
kool kitty89 wrote:

Putas, you've run Unreal with your ViRGE MX, right? How did that compare with elianda's video?

Ah, the MX can just render filtered textures, that's about it. All lighting and effects are broken. But with Trio3D/2X (supporting additive and multiplicative alpha, more mature drivers) one can do much better:
http://www.vintage3d.org/images/T3D2X/UNREAL%2024w.png

Reply 123 of 185, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:
kool kitty89 wrote:

Putas, you've run Unreal with your ViRGE MX, right? How did that compare with elianda's video?

Ah, the MX can just render filtered textures, that's about it. All lighting and effects are broken. But with Trio3D/2X (supporting additive and multiplicative alpha, more mature drivers) one can do much better:
http://www.vintage3d.org/images/T3D2X/UNREAL%2024w.png

Is there a point to adding more features to a Virge, though? That's like those guys who put rims, stickers, and a rear spoiler on an economy car. It looks nice, but underneath it's still an economy car. The performance still isn't going to be there. What the Virge really needed was more raw muscle before any new features were added.

Reply 124 of 185, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:

Is there a point to adding more features to a Virge, though? That's like those guys who put rims, stickers, and a rear spoiler on an economy car. It looks nice, but underneath it's still an economy car. The performance still isn't going to be there. What the Virge really needed was more raw muscle before any new features were added.

You need both speed and features. I would rather play at low resolution without bugs then the other way. Who knows what was S3 thinking, in the beginning Virge was so successful they may have decided to dominate by cheapness and underestimated pace, hence the Trio3D, missed cycle, and catching up after that. Or it was Virge /T...

Reply 125 of 185, by SquallStrife

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

http://www.youtube.com/watch?v=HymP5Tz1kfI

Voodoo 1 Unreal flyby with timedemo stats. vsync disabled.
10-30fps

Could you try UTBench?

http://www.ut-files.com/index.php?dir=Utiliti … D%3D--/UTbench/

I just ran it on my K6-2 and got 18.32fps avg, 29.84fps max.

Keen to see if it's the CPU of the GPU holding this one back.

Edit: I think it might be the CPU.

With an nVidia FX 5200, 19.56fps avg, 37.5fps max. Higher burst, but overall the same performance.

Edit2: Ran this with UT GOTY edition v436, 640x480 16-bit, Low world textures, Low skin detail.

VogonsDrivers.com | Link | News Thread

Reply 128 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++
SquallStrife wrote:

Could you try UTBench?

http://www.ut-files.com/index.php?dir=Utiliti … D%3D--/UTbench/

I just ran it on my K6-2 and got 18.32fps avg, 29.84fps max.

K6 CPUs are quite slow for UT. I did a test years ago with my Voodoo5 and had a P3-450 outperform a K6-III+ 616MHz.

BTW, UTGLR should work pretty well with a 5200. It won't work optimally on a K6 though because of no SSE support. A 5200 is somewhat like a GeForce 3, assuming it has a 128-bit memory bus.

Will try UTBench on the Voodoo. I'm using a PIII-S at 1050 MHz for these tests to minimize CPU bottleneck.

Reply 129 of 185, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:
SquallStrife wrote:

Ran this with UT GOTY edition v436, 640x480 16-bit, Low world textures, Low skin detail.

Do people really think FX 5200 is so hopeless?

Yes. The non-ultra FX5200 is horrible. It is a DX9 compliant card in name only because it does not have the speed to play DX9 games at any reasonable framerate. I feel sorry for anybody who bought one thinking it would be as fast as a Radeon 9500.

Reply 130 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:
Putas wrote:
SquallStrife wrote:

Ran this with UT GOTY edition v436, 640x480 16-bit, Low world textures, Low skin detail.

Do people really think FX 5200 is so hopeless?

Yes. The non-ultra FX5200 is horrible. It is a DX9 compliant card in name only because it does not have the speed to play DX9 games at any reasonable framerate. I feel sorry for anybody who bought one thinking it would be as fast as a Radeon 9500.

I think you missed Putas' point. Look at those UT settings again.

Reply 131 of 185, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
sliderider wrote:
Putas wrote:

Do people really think FX 5200 is so hopeless?

Yes. The non-ultra FX5200 is horrible. It is a DX9 compliant card in name only because it does not have the speed to play DX9 games at any reasonable framerate. I feel sorry for anybody who bought one thinking it would be as fast as a Radeon 9500.

I think you missed Putas' point. Look at those UT settings again.

And you think the scores are good scores?

Reply 132 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++

He never said it was about scoring good, he's just determining a bottleneck from his results

not really hard to figure that out, so put away your 'internet forum cheatsheet: why fx5200 sucks' no one wants to read it, we're all damn aware about the card

apsosig.png
long live PCem

Reply 133 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

sis6326.png
The chip runs so hot that it hurts if you touch it...

footage soon on YT.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 134 of 185, by SquallStrife

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

K6 CPUs are quite slow for UT.

OK, interesting!

swaaye wrote:

I did a test years ago with my Voodoo5 and had a P3-450 outperform a K6-III+ 616MHz.

BTW, UTGLR should work pretty well with a 5200. It won't work optimally on a K6 though because of no SSE support. A 5200 is somewhat like a GeForce 3, assuming it has a 128-bit memory bus.

The second score was with UT's OpenGL renderer, the Direct3D renderer not only looked horrible, but ran about 4-5fps slower on average.

I'll give UTGLR a go tonight.

swaaye wrote:

Will try UTBench on the Voodoo. I'm using a PIII-S at 1050 MHz for these tests to minimize CPU bottleneck.

The first score was with the Voodoo 1, keen to see how it compares with a significantly faster CPU. 😀

Another comparison, the FX 5200 for GLQuake gives 145fps for "timedemo demo1", where the Voodoo does about 25fps, both on the K6-2.

VogonsDrivers.com | Link | News Thread

Reply 135 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
Putas wrote:
kool kitty89 wrote:

Putas, you've run Unreal with your ViRGE MX, right? How did that compare with elianda's video?

Ah, the MX can just render filtered textures, that's about it. All lighting and effects are broken. But with Trio3D/2X (supporting additive and multiplicative alpha, more mature drivers) one can do much better:
http://www.vintage3d.org/images/T3D2X/UNREAL%2024w.png

Your ViRGE comparison page mentioned you only got Unreal running on the MX, did you find other drivers for the GX2/Trio3D since you wrote that?

That screenshot looks quite good, how fast can the Trio3D run with those features on? (is it at least playable at 320x240?)

Putas wrote:

You need both speed and features. I would rather play at low resolution without bugs then the other way. Who knows what was S3 thinking, in the beginning Virge was so successful they may have decided to dominate by cheapness and underestimated pace, hence the Trio3D, missed cycle, and catching up after that. Or it was Virge /T...

Plus, remember that most console games of the time were still running at ~320x224 highcolor (and PS1 was obviously far lower visual quality). Dreamcast would obviously be in another league though. (but by that point, you could compare the Savage)

Also, Putas, I think I mentioned this before but don't remember getting an answer:
In your tests of Direct3D games on the ViRGE cards (including the 325), did you notice weird rendering bugs and artifacts like swaaye's videos of Tomb Raider 2 show? (especially the black rendering errors in the 325 recording)

sliderider wrote:

Is there a point to adding more features to a Virge, though? That's like those guys who put rims, stickers, and a rear spoiler on an economy car. It looks nice, but underneath it's still an economy car. The performance still isn't going to be there. What the Virge really needed was more raw muscle before any new features were added.

Adding those things to an economy car are largely (or entirely) aesthetic . . . adding such features to the ViRGE architecture would be more akin to adding headers, a turbocharger, modifying/replacing the transmission, suspension, hubs, wheels/tires, etc, or such to an economy car . . . or replacing/upgrading the engine entirely.

Or, more accurately (since we're not talking about aftermarket chances of the ViRGE -like overclocking), it would be like a new revision/model of said economy car, sharing many components and design elements, but with considerable chances as well. (like comparing a 1984 Pontiac Fiero with 92 HP 2.5L 4 cylinder engine to a 1985 GT with 140 HP 2.8L V6)

Remember, the ViRGE did get significant hardware changes in its architecture. The ViRGE DX/GX was significantly faster per-clock than the 325, and later revisions added more tweaks and changes as well, at least up to the GX2 and Trio3D. (drivers also appear to have improved considerably throughout its life, meaning fewer bugs and better performance for all models as time went on)

Reply 136 of 185, by sliderider

User metadata
Rank l33t++
Rank
l33t++
kool kitty89 wrote:
Your ViRGE comparison page mentioned you only got Unreal running on the MX, did you find other drivers for the GX2/Trio3D since […]
Show full quote
Putas wrote:
kool kitty89 wrote:

Putas, you've run Unreal with your ViRGE MX, right? How did that compare with elianda's video?

Ah, the MX can just render filtered textures, that's about it. All lighting and effects are broken. But with Trio3D/2X (supporting additive and multiplicative alpha, more mature drivers) one can do much better:
http://www.vintage3d.org/images/T3D2X/UNREAL%2024w.png

Your ViRGE comparison page mentioned you only got Unreal running on the MX, did you find other drivers for the GX2/Trio3D since you wrote that?

That screenshot looks quite good, how fast can the Trio3D run with those features on? (is it at least playable at 320x240?)

Putas wrote:

You need both speed and features. I would rather play at low resolution without bugs then the other way. Who knows what was S3 thinking, in the beginning Virge was so successful they may have decided to dominate by cheapness and underestimated pace, hence the Trio3D, missed cycle, and catching up after that. Or it was Virge /T...

Plus, remember that most console games of the time were still running at ~320x224 highcolor (and PS1 was obviously far lower visual quality). Dreamcast would obviously be in another league though. (but by that point, you could compare the Savage)

Also, Putas, I think I mentioned this before but don't remember getting an answer:
In your tests of Direct3D games on the ViRGE cards (including the 325), did you notice weird rendering bugs and artifacts like swaaye's videos of Tomb Raider 2 show? (especially the black rendering errors in the 325 recording)

sliderider wrote:

Is there a point to adding more features to a Virge, though? That's like those guys who put rims, stickers, and a rear spoiler on an economy car. It looks nice, but underneath it's still an economy car. The performance still isn't going to be there. What the Virge really needed was more raw muscle before any new features were added.

Adding those things to an economy car are largely (or entirely) aesthetic . . . adding such features to the ViRGE architecture would be more akin to adding headers, a turbocharger, modifying/replacing the transmission, suspension, hubs, wheels/tires, etc, or such to an economy car . . . or replacing/upgrading the engine entirely.

Or, more accurately (since we're not talking about aftermarket chances of the ViRGE -like overclocking), it would be like a new revision/model of said economy car, sharing many components and design elements, but with considerable chances as well. (like comparing a 1984 Pontiac Fiero with 92 HP 2.5L 4 cylinder engine to a 1985 GT with 140 HP 2.8L V6)

Remember, the ViRGE did get significant hardware changes in its architecture. The ViRGE DX/GX was significantly faster per-clock than the 325, and later revisions added more tweaks and changes as well, at least up to the GX2 and Trio3D. (drivers also appear to have improved considerably throughout its life, meaning fewer bugs and better performance for all models as time went on)

The Virge was still slow as hell and outdated long before the architecture was actually discontinued so adding all that stuff without substantially upgrading the speed would be pointless. It is akin to nVidia calling the FX5200 a DX9 card when it never had the power to run DX9 games.

Reply 137 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
sliderider wrote:

The Virge was still slow as hell and outdated long before the architecture was actually discontinued so adding all that stuff without substantially upgrading the speed would be pointless. It is akin to nVidia calling the FX5200 a DX9 card when it never had the power to run DX9 games.

It wouldn't be pointless if it allowed some games to run decently (or boarderline) that otherwise wouldn't work at all (or would look really wrong), even if only at low resolutions.
Ie, if Unreal could run at 320x240 with good visual quality and a playable framerate (ie averaging at least ~15 FPS), it would make sense, especially as a bottom-end OEM card. (rather like ATi did with the Rage II+/IIc alongside the Rage Pro -and, later, the Pro alongside the 128 and later cards)
Or also similar to the SiS 6326. (slow, but feature-rich with relatively good Direct3D support)

The problem would, of course, be that some games would be unplayably slow at best, so not much advantage over lacking the features entirely. (this would especially be true for games with fairly high minimum resolutions -like 640x480 . . . that seems to be a problem even with some S3D games unfortunately -lack of 32-bit color depth support in many of those games would also be a big disadvantage, as the ViRGE is arguably best at low resolutions with 32-bit color and near-full features enabled -though some games would be worth running at lower detail modes for higher res+framerate -ie games that generally cater well to unfiltered textures and also look pretty good when software rendered or using the Mystique -except the ViRGE could still have alpha blending too)

If you were using the ViRGE with a slow enough CPU, it wouldn't be much of a bottleneck anyway. (say you were in the market for a budget upgrade for graphics on a Pentium 133 or Cyrix PR166, the ViRGE or similar low-end card would be a fairly decent match for that at low-cost -assuming the VooDoo1 was still significnatly more expensive -used or new-, otherwise that would obviously be a good choice as an upgrade)

On this note: does anyone know specifically when the final features were added to the ViRGE series? (did the Trio3D add anything over the GX2, or was it basically the same as that chip feature-wise -perhaps just die-shrunk)

Reply 138 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++

If anyone's still doing PCX2 captures, i'll mention this

HKEY_LOCAL_MACHINE/SOFTWARE/Videologic ....... explore a little and you should find keys that allow you to unlock the ability to use 24-bit rendering and the advanced options menus with game profiles 😀

apsosig.png
long live PCem

Reply 139 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Voodoo Graphics 4MB
UT99 UTBench -High detail exceeds 2MB texture memory (probably by a lot) and kills performance. Reducing to medium almost doubles frame rate.
Turok 2 via Glide

GeForce FX 5200 Ultra 128MB
Doom3 - the showcase for the FX cards.
UTGLR UT99 and a UTBench run -zoom.
3DMark03 - reminiscent of the 8500's performance.
3DMark2001SE -pretty fast but the 8500 is faster.

Radeon 8500 64MB
-the DVI on 7500/8500 have problems staying in spec and causes problems so I had to crop out 10px of the bottom. The DVI doesn't even work outside of Windows on these cards.
3DMark2000
3DMark2001SE
3DMark03