VOGONS


3D Accelerator Video Captures

Topic actions

Reply 100 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

@Swaaye: done
@leileilol: I combined the latest Matrox m3D driver with the latest Apocalypse 5D driver and now Unreal with SGL and D3D work.
I guess the sound stutters in Final Reality is due to using a PCI soundcard (Yamaha Xwave) and Bus congestion.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 102 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

Yes this was Unreal using SGL: https://www.youtube.com/watch?v=O3GjPgb6a3s
I could not get it running using D3D on PowerVR, it reports that some texture is too large.

Maybe some additions:
At with all graphical features on (Coronas, Shiny Surfaces, Voluminetric Lighting, High Detail Textures...) and full digital music it still does on an Athlon 500 MHz

10 fps at 1024x768
16 fps at 800x600
23 fps at 640x480

This is not too bad, if I remember correctly a Voodoo 1 does about 15 fps at 640x480 ?!?

I even see transparency that appear opaque (flat black) in D3D which looks like a alpha blending problem.

3DMark99Max at default 800x600x16 gives a score of 1483 3DMarks / 9020 CPU Marks with no serious graphical glitches except the missing alpha blending.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 103 of 185, by leileilol

User metadata
Rank l33t++
Rank
l33t++

I've uploaded UT PowerVR SGL footage, this time from the original 400 release (and not 426 GOTY where the lighting appears to be screwed up)
http://www.youtube.com/watch?v=3KQcztTwLxk

apsosig.png
long live PCem

Reply 104 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Am I crazy or do I remember you saying the SGL renderer was not really there, back in the PVR Fun Thread. ?? Cool to hear that you two got it working!

elianda wrote:

This is not too bad, if I remember correctly a Voodoo 1 does about 15 fps at 640x480 ?!?

It varies but I want to say 20-30fps in corridors. I know I used to drop to 512x384 sometimes. I haven't really used Voodoo1 much in years. I guess I should get it out and record it.

Reply 106 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

There is some remnant of it in some of those games but the files are empty placeholders I think (you said) .

And there's no possibility to do a transplant?

Reply 107 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
elianda wrote:

Some more hot action: http://www.youtube.com/watch?v=SNiYRcm3NbE

Alpha blending seems just dithering.

In what cases does alpha blending look like plain dithering (I assume you mean dithered "mesh" like Mystique and some software rendered games do).

I've seen tons of dithering in ViRGE stuff, but nothing that looks like simple dithered mesh effects. (everything I see looks like interpolation from higher color down to 16-bit using dithering to approximate higher color depths -many, many cards support this as do many 3D game consoles, though the quality/complexity of the dithering varies)
Thus, you'll often see dithering on every on-screen object, since anything that doesn't scale to almost exactly 15/16-bit RGB will be interpolated with a dither pattern. (which ends up being most shaded/lit/blended surfaces)

However, one noticeable problem with ViRGE alpha blending is the odd rectangular mattes/boarders seen around the transparent textures (textels that should be invisible are instead blended at a very low opacity). This seems to be absent in 32-bit color modes as mentioned in my previous post.
vintage3d.org/images/DX/incoming 16.png
incoming%2016.png
vintage3d.org/images/DX/incoming 32.png
incoming%2032.png
(pics courtesy of Putas)[/img]

Reply 108 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

Well I have to take a look at 32 bit rendering with Virge I guess again.
Currently I have a Rage IIC plugged and plugged the PowerVR in the Athlon 500 MHz.
I tried some more demanding game:
Dethkarz on PowerVR: https://www.youtube.com/watch?v=Ye__fZ6qk-I
Note that it crashes in between to Desktop. It feels faster on setting low (niedrig), but you will notice the slowdowns. Especially noticeable is this as soon as some transparent graphics is present. See f.e. a scene where the car gets the green cloaking. Also some rocket effects at high setting have a transparency problem where you suddenly have your view blocked 😉.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 109 of 185, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
elianda wrote:

So here we have another surprise: S3 Trio3D running 3DMark2001SE

pure win

The blending is far from stipple alpha technique. I was thinking if the Virge could use some speed hacks, like small color look up table to speed up the work at expense of rounding errors. But that wouldn't make sense since Virge can do "full speed" 24 bit without artifacts. I am pretty sure artifacts of the 16 bit dithering are multiplied by number of surfaces drawn over. Every 3d frame drawn in 16 bit has a dither pattern with interleaving darker and brighter pixels. Alpha texture on top of that adds it's own pattern, a bit of brightness and when brighter pixels meets they are multiplied and can become almost white. When bad enough situation occurs the pattern may appear stipple-like.

Reply 110 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Trident 3DImage 9850 with 4MB.
http://www.youtube.com/playlist?list=PL995DE461767BFB71

It works better for D3D than Virge. Image quality has some problems, but I think it's considerably better than Cirrus Logic's attempt at 3D.

Reply 111 of 185, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Voodoo1 (Righteous 3D) running Unreal v219.
http://www.youtube.com/watch?v=LPocZ-FX8SU

-"stat fps" enabled : 1000/#ms = fps
-seems to hover around 30fps (vsync w/o triple buffering)
-v219, last version with original sounds
-sound distortion caused by Voodoo 1 bus congestion (alleviate via SST_FASTPCIRD=0)
-A3D enabled (44.1kHz too)

Reply 112 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
Putas wrote:
elianda wrote:

So here we have another surprise: S3 Trio3D running 3DMark2001SE

pure win

The blending is far from stipple alpha technique. I was thinking if the Virge could use some speed hacks, like small color look up table to speed up the work at expense of rounding errors. But that wouldn't make sense since Virge can do "full speed" 24 bit without artifacts. I am pretty sure artifacts of the 16 bit dithering are multiplied by number of surfaces drawn over. Every 3d frame drawn in 16 bit has a dither pattern with interleaving darker and brighter pixels. Alpha texture on top of that adds it's own pattern, a bit of brightness and when brighter pixels meets they are multiplied and can become almost white. When bad enough situation occurs the pattern may appear stipple-like.

Yes, and (again) the majority of later accelerators (and game consoles) also supported dithering in 16-bit color. (both for shading and blending)
Though the quality and complexity of the dithering algorithms vary (some used error diffused dithering iirc).
The Playstation was among the first (if not the first) consumer-level products to support that feature. (though it was software selectable -so plain 15-bit color without dither was also possible, which is also true of most -if not all- other GPUs with the feature -not sure if any ViRGE drivers support 16-bit without dither though)

But again, with the ViRGE, it's not the dithering itself that's the only artifact with blending (and the dithering does smooth things over plain 16-bit), but the odd mattes/boarders surrounding blended textures. (which, again, appears to be an error in drawing 100% transparent -zero opacity- textels)
The boarder errors aren't present in 32-bit rendering (neither is dithering at all), so maybe it's a rounding error related to the 24=>16-bit interpolation process.

The only accelerator I know of without any translucent blending support at all (even on a per-polygon/texture/line basis -with no alpha channel) is the Matrox Mistique (and related), and that's the only case (aside from software rendering -or old 2D consoles/arcade games) that uses dithered/stippled mesh effects for transparency, and it really seems odd that the mystique didn't at least support per-texture/per-polygon blending/averaging (similar to the Playstation, Saturn, 3D0, Jaguar, etc).

The Saturn also relied on dithered meshes in some cases, but this was for workarounds for cases where the hardware blending wasn't practical. (warped quads overdraw and cause blending errors and sprite/polygon and background interaction/priority are also limited -so translucency isn't always possible in the desired manner when using both VDPs)

Reply 113 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

So whats your flyby intro fps (timedemo 1) with all graphical features on then?

next thing coming up is some Rage II action...
I also recorded a bit from Rage Pro Turbo, which is incredibly faster than Rage II. It has also the missing features +32 bit rendering and does even 3DMark2000 and Dethkarz well.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 114 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
elianda wrote:

So whats your flyby intro fps (timedemo 1) with all graphical features on then?

next thing coming up is some Rage II action...
I also recorded a bit from Rage Pro Turbo, which is incredibly faster than Rage II. It has also the missing features +32 bit rendering and does even 3DMark2000 and Dethkarz well.

The Rage Pro should definitely be faster per-clock than the Rage II for some of the heavier features (though I think plain filling/shading and unfiltered textures are close to the same).

However, there's also the difference in clock speed, but this varies by GPU revision and (maybe) by card.
Apparently, the original Rage II was clocked at 50 MHz, the II+ at 60 MHz, and the IIC at 83 MHz. The Pro seems to have been clocked at 75 or 83 MHz. (not sure if both of those speeds were used or if one is wrong)

I've also seen some mention of Rage Pro derivatives (like the XL) being listed as 100 MHz core clock (which would make them the fastest Rage-compatible GPUs -non 128), but I haven't seen definitive info on this.

Reply 115 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

Powerstrip says:
My Rage IIC AGP 62 MHz (no info if core or memory), does 147 3DMark99
Rage Pro Turbo PCI Core 75 MHz, Memory 100 MHz.

The Rage Pro feels at least 10 times faster.

As for the Virge and 32 bit mode. I tried with Unreal in 16 or 32 bit mode and 16 or 32 bit textures.
I still see the dithered transparent textures and squares, so basically no change at all.
As for the light Unreal renders no lightmaps somehow.
If I switch on Detail Textures it gets a lot brighter (additive?) and I get dither hell for all closer textures everywhere, performance drops ofcourse.
Even if I turn Brightness down some areas are near white. The multiple textures on a surface flicker often when moving.
Sometimes Unreal also crashes because it can not get a required Vertex Buffer Lock.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 116 of 185, by kool kitty89

User metadata
Rank Member
Rank
Member
elianda wrote:

Powerstrip says:
My Rage IIC AGP 62 MHz (no info if core or memory), does 147 3DMark99
Rage Pro Turbo PCI Core 75 MHz, Memory 100 MHz.

Putas mentioned the IIC was clocked up to 83 MHz, but it's certainly possible that some models were clocked lower too. (there was certainly a lot of underclocking with ViRGE based cards, if that's any indication)

Are both 8 MB cards? (and both SDRAM or SGRAM?)

The Rage Pro feels at least 10 times faster.

Interesting, and this is comparing both with full-quality features on?
From what I understand, the Rage Pro improved a lot of the more intensive features (especially filtered texture mapping), having single-cycle operation for most things the Rage/Rage II took many cycles for. (so the difference should be much less dramatic with detail levels turned down -especially texture filtering, much like the ViRGE's speed boost when rendering unfiltered textures) I think the texture cache may also have been improved.

Even with the per-clock advantage and 75 vs 62 MHz speeds, a 10x framerate improvement seems a bit excessive. (I'd have expected more like 5x max)
In any case, it's definitely a much larger improvement than the ViRGE 325 to ViRGE DX/GX (or even 325 to GX2).

As for the Virge and 32 bit mode. I tried with Unreal in 16 or 32 bit mode and 16 or 32 bit textures.
I still see the dithered transparent textures and squares, so basically no change at all.

That's weird . . . it sounds like 32-bit color may not be working at all. (maybe a driver issue with Unreal) From Putas's comments and screenshots, dithering isn't used at all in 32-bit mode and the mattes are also gone. (I haven't seen any video card that dithers in 32-bit color -for that to even make sense, you'd need a GPU that rendered internally at more than 24/32-bit precision and then interpolated to dithered 24-bit)

As for the light Unreal renders no lightmaps somehow.

Do other cards have problems with this in DirectX? (and there's no vertex lighting option either?)

If I switch on Detail Textures it gets a lot brighter (additive?) and I get dither hell for all closer textures everywhere, performance drops ofcourse.
Even if I turn Brightness down some areas are near white. The multiple textures on a surface flicker often when moving.

That definitely sounds like additive blending . . . not sure why that would happen though (unless its supposed to be subtractive blending).
The heavy dithering makes sense in that case, since very bright/washed-out colors will translate poorly from 24 to 16-bit color, resulting in very heavy dithering to approximate the subtle shades not possible in 16-bit color.

Putas, you've run Unreal with your ViRGE MX, right? How did that compare with elianda's video?

Reply 117 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

Well I tried a bit more. There is a newer patch for Unreal that had also a D3D8 and D3D9 renderer, which didn't worked.
There seems to be no option for VertexLighting for D3D, just for SGL.
DetailedTextures work so far and the solution for the Dithering Problem is to set the Colordepth to 24 bits. Now the texture quality is really good.
(I guess it couldn't initialize a 32 bit mode and dropped back to 16 bit before).
For the texture flickering that starts to occur at 640x480 resolution UseVideoMemoryVB=True has to be set.
Still no transparent leaves on trees though.

With all settings on +blob shadows the card delivers about 4 fps at 512x384.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 118 of 185, by SquallStrife

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
Voodoo1 (Righteous 3D) running Unreal v219. http://www.youtube.com/watch?v=LPocZ-FX8SU […]
Show full quote

Voodoo1 (Righteous 3D) running Unreal v219.
http://www.youtube.com/watch?v=LPocZ-FX8SU

-"stat fps" enabled : 1000/#ms = fps
-seems to hover around 30fps (vsync w/o triple buffering)
-v219, last version with original sounds
-sound distortion caused by Voodoo 1 bus congestion (alleviate via SST_FASTPCIRD=0)
-A3D enabled (44.1kHz too)

I think 30ish FPS is about as good as it gets on Voodoo 1.

I upgraded my "Not a 486 anymore" box to a K6-2+ 500, and the framerate in GLQuake is exactly the same as when it had a K6 200, around 25~30.

VogonsDrivers.com | Link | News Thread

Reply 119 of 185, by elianda

User metadata
Rank l33t
Rank
l33t

Ok.

After my recheck of the Trio3D I found a SIS 6326 PCI with 4 MB, Powerstrip tells a 83 MHz memory clock.
The chip seems to have all features, it runs even 3dmark2001 with just a few texture glitches.
Performance is more at the lowest end.
103 3DMark2001SE at 640x480x16
and 69 3DMarks99 (as comparison the Ati Rage II above made 147 3DMark99)
Fillrate shows no Multitexturing.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool