VOGONS


First post, by mzry

User metadata
Rank Member
Rank
Member

Heya,

I was really keen to try forcing Glide to 32bit (one of the main reasons I wanted a 5500) but from what I can see the function doesn't seem to have any effect. I am using a v5500 with XP and Amigamerlin 3.1 r11 drivers (SFFT doesn't seem to work for me, I just get corrupt 2D desktop when installing them)

I am using Unreal as my test game and the reason I can tell there is no difference: With 32bit enabled colour banding around light sources should be completely smooth & dark areas should be smooth without the classic 3dfx dithering present. When I force 32bit it makes no difference to the image quality / banding / dithering. I also ran Timedemo 1 and there was zero difference in performance.

I've tried forcing the option in both 3dfx tools and vctrl.

Have any other voodoo5 owners ever experienced this?

Thx

Last edited by mzry on 2016-06-15, 13:05. Edited 2 times in total.

Reply 1 of 18, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I've done this recently but with the reference drivers Driver 1.04.00.

I think I tried it in Unreal Tournament and the FPS certainly went down.

YouTube, Facebook, Website

Reply 2 of 18, by mzry

User metadata
Rank Member
Rank
Member

Okay I found the solution (finally) just for other peoples info: Any Glide DLL that ISNT official 3dfx will not render at 32bit when forced. I was using some of the new Glide3x DLL's from the Glide open source project, 32bit rendering does not work for these (I wonder if they even know about this, maybe I should try and contact them)

Reply 4 of 18, by szo

User metadata
Rank Newbie
Rank
Newbie

Unreal uses glide2x api, and glide2x doesn't support 32 bits but only 16 bits. The glide3x api can do 32 bits if the hardware supports is (i.e. voodoo4/5). Since glide2x and glide3x are not interchangeable, you can't get 32 bits rendering from unreal+glide.

Reply 5 of 18, by mzry

User metadata
Rank Member
Rank
Member

Really sorry szo but I'm 100% sure I am correct here. If I replace Glide3x DLL with the 3dfx version instead of the Open Glide version I clearly see 32bit colour in Unreal when enabled. It can be clearly be seen on distant coronas where they are rendered perfectly transparent in 32bit, but in 16bit they are 'dotty'. Also there is about a 20-30% performance dip when 32bit is enabled.

The quality difference being subjective & the performance difference being objective & the different behaviors of each Glide DLL give me two objective and one subjective pieces of evidence to support my claims here.

Edit: Here I have added proof of 32bit colour in unreal:

16bit Colour
32bit Colour

Pay attention to the smooth coronas in 32bit, and the dotty badly filtered coronas in 16bit. You can even notice the dithering on the mountain surfaces being so much better and clearer in 32bit mode.

Reply 6 of 18, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I can also confirm the performance drop in Unreal Gold. I didn't know what to look for regarding the colour quality, but will pay attention the next time.

YouTube, Facebook, Website

Reply 7 of 18, by skaarj

User metadata
Rank Newbie
Rank
Newbie

Unreal's GlideDrv.dll is dependent on glide2. This may be confirmed by using dependency walker. Since the game is not linked with glide3, therefore changing the glide3 library will have no effect on the game and its video output.

Reply 8 of 18, by mzry

User metadata
Rank Member
Rank
Member

Well I've posted visual and statistical proof that glide3 effects unreal. That's all there is to it. Perhaps glide2 works in conjunction with glide3 in this instance, I'm not sure but my evidence is irefutable.

Reply 11 of 18, by Jade Falcon

User metadata
Rank BANNED
Rank
BANNED

Just FYI Amigamerlin is a older tweaked version of sfft.
Also what version of sfft were you using?

Last edited by Jade Falcon on 2016-06-16, 16:52. Edited 1 time in total.

Reply 12 of 18, by mzry

User metadata
Rank Member
Rank
Member

I have successfully tested 32bit on both AmigaMerlin V3.1-R11 and the official V1.04.00 reference drivers. Amigamerlins will only work if Glide3x is replaced with the original version from the reference drivers though. I can never get sfft to work, I've been unsuccessful on about four occasions on various retro builds, so now I avoid them like the plague. Amigamerlin only uses the direct3d core from sfft, which is actually excellent.

I have a theory that when 'force 32bit in glide' is enabled, that there might be some communication between the glide library's. I am just totally guessing but as the other gentlemen in this post say Unreal only uses Glide 2 - it really makes me wonder why the Glide 3 dll is making or breaking this functionality. It is a real mystery.

MAYBE - when force 32bit is enabled the drivers make any Glide2 calls from games go through Glide 3? So on the game end it would look like it's still going through Glide 2, but in fact it's not? Crazy idea I know but it's the only explanation I can think of.

Reply 13 of 18, by mzry

User metadata
Rank Member
Rank
Member

Okay I have solved the mystery: My Unreal IS running in Glide 3 through a translator DLL. The Glide2x.dll I was using says it is a 'Glide2 to Glide3 Translator DLL' Copyright 2003 Ryan Nunn. This was included with the Amigamerlin drivers. This explains why using different versions of Glide3x.dll was causing 32bit to work/not work. This also explains how I am able to get 32bit colour from a Glide 2 based game.

I'll upload the DLL for anyone who wants to try it. I'll include the translator DLL Glide2 and the original 3dfx Glide3 which allows 32bit forcing to work.

Google Drive link

Reply 14 of 18, by mzry

User metadata
Rank Member
Rank
Member

Update2: Mystery still not 100% solved. I just replaced both Glide2 and Glide3 with the original versions from the reference driver, and 32bit colour forcing still works. Like before my only explanation is that somehow the drivers are either forcing Glide2 into 32bit rendering, or they are deferring all the Glide2 calls to Glide3 calls.

I can still confirm 32bit colour 100% working through image quality and FPS tests. 16bit timedemo: 108fps, 32bit timedemo: 78fps.

Reply 15 of 18, by Jade Falcon

User metadata
Rank BANNED
Rank
BANNED

Once more what version of sfft did you use? And on witch os?
I helped with dev work on a few older versions of sfft and never ran into a problem were it simply did not work. Other then failing card and id10t errors I never seen anyone not able to use most newer versions of sfft.
I find it wuite odd that you can use AmigaMerlin and not sfft.

Reply 17 of 18, by Stiletto

User metadata
Rank l33t++
Rank
l33t++
mzry wrote:

Okay I have solved the mystery: My Unreal IS running in Glide 3 through a translator DLL. The Glide2x.dll I was using says it is a 'Glide2 to Glide3 Translator DLL' Copyright 2003 Ryan Nunn. This was included with the Amigamerlin drivers. This explains why using different versions of Glide3x.dll was causing 32bit to work/not work. This also explains how I am able to get 32bit colour from a Glide 2 based game.

I'll upload the DLL for anyone who wants to try it. I'll include the translator DLL Glide2 and the original 3dfx Glide3 which allows 32bit forcing to work.

Google Drive link

Ah, that's the Glide 2 Translator: https://wenchy.net/old/glidexp/

I always thought that was neat and that more people should mess around with it. 😀

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto

Reply 18 of 18, by Jade Falcon

User metadata
Rank BANNED
Rank
BANNED
mzry wrote:

I already told you I'm not using sfft.

In your OP you started that you tried sfft, sorry that confused me. I still thing there something wrong with you card or system if you cant use sfft with It, that is unless you were using a old beta/alpha version.