VOGONS


First post, by mzry

User metadata
Rank Member
Rank
Member

Hey guys,

I have read that in SLI mode the v2 has its special filter disabled, and during practice I can tell. For instance in dark games like Deus Ex, the dithering and dark shades are all dotty / pixelish instead of smooth with that classic 'voodoo' look.

I was wondering if there was ever a way around this. I have tried forcing the 24bpp rendering mode etc without any difference. I am using Koolsmokys best and last drivers on Winxp.

Thx

Reply 1 of 10, by leileilol

User metadata
Rank l33t++
Rank
l33t++

V2 never had box filtering. That starts on Banshee/V3

You're probably describing "dither subtraction", in which for blended textures, the 4x4 dither matrix is subtracted from the destination blend and then blended on top of a source which then gets dithered causing a 'smooth big 2x2 waffle' effect. UnrealEngine1 games in Glide do not use this dither subtraction and always go for a 2x2 dither pattern instead. All those OpenGL games that support the old 3dfx MiniGL driver (GLQuake, Q2, Half-Life etc) will use that dither subtraction in 4x4 so check those.

Also it's already been noted that many third-party V2 drivers muck around with the variables in an annoying way to go as far as removing the 4x1 filter completely, and maybe the dither subtraction (among other things). At least avoid FastVoodoo if you want more authentic looks.

apsosig.png
long live PCem

Reply 2 of 10, by mzry

User metadata
Rank Member
Rank
Member

Yeah thanks for understanding me that's exactly what I mean. Unreal games are nicely smoothed on v3 or v5 hardware though, is that because of their box filter? thx
Ps Koolsmoky is the most legendary driver guy for the voodoo. His drivers are fully updated with open source glide and he also fixes the windows side mini port removing dirty hacks from the original 2k v3 driver package etc. In my opinion any with a v2 should use these if on 2k or xp.

Reply 3 of 10, by mzry

User metadata
Rank Member
Rank
Member
leileilol wrote:

V2 never had box filtering. That starts on Banshee/V3

You're probably describing "dither subtraction".

After further investigation I don't know if it is dither subtraction or not now, based on this info from my driver:

"Alphablending dither subtraction is disabled by default. Set SSTH3_ALPHADITHERMODE = 3 to enable dither subtraction;
4x4 dither matrics is used when alphablending dither subtraction is enabled. Other wise it's always 2x2"
source: http://www.3dfxzone.it/dir/news/3dfx/koolsmok … kit_27_12_2009/

I tried this and it did not make the screen nice and smooth in either Unreal or MiniGL Quake. I still get the 'dotty' dithering in dark areas. Not the 'smooth paste' effect from Voodoos I remember.

Check the screenshots in this thread:
http://www.3dfxzone.it/enboard/topic.asp?TOPI … 839&whichpage=2

He shows dither substraction on and off. Either way he doesn't have the 'pastey, smooth voodoo style' to either screenshot. But he does mention that 24bpp mode is not supported on V2 SLI: "you should try SSTV2_VIDEO_24BPP=1,it improves the IQ quite a bit,but its not compatible with SLI."

So perhaps the smoothness I am missing is the 24bpp I can't get on SLI.. maybe. What do you think?

Reply 4 of 10, by leileilol

User metadata
Rank l33t++
Rank
l33t++

On stock V2 drivers i've only ever noticed VIDEO_24BPP enable the 4x1 filter when set to 1 and disable it when set to 0 (which confuses me as there's already a filter disable variable). Peeking in Glide source code shows it disables itself completely on SLI

        if(altVideoTiming == (sst1VideoTimingStruct *) NULL) {
// Determine when cannot output 24bpp video...
// Cannot run at high frequencies across SLI connector...
if(sst1CurrentBoard->sliDetected &&
sstVideoRez->clkFreq24bpp > 90.0F)
sst1CurrentBoard->fbiVideo16BPP = 1;
}

apsosig.png
long live PCem

Reply 5 of 10, by mzry

User metadata
Rank Member
Rank
Member

It's surprising how you don't really see other v2 sli owners talking about this. For me having that distinctive 3dfx smooth filter style adds that extra level of nostalgia for me. Perhaps it is due to the effect being generated in the ramdac and doubling up on two levels of that would look horrid, just as a guess.

Reply 6 of 10, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote:
On stock V2 drivers i've only ever noticed VIDEO_24BPP enable the 4x1 filter when set to 1 and disable it when set to 0 (which c […]
Show full quote

On stock V2 drivers i've only ever noticed VIDEO_24BPP enable the 4x1 filter when set to 1 and disable it when set to 0 (which confuses me as there's already a filter disable variable). Peeking in Glide source code shows it disables itself completely on SLI

        if(altVideoTiming == (sst1VideoTimingStruct *) NULL) {
// Determine when cannot output 24bpp video...
// Cannot run at high frequencies across SLI connector...
if(sst1CurrentBoard->sliDetected &&
sstVideoRez->clkFreq24bpp > 90.0F)
sst1CurrentBoard->fbiVideo16BPP = 1;
}

I remember trying this variable a few years ago and noticing my then dual card setup would work fine at 800x600x85 downwards but having either the DACs or the cards themselves crap out at 1024x768 (my monitor's osd would throw an out of range error at 42hz).

Thanks for the valuable info. That may explain why 1024 always seemed grainy to my eyes 😀

7fbns0.png

tbh9k2-6.png

Reply 7 of 10, by mzry

User metadata
Rank Member
Rank
Member
subhuman@xgtx wrote:

I remember trying this variable a few years ago and noticing my then dual card setup would work fine at 800x600x85 downwards but having either the DACs or the cards themselves crap out at 1024x768 (my monitor's osd would throw an out of range error at 42hz).

Do you still know how to do this? I wouldn't mind trying it at 800x600.

Reply 8 of 10, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
mzry wrote:
subhuman@xgtx wrote:

I remember trying this variable a few years ago and noticing my then dual card setup would work fine at 800x600x85 downwards but having either the DACs or the cards themselves crap out at 1024x768 (my monitor's osd would throw an out of range error at 42hz).

Do you still know how to do this? I wouldn't mind trying it at 800x600.

Write SET SSTV2_VIDEO_24BPP=1 on your autoexec.bat or game batch file

It's an enviroment variable.

7fbns0.png

tbh9k2-6.png

Reply 10 of 10, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
mzry wrote:

Oh I thought you had edited the glide source code or something. That variable doesn't work for the v2 sli as the source code keeps it disabled as shown above.

It works, but my memory seems to be a little hazy. Open Regedit and go to HKEY_LOCAL_MACHINE\SOFTWARE\3Dfx Interactive\Voodoo2\

then select the subkey called GLIDE.

Create a DWORD named SSTV2_VIDEO_24BPP with a value of 1. After that, move to the D3D subkey and repeat the same process.

That's how most windows games will recognise it.

7fbns0.png

tbh9k2-6.png