leileilol wrote on 2025-11-18, 10:26:
DirectX 6 was about multitexturing than anything else. One game where V3 beats V2 at on this is Max Payne where the V2 can't render the decals and tags properly. There's also the virtue of being a primary video device which helps compatibility with some later (2000-2002) games that a Voodoo2 can't start at all.
- But the V2 also support multitexturing (in it's case, 2 texture maps done in 1 single pass).. So then why is the V2 considered only fully compatible with DX5, while V3 is fully DX6?
leileilol wrote on 2025-11-18, 10:26:
"22-bit" is marketing. The filters are as i've described in 2018 in this very thread though i've neglected to mention the 2x2 box filter - which isn't a miracle either, and isn't supported on all resolutions. The same filters are on the Banshee.
I read your 2018 post, but I still don't understand exactly..
You said:
"Maybe the filter's an advantage.
Voodoo2's 4x1 filter is actually a single pass which doesn't have the feedback nor lines of the Voodoo Graphics. V3's based on the Voodoo Graphics and carries the 4 pass 4x1 filter."
Can you please expand on that? Are you refering to texture filtering? (I have no clue how V3 does texture filtering compared to the V2, or which one is better... and from your old comment I deduced that it is worse on the V3).
Also, the 22-bit color "cheat", from what I read throughout the years, is done after processing (probably in the DAC), and it basically reduces color-banding by mixing the nearest colors so there aren't any visible "borderlines" between color nuances. Which... to me seems like a genius ideea. And because it was done in the DAC, the effect couldn't be seen in screenshots... but only with a capture device or directly on the monitor (PHIL has some great screen captures here: https://www.philscomputerlab.com/voodoo-3-22-bit-output.html).
leileilol wrote on 2025-11-18, 10:26:
And finally, a V3 will clobber a V2 SLI at Quake3!!!!!!!!!!!!!!! Forsaken bars cope mean nothing there!!!! Why even SLI!? "1024x768"??? a v3 can do 1920x1440 lmao
Yes, I know of V2's shortcomings with newer games, and high resolutions. But in my case (and likely in many others' too, as I've seen around here), I'm gonna be using the 3dfx card as secondary. So it's either gonna be a V2 or a V3-PCI, along side a GeForce2...
Thing is, I'm way more concerned with stuff like the cards' image rendering quality, and compatibility between the 3 glide generations (glide, glide2x, glide3x) games. As any game that look better on D3D, will be more likely to be run on the GF2.
So, if i'm not mistaken, glide3x might have some extra eye-candy, that the V2 can't do, thus Glide3X games have to be run in glide2x mode, with uglier graphics. Is this true??
Also if the V2 doesn't do the 22-bit color trick, than the V3 clearly has the advantage of retroactively eliminating color-banding, in every 3dfx game. So that also seems like a big plus.
On the other hand, the V3 doesn't do that great in 3dfx DOS titles.
And also it will eat up another precious IRQ (that i might not be able to easily spare, as I'm also aiming to have 4 sound cards... with 3 of them already in).
And it will also be more difficult to be run along side another Video card in Win98 (compared to the V2 which can run at the same time as the GF2, and also doesn't require an IRQ)
So if possible, I would really love to know all the differences between the V2 and V3 in terms of compatibility, rendering/graphical quality and visuals.. and maybe any other possible pluses and minuses, when it comes to playing the actual glide games. As this is what will ultimately dictate the direction that this build is gonna take.
P.S. I know the pass-through cable may introduce noise, and that the V2 also has a "dirtier" image output even if connected directly.. And that a pair of V2 in SLI is needed to achieve 1024x768, and they'll suck up 3 times the Wattage of a V3_2000 PCI, and the added heat issues, and no windowed games, and drivers kinda suck...etc.. etc.. I know 😀