VOGONS


First post, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

HI Dege,
I know this is crazy, but I do find a game which calls _grTexTextureCalcMemRequired() with an insane texture format encoding.

 glidept: _grTexCalcMemRequired 8, 2, 3, 1347241300 

3Dfx glide.h has texture format encoding in the range from 0x00 to 0x0F, but this game was sending a big number. This caused dgVoodoo2 to fault in itself. I found that both OpenGlide and 3Dfx released Glide2x source code for Linux were coded to be tolerant to such abuses. Basically, the function takes advantage of the encoding that texture format >= 0x08 is 16-bit texture, otherwise it is 8-bit texture. So even with an insanely huge number passed as texture format, it would just assumed to be 16-bit texture and still compute the texture size accordingly.

Perhaps dgVoodoo2 should also emulate such behavior as well, to be more compatible with programming errors. In 3Dfx Glide2x, _grTexTextureCalcMemRequired() would simply call _grTexTextureMemRequired() to do the actual texture memory calculation. I guess that you probably did the same looking at the stack depth from GDB when it caught the fault inside dgVoodoo2. So the real fix would be in _grTexTextureMemRequired() and computing texture memory would be tolerant to texture format encoding regardless of which function is used.

Reply 1 of 3, by Dege

User metadata
Rank l33t
Rank
l33t

Hi,

Yes, you're right. I'll include the fix in the next WIP.
But only for Glide 1/2, because looking at the Glide3 source it does some (bad) validation for texture formats and calculates with 0 bytes/texel for invalid ones.

Reply 3 of 3, by Dege

User metadata
Rank l33t
Rank
l33t

The plain final x64 indeed doesn't work, I found a bad define in the build chain. ( 😕 😕 how did that get there?) The spec-version x64 however works.
Does it work for you too? I ask it because I'm curious if there is some additional problem beyond the one I found. If not, then I could re-upload WIP54 with the fixed version.