First post, by kjliew
HI Dege,
I know this is crazy, but I do find a game which calls _grTexTextureCalcMemRequired() with an insane texture format encoding.
glidept: _grTexCalcMemRequired 8, 2, 3, 1347241300
3Dfx glide.h has texture format encoding in the range from 0x00 to 0x0F, but this game was sending a big number. This caused dgVoodoo2 to fault in itself. I found that both OpenGlide and 3Dfx released Glide2x source code for Linux were coded to be tolerant to such abuses. Basically, the function takes advantage of the encoding that texture format >= 0x08 is 16-bit texture, otherwise it is 8-bit texture. So even with an insanely huge number passed as texture format, it would just assumed to be 16-bit texture and still compute the texture size accordingly.
Perhaps dgVoodoo2 should also emulate such behavior as well, to be more compatible with programming errors. In 3Dfx Glide2x, _grTexTextureCalcMemRequired() would simply call _grTexTextureMemRequired() to do the actual texture memory calculation. I guess that you probably did the same looking at the stack depth from GDB when it caught the fault inside dgVoodoo2. So the real fix would be in _grTexTextureMemRequired() and computing texture memory would be tolerant to texture format encoding regardless of which function is used.