In some ways, it would be good if there were some kind of "legacy directx" and "legacy opengl" wrappers that kind of wrapped older, deprecated or changed directx and opengl calls to new ones and kind of fixed up things, and also allowed you to force higher resolutions, so the game thinks it is running at a lower res for helping with interface scaling.
Sadly I don't know of any such projects.
To see the issue easily, go to the hotel level (Mr Wulf level) and go in the main doors, then head all the way to the right and look out of the windows onto the street. Instead of seeing the pavement properly,
It also appeared to be 16bit rather than 32bit forced - or at least the lighting around light sources and certain areas was a lot more "rainbow" than in opengl / direct3d mode.
Yes, that is the correct level! The windows i am meaning are to the far right - go past the reception, and through the main doors into the corridor with the guest rooms. The windows are in here. You can also go to the end of the corridor, and in peoples rooms themselves too to see the effect.
Here is the location: http://img163.imageshack.us/img163/4601/maplocation.png
Here is the clipping issue (in d3d and ogl): http://img585.imageshack.us/img585/2513/clippingc.jpg
Another solution would be an opengl wrapper for opengl that the only thing it wraps is the resolution so that you can "force" resolutions outside of the game, simply providing a way to kind of "fake" interface scaling support.
Or does what i say sound mad?
Well, thanks, I will visit that window!
The fact that 1280*960 is exactly double means it should scale fine, and result in a much more readable interface. So something with the scaling is not so good.
As for the rainbow textures, I still say it's because of 16 bit textures, it's not an issue.
Yer. Strange how it is for opengl too.I visited the window. Indeed, it has some kind of clipping issue with D3D. Must be a bug in the D3D renderer. However, with 3Dfx it's OK.
I think resolution scaling couldn't be done for neither DirectX nor OpenGL (in general). The problem is that these APIs support offscreen buffers which can be copied to the rendering buffer by either texturing or raw copy. If raw copy is used, the offscreen size has to be scaled too. But, how would a wrapper know that what purpose is an offscreen is allocated for? It could be indeterminable. The only case is the classic OpenGL 1.x that does not support frame buffers (offscreens) but only 2-3 rendering buffers with a depth buffer, as Glide does.
Hmm i kind of understood about your pixel centre stuff (though not exactly ),
This is probably very ignorant of me, but say if you just scaled the offscreen buffer too, wouldn't that be okay?
Sure you would also be scaling the textures as you say, but that wouldn't necessarily be a problem ?
Or would it then try and read those textures back in using only the coordinates of the original size, meaning you only get part of the texture?
Users browsing this forum: No registered users and 1 guest