VOGONS


First post, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

I've been trying to make X-Plane 7 on my not-so-new main rig (Ryzen 3 with Ti1050). The solution was using nvidia profile inspector to limit GL extensions, but I think that it is limited to people with nvidia cards. So I've come with thoughts about a OpenGL "limiter" that can "fake" having an older video card that works with non-cooperative games.

Is there any program that can do the same? If not, there are some ideas about that (treat it lika a list of wishes):

The limiter should fake the OpenGL version, list of extensions and memory reported to games. Every other function could be passed directly to system OpenGL (we don't need to deactivate modern functions, only faking that they're not present). Instead of having a long list of extensions to deactivate, it could have some pre-defined profiles (like those cards listed on dgVoodoo 2). So if we choose a nVidia Geforce Ti 4800 , it should report the same capabilities as that card (that lists may be copied from a real card*). After selecting a card, some checks should be done:
- OpenGL version on real card is equal or higher than faked card version.
- Real card memory is the equal or bigger than faked card memory.
- Every GL extension on the faked card is available on our card**.

If everything is OK, we can safely report that our card is the faked one; as I said it's safe to guess that the application won't use anything that we haven't reported so every other GL function can be passed directly.

The rationale behind this is that limiting the size of the buffer used to report GL extensions is useless if necessary extensions are beyond that limit; giving a "faked" list would make sure that the extensions are properly "discovered". I guess that it could be done using a fake opengl32.dll... like those used on miniGL drivers (so no need to get profiles for Voodoo cards? In any case if you're using a miniGL driver, you're using the real thing so your configuration is already limited).

*As the OpenGL implementation can change with drivers versions, the cards profile should be generated with the same drivers used at launch or a 3-6 month later version.
** I wonder if cards manufacturers "deprecate" older GL extensions (i.e.: a modern card without support for 8 bit textures). If there are missing extensions on our newer card, that "limiter" would not be useful (some applications would refuse to run because they have a necessary extension missing) and we would need a GL library that could emulate the missing extensions.

Would it be feasible?

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 1 of 8, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

DxWnd has the following features

For the other functions, you may ask him, fake OpenGL version may be possible with the Custom OpenGL library option. Had I have this game, I would test it out myself, considering we have the same GPU and the same gen CPU (R5 1600)

previously known as Discrete_BOB_058

Reply 2 of 8, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

It seems that what it does is similar as nvidia profile inspector. It does "cut" the length of the OpenGL extensions list, but there is no guarantee that the needed exentions will be first (and found). The good thing is that it seems that it will work with ANY video card (while nvidia profile inspector suggest that it will only work on nvidia cards); on the other hand it does not say where is the limit (nvidia profile inspector have different presets for some games).

Good utility, thanks.

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 3 of 8, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

The application has a log feature, I believe that can log the extensions that the game requires or DxWnd loaded. It's not always that a wrong extension is loaded but you cam always look at the log to see what's wrong.

previously known as Discrete_BOB_058

Reply 5 of 8, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie
Zup wrote on 2021-11-10, 10:02:

** I wonder if cards manufacturers "deprecate" older GL extensions (i.e.: a modern card without support for 8 bit textures). If there are missing extensions on our newer card, that "limiter" would not be useful (some applications would refuse to run because they have a necessary extension missing)

Core GL functionality should work everywhere, but
- some drivers are buggy or they interpret the standard in an unusual way, and
- some old obscure extensions may be not present on newer implementations.

Reply 6 of 8, by nl.68

User metadata
Rank Newbie
Rank
Newbie

Hi, after hours of searching and testing, I finally got X-Plane 7 working with an old (~2010) DirectX 11 AMD card.
I couldn't have done it without your posts!

Big thanks to BEEN_Nath_58 for mentioning DxWnd and to Zup for original Nvidia solution.
Using DxWnd with x-plane.exe, I activated OpenGl extensions trim in the OpenGL tab - this should be the equivalent option that Zup was searching for. Then, after setting the video size (I used 1024x768), the game launches and runs fine. You should enable the same checkbox in OpenGL for all other executables to make them work too.

Tested on the FX Interactive disc version without any updates or "noCD patch". It doesn't require any dll replacement.

(Quick question for those who know: how would one diagnose this kind of issue e.g., figuring out that the problem wasn’t missing DLLs, but rather too many OpenGL extensions being exposed by the GPU? Is there a tool (like ProcMon, DxWnd logs, or something else) that can help trace which OpenGL features are actually being used or blocked? I’m curious about how one would approach fixing this kind of thing, not just applying a workaround.)

Thanks again to everyone who shared ideas.
✈️

Here my ASCII plane for fun

   |*
+--<D+
|*

Reply 7 of 8, by leileilol

User metadata
Rank l33t++
Rank
l33t++

For a quick and dirty way, Mesa can limit by extension year defined by an environment variable, i.e. set MESA_EXTENSION_MAX_YEAR=2001. Would be better if there was an OpenGL shim that'd just mess with extension lists and pass-through to GL to deal with that.

apsosig.png
long live PCem
FUCK "AI"

Reply 8 of 8, by yliopp Larvanto

User metadata
Rank Newbie
Rank
Newbie
nl.68 wrote:

(Quick question for those who know: how would one diagnose this kind of issue e.g., figuring out that the problem wasn’t missing DLLs, but rather too many OpenGL extensions being exposed by the GPU? Is there a tool (like ProcMon, DxWnd logs, or something else) that can help trace which OpenGL features are actually being used or blocked? I’m curious about how one would approach fixing this kind of thing, not just applying a workaround.)

If it is an old game you can expect possible OpenGL related issues with buffer overflows when enumerating available GL extensions. Usually that would be the game program asking how many GL extensions are available and when it gets a huge list it goes past the allocated space in the return memory buffer which usually leads to a crash. That is if the game code does not have extra logic to deal with such unexpected situations.

You can override and manage what the application sees also with GLIntercept:
https://github.com/dtrebilco/glintercept

Personaly I have only needed it for Quake 2 where I build a custom "fix" by using only the files:
GLExtOverride.dll
OpenGL32.dll
gliConfig.ini

(Taken from the ..._ManualInstall.zip packgage)

With "gliConfig.ini" edited to contain only the following:

PluginData
{

BaseDir = ".\";

Plugins
{

ExtensionOverride = ("GLExtOverride.dll")
{
//VendorString = "Custom vendor string";
//RendererString = "Custom renderer string";
//VersionString = "1.1.0 - Custom version string";
//ShaderVersionString = "1.0.0 - Custom shader version string";

ExtensionsString = (GL_EXT_LIE);

}

}
}

This of course being a very simplistic "fix" that basically lies that no extensions are available except a fake "GL_EXT_LIE" but one could presumably instead write all the extensions that are needed there (delimited by , and space.). In addition to messing with the version stings by removing the comments "//".

The full list of available values to change is in the example config.ini:
https://raw.githubusercontent.com/dtrebilco/g … ride/config.ini

It seems to be a fairly complex toolset for analyzing opengl applications and logging them so I would imagine you can do lots of stuff with it like analyze why an OpenGL game fails to run and what extensions it might be using with some learning.