VOGONS

Common searches


First post, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

I've been trying to make X-Plane 7 on my not-so-new main rig (Ryzen 3 with Ti1050). The solution was using nvidia profile inspector to limit GL extensions, but I think that it is limited to people with nvidia cards. So I've come with thoughts about a OpenGL "limiter" that can "fake" having an older video card that works with non-cooperative games.

Is there any program that can do the same? If not, there are some ideas about that (treat it lika a list of wishes):

The limiter should fake the OpenGL version, list of extensions and memory reported to games. Every other function could be passed directly to system OpenGL (we don't need to deactivate modern functions, only faking that they're not present). Instead of having a long list of extensions to deactivate, it could have some pre-defined profiles (like those cards listed on dgVoodoo 2). So if we choose a nVidia Geforce Ti 4800 , it should report the same capabilities as that card (that lists may be copied from a real card*). After selecting a card, some checks should be done:
- OpenGL version on real card is equal or higher than faked card version.
- Real card memory is the equal or bigger than faked card memory.
- Every GL extension on the faked card is available on our card**.

If everything is OK, we can safely report that our card is the faked one; as I said it's safe to guess that the application won't use anything that we haven't reported so every other GL function can be passed directly.

The rationale behind this is that limiting the size of the buffer used to report GL extensions is useless if necessary extensions are beyond that limit; giving a "faked" list would make sure that the extensions are properly "discovered". I guess that it could be done using a fake opengl32.dll... like those used on miniGL drivers (so no need to get profiles for Voodoo cards? In any case if you're using a miniGL driver, you're using the real thing so your configuration is already limited).

*As the OpenGL implementation can change with drivers versions, the cards profile should be generated with the same drivers used at launch or a 3-6 month later version.
** I wonder if cards manufacturers "deprecate" older GL extensions (i.e.: a modern card without support for 8 bit textures). If there are missing extensions on our newer card, that "limiter" would not be useful (some applications would refuse to run because they have a necessary extension missing) and we would need a GL library that could emulate the missing extensions.

Would it be feasible?

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 1 of 5, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

DxWnd has the following features

For the other functions, you may ask him, fake OpenGL version may be possible with the Custom OpenGL library option. Had I have this game, I would test it out myself, considering we have the same GPU and the same gen CPU (R5 1600)

Attachments

previously known as Discrete_BOB_058

Reply 2 of 5, by Zup

User metadata
Rank Oldbie
Rank
Oldbie

It seems that what it does is similar as nvidia profile inspector. It does "cut" the length of the OpenGL extensions list, but there is no guarantee that the needed exentions will be first (and found). The good thing is that it seems that it will work with ANY video card (while nvidia profile inspector suggest that it will only work on nvidia cards); on the other hand it does not say where is the limit (nvidia profile inspector have different presets for some games).

Good utility, thanks.

I have traveled across the universe and through the years to find Her.
Sometimes going all the way is just a start...

I'm selling some stuff!

Reply 3 of 5, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

The application has a log feature, I believe that can log the extensions that the game requires or DxWnd loaded. It's not always that a wrong extension is loaded but you cam always look at the log to see what's wrong.

previously known as Discrete_BOB_058

Reply 5 of 5, by Azarien

User metadata
Rank Oldbie
Rank
Oldbie
Zup wrote on 2021-11-10, 10:02:

** I wonder if cards manufacturers "deprecate" older GL extensions (i.e.: a modern card without support for 8 bit textures). If there are missing extensions on our newer card, that "limiter" would not be useful (some applications would refuse to run because they have a necessary extension missing)

Core GL functionality should work everywhere, but
- some drivers are buggy or they interpret the standard in an unusual way, and
- some old obscure extensions may be not present on newer implementations.