VOGONS

Common searches


Reply 20 of 31, by teleguy

User metadata
Rank Member
Rank
Member

GLIntercept works pretty well with GPUCapsViewer however when I try to use it with Riddick the game shows an error message about my gpu having insufficient capabilities right at the start.

gpucaps.jpg
Filename
gpucaps.jpg
File size
216.86 KiB
Views
2756 views
File license
Fair use/fair dealing exception

There is another tool in development that sounds promising but using the spoofing features is not exactly user friendly.
https://github.com/p3/regal#spoofing-opengl-v … tension-strings bangheadsmiley.gif~original

Reply 21 of 31, by Gamecollector

User metadata
Rank Oldbie
Rank
Oldbie
notindeed wrote:

how to get this to work on ATI /AMD cards or if there's some way to enable an extension limit there or set vendor ID for opengl.

There is atiogl.xml file.
The Catalyst uses it if the "Catalyst AI" is on.

Asus P4P800 SE/Pentium4 3.2E/2 Gb DDR400B,
Radeon HD3850 Agp (Sapphire), Catalyst 14.4 (XpProSp3).
Voodoo2 12 MB SLI, Win2k drivers 1.02.00 (XpProSp3).

Reply 22 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

teleguy, i think you may need to have it raised to opengl 2.1 instead of 2? Not sure about the glsl version.

Does anyone have a 6000 series card to find out what versions these should be limited to?

That atiogl.xml file as far as i can tell it just seems to adjust extension limits and things but not vendor ids?

I am not sure what hex values do what so i tried copying the quake 3 ones to riddick but it doesn't help so i think it's a vendor issue.

I'll look into GLIntercept - just been putting it off as extension editing seems a bit complicated.

Does anyone know how to find out which version of the engine will be running? I know it won't be amd 64 as im on XP but not sure out of x86, SSE or SSE2.
I've got an AMD 64 3200+ so i would imagine it would be running the SSE2 version but you never know... Especially as i heard about that Intel dodgy compiler issue. I'm guessing i should really check most programs for this?

Also, on the laptop with the 540m, I was trying to change the extension limit to a different setting to see what it picked up as different in glview and glcapview but unfortunately it seems to have got in an error state where it will no longer save settings. I tried re-extracting but it didn't help so i'm guessing it's the joys of some error in nvidia drivers. What is nvidia inspector actually editing? registry entries? Or an xml file like that atiogl.xml but for nvidia or what?

Reply 23 of 31, by teleguy

User metadata
Rank Member
Rank
Member
notindeed wrote:

teleguy, i think you may need to have it raised to opengl 2.1 instead of 2? Not sure about the glsl version.

I tried that and lots and lots of other values but I always got the same message. Except when I set Opengl version to something like 1.2 then I got a different message that at least 1.3 is needed. The problem is finding out what the game is looking for when it says I have insufficient capabilities. Is it the Shadermodel, is it some extension that's missing or some other value?

That atiogl.xml file as far as i can tell it just seems to adjust extension limits and things but not vendor ids?

Thing is this file has been missing from the driver for several months now so I'm not sure if it's even being used anymore.

Also, on the laptop with the 540m, I was trying to change the extension limit to a different setting to see what it picked up as different in glview and glcapview but unfortunately it seems to have got in an error state where it will no longer save settings.

Happened to me a few times as well, I think a reboot fixes it.

Reply 24 of 31, by teleguy

User metadata
Rank Member
Rank
Member

I found this thread:
http://www.nvnews.net/vbulletin/showthread.php?t=46564

Using the same GLIntercept version (0.41) and settings I'm able to launch the game but the 2++ shaders are still unavailable.

Reply 25 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

Perhaps you also need to change the ShaderVersionString parameter?
Not sure what version is appropriate though?

Is there any way to export the list of extensions to text file?

I found a list of extensions for the 6800 GT if that helps:
http://www.geeks3d.com/20110407/test-r270-51- … till-supported/

Edit -
I just tried what you said on my x1950 pro and it didn't help. I'm guessing we also need to remove some extensions. Need to find out how to do an export though so i can quickly find out which ones need removing.

Would be good to confirm appropriate values for the other parameters too.

Mind you, there could always be some other way the game checks if you are using nvidia hardware 😒

Reply 26 of 31, by teleguy

User metadata
Rank Member
Rank
Member

According to a post by a developer GL_NV_copy_depth_to_color is essential to enable soft shadows and GL_NV_copy_depth_to_color depends on GL_NV_packed_depth_stencil.

I found a program called OpenGL Extensions Viewer that has a large database on extensions supported by different videocards.

Also by pressing ctrl alt ~ you can open the console which seems to contain useful info (especially the last screenshot):

SbzEngine.exe_2015-07-09-23-49-44-623.jpg
Filename
SbzEngine.exe_2015-07-09-23-49-44-623.jpg
File size
563.4 KiB
Views
2585 views
File license
Fair use/fair dealing exception
SbzEngine.exe_2015-07-09-23-50-15-554.jpg
Filename
SbzEngine.exe_2015-07-09-23-50-15-554.jpg
File size
574.28 KiB
Views
2585 views
File license
Fair use/fair dealing exception
SbzEngine.exe_2015-07-09-23-50-42-107.jpg
Filename
SbzEngine.exe_2015-07-09-23-50-42-107.jpg
File size
581.88 KiB
Views
2585 views
File license
Fair use/fair dealing exception
SbzEngine.exe_2015-07-09-23-51-14-593.jpg
Filename
SbzEngine.exe_2015-07-09-23-51-14-593.jpg
File size
564.21 KiB
Views
2585 views
File license
Fair use/fair dealing exception

Reply 27 of 31, by silikone

User metadata
Rank Member
Rank
Member

I love that even the console shows off their light projection and normal mapping technology. The developers are very talented.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 28 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

I tried adding those two extensions but it didn't do anything.

Then, i did a winmerge and regexp with the extensions my card has compared to the 6800 list from the link above.

That made the graphics go mega corrupted which implies it was going something but it was so corrupted that i couldn't actually read the menus or console to find out if SM2.0++ was allowed.

It even went into this mode when i commented out the Nvidia vendor string and the render and version strings.

So i would say it needs more extensions than my card supports.

I'm not sure what exact extensions it needs but it would seem to be more than just the two you mentioned as it only went into that mode when i added all the extra ones from the 6800

They seem to be Nvidia specific extensions, which is weird as I thought SM2 and suchlike were "standards" but there seem to be loads of vendor specific ati and nvidia extensions in them (especially nvidia).
So maybe shader model levels are not standards at all?

Either way, I think you would have to have a wrapper to get it working, wrapping the extensions needed. First you'd need to find out which extensions it is and then reimplement them somehow if there is some documentation as to what they are supposed to do. At least that's what i imagine.

If you want to see what happens with your more recent ATI card, here are all the extensions i added:

AddExtensions = (GL_ARB_ES2_compatibility, GL_ARB_color_buffer_float, GL_ARB_copy_buffer, GL_ARB_depth_clamp, […]
Show full quote

AddExtensions = (GL_ARB_ES2_compatibility,
GL_ARB_color_buffer_float,
GL_ARB_copy_buffer,
GL_ARB_depth_clamp,
GL_ARB_explicit_attrib_location,
GL_ARB_imaging,
GL_ARB_occlusion_query2,
GL_ARB_provoking_vertex,
GL_ARB_robustness,
GL_ARB_sampler_objects,
GL_ARB_separate_shader_objects,
GL_ARB_shading_language_include,
GL_ARB_texture_rg,
GL_ARB_texture_swizzle,
GL_ARB_timer_query,
GL_ARB_vertex_array_bgra,
GL_ATI_texture_mirror_once,
GL_EXT_Cg_shader,
GL_EXT_depth_bounds_test,
GL_EXT_direct_state_access,
GL_EXT_pixel_buffer_object,
GL_EXT_provoking_vertex,
GL_EXT_separate_shader_objects,
GL_EXT_stencil_two_side,
GL_EXT_texture_compression_dxt1,
GL_EXT_texture_format_BGRA8888,
GL_EXT_texture_lod,
GL_EXT_timer_query,
GL_EXT_vertex_array_bgra,
GL_IBM_rasterpos_clip,
GL_IBM_texture_mirrored_repeat,
GL_NVX_conditional_render,
GL_NV_alpha_test,
GL_NV_blend_minmax,
GL_NV_complex_primitives,
GL_NV_copy_depth_to_color,
GL_NV_depth_clamp,
GL_NV_fbo_color_attachments,
GL_NV_fence,
GL_NV_float_buffer,
GL_NV_fog_distance,
GL_NV_fragdepth,
GL_NV_fragment_program,
GL_NV_fragment_program2,
GL_NV_fragment_program_option,
GL_NV_framebuffer_multisample_coverage,
GL_NV_half_float,
GL_NV_light_max_exponent,
GL_NV_multisample_filter_hint,
GL_NV_occlusion_query,
GL_NV_packed_depth_stencil,
GL_NV_pixel_data_range,
GL_NV_point_sprite,
GL_NV_primitive_restart,
GL_NV_register_combiners,
GL_NV_register_combiners2,
GL_NV_texture_barrier,
GL_NV_texture_compression_vtc,
GL_NV_texture_env_combine4,
GL_NV_texture_expand_normal,
GL_NV_texture_lod_clamp,
GL_NV_texture_rectangle,
GL_NV_texture_shader,
GL_NV_texture_shader2,
GL_NV_texture_shader3,
GL_NV_vertex_array_range,
GL_NV_vertex_array_range2,
GL_NV_vertex_program,
GL_NV_vertex_program1_1,
GL_NV_vertex_program2,
GL_NV_vertex_program2_option,
GL_NV_vertex_program3,
GL_OES_depth24,
GL_OES_depth32,
GL_OES_depth_texture,
GL_OES_element_index_uint,
GL_OES_fbo_render_mipmap,
GL_OES_get_program_binary,
GL_OES_mapbuffer,
GL_OES_packed_depth_stencil,
GL_OES_rgb8_rgba8,
GL_OES_standard_derivatives,
GL_OES_texture_3D,
GL_OES_texture_float,
GL_OES_texture_float_linear,
GL_OES_texture_half_float,
GL_OES_texture_half_float_linear,
GL_OES_texture_npot,
GL_OES_vertex_array_object,
GL_OES_vertex_half_float,
GL_S3_s3tc,
GL_SGIX_depth_texture,
GL_SGIX_shadow,
GL_SUN_slice_accum,
WGL_ARB_create_context_profile,
WGL_ARB_create_context_robustness,
WGL_NV_float_buffer,
WGL_NV_render_depth_texture,
WGL_NV_render_texture_rectangle);

Let me know if you can read anything with that on your card haha.

But yer, i'm not even sure if that "tricks" the game into allowing SM2.0++ anyway as it's so corrupted i can't tell - due to nvidia specific extensions that i've in effect lied to the game about having.

Reply 29 of 31, by teleguy

User metadata
Rank Member
Rank
Member

I tried the extensions you added and just the ones shown missing in the console but that also caused graphic corruption.

With GL_NV_fragment_program2 removed the game loaded normally but 2.0++ was inaccessible.

Reply 31 of 31, by Davros

User metadata
Rank l33t
Rank
l33t

I just switched to an nvidia card and the game would crash so i renamed the Win32_x86 exe file to mohaa.exe and it works
shader model 2.0 I havnt tested the other exe's yet but I'm sure they will work.

Guardian of the Sacred Five Terabyte's of Gaming Goodness