VOGONS

Common searches


First post, by thecrankyhermit

User metadata
Rank Member
Rank
Member

This isn't that old a game, but it's old enough to be giving me grief.

I have two computers, both dual core, both running Windows 7 Professional, 64-bit. The game works fine on PC-A, but crashes on PC-B.

PC-A:
Core 2 Duo 2.53Ghz
2GB RAM
Geforce 9600 GT

PC-B:
Core i3-4130 3.4Ghz
8GB RAM
Geforce 750 Ti

This is the original retail release of the game, not the remake bundled with Assault on Dark Athena. The demo also crashes on PC-B.

When I run "Riddick.exe," it goes into fullscreen mode, and very briefly shows a screen with some standard legal disclosures on it. Then it crashes to a Yes/No dialog:
"Game has crashed. Do you want to create a crashlog?"

If I say Yes, it generates a dump in subdirectory "System\Win64_AMD64." The contents are:

Unhandeled excepion

Exception type: Unknown

Exception address: 0x00000000fd88940d (C:\Windows\system32\KERNELBASE.dll!RaiseException)

StackTrace:
0x00000000fd88940d C:\Windows\system32\KERNELBASE.dll!RaiseException
(null):0
StackFrame: 0x000000000011d560

..followed by about 200KB of meaningless hex dump.

If I run SbzEngine.exe in that folder, the exact same thing happens.

The "System" subdirectory has a three other subdirectories in it. Each one has its own version of SbzEngine.exe. I've tried them all, and they crash too. The only difference is the error message for these says:
"Microsoft Visual C++ Runtime Library"
"Runtime Error!"
"Program: <dir>\Sbzengine.exe"
"abormal program termination"
..and there is no option to generate a crashlog.

The game works fine on PC-A, and it is using the Win64_AMD64 version of sbzengine.exe.

I have tried the following, on all executables:
Run as admin
Several compatibility modes
Forcing single-core affinity via command line
Rebooting with minimal startup items and services
Scouring Google for pages

The demo also works fine on my laptop, which also runs Windows 7 Professional x64, and has:
Core i5-2430M 2.4Ghz
8GB RAM
Intel HD Graphics

Running:
Windows 10
Core i5-6600
Geforce GTX 970
8GB RAM

Reply 1 of 31, by obobskivich

User metadata
Rank l33t
Rank
l33t

What drivers do you have installed for the 9600 vs the GTX 750? Wouldn't be surprised if that's the culprit - nVidia is kind of notorious for breaking backwards compatibility with driver updates.

EDIT: A bit of directed searching and it appears the issue may be related to newer nV drivers not properly identifying older OpenGL support; apparently there's a patch for the game that addresses this. Give this a look: https://forums.geforce.com/default/topic/4013 … r-bay-help-33-/

Reply 3 of 31, by thecrankyhermit

User metadata
Rank Member
Rank
Member
obobskivich wrote:

What drivers do you have installed for the 9600 vs the GTX 750?

The same. I use Geforce Experience on both machines, and always install the latest stable driver releases when available. Currently this is 347.88.

The patch unfortunately made no difference. It still crashes in the same place, the same way, on all executables. In any event, I don't think that's the issue - I'm not getting any errors about OpenGL 1.3 not being supported.

Davros wrote:

try the different exe's in riddick/system
I use Win32_x86_SSE2

I tried all of them. They all crash in the same place, only difference is 32-bit exes don't leave a crashlog.

Running:
Windows 10
Core i5-6600
Geforce GTX 970
8GB RAM

Reply 7 of 31, by SpeedySPCFan

User metadata
Rank Member
Rank
Member

This topic is sort of old but still on the first page. I found a fix for this on my Nvidia machine.

1. Patch the game to 1.1
2. Download Nvidia Inspector if you don't already have it (you should, it's really handy)
3. Open it and click the little button next to "Driver Version"
4. You should see "_GLOBAL_DRIVER_PROFILE (Base Profile)" on the top. Go ahead and type "The Chronicles" in place of that and it'll suggest some things for you.
5. Click on "The Chronicles of Riddick"
6. Scroll down to "Common" and click the text next to "Extension Limit"
7. From here, select "0x000011A8 (Medal of Honor: Allied Assault)"
8. Press "Apply Changes"
9. Set every executable in the folder you installed Chronicles of Riddick to Windows XP SP3 compatibility mode and make sure "Disable Visual Themes" is enabled. EVERY executable must have this setting.
10. You're done! Make sure you're running Riddick as Admin, or it'll freeze on start.

You already did this I believe, but you may also need to set the game to run on only one core.

Musician & music gear/game reviewer.

MIDI hardware: JD-990, SC-55, SC-880, SD-90, VL70-m, Motif ES, Trinity, TS-10, Proteus 2000, XK-6, E6400U

Reply 8 of 31, by thecrankyhermit

User metadata
Rank Member
Rank
Member

Big, big thank you to SpeedySPCFan! Forcing the extension limit through Nvidia Inspector made the game not crash any more. It fixed the demo too.

For the benefit of anyone Googling this problem and finding their way here, this was the only thing necessary to make the game work. Well, I patched it to 1.1 too, but I always patch to the latest version anyway. I did not need to use unofficial patches, or compatibility mode, or Run As Admin, or single core affinity, and the game automatically selects the 64-bit mode.

I actually had to do this many, many years ago, for SiN and Anachronox. Those games used to crash on my Geforce 9600 GT with the extension limit off. Nowadays they work fine on the current Nvidia drivers, so I assumed nVidia fixed this. It didn't at all occur to me that a game so much more advanced than the Quake II engine might have this problem.

One flaw I am seeing is that the shader effects setting seems to be limited to the second-highest setting. I may play around with the extension limit setting to see if I can find one that enables the highest.

FYI, the version on GOG is not the same game. Assault on Dark Athena does include an "Escape from Butcher Bay" campaign, but it's kind of tacked on, unpolished, and incomplete. The retail release on PC is the definitive version, as far as I'm concerned, and I think it's too bad GOG doesn't offer it.

Running:
Windows 10
Core i5-6600
Geforce GTX 970
8GB RAM

Reply 9 of 31, by thecrankyhermit

User metadata
Rank Member
Rank
Member

Played around with the extension limit.

On any of these:
Off
On
0x00000000 (UFO: Afterlight)
0x00000001 (X29)

...it just crashes.

On this:
0x000011A8 (Medal of Honor: Allied Assault)

...it works, but Shaders only go up to 2.0.

On these:
0x00001B58 (Hired Team, Quake3, Quake2, Dark Salvation, Heretic II, Quake, ...)
0x00004844 (Call of Duty: United Offensive, Return to Castle Wolfenstein, ...)

... it works, auto-Shaders will select 2.0, but 2.0++ can be manually selected.

Running:
Windows 10
Core i5-6600
Geforce GTX 970
8GB RAM

Reply 10 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

I read on GOG that

the highest shader settings were tied to one generation, and only that generation, of nVidia cards

Is this actually true or just that they couldn't get it to work?

I assume they are talking about the Shader Modell 2.0+ setting.
Does this really not work on ATI cards, even those that support SM 3?

Although i heard that the game might vendor tie this setting to nvidia cards only, even though it is a standard implementation?
Something like the required feature wasn't supported by ATI's shader model support at the time. But considering I have a newer x1950 pro that is SM3 i hope that it should support it? Or not?

Mind you, why would people say that only 6000 series nvidias can run the SM 2.0+ setting? If it is using standard features then surely it should work on later cards too or did they put some silly lock out code in the game?

I mean, at first i thought this was like the splinter cell shadow buffer thing, but from the information i'm reading it doesn't seem like it so i am not sure if i believe that it truly "only works on geforce 6000"

The problem is i can't really find any proper information on it!

Does anyone know? 😀

Also, cranky hermit, could you go into more detail about what is worse with the remake version? Thanks 😀

Reply 11 of 31, by mirh

User metadata
Rank Member
Rank
Member
notindeed wrote:

I read on GOG that

the highest shader settings were tied to one generation, and only that generation, of nVidia cards

Wait wait wait.
Is this something comparable to what Splinter Cell Pandora Tomorrow also have?

pcgamingwiki.com

Reply 12 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

I dunno - i assumed it would be but i couldn't find anything about it being shadow buffers online. From it talking about shader models, i would assume it was something different but then i'm no expert.

In the game it says it's for Geforce 6 series cards and says that it enables soft shadows. Whether it adds anything else i don't know - the description is really short.

It won't allow me to select Shader Model 2.0++ mode for my x1950 pro even though the card supports SM 3.

Does anyone know if there is a way to force it to be allowed on ATI cards. With some tweak id utility or something?
And does it work on Geforce cards later than the 6 series? Maybe again, you will have to force something?

i found a really old quote on some other forums that says:

The game doesn't support anti-aliasing but you can force it using RivaTuner (as the NVIDIA profiles don't work with it for some reason). If you have a GeForce 6x00/ATI Radeon X800 card the game defaults to using 2.0++ shaders (presumably 2.0b for the ATI cards and 3.0 for GF6 ones?) but it ran like crap on my system until I reduced it to 2.0 and now the game runs silky smooth at 60+fps at a resolution of 1280x1024 with everything maxed out and anisotropy on Full. I have Quincunx anti-alising forced via RivaTuner.

So unless that's wrong, the SM 2.0++ did work on ATI too. Maybe there's a way to make the card pretend it only supports SM 2.0b so that the game will actually try and let you run that option.
Assuming that's the case, It's really annoying when they lock things out like this as it breaks future proofing.

Thanks

Reply 13 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

Okay, i just tried it on a laptop with a newer nvidia gpu (540m i think?).

If you change the extension limit to 0x00004844 it works and allows SM2.0++ (like mentioned above).

All i can tell that it does is soften the edges of the shadows ever so slightly - quite subtle. I'll post pictures if anyone wants.
I imagine it would work fine on any nvidia card with the extension limit tweak.

In terms of ATI support, i tried modifying the game settings .cfg to see if i could force it that way but it forces itself down to SM 2.0 anyway.
I'm not sure whether it's not allowing it due to an extension limit problem or a vendor id (nvidia dirty tricks) issue.

Either way, can't get it to be allowed on ATI. I'm pretty sure it would work correctly if you could "get around the security checks" for enabling the option. I'm guessing it uses standard SM code it's just the ATI cards on release didn't support a high enough version of SM.

So yer...
If anyone knows how to force it or trick it into thinking your extension limit is lower or your vendor id is nvidia on an ati card then let me know!

Thanks

Reply 14 of 31, by silikone

User metadata
Rank Member
Rank
Member

There's a certain driver update that conflicted with Riddick. It worked out of the box when I had a Geforce 8800, including shader mode 2.0+. It would be interesting to pinpoint the exact version that killed compatibility with Riddick.
Here are some very fine screenshots.

http://i.imgur.com/9jM1EPn.jpg […]
Show full quote

9jM1EPn.jpg

http://i.imgur.com/quvyDLF.jpg […]
Show full quote

quvyDLF.jpg

http://i.imgur.com/9PGZSoG.jpg […]
Show full quote

9PGZSoG.jpg

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 15 of 31, by notindeed

User metadata
Rank Newbie
Rank
Newbie

Just in case people want to see the difference between SM2.00 and SM2.00++ here is a comparison of the simple soft shadows ++ adds (running on nvidia 540M).

2.00:
http://i.imgur.com/ZqqWt6y.jpg

2.0++:
http://i.imgur.com/WjmYA61.jpg

Bear in mind it only affects dynamic shadows - static shadow maps are always soft.

But yer, it would be nice if someone knows how to get this to work on ATI /AMD cards or if there's some way to enable an extension limit there or set vendor ID for opengl. Ati tray tools only seems to set vendor id for DX 9

Reply 16 of 31, by silikone

User metadata
Rank Member
Rank
Member
notindeed wrote:

Bear in mind it only affects dynamic shadows - static shadow maps are always soft.

Does this game even have shadows that aren't dynamic? I'm sure every single light source is real-time. Unless you are talking about the fake shadow projections.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 18 of 31, by mirh

User metadata
Rank Member
Rank
Member
silikone wrote:

There's a certain driver update that conflicted with Riddick. It worked out of the box when I had a Geforce 8800, including shader mode 2.0+. It would be interesting to pinpoint the exact version that killed compatibility with Riddick.

As I said, check Pandora Tomorrow thread.
Even there newer drivers shows different behavior.

notindeed wrote:

But yer, it would be nice if someone knows how to get this to work on ATI /AMD cards or if there's some way to enable an extension limit there or set vendor ID for opengl. Ati tray tools only seems to set vendor id for DX 9

3DAnalyzer perhaps?

pcgamingwiki.com

Reply 19 of 31, by teleguy

User metadata
Rank Member
Rank
Member
notindeed wrote:

But yer, it would be nice if someone knows how to get this to work on ATI /AMD cards or if there's some way to enable an extension limit there or set vendor ID for opengl. Ati tray tools only seems to set vendor id for DX 9

GLIntercept seems to be able to do that but I'm still trying to figure out how to use it.
https://github.com/dtrebilco/glintercept/releases

3DAnalyzer perhaps?

It appears to be DirectX only.