boxpressed wrote:One unrelated question. With Unreal Gold, the game alternates between the original Unreal and Return to Na Pali every time I start it. I'd like for it to always run the original Unreal, if possible unless I manually start Na Pali. Anyone have a workaround?
in case you (or anyone) is still wondering about this, I found a workaround recently. Apparently the decision to run "Unreal" or "Na Pali" is held in the Unreal.ini file. I wasn't able to find where that setting is, but if you make the file Read-Only then it will stay on "Unreal". Probably not a good idea for real play, but for benchmarking it seems to be a good solution. Side bonus is that you don't have to worry about accidental changes to the settings that would affect the test results.
To be totally verbose, what I did was the following:
Copy the Unreal.ini file to something like "1024x768x16.ini". Make sure all the settings are the way you want them, so that .ini is perfectly configured and doesn't need to change. Duplicate that file for each screen mode you want to test.
Make a shortcut for each configuration. In the shortcut, use the "-INI=1024x768x16.ini" command line parameter to specify the .ini file that it will use.
Run that shortcut until "Na Pali" comes up. Exit.
Make the .ini file Read-Only. Next time you run the game with that .ini, it will be back to the standard Unreal demo. Since the file is read-only, this will be persistent.
===============
[edit]
I wasn't aware of this UseVsync setting, but I see now that my .ini files already have a line "UseVsync=False" in the D3D section.
I'm running the GoG version of Unreal Gold. The system is a K6-3+ 450MHz Geforce2 MX, currently with nVidia driver 5.32.
I've been using Coolbits to disable Vsync, and I've been seeing tearing during the "Unreal" demo. When I was testing some later drivers I used RivaTuner and saw the same tearing.
I'm experimenting with this now. I tried changing one of my .ini files to UseVsync=True but it seems to have made no difference, either visually or in average framerate. After 11 laps at "True" and "False", my averages were virtually the same (differed by 0.02%). So I guess the disabling of Vsync via CoolBits has already been working, at least for me.
I will say though that the consistency of Unreal is driving me a bit nuts. When I did the test I just described, my averages were about 0.5fps faster than the last time I tested this exact same configuration. I've had that happen a few times now, where there is a small but definite change in the framerate (maintained over any number of laps) between separate runs of the game.
Meanwhile, Unreal Tournament has just matched it's previous results again.
Unreal is the game that I actually want to play at some point, so it seems more relevant to me, but Unreal Tournament has been more consistent to test. Also, with each driver I've tested the performance of Unreal and UT have tracked together anyway, so I may just stop testing Unreal. It's redundant and moody.