VOGONS


First post, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

Not sure if this is common knowledge or not, but I finally figured out how to crack the 100 FPS barrier in Unreal Gold.

I always suspected that Vsync was enabled in the game, even though I was using Powerstrip to disable it for D3D with my Geforce cards. Even very powerful CPU/video card combos topped out at about 85 FPS at 1024x768.

I noticed that both Unreal Tournament and Deus Ex have a line in their respective .ini files (under the D3D section) that reads UseVSync=False.

For both of these games, this line needs to be set to UseVsync=True if you want to DISABLE Vsync. Backward.

But unreal.ini (at least as installed by Unreal Gold) has no such line in the D3D section.

So I added it.

I'm using an Athlon 2400+ with a Quadro DCC (basically a GF 3). 1024x768.

Without UseVsync=True : 71 FPS
With UseVsync=True : 128 FPS

The difference isn't such an issue with this particular CPU/video card combo (71 FPS is plenty fast), but it might make a difference for other systems.

I'm still wondering if I can trust these numbers because they almost double the framerate.

P1110678.jpg

Reply 1 of 19, by Davros

User metadata
Rank l33t
Rank
l33t

Your trying to get over 100fps on a screen with a refresh of 60hz, wont that lead to tearing ?

Guardian of the Sacred Five Terabyte's of Gaming Goodness

Reply 2 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

I'm not sure -- the intro demo looked just fine. Wouldn't there be tearing on it if it were to happen in the game?

Reply 3 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

I was curious enough to see how this "hack" (LOL) scaled with a less-powerful CPU/ video card combo, so I tested with a K6-3+ 550 / GF2 MX combo. Again, testing at 1024x768 (16 bpp).

Without UseVsync=True : 33.2
With UseVsync=True : 35.3

I ran the tests twice. Not a huge difference, but somewhere between 6%-7% gain.

Reply 4 of 19, by ZellSF

User metadata
Rank l33t
Rank
l33t
Davros wrote:

Your trying to get over 100fps on a screen with a refresh of 60hz, wont that lead to tearing ?

Disabling vsync will lead to tearing regardless of if his screen exceeds the refresh rate of the monitor or not.

He's definitely getting tearing, but if he's like 99% of (lucky) people then he doesn't notice it.

Reply 5 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

Just because no test is complete without an example from a BX board, I tested with a SE440BX-2 / P3-1000 / GF2 MX200 combo. 30.82 drivers.

Without UseVsync=True : 35.8
With UseVsync=True : 50.2

A 40% bump. This is the combo where Vsync disabled may actually make a difference in real world play.

Reply 6 of 19, by Davros

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:

Disabling vsync will lead to tearing regardless of if his screen exceeds the refresh rate of the monitor or not.

Yes but enabling vsync should prevent tearing

Guardian of the Sacred Five Terabyte's of Gaming Goodness

Reply 7 of 19, by leileilol

User metadata
Rank l33t++
Rank
l33t++

With Unreal it's ass backwards. Enabling Vsync would make tearing, which is what this whole thread is about.

apsosig.png
long live PCem

Reply 8 of 19, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Very interesting.

Is it possible that this doesn't apply to the glide render? Because in all my benchmarks with Voodoo cards, I did not run into this 100 fps limit:

1783121_orig.png

YouTube, Facebook, Website

Reply 9 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

Yes, there's nothing significant about 100 FPS; I could have easily said 85 FPS because that was my personal ceiling with D3D.

I think that this applies only to D3D rendering. I could get only 51 FPS with an Athlon 2400+ and a GF 6800GT at 1600x1200, so I knew that something was off.

Reply 10 of 19, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Good to know! I admit I never tried the other renders, so I never noticed.

YouTube, Facebook, Website

Reply 11 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

One more update.

I think that the Nvidia cards may be the only ones that have problems disabling Vsync in D3D. I just tried the same test with a Radeon 9800 Pro (Catalyst 3.4), and I got the same result whether UseVsync was set to "True" or "False."

I seem to remember Nvidia drivers of this era being kind of flaky in this regard.

One unrelated question. With Unreal Gold, the game alternates between the original Unreal and Return to Na Pali every time I start it. I'd like for it to always run the original Unreal, if possible unless I manually start Na Pali. Anyone have a workaround?

Reply 12 of 19, by idspispopd

User metadata
Rank Oldbie
Rank
Oldbie

Didn't Unreal have some kind of physics bug with high frame rates?

Reply 13 of 19, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Nvidia cards, at least the early drivers, all need another tool, or coolbits, to control V-sync.

For benchmarking, I had issues with Unreal Gold on certain cards, I recommend going with UT instead. It seems to have a newer engine that is more compatible with D3D.

YouTube, Facebook, Website

Reply 14 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie
philscomputerlab wrote:

Nvidia cards, at least the early drivers, all need another tool, or coolbits, to control V-sync.

For benchmarking, I had issues with Unreal Gold on certain cards, I recommend going with UT instead. It seems to have a newer engine that is more compatible with D3D.

What kind of issues did you have? I'd read that Unreal Gold has an engine that is more like UT rather than the stock Unreal patched to 2.26. I'd have no idea whether this is true or not, but I wonder if the "UseVsync" variable would work on an older version of Unreal, say 2.24 or before.

Reply 15 of 19, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I got severe stuttering / lagging on certain Nvidia cards. Now that could be a driver issue, but UT was running thing, it was only Uneal Gold that had this issue.

YouTube, Facebook, Website

Reply 16 of 19, by boxpressed

User metadata
Rank Oldbie
Rank
Oldbie

I'd be curious if the UseVsync=True line fixed the stuttering/lagging with your Nvidia cards. Just pulled 105 FPS at 10x7 on a Duron 1600 / Quadro DCC combo, and the game looked smooth to my eyes.

Reply 17 of 19, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
boxpressed wrote:

I'd be curious if the UseVsync=True line fixed the stuttering/lagging with your Nvidia cards. Just pulled 105 FPS at 10x7 on a Duron 1600 / Quadro DCC combo, and the game looked smooth to my eyes.

It wasn't on all NV cards. Might also have been a driver issue. In the end I just moved on and removed Unreal from the benchmarks I ran on that Pentium 4 machine.

YouTube, Facebook, Website

Reply 18 of 19, by shamino

User metadata
Rank l33t
Rank
l33t
boxpressed wrote:

One unrelated question. With Unreal Gold, the game alternates between the original Unreal and Return to Na Pali every time I start it. I'd like for it to always run the original Unreal, if possible unless I manually start Na Pali. Anyone have a workaround?

in case you (or anyone) is still wondering about this, I found a workaround recently. Apparently the decision to run "Unreal" or "Na Pali" is held in the Unreal.ini file. I wasn't able to find where that setting is, but if you make the file Read-Only then it will stay on "Unreal". Probably not a good idea for real play, but for benchmarking it seems to be a good solution. Side bonus is that you don't have to worry about accidental changes to the settings that would affect the test results.

To be totally verbose, what I did was the following:
Copy the Unreal.ini file to something like "1024x768x16.ini". Make sure all the settings are the way you want them, so that .ini is perfectly configured and doesn't need to change. Duplicate that file for each screen mode you want to test.
Make a shortcut for each configuration. In the shortcut, use the "-INI=1024x768x16.ini" command line parameter to specify the .ini file that it will use.
Run that shortcut until "Na Pali" comes up. Exit.
Make the .ini file Read-Only. Next time you run the game with that .ini, it will be back to the standard Unreal demo. Since the file is read-only, this will be persistent.

===============

[edit]
I wasn't aware of this UseVsync setting, but I see now that my .ini files already have a line "UseVsync=False" in the D3D section.
I'm running the GoG version of Unreal Gold. The system is a K6-3+ 450MHz Geforce2 MX, currently with nVidia driver 5.32.
I've been using Coolbits to disable Vsync, and I've been seeing tearing during the "Unreal" demo. When I was testing some later drivers I used RivaTuner and saw the same tearing.

I'm experimenting with this now. I tried changing one of my .ini files to UseVsync=True but it seems to have made no difference, either visually or in average framerate. After 11 laps at "True" and "False", my averages were virtually the same (differed by 0.02%). So I guess the disabling of Vsync via CoolBits has already been working, at least for me.

I will say though that the consistency of Unreal is driving me a bit nuts. When I did the test I just described, my averages were about 0.5fps faster than the last time I tested this exact same configuration. I've had that happen a few times now, where there is a small but definite change in the framerate (maintained over any number of laps) between separate runs of the game.
Meanwhile, Unreal Tournament has just matched it's previous results again.

Unreal is the game that I actually want to play at some point, so it seems more relevant to me, but Unreal Tournament has been more consistent to test. Also, with each driver I've tested the performance of Unreal and UT have tracked together anyway, so I may just stop testing Unreal. It's redundant and moody.

Reply 19 of 19, by DracoNihil

User metadata
Rank Oldbie
Rank
Oldbie

I hate to be "that guy" but don't run Unreal at over 100 FPS, hell don't go over 85 FPS.

If you can try to stay within 60 FPS at all times even.

Alot of core game logic is retardedly tied to the graphics redraw rate rather than a properly timed main loop segregated from the graphics system. If you go beyond 120 FPS the game will start to speed up and slow down randomly.

“I am the dragon without a name…”
― Κυνικός Δράκων