Reply 60 of 92, by mihai
I remember disabling VBO when I originally played - it was known to cause issues even back then.
I remember disabling VBO when I originally played - it was known to cause issues even back then.
Joseph_Joestar wrote on 2024-04-06, 04:33:Quincunx AA looks too blurry to my eyes, and it tends to also affect the game's UI, which is a big turn off for me. Granted, I only tried it a few times on a GeForce 4 Ti 4200 card, and quickly gave up after seeing the initial results.
I really liked it with Voyager Elite Force at 1600x1200 or 1920x1440. I think I was using a FX 5900 but it can be enabled on much newer hardware with NV Inspector. It is awful at lower resolutions as they tried to sell it for with like GeForce 3.
ATI added something similar with the tent filters on the HD cards. And NV now has that DSR smoothness filter option too.
It certainly is personal preference though yeah.
On the compatibility issues in general, KOTOR was mainly an Xbox game and Bioware was Nvidia-centric with their older games too. It was built for that GeForce 3/4. So it's a bit like Splinter Cell in that way. If you play it on a GeForce 3 - 7 it should run without any compatibility problems but I don't remember what it's like on newer hardware. I have a feeling it would work ok up to at least GTX 5xx cards though.
ATI on the other hand was very problematic. ATI's OpenGL sucks and had regressions with every other driver release so that was a rough ride. Catalyst 4.2 worked pretty well and so does 7.11. But I think there's a desert heat blur effect on Dantooine that caused crashes.
As for the multi-core aspect, using imagecfg to set single core affinity started with games like this back in WinXP and Athlon 64 X2 days. Hyperthreading was brand new in 2003 so probably not thoroughly tested by the developers so I would set core affinity there too. You probably want to keep games on one logical core of a single core chip anyway.
After watching a Digital Foundry video on SGSSAA and seeing it mentioned favorably on this forum, I decided to experiment with it a bit. So far, I really like the results. Also, I think being able to use this further justifies having an overpowered WinXP rig. For example, here's a screenshot from Gothic 2 (stock game, no mods) running at 1600x1200 with 8xSGSSAA on this system:
And the relevant settings from the Nvidia Profile Inspector:
Besides cleaning up the jagged edges, SGSSAA also gets rid of the shimmering that occurs when you're moving toward objects with lots of fine details, such as tree leaves or tall grass. It can be quite transformative in some older games. From what I can tell, this is one of the best forms of anti aliasing, though possibly not for every single use case, as it can be pretty demanding. Anyway, just figured I'd share my thoughts on this.
Joseph_Joestar wrote on 2024-02-21, 14:21:Hooked up my wired Xbox360 controller to this rig.
After using the Xbox 360 controller on this system all this time, I have to say it's a really awesome and well designed piece of hardware. And before someone labels me as an Xbox fanboy, I was exclusively a PlayStation guy until a year or so ago, when I bought a second hand Xbox 360. That console came with two controllers: a wireless and a wired one.
The wired controller is what I use with this rig, though there are official adapters to use the wireless one too, I just don't own one of those at the moment. I tried a bunch of games made between 2007-2016 which have built-in controller support. And guess what, all of them automatically detected the Xbox 360 controller and set it up without any issues. The button prompts are correctly displayed and rumble functions properly. No problems whatsoever, it just works.
Of course, I use the wireless controller with my Xbox 360 console, and I love that fact that it runs on two AA batteries. Why? Because you can get rechargeable batteries for a reasonable price nowadays, and they will last you a long time. This is in stark contrast to my PlayStation 3 and 4 controllers which use proprietary batteries. When those lose charge, you have to buy a (proprietary) replacement. And who knows for how long those will be made. Honestly, kudos to the Xbox 360 hardware designers for thinking about this stuff way back then.
Yes the xbox 360 controller is terrific on Windows XP. I am the original owner of both a white wired xbox 360 controller and the official "play and charge" kit which bundled the wireless controller with a Microsoft wireless dongle (USB). In terms of XP drivers, the wireless controller works identically to the wired version. I agree that the AA batteries is terrific - I use Eneloops (or IKEA LADDA) rechargeable AA batteries - I have decade old Eneloops that still work decently... One concern however is that my wired xbox 360 controller (which I bought brand new in 2007) exhibits a fair amount of stick drift...
These controllers are awesome for games like Need for Speed Porsche Unleashed, where the analogue triggers can be used for throttle/brake.
bZbZbZ wrote on 2025-02-14, 03:25:I agree that the AA batteries is terrific - I use Eneloops (or IKEA LADDA) rechargeable AA batteries
Yeah, I also use Eneloop rechargeable batteries and they've been great. When fully charged, I get about 24 hours of run time from my Xbox 360 controller, which translates to at least six days worth of gaming for me.
One concern however is that my wired xbox 360 controller (which I bought brand new in 2007) exhibits a fair amount of stick drift...
I heard that this can happen, but thankfully, I had no such issues with mine. In case it matters, my controllers are of the slightly newer "S" variety, as my console is the Xbox 360 S model.
However, I did have that unpleasant experience with one of my PlayStation 3 controllers. I bought it brand new in 2014, and it developed stick drift after just two years of use.
These controllers are awesome for games like Need for Speed Porsche Unleashed, where the analogue triggers can be used for throttle/brake.
Interesting. Do you need a DirectInput wrapper for that, or does it work out of the box? At the moment, I'm only using my Xbox 360 controller for newer XInput games, since I have a Logitech RumblePad 2 for older DirectInput titles.
Meanwhile Sony is selling some sort of pro version of their controller for a whopping 200€+ without hall effect sticks. Add in the inability of using AA batteries it's almost as if they don't want their controllers to last.
I've personally liked 8bitdo controllers for my light use, cheap, well made and many of them have hall effect sticks. Just minor gripes with their size/shape, 8BitDo Pro 2 has bit too steep angle on their triggers and 8BitDo Ultimate is surprisingly small.
Sombrero wrote on 2025-02-14, 10:41:hall effect sticks
I wasn't familiar with that term, so the first thing that came to mind was "Wait, gamepad sticks can do EAX now?". 😁
Reading up on that, it seems like pretty cool tech. Will definitively look into it some more.
Joseph_Joestar wrote on 2025-02-14, 13:46:I wasn't familiar with that term, so the first thing that came to mind was "Wait, gamepad sticks can do EAX now?". 😁
Some like their buttons clicky, maybe some like their buttons REALLY clicky 😀
Joseph_Joestar wrote on 2025-02-14, 04:45:bZbZbZ wrote on 2025-02-14, 03:25:These controllers are awesome for games like Need for Speed Porsche Unleashed, where the analogue triggers can be used for throttle/brake.
Interesting. Do you need a DirectInput wrapper for that, or does it work out of the box? At the moment, I'm only using my Xbox 360 controller for newer XInput games, since I have a Logitech RumblePad 2 for older DirectInput titles.
Yup... works out of the box. The game itself needed some work to run stable in Windows XP (I used an XP compatibility patch and nGlide, since getting Direct3D to work with my Radeon HD 5xxx was a dead end), but the xBox 360 controller with the official Microsoft driver was picked up by the game no problem!
Oh I just dropped by to say that I love your helpful and informative posts here on the forum. I actually got inspired from you and Phil to build something quite similar as my XP gaming rig.
Ironically, i'm still using the same config (only with 8gb of ram) for modern'ish gaming on Win10 (and for retro too, of course).
Hans Tork wrote on 2025-02-16, 07:48:Oh I just dropped by to say that I love your helpful and informative posts here on the forum. I actually got inspired from you and Phil to build something quite similar as my XP gaming rig.
Cheers! I also got inspired by Phil and other people here for some of my builds, so it's always nice to pass on the knowledge. 😀
KainXVIII wrote on 2025-02-16, 08:16:Ironically, i'm still using the same config (only with 8gb of ram) for modern'ish gaming on Win10 (and for retro too, of course).
Heh, this was basically my Win7 work (from home) PC back in 2013, hence the 16GB RAM. Didn't even know it had WinXP drivers at that time. 😁 I have upgraded it over the years, most notably by adding the X-Fi and GTX 970 to make it more gaming friendly.
Small update: now that I've started using SGSSAA for further enhancing WinXP era games, I realized that I need to monitor GPU utilization a bit more thoroughly. This is mostly so that I can determine the optimal SGSSAA setting for each game, as their demands can vary quite a bit. For this, I use MSI Afterburner 4.6.4 which also comes with Rivatuner Statistics Server 7.3.3. With those, I can easily see how high my GPU and CPU utilization is for each game, which is especially important if I'm using SGSSAA. For example, here's a screenshot from Prince of Persia: Sands of Time running at 1600x1200 with 8xSGSSAA:
And here's the same screenshot with 4xSGSSAA:
You can clearly see how the frame rate starts to buckle when using 8xSGSSAA on my GTX 970 in this game. Granted, this is an edge case that I've deliberately created by standing very close to the sand portal which uses a lot of transparency effects. During normal gameplay, the frame rate stays locked at 60 FPS most of the time, but I still didn't like these occasional spikes. For that reason, I decided to drop to 4xSGSSAA for Sands of Time. It's kinda funny that a game from late 2003 manages to tax a GTX 970 so much, but that's the cost of using extremely high quality anti aliasing, I guess. That said, even 4xSGSSAA is pretty clean, with some very minor shimmering during movement, so it's still a great result.
Joseph_Joestar wrote on 2024-04-03, 19:42:So far, my only use case for that are the first two Splinter Cell games (mostly the original one), due to the reliance on Shadow Buffers which only work properly on GeForce 3/4/FX cards. Thankfully, dgVoodoo2 can handle that, and Creative ALchemy can restore EAX under Win7, making Splinter Cell 1 and 2 quite playable on this rig. While I usually prefer using real hardware for retro gaming, the first Splinter Cell runs very poorly on period-correct GPUs, especially when targeting higher resolutions with fully maxed out settings. But using dgVoodoo2, I can play Splinter Cell 1 and 2 at 1600x1200 with AA and AF cranked up, while having 60+ FPS.
I've been testing this some more, and I think I've found the best dgVoodoo2 config for playing the original Splinter Cell on my system:
Relevant settings:
Running the game at 1600x1200 as its native resolution and using these settings gives me something comparable to 4xSGSSAA in terms of visuals:
This looks super clean and doesn't tax my GTX 970 too much. The jaggies and movement related shimmering are almost entirely gone. Also, the first Splinter Cell is officially supported by ALchemy under Win7, so EAX 3.0 is working fine too.
EDIT - after some further testing, it seems that doubling he resolution causes some issues with thermal vision. Similarly, forcing MSAA causes the water rendering to glitch out on the Oil Refinery level. So while it's possible to achieve a very clean look using those features, there are some drawbacks.
Small update on Splinter Cell. If you make the game use shadow projectors by setting ForceShadowMode=0 in SplinterCell.ini it will work fine even without dgVoodoo2. However, you will lose the high quality shadows and light sources. But on the plus side, in shadow projector mode, you can force 8xSGSSAA by using the anti aliasing compatibility flag 0x00000040.
With this approach, there are also no graphical glitches on the water during the Oil Refinery level. Thermal vision works properly as well. Lastly, the game will run fine on WinXP, so you can get native EAX instead of ALchemy. Speaking of that, I recently learned that there is now a fan made fix which restores EAX 3.0 functionality to the GOG and Ubisoft Connect versions of the game.
Since the original Splinter Cell worked with SGSSAA, I decided to test Pandora Tomorrow as well. And sure enough, it also works when using the same compatibility flag i.e. 0x00000040.
Those palm tree leaves never looked smoother, and they no longer produce any shimmering or flickering during movement. To address the missing light/shadow rendering on modern GPUs, I used this experimental fix. It's not 100% accurate, but it's still good enough to play through the entire game without any major issues. My usual test cases are the flashlights mounted on guard helmets and the huge floodlights in the mined courtyard near the TV station. Both of those render correctly with this fix. And best of all, it works under WinXP, so I can play the game there and get native EAX support as well.
I did run into an unrelated issue with the in-game movie playback. Updating the binkw32.dll file (as suggested by the PC Gaming Wiki) did help, but I'm not sure if the problem is completely gone.
I noticed something odd while playing certain games like Mirror's Edge and Battlefield 2 under WinXP. While this rig is more than powerful enough to run those games at 1080p @ 60 FPS, my frame time graph (via MSI Afterburner) had some unusual spikes during normal gameplay. Here's an example from the first level of Mirror's Edge:
I researched this a bit, and eventually realized that there's a setting which can smooth it out. It's called "Maximum pre-rendered frames" in older Nvidia drivers, such as 355.98 which I'm currently using. You want to set this to "1" then enable V-Sync in the Nvidia drivers, and lastly disable V-Sync in the game's video options.
Here's the same area from Mirror's Edge after the aforementioned changes were made:
No frame time spikes anymore, and the game definitively feels more responsive to me. Now, depending on the type of game that you're plying, this might not be super noticeable. However, in a fast paced platforming title like Mirror's Edge, this can make a noticeable difference. From what I gather, this is a well known thing among hardcore gamers, but I wasn't aware of it until recently.
Joseph_Joestar wrote on 2025-03-25, 18:02:From what I gather, this is a well known thing among hardcore gamers, but I wasn't aware of it until recently.
I found out about that back when The Elder Scrolls IV: Oblivion was released. My old and creaky Radeon 9800 Pro could run it reasonably only at low settings, but somehow I learned about the ATI equivalent of that setting and it made the game run way better. I remember being able to rise it up to medium settings and it still ran better than on low settings without it. Odd now that I think about it, that setting usually just reduces mouse lag.