VOGONS

Common searches


Deus Ex Invisible War and GPUs

Topic actions

First post, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Does anyone else find it mind blowing that Deus Ex Invisible War is more demanding than Far Cry and Doom3? I expected the FX 5900 Ultra to do fairly well with the game because it is all PS 1.1 and NV35 has double z fillrate ability (good for stencil shadows) but this game is just a mess from a performance standpoint. It appears to just be extremely fillrate dependent, beyond reason, as if there is unbelievable overdraw for some reason.

Lookie here
http://www.xbitlabs.com/articles/graphics/dis … -27gpu2_21.html

Thief Deadly Shadows runs much better. I suppose they managed to dramatically optimize the renderer.
http://www.xbitlabs.com/articles/graphics/dis … 2_20.html#sect0

Attachments

  • deus_1600_pure.gif
    Filename
    deus_1600_pure.gif
    File size
    15.17 KiB
    Views
    2048 views
    File license
    Fair use/fair dealing exception

Reply 2 of 39, by Firtasik

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, I did complain about Invisible War in the "Crazy system requirements for its time" thread. This game was released too early. Oh, and level loading times are just awful even on modern CPUs. However, I like this game, it's a guilty pleasure for me. 😊

11 1 111 11 1 1 1 1 1 11 1 1 111 1 111 1 1 1 1 111

Reply 4 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The shoehorned Carmack's reverse shadows were a big deal.

Bloom on the other hand is just mostly copying the screen to a texture buffer, blurring it creatively (like copying it into a smaller buffer and adding that as a quad to the screen in several times

apsosig.png
long live PCem

Reply 5 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The bloom does have some impact but not that much. I saw some people say the bloom is partly intended to reduce aliasing, similar to Quincunx. Considering people were forced to run low resolutions, the blur was less than desirable at the time. But at say 1600x1200 it is a nice effect.

I suggest either no vsync or using D3DOverrider to force vsync+triple buffering. The game double buffers and so you lose quite a bit of speed if you use its vsync. You'll be stuck at fractions of the refresh rate. This is even with the default.ini set to triple buffer by default.

Reply 6 of 39, by AidanExamineer

User metadata
Rank Member
Rank
Member

I always figured it just wasn't optimized at all.

I needed to get some video capture for a review/thing and installed the game on my current workhorse machine (i5-3470, 2GB GTX 660 SC, 16GB RAM, Seagate hybrid drive).

Chugged like a bitch. Crashed like a biiiiiitch. Looked like craaaaaaaaaaaaaaaaaaaaap. Ended up getting all of my capture on the Xbox version, which has a deplorable framerate but is at least consistent, and stable. The lighting effects are still better than most games bother with, so that's nice.

Reply 7 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I played most of the game on my i5 2500 + Radeon 6950 machine and it ran really well. I even had AMD 4X supersampling forced I think and was running it at 1080p. I also hex edited widescreen support into it.

I'm not sure if I've tried it on any recent NV hardware....

I have not beaten the game though. I finally lost interest in whatever was going on when I got to Antarctica. The game areas are just so small and cheesy that I found it difficult to get immersed in. It has some nice atmosphere at times though.

Reply 8 of 39, by F2bnp

User metadata
Rank l33t
Rank
l33t

This is still pretty baffling. How did they mess up this bad? I remember a friend of mine playing through the game on a Pentium 4 2.0GHz and Ti 4200 and he obviously had a ton of problems and crashes. He was also a big fan on the first game and was majorly disappointed with this one.

Reply 9 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

This is still pretty baffling. How did they mess up this bad? I remember a friend of mine playing through the game on a Pentium 4 2.0GHz and Ti 4200 and he obviously had a ton of problems and crashes. He was also a big fan on the first game and was majorly disappointed with this one.

I found an article about the development of Thief Deadly Shadows. As you may know, T:DS and DXIW are similar technology and were developed somewhat in parallel. Overall it sounds like their development methodology was the main problem. For the technology specifically, they went too far with fancy effects and didn't have the skill to get it working well. Thief did end up running much better though, probably because they had more time to improve it.

http://www.rockpapershotgun.com/2007/09/21/ma … deadly-shadows/

Reply 10 of 39, by F2bnp

User metadata
Rank l33t
Rank
l33t

Which reminds me I should play the rest of the Thief games (with the exception of the last one probably). I tried Dishonored once or twice but it didn't do the trick for me. However, since I played and beat (and loved) Thief 1, I may enjoy it far more now 😀.

Thief 3 looked spectacular back then! I kinda miss 2004-2005, when doing impressive shadow effects was all the rage. Doom 3, Thief 3, F.E.A.R., Chronicles of Riddick...
Speaking of which, it's a shame the xbitlabs article (which btw rocks, thank you very much swaaye 😀 ) doesn't cover Chronicles of Riddick. Does anybody have a comprehensive article on how this game performed? Should have been mostly a breeze for a 6800GT/Ultra, I reckon!

Reply 11 of 39, by AidanExamineer

User metadata
Rank Member
Rank
Member

I imagine it was the result of developing for Xbox, and porting to PC. The PC version acts almost identically to the Xbox version, and doesn't seem to understand it's running on a PC. Also as Phil likes to remind people, Splinter Cell was developed for a very specific GPU, could be the same kind of problem with IW.

It's possible their test units were more powerful than retail Xbox units, leading to overestimating the acceptable framerate targets. I'm talking in the teens, not 20s.

Reply 12 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It just sounds like they wanted to use the stencil shadows, copious bump mapping and fancy lighting really bad and that necessitates small areas with low polygon counts. Doom3 is the same deal. But it sounds like even with simplistic areas they still couldn't get it working very well. It's amazing they even considered making a Deus Ex sequel this way. The original game was partly known for its large open areas.

Besides the technology, the gameplay is a mess of fruity game design philosophy too. It's fairly similar to other Ion Storm tales. It's interesting that the original game came out as well as it did.

Reply 13 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

Which reminds me I should play the rest of the Thief games (with the exception of the last one probably). I tried Dishonored once or twice but it didn't do the trick for me. However, since I played and beat (and loved) Thief 1, I may enjoy it far more now 😀.

Thief 3 looked spectacular back then! I kinda miss 2004-2005, when doing impressive shadow effects was all the rage. Doom 3, Thief 3, F.E.A.R., Chronicles of Riddick...
Speaking of which, it's a shame the xbitlabs article (which btw rocks, thank you very much swaaye 😀 ) doesn't cover Chronicles of Riddick. Does anybody have a comprehensive article on how this game performed? Should have been mostly a breeze for a 6800GT/Ultra, I reckon!

Thief 3 is enjoyable. It still has a lot of loading areas in each mission like DXIW and those are annoying, but the gameplay is quite good and the atmosphere is great.

I played and beat the new Thief. It's not exactly bad but it's certainly not a great game. I didn't enjoy their attempt at making it open-world-ish / hub based. It's confusing to navigate and there really isn't much point unless you really get into looking for any random loot you can find. You get to hear the same NPC conversations 100 times. The missions play rather similar to Thief 3 though (some of it is clearly inspired by Thief 3). The AI that is often complained about works like in most stealthy games. However, aside from the weak open-world aspect, the story isn't exactly thrilling or well-developed. I hated the intro sequence of the game, chasing the idiotic rebel girl-friend-protege-whatever, that set up the story (not a good sign). And at the end the final cinematic doesn't even match up to the end gameplay. Not exactly impressive overall.

I think Dishonored is definitely the better game. I can understand why Dishonored isn't for everyone though. I liked it but am also mixed on some of its aspects because I am not really extremely into its steampunk thing.

Reply 14 of 39, by F2bnp

User metadata
Rank l33t
Rank
l33t

-Yeah, the whole AI thing was a bit weird to read. It sounded like it worked exactly like every stealth game ever, just like you said.

-The story is a mess, because of development hell. I've done some studying and it seems to me like they had 3 different stories at 3 different points in development and decided to combine them into one. This doesn't seem to work however.

-About Dishonored, I was really off put by the steampunk setting, it had something to do with whales or whatever? I like steampunk settings, but it certainly didn't click to me this time!

Reply 15 of 39, by AidanExamineer

User metadata
Rank Member
Rank
Member

I'm generally not a fan of when developers make a sequel to somebody else's game. Human Revolution was a pretty good game, but man it just has nothing to do with Deus Ex.

It looked like New Thief 4 was the same story.

Reply 18 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I threw together a Opteron X2 2.4 GHz w/ PCIe setup to try some other cards at 1600x1200. I tried X850XT PE, X1950XTX and a factory-overclocked 3850 512MB. Surprisingly they all ran the game similarly! Very smooth in general with some minor stutter in large open areas (could be CPU related). The 3850 is something like 3x faster than a X850XT PE so CPU is likely a bottleneck.

I don't have any NV GF 7 - 9 cards at the moment unfortunately.

I think the game has problems on dual cores (play speed seems to deteriorate / get jumpy over time) so I got imagecfg and set it to run on one core. At one point the game speed was jumping / lurching along with turbo speed-pause-turbo-pause....

Reply 19 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Ok so I sold a GTX 560 Ti on EBay awhile back and it got returned because the buyer didn't understand that she needed a low profile card...

So I put the 560 in and tried it. Runs DXIW at a solid 60fps at 1600x1200 as it should. On the same Opteron x2 2.4 GHz machine. I wasn't sure the card would work on that first gen PCIe of nForce4 but it does. I've read Radeon 5000+ cards are trouble.

Overall it seems like you want at least X800XT or 7800 GTX for 1600x1200 DXIW. 6800 Ultra is noticeably slower than X800XT for some reason. Possibly just the difference in raw fillrate.