VOGONS


Reply 20 of 189, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Phil's Splinter Cell Retro PC Walkthrough - Part 1: Training - Geforce4 Shadow Buffers: https://www.youtube.com/watch?v=QOBMHEQR6iI

Phil's Splinter Cell Retro PC Walkthrough - Part 2: Police Station - Geforce4 Shadow Buffers: https://www.youtube.com/watch?v=AgxT3scuMZw

Phil's Splinter Cell Retro PC Walkthrough - Part 3: Defence Ministry - Geforce4 Shadow Buffers: https://www.youtube.com/watch?v=cJ5DjMZQw-U

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 22 of 189, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
Davros wrote:

pandora tommorrow also has the non power of 2 error same as crimson skies
ps: cant get it to run with 3d analyse

I only briefly tried that game. It comes with a thorough system check and it only complained about the video driver because I use the one for the first Splinter Cell. Once I have done all videos for Splinter Cell I will tackle Pandora Tomorrow.

But by the looks of it the game is very beautiful and atmospheric.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 23 of 189, by rick6

User metadata
Rank Member
Rank
Member
Mau1wurf1977 wrote:

Question: How does the top Geforce3 stack up against the Geforce4 in this game? Was the Ti500 the fastest one? Would love to test it.

I remember running it on my computer at the time, an Athlon XP 2000+ with a Geforce 3TI200 128mb oc'ed above TI500clocks, and to be fair it didn't have a stellar performance. A Geforce4 TI might be better suited for it.

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 25 of 189, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Better graphics for me. Half the time you sneak around in crouched position 😀 So frame rate is really not that important.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 26 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:

How would an original XBox with component cables look on a modern TV? Would it display correctly in 4:3 aspect ratio?

I don't have an original Xbox, but I have a PlayStation 2 with component cables (it also does 480p) - it depends on the game and the TV as to what you get. Some games on PS2 can be set to 16:9 (and are indeed rendered for WS), and if I remember right the Xbox also supported that. On the TV side, it depends on the TV's input processing capabilities. On both of my HDTVs I can set them to "Normal" which will mean 4:3 from 480i/480p, or "Wide" which will get the correct 16:9 AR (same settings used for anamorphic output on a DVD player). I would assume some TVs offer more or less options in that regard.

Mau1wurf1977 wrote:

Question: How does the top Geforce3 stack up against the Geforce4 in this game? Was the Ti500 the fastest one? Would love to test it.

No idea about this game specifically, but GeForce 4 Ti is generally much faster (better PS/VS support too). The original GeForce 3 or the Ti 500 will be the fastest options (the Ti series came out later, along with the GeForce 4 Ti, as the "new midrange" for nVidia in late 2001/early 2002). Ti 4400 or better should be faster hands down.

Mau1wurf1977 wrote:

Could it be that my FX5950 Ultra is a bit faulty? Are these two cards based on the same chip?

Not directly, no. The FX 5500 is NV34B, which is derived from the FX 5200 (NV34), which was part of the "original FX" lineup that included the FX 5600 and 5800. The 5950 is a revision of NV35 (FX 5900), which had changes/improvements made to the PS/VS engines to increase performance for SM2.0 titles.

This review contains some information about the NV30->NV35 changes:
http://ixbtlabs.com/articles2/gffx/5900u.html

They're similar, but the 5500 is not just a "cut down" 5950 as far as I understand. No idea if your specific card is faulty - I'd expect other issues if it were though, like random crashes/locks, artifacts, etc.

So yea, if someone could test this with another FX 5950 Ultra that would be awesome 😀

Is there a demo version of this game that I could test out for you? 😀

Reply 27 of 189, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:

but GeForce 4 Ti is generally much faster (better PS/VS support too)

Actually there is a lot of conflicting/false information on this DX8/DX8.0a/DX8.1 stuff. Here (this article needs serious rewriting in general) GF3 is listed as a DX8 card, while GF4 Ti is stated to support DX8.1. I believe DX8.0a/PS1.3 is correct for GF4 Ti, this is also stated on its own page. Coincidentally this is also the last version supported by Win95.

Reply 28 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:
obobskivich wrote:

but GeForce 4 Ti is generally much faster (better PS/VS support too)

Actually there is a lot of conflicting/false information on this DX8/DX8.0a/DX8.1 stuff. Here (this article needs serious rewriting in general) GF3 is listed as a DX8 card, while GF4 Ti is stated to support DX8.1. I believe DX8.0a/PS1.3 is correct for GF4 Ti, this is also stated on its own page. Coincidentally this is also the last version supported by Win95.

Even from that Wikipedia link, GeForce 4 Ti has much better VS throughput. That's what I was getting at - the GF4 only added/improved on GF3 in terms of feature-support by updating to PS1.3. Both are properly DX8.1 cards as far as I know. "Support" was probably the wrong word.

Here's the ixbt review/article on the GF4 launch:
http://ixbtlabs.com/articles/gf4/index1.html

It explores differences/improvements over GeForce 3, and provides test data. If you want to skip ahead to the benchmark comparisons they start here:
http://ixbtlabs.com/articles/gf4/index3.html ("Part II" talks about multi-monitor support, the physical cards for Ti 4400/4600, and some other stuff)

I would expect the GF3 to maintain this trend with Splinter Cell, and perform worse than the GeForce 4 Ti or higher-spec GeForce FX cards.

Reply 29 of 189, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

IIRC the 8.0 vs. 8.1 thing was relevant in HL2 which had a ton of different rendering paths. On the GF4 Ti the game would only report 8.0 or 8.0a, while the Radeon 8500 could do 8.1. A different case would be Voodoo2 which had a DirectX 7 driver released for it; nobody would call it a D3D7 card, certainly. So the differentiation is mostly in terms of feature level IMO.

Reply 30 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:

IIRC the 8.0 vs. 8.1 thing was relevant in HL2 which had a ton of different rendering paths. On the GF4 Ti the game would only report 8.0 or 8.0a, while the Radeon 8500 could do 8.1. A different case would be Voodoo2 which had a DirectX 7 driver released for it; nobody would call it a D3D7 card, certainly. So the differentiation is mostly in terms of feature level IMO.

I think in that case it's another example of Valve being in bed with ATi than anything else. The GF3/4 are generally regarded as 8.1 components when reviewed (e.g. by ixbt, Hardware Secrets, etc). But in the magical world of Valve marketing, they become DirectX 8, just like GeForce FX becomes DirectX 8.1, and GeForce 6 stops supporting SM3.0. Anything to make those Radeon cards look better for the cameras. 😉

Reply 31 of 189, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

Well the problem is, FX cards were effectively 8.1 because PS2.0a was dirt slow on them. It's not even limited to HL2.

Haven't heard about GF6 not being detected as DX9 though. I had a 6500 back in the day and I'm pretty sure it was detected as DX9.0c in this game. Should run Lost Coast as well, which requires SM3.0 for full settings I believe.

Reply 32 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t

To an extent I was being facetious, but there is truth to the Valve<->ATi connection. The GeForce FX (at least the higher-spec models) are capable of doing DX9 real-world, and I do remember running Half-Life 2 (pre-HDR) quite fine on my FX 5900 at lower resolutions (I don't recall my 9700 enabling much better resolutions/IQ enhancements and maintaining playable framerates; it just started out at SM2.0). Sure, the GeForce 6/7 did a better job, but for cards a year (or more, in the case of the 9700) older than the game it wasn't too bad. I don't remember any sort of "down-grading" happening in FarCry, Doom 3, etc and they performed fairly equivalently there. Yes, the SM2.0 performance for GeForce FX isn't stellar, but the difference is exaggerated in HL2 vs other games or 3D03. It also isn't surprising that HL2 would equally show "downgrades" for other nVidia cards (what I'd be really curious to see is how it handles third party SM2.0+ cards).

Reply 33 of 189, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

Hmm, what do you understand by "lower resolutions"? Mainstream resolution at that time would have still been 1280x1024. I ran the game at either 800x600 or 1024x768 with no AA (would be death on a 64-bit bus, lol) at medium or high settings on my 6500 and it was rather fluent IIRC. The 6500 is just a glorified 6200 while the 5900 is the top of the line card from the previous gen, with vastly superior specs on paper (it is literally twice as fast), so it is somewhat pathetic that it would run the game similarily even when forced to the lower quality DX8.1 mode.

As far as Doom3 goes, with that being an OpenGL game it isn't surprising that it runs well on the FX. It's well known that Nvidia's OpenGL driver was superior and so id cooperated with Nvidia for this one. This "Ultra Shadow" stencil shadowing stuff which NV3x is good at was used in Doom3.

Reply 34 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:

Hmm, what do you understand by "lower resolutions"? Mainstream resolution at that time would have still been 1280x1024.

Talking about HL2 specifically, I meant 800x600 as "lower resolution."

I'd also add that 1024x768 was much more mainstream in 2003-2004, at least from what I remember; it'd be nice if we could view historic versions of the Steam Survey to verify this kind of stuff though (I do recall at one point 1024x768 was something like 70-80% of machines surveyed).

I ran the game at either 800x600 or 1024x768 with no AA (would be death on a 64-bit bus, 🤣) at medium or high settings on my 6500 and it was rather fluent IIRC. The 6500 is just a glorified 6200 while the 5900 is the top of the line card from the previous gen, with vastly superior specs on paper (it is literally twice as fast), so it is somewhat pathetic that it would run the game similarily even when forced to the lower quality DX8.1 mode.

FX 5900 had no problems at 800x600 in DX9 IME; in 8.1 it would run at higher resolutions (1024x768 or above).

If you look at the other #s from the test you linked, in 3D03, you'll see the Radeon and GeForce cards being squarely competitive:
http://www.tomshardware.com/reviews/nvidia-ge … tra,630-29.html (Game 4 - Mother Nature, uses SM2.0)
http://www.tomshardware.com/reviews/nvidia-ge … tra,630-30.html

And yeah, Doom 3 was optimized for nVidia cards from the start, which is/was equally a bummer for ATi owners (in general I don't like it when developers play favorites with one OEM or another because it usually means end-users are shafted). I don't think the optimizations in Doom 3 were quite as aggressive though, for example:
http://www.tomshardware.com/reviews/nvidia-ge … tra,630-12.html

Here's an older article talking about Valve, ATi, and nVidia:
http://www.anandtech.com/show/1144

Reply 35 of 189, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

The vast majority of LCDs sold during that time was 1280x1024 though. Of course widescreens existed, but they were much more of a high-end/enthuasiast thing. While running 1024x768 on a 1280x1024 screen is quite bad due to scaling and wrong aspect ratio it was the only choice for people with slow GPUs like myself; performance literally collapsed when going 1280x1024 in every game with that lowly 6500. If you look up tests a 9800 could easily do 1600x1200 in UT2k3, and that was in mid-2003; since that performance was considered mainstream in 2004 with 6600 GT (which I remember being THE recommendation for an affordable gaming card, btw) I'd class 1280x1024 as mainstream for 04-05.

For HL2 DX9 performance on FX 5950 Ultra: http://www.youtube.com/watch?v=KwUxxLzu-uk

Reply 36 of 189, by F2bnp

User metadata
Rank l33t
Rank
l33t

1024x768 mainstream for 2003 and 2004 sounds about right. 1280x1024 was a really nice resolution and 1600x1200 was the unreachable dream for most 😀.
I remember the first time I ran Half-Life 2 on my FX 5600 XT, which in complete contrast to ATi cards, meant that this was a lower variant card. Anyway, I remember being pretty confused that the game used DX8.1. I didn't care too much though hehe.

Obobskivich, the Doom III tests you posted are a bit off. These were posted a year before the actual game came out. I think the final game benchmarks illustrate your point better:

http://www.xbitlabs.com/articles/graphics/dis … ests.html#sect0

Reply 37 of 189, by obobskivich

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:

The vast majority of LCDs sold during that time was 1280x1024 though. Of course widescreens existed, but they were much more of a high-end/enthuasiast thing. While running 1024x768 on a 1280x1024 screen is quite bad due to scaling and wrong aspect ratio it was the only choice for people with slow GPUs like myself; performance literally collapsed when going 1280x1024 in every game with that lowly 6500. If you look up tests a 9800 could easily do 1600x1200 in UT2k3, and that was in mid-2003; since that performance was considered mainstream in 2004 with 6600 GT (which I remember being THE recommendation for an affordable gaming card, btw) I'd class 1280x1024 as mainstream for 04-05.

For HL2 DX9 performance on FX 5950 Ultra: http://www.youtube.com/watch?v=KwUxxLzu-uk

I don't think LCDs were that common in 2003-2004, but yeah 1280x1024 is pretty common for an LCD, and non-native (especially different AR, which is hard to avoid with a 5:4 monitor) tends not to look very good. 😵

F2bnp wrote:

Obobskivich, the Doom III tests you posted are a bit off. These were posted a year before the actual game came out. I think the final game benchmarks illustrate your point better:

http://www.xbitlabs.com/articles/graphics/dis … ests.html#sect0

I was wondering why it said "Special Preview" 🤣 Thanks for catching that one. Still biased results, but not *as* bad. 😐

Reply 38 of 189, by Jarvik7

User metadata
Rank Newbie
Rank
Newbie

Someone describing Splinter Cell as retro suddenly made me feel very old.
Splinter Cell is still one of those "new games I need to get around to playing someday".

Reply 39 of 189, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Played another level of Splinter Cell 😀

Phil's Splinter Cell Retro PC Walkthrough - Part 4: Oil Refinery - Geforce4 Shadow Buffers: https://www.youtube.com/watch?v=FPCLOy4J7y4&feature=youtu.be

I remember playing Splinter Cell at around 2003-2004 on a Pentium 4 with a Radeon 9800. Wasn't aware of the shadow issues and the game certainly runs faster with shadow projectors. Some reviewers knew this and tested them all under this mode. Some didn't and made ATI look much faster 😀

I had a CRT at that time. My first LCD monitor I believe I purchased in 2006 or 2007:

4VKysdFh.jpg

fq72vhYh.jpg

It was the cheapest brand / model you could get. Did have a stuck pixel but I loved it. I never looked back at CRTs. Soon after the 22" 1680 x 1050 widescreen resolution revolution began.

I remember people telling me I would be fine with a 6600GT but I found performance in games such as BF2 and Fear lacking so had no choice but to return the card and upgrade to a 7800GT. To this date my most expensive video card purchase ever. It was around A$600 😵

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel