VOGONS


Was UT built for 3DFX?

Topic actions

First post, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie

I have been playing Unreal tournament on my coppermine P3 933mhz with a voodoo 5 5500 and on and off Voodoo 3 3000, both AGP, great performance and very good FPS(60+), now i decided to install on a Northwood P4 2.6ghz(800fsb) system, same RAM size(512Mb), same OS(win XP SP2) but with a ATI Radeon 64Mb DDR card, the performance seems really crummy in comparison. so my question is, was UT really optimised for the Voodoo cards and ignored everything else, or is something else going on?
http://www.anandtech.com/show/585/26 anand seemed to have the same deal going on, but really does the difference between a P3 933 and P4 2.6 not make up for this? I'm kinda surprised as the P4 box is only making a regular frame rate of just under 40fps @ 800x600 32 bit. TBH this seems really weak, unreal gold is also averaging 38 FPS on the demo at same settings.
Well,thanks for taking the time to read,
Best,
Chris

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 1 of 7, by leileilol

User metadata
Rank l33t++
Rank
l33t++

It's less 'built for 3dfx' and more of 'sweeney's first d3d driver ever', slow from code immaturity, particularly because it was made for pre-Geforce hardware and didn't take advantage of the new T&L stuff all that much (and it's hard to anyway when all the models are vertex transformed and how UE1 works in general with the core itself providing polygons to the renderer module driver)

Also a primary difference between the Voodoo and the Radeon is paletted texture support - radeons don't do that, and UT primarly has only paletted textures. For Radeon it has to convert on the fly before uploading which could take some CPU time if it's stuff like those Fire/fractal textures.

The most canonical renderer for UT is the software renderer and it's a game built for that.

Also consider the Radeon's driver immaturity at the time of those benchmarks. The Radeon7200 had a bad launch.

apsosig.png
long live PCem

Reply 2 of 7, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
BSA Starfire wrote:

http://www.anandtech.com/show/585/26 anand seemed to have the same deal going on, but really does the difference between a P3 933 and P4 2.6 not make up for this? I'm kinda surprised as the P4 box is only making a regular frame rate of just under 40fps @ 800x600 32 bit. TBH this seems really weak, unreal gold is also averaging 38 FPS on the demo at same settings.

I don't see there the same deal going on. Your setup is for some reason strongly underperforming, unless your Radeon is VE. Also maybe UT defaulted to Glide with your 3dfx cards, giving them advantage.

Reply 3 of 7, by rick6

User metadata
Rank Member
Rank
Member

You could download a new Direct3D renderer and enjoy either Unreal or Unreal Tournament with much more image quality and performance. I've played today Unreal1 with both S3TC textures and detail textures at 16Bit color and LodBias=0 vsync=off on a Geforce 2 MX400 at 1280x1024 with excellent performance. Usually 50 to 80 FPS. It could drop a bit under 45 when in heavy fights.
Despite being at 16 bit color depth and LodBias=0 the image quality was really good. I could post later on a few screenshots if you want.
Can't really remember if the new Direct3D renderer came with the new patch or if i installed it manually.

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 4 of 7, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie

Thanks for the replies, I had a bad driver install on the Radeon I think, re-install and now all is well, averaging 58 or so FPS @1024x768, 32 bit colour on UT, I'm using the version from the Unreal anthology published in 2006. It seems the card is a VE version as the GPU and memory clocks are only running at 148Mhz, on re-install this became aparrent, so i guess this is pretty much what i can't expect. Perfectly playable and responsive anyhow 😀 Yes, the 3DFX cards were using glide.
Best,
Chris

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 5 of 7, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

It is more likely LE than VE. TnL works?

Reply 6 of 7, by BSA Starfire

User metadata
Rank Oldbie
Rank
Oldbie

Yes, your right, it is an LE, first time i have really played with these earlier ATi cards and it's all a bit confusing!

286 20MHz,1MB RAM,Trident 8900B 1MB, Conner CFA-170A.SB 1350B
386SX 33MHz,ULSI 387,4MB Ram,OAK OTI077 1MB. Seagate ST1144A, MS WSS audio
Amstrad PC 9486i, DX/2 66, 16 MB RAM, Cirrus SVGA,Win 95,SB 16
Cyrix MII 333,128MB,SiS 6326 H0 rev,ESS 1869,Win ME

Reply 7 of 7, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Radeon LE has a quirk. The LE suffix means lower clock speeds and that HyperZ is supposed to be disabled by default. It was this way for a number of driver releases. However at some point ATI stopped differentiating LEs from standard Radeons and enabled HyperZ features. The problem with this was that my LE apparently had defective Hyper Z hardware and enabling it caused geometry to have missing/flashing polygons. So if you see visual anomalies like that, you can try a few registry tweaks to disable HyperZ again.