VOGONS


Palamino faster than a Thoroughbred?

Topic actions

Reply 20 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

For some extra testing, I removed the AGP Voodoo 5500, and reinstalled the AGP FX 5200. Now, how can I overclock this thing, just to see what it can do?

There's a win98 software application installed called "Expert Tool", which allows you to increase the FX 5200's core clock (both 3D and 2D), and also its memory clock. Curiously, there's also a set of jumpers on the mobo [Epox EP-8KTA3PRO, PCB Rev 1.0] that allows you to mess about with the mobo's AGP voltage. Currently, this jumper is set to its default setting of +0.0V. I can remove this jumper, and place it on either the +0.1V, +0.2V, +0.3V, or +0.4V jumper. Presumably this will "overclock" the AGP port. Is it worth trying?

I attempted to overclock the FX 5200 card last night by using the Expert Tool application. I timidly increased the core clock to 277 mhz (from its default setting of 250 mhz), and I also increased the memory clock to 344 mhz (from its default setting of 280 mhz). Then, I ran glQuake in "1280/16 mode", and got 75.0 fps using the "timedemo demo2" console command. (Without this overclocking, the score is about 10% less at 70 fps.) However, this overclocking using Expert Tool seemed to make very little difference to my win98 pcpbench results. (I got 249 instead of my typical score of 247.)

So, do I continue to increase the overclocking values inside the Expert Tool utility, or should I alter the mobo's AGP voltage jumper, or do both perhaps?

If it all goes wrong, I may end up by accidentally "killing it with fire", which was leileilol's suggestion. 😉

Reply 21 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

I'm probably chatting away to myself here, but I did some more tests with my "crappy" FX 5200.

The game MDK2 has a useful benchmarking test facility. I ran it, using these settings -

Res 1280x1024 (best my monitor can do)
Color 32 bit
Texture - max setting
Filtering trilinear
Mipmap, Fullscreen and Hardware T&L - all selected

(In other words, everything maxed out)

MDK2 benchmark gave me 40.90 fps, which ain't bad!

But then I used the "Expert Tool" utility to increase both the core and memory clocks of the FX 5200 card. I increased the core clock from 250 to 295, and the memory clock from 280 to 390.

MDK2 benchmark then gave me 70.46 fps, which IMHO (as an OC'ing noob) is an amazing increase.

I have yet to mess about with the AGP voltage jumpers on my mobo - any thoughts on this please?

Reply 22 of 43, by prophase_j

User metadata
Rank Member
Rank
Member

Don't mess with the AGP voltage if the AGP bus is running at nominal speeds (66mhz), and even then it may not be necessary .What your doing at this it just changing the cards internal speed, and not the speed of the interface.

"Retro Rocket"
Athlon XP-M 2200+ // Epox 8KTA3
Radeon 9800xt // Voodoo2 SLI
Diamond MX300 // SB AWE64 Gold

Reply 23 of 43, by leileilol

User metadata
Rank l33t++
Rank
l33t++
retro games 100 wrote:

MDK2 benchmark gave me 40.90 fps, which ain't bad!

Oh, that's very bad. A "inferior" Geforce2 GTS from 3 generations behind the FX5200 at stock can avg. 110fps in that one on an average 1-2GHz system, and 40fps if you give it a Pentium II 400 / K6-3 450

apsosig.png
long live PCem

Reply 24 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

Interesting. Even with those maxed out graphics settings I was using? I just tried another test: Quake 3 Arena. My settings were -

GL extensions - on
Video mode - 1280x1024 (best my monitor can do)
Color depth - 32
Full screen - on
Lighting - light map
Geometric detail - high
Texture detail - extreme right
Texture quality - 32 bit
Texture filter - trilinear

(All settings were maxed out.) My OC'd FX 5200 managed 98.5 FPS on the default installed demo map called Q3DM6.

Reply 25 of 43, by elfuego

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote:
retro games 100 wrote:

MDK2 benchmark gave me 40.90 fps, which ain't bad!

Oh, that's very bad. A "inferior" Geforce2 GTS from 3 generations behind the FX5200 at stock can avg. 110fps in that one on an average 1-2GHz system, and 40fps if you give it a Pentium II 400 / K6-3 450

I know a GF 4 MX 440 - 460 can do it, but didnt know that a GF2 GTS can pull it off that much at stock.

I tested FX5200 and a MX440SE (Abit Siluro V1.0 128bit) in 3Dmark 2001SE some years ago. Overclocked MX440SE scored roughly 9k and FX5200 about 5.5k (64bit) and ~8k (128bit). As far as I remember, that was done on a Asus A7V133-C with Palomino 1600+ OC to 1.8ghz.

So - yes, it is possible that a DX7 game performs better on GF2/4MX then on FX5200. If a game uses shaders its another story.

Reply 26 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

FX5200 might be useful for DX8-level shaders but that's it. The "cards to have" from the FX line are probably the FX 5700 and 5900. They are fixed up a bit. But actually, I'd skip those too. 😜

I just got myself a very cheap original GF3. I have to say that I am really quite impressed with it. It was supposed to be a GF3 Ti500 but it's just the original, which is actually fine with me. I've never owned a GF3, was a Radeon 8500 guy. I have it in my BX system with a Celeron 1200 running XP.

I installed Unreal and UT and got the new OpenGL renderers for them. I can run the games with 2X MSAA and 4X AF at 1024x768 completely smoothly and it looks absolutely beautiful. It's better in the quality dept by a long, long way compared to a 8500.

Reply 27 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

Following on from Swaaye's post above, I am thinking of getting 2 "meaty" retro graphics cards, and wondered if an nVidia 5900 card would be a good idea, and also an ATI 9800 card.

Regarding the nVidia card, I'm not sure whether to go for the 5950 Ultra version. And regarding the ATI card, I'm not sure whether to get the Pro or the XT version.

Any thoughts please, retro oracles? 😀

Reply 28 of 43, by prophase_j

User metadata
Rank Member
Rank
Member

I had looked into those cards before, and I came to the conclusion that the FX is a bit more powerful for DX7, but pretty much everything else is better with the ATI series. Even so I had planned on getting a 5950 Ultra, but sprung for 9800xt with the right price.

"Retro Rocket"
Athlon XP-M 2200+ // Epox 8KTA3
Radeon 9800xt // Voodoo2 SLI
Diamond MX300 // SB AWE64 Gold

Reply 29 of 43, by Kiwi

User metadata
Rank Newbie
Rank
Newbie

I have owned a couple of each of those, and still have a PCX 5900 in a basic utility box I use for prepping drives, copying CDs / DVDs, making streamed CDs, stuff like that. With the various software it has aboard, I imagine that SecureROM would have a hissy, but I never tried one game on it.

I did try Oblivion on the several FXes I owned three years ago, and none of them were any use for that game. By comparison, the Radeon 9800 XT was very good, and not even the Geforce 6800 GT beat the quality of images that the older Radeon offered, although it could run the game without quite the number of slowdowns, especially looking at gates.

If you like looking at crisp, high quality image, go with the Radeon.

P. S. Back when the Ti-200 was still relatively new, and I was in a rush, I paid more for one of those than I have ever paid for any video card; above $150, from a brick and mortar. It had the shortest-lived cooling fan I ever had to replace, and came in a really HUGE box that I can see from here, with something newer inside it today. I used it to play the expansions to NWN-1.

Last edited by Kiwi on 2009-08-17, 01:24. Edited 1 time in total.

.

Kiwi

* *

Reply 30 of 43, by prophase_j

User metadata
Rank Member
Rank
Member

That sounds about right, Oblivion is a HEAVELY shaded game, with that being the weak point of the FX series, it falls flat on it's face. I just be careful about getting used ATI card like the 9800 and even the x800, I have had 3 different occasions where after getting my stuff in the mail capacitors are broken off, with all but one being rendered useless.

"Retro Rocket"
Athlon XP-M 2200+ // Epox 8KTA3
Radeon 9800xt // Voodoo2 SLI
Diamond MX300 // SB AWE64 Gold

Reply 31 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Well if you're going to be playing Oblivion, you don't want to mess around with anything retro. 😀 Just grab something recent like a Radeon 4850 and it'll eat Oblivion alive for $90 or whatever those go for now.

Yeah Oblivion really does prove how bad the GFFX series was for shader code. I've seen it firsthand too. The Radeon 9700 can run the game fine. A thing to note about the GeForce 6800 however is that it has Shader Model 3 and thus can run the game's HDR lighting. The Radeon X850 and older can not do the game's HDR because they are only Shader Model 2b. You have to take this into account when judging a X8x0 vs 6800 because HDR is much more demanding than bloom. Of course, you can always disable HDR and run bloom on the 6800 too.

Oh and also Radeon 9800/9700 lack some features that X800/X850 have. They are Shader Model 2 while the latter cards are 2b, meaning they are a little less capable. Oblivion turns off a few more effects on 9700/9800, more than just lacking HDR.

I think GeForce FX actually has a good bit more shader flexibility than 9700/9800, but it's so slow at it who cares. 😀

Reply 32 of 43, by Kiwi

User metadata
Rank Newbie
Rank
Newbie

From time to time, a game comes along that is so much fun to play that you play it again and again. Bard's Tale was like that, but on the C64, the stock C64 disk drive was so bad, I waited until the port to the PC before I replayed.

I don't remember whether I played Curse of the Azure Bonds first on the C64 or on the PC, but I played that one over and over. I owned the original Pool of Radiance out iof order, after the second Gold Box game, and only on the PC. I replayed it a few times also.

TES: Arena was another replayble classic. But the one that owned my evenings literally for months was Might and Magic Six! That was such great fun! Morrowind had seemed at first as if it would be the same, but I never even finished it once. I really wanted that CRPG version of the old Gary Gygax game outline for AD&D to be good -- Temple of Elemental Evil. The combat was good, but that was the extent of it.

I think I would enjoy Bethesda's games much more, if the forums that they sponsor didn't have such a high percentage of Nazi type moderators. They suck all of the fun right out of everything. I finished the main line story once, and played with a few Mods, and with the side quests too, but when sharing my experiences became such a tip-toeing on eggshells trial, I lost interest.

I was in a financial bind in the spring of 2006, and had to wait on a really good eBay auction to try either the 6800 GT or the X800 XT, later. With the two Geforces (there was also a 6600 GT, poor thing) in between, the X800 didn't seem particularly better looking than the 9800 had, just a lot faster than anything else I had tried at the time.

.

Kiwi

* *

Reply 33 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

All good interesting stuff, thanks guys.

I was playing Unreal last night (the first one), on my Windows XP box using an nVidia 8800 GT graphics card. It looked great. I suppose you can experiment with running old games on Windows XP. If they work, then it's useful to take advantage of a modern fast machine. In fact, looking at Dos Freak's games compatibility spreadsheet, a lot of old games seem to work on XP, and this makes me wonder if a "beast of a machine" is really needed for Windows 98 games. (But I think I'll build one anyway. 🤣 )

Reply 34 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

All of the replacement renderers and ports that are showing up for old games are really effectively reducing my need for Win9x retro rigs. Beyond just the fun of playing with old hardware, anyway.

Unreal is a really great example. Glide mode used to be by far the best way to play that and UT99. But now with the recent OpenGL and D3D9 renderers, Glide is definitely not the #1 choice.

Reply 35 of 43, by elfuego

User metadata
Rank Oldbie
Rank
Oldbie

FX 5 series VS Radeon 9k family - this is not even worth caparisoning. Radeon 9k family was the first ATI offering that beaten NV to its knees. Better quality, better speed and cheaper. Dont even think about FX 5 series. If you do, you can go for GF FX 5800 Ultra just for the kicks. People who know what I'm referring to will have a nice laugh. 😀 For all the others, there is google 😀

FX 6 series vs XT 600/8x0 series - here, FX is the winner. XT800 is nothing else then Radeon 9800 on steroids, while FX 6 supports HDR. Mind you, HDR isnt really useful on anything under 6800 / 6800 ultra. 😀

Reply 36 of 43, by leileilol

User metadata
Rank l33t++
Rank
l33t++

If you're aiming for retro support, I would recommend the Radeon 9x00, as the nVidia drivers break *a lot* of things (a majority of windows game issues started as threads here involve them). There's only two problems with using the Radeon:

- SDL and 8-bit color precision for dark blue, and even purple may be cut
- ati2evxx is a hot key polling process that will screw your CPU priority and give you laggy keyboard and mouse input

elfuego wrote:

I know a GF 4 MX 440 - 460 can do it, but didnt know that a GF2 GTS can pull it off that much at stock.

A GF4MX is essentially a rebranded GF2GTS 😉 though there's obvious clock and driver support differences that make it a tad bit faster. Even THAT card, once being the ultimate joke of nvidia hardware (prior to 2003) was much better than the FX5200 and slaughters it in performance.

apsosig.png
long live PCem

Reply 37 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I should mention that by Radeon 9x00 we mean Radeon 9500, 9600, 9700 and 9800. Radeon 9600 and some 9800s won't work in AGP 2x, however.

Avoid Radeon 8500, 9000, 9100, 9200, 9250. They are all based on Radeon 8500. Not the best choice out there from any standpoint.

I don't really see any problem using GeForce 3 or 4 (Ti or MX). You can always use old drivers if newer drivers break games. These cards were basically the best D3D and OpenGL cards of their times and game devs knew it. They are quite fast for old games and the quality is very good. I don't like GF2 or GF256 because they are often blurry.

Reply 38 of 43, by retro games 100

User metadata
Rank l33t
Rank
l33t

In 2002, I upgraded my 486 to a P4. Honestly. I really pushed the boat out, because I bought a Radeon 9700 Pro. It cost a lot of money. Unfortunately, I broke it 4 years later. I was messing about with the video cable, pulling it out then pushing it back in again (long story), and I must have put some static in to the card, because it went mad and displayed weird bits on the screen all the time. Time for a replacement. I got a 9250 cos it was fanless. (4 years of 9700 Pro fan whining really got to me!) And so the 9250 became my main graphics card until last year, when I pushed the boat out again, and got a C2D rig with an nVidia 8800 GT.

Thanks a lot for reading my life's story. 😉

Reply 39 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've seen a few Radeon 9700s "die" from RAM failures. I've fixed them by reprogramming the BIOS with lower RAM speeds. I'm not sure if that's what happened to you but the distortion is similar.