VOGONS


Halo CE on a Dell Dimension 4600

Topic actions

First post, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

About a week ago, just for shits and giggles I picked up an old Dell Dimension 4600 from the local recycling center and decided to "restore" it so that it would be my "early XP era guinea pig". 🤣 It came with little more than the case, the board, the cooling system, a 2.8GHz Prescott HT, and 512MB of DDR ram, but I outfitted it with an additional 512MB of ram, a 60GB hard drive (stolen from my old iPod 5th gen 🤣), a standard 400w PSU, and a 128MB Geforce FX 5200 (which I jury-rigged a fan to since it didn't have one).

I installed Windows 2000 SP4 to it (since there's no point in wasting another XP key), and it runs fine for the most part, but for whatever reason I'm getting exceptionally low scores in 3DMark2001SE (only 4200!), and when I installed Halo CE onto it, it ran like CRAP. 🤣 The strange thing is, I once had a 2.4GHz Prescott build with 256MB of ram and a 128MB Radeon 9000, and it not only acheived higher scores in 3DMark (around the 5400 mark I believe), but it also ran the Halo demo pretty damn smoothly (there was some lag in places, but it ran great for the most part). What I don't get is, why does this system run Halo CE so poorly when it has better specs? 🤣

Reply 1 of 26, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

IIRC you can force Halo to run in various DX modes DX7,8,9 using a command line switch.

Mabye that accounts for the difference.

How To Ask Questions The Smart Way
Make your games work offline

Reply 2 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

If it's the original Dell FX5200, there's a good chance it's one of the crappy, low-clocked 64-bit versions. Though the regular FX5200 wasn't particularly great, the 64-bit versions were absolute garbage... 4200 in 3dMark01 would be about right for one of those.

Reply 3 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah that might come down to whether Halo is trying to run DirectX 8 or 9 on the 5200. The latter would run badly. Radeon 9000 would run DirectX 8.

If it's a Dell 64-bit 5200 (had one myself recently), it is pretty useless for anything but like DirectX 7 or older games or as a 2D card (which is really what they are intended for). You can tell this by whether the card has empty spots for RAM chips that would make up the 128-bit bus.

Last edited by swaaye on 2012-05-22, 18:42. Edited 1 time in total.

Reply 4 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
Old Thrashbarg wrote:

If it's the original Dell FX5200, there's a good chance it's one of the crappy, low-clocked 64-bit versions. Though the regular FX5200 wasn't particularly great, the 64-bit versions were absolute garbage... 4200 in 3dMark01 would be about right for one of those.

I think you're right. I'm well aware that the FX series has a bad rap, but I'm just surprised that a DX8-era card would beat it in terms of performance! 🤣 As well, the card didn't come with the system, but I'm pretty sure it was taken out of a similar Dell at some point in time (those systems are as common as DIRT around here 😜).

EDIT: So, what would you guys suggest for a replacement card? I was thinking a Radeon 9700 myself, but I wonder what else could work. Keep in mind, I want this system to be at least somewhat period-accurate.

Reply 5 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

The 9700/9700Pro is probably your best bet. They're not really in much demand anymore, so they can be had pretty cheap, and it'd certainly be a huge step up in performance. If you don't mind the lack of DX9, the Ti4600 is a nice card too.

If you can find a good deal on one, the 9800/Pro/XT would be a good match for that system (and was, in fact, an original factory option), but the trouble is that the Mac guys still buy up those cards for their old G4 machines, so they tend to sell for quite a bit more than the 9700s.

Reply 6 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

I was just looking through the pile of stuff I brought back from the recycling center (I've visited multiple times in the last week), and I happened across a Radeon All In Wonder 9800 Pro! 😳 It's in great shape, but it's missing its heatsink, and I need to find a DVI to VGA converter. 🤣 (the only DVI monitor I have is currently hooked to my main system).

Anyhow, how big are the heatsinks on these things? I was thinking of just taking the heatsink/fan off of my Geforce FX5200, but it's not very big, and I don't want to do anything that would screw up the card. 🤣

Reply 7 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

I wouldn't use a 5200 heatsink on a 9800. The stock 9800 heatsink wasn't all that big, but it it was made of thicker metal and had more fins than most of the cheesy OEM heatsinks used on low-end cards like the FX5200.

Do you have any spare Pentium/PIII/Athlon coolers laying around? Something like that would be more than adequate, if you can rig up some way to attach it to the card (and don't mind losing use of a PCI slot or two).

Reply 8 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
Old Thrashbarg wrote:

Do you have any spare Pentium/PIII/Athlon coolers laying around? Something like that would be more than adequate, if you can rig up some way to attach it to the card (and don't mind losing use of a PCI slot or two).

I was actually thinking of that idea myself. 😁 I have a cooler from a socket 7 system, and I also have one from this weird Duron system where the cpu was soldered right to the motherboard.

Last edited by mr_bigmouth_502 on 2012-05-23, 02:03. Edited 1 time in total.

Reply 10 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

I already fried the stock PSU. 🤣 That's why I replaced it with a standard 400w. 😁

Thanks! 😁 I'm gonna need all the luck I can get! 🤣

Reply 11 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

Actually, Dell PSUs are usually very good units... reliable, and often underrated. Quality-wise, they compare favorably to a lot of the mid-range PSU brands, probably because they're made by the same OEMs as a lot of the mid-range units.

The infamous 250W Hipro PSU in the 4600 was somewhat of an exception, but even then it was a good design, Hipro just cheaped out on the later of revisions of it, switching over to crappy capacitors. I've fixed dozens of those things, and once you replace the caps they're pretty much indestructible. I've pulled close to 350W off one before, using it as a test/bench PSU. 😳

Reply 12 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

It turns out my Radeon 9800 Pro AIW was thrown out for a reason; the damn thing doesn't work! 🤣 Windows is able to recognize it, and I was able to install the drivers for it, but the graphics output is EXTREMELY corrupted, the graphics acceleration doesn't work (none of the games I tested acknowledged that I had a 3D accelerator), and worst of all it puts out a REDICULOUS amount of heat (even with a heatsink originally intended for a Duron). I tried to milk it for what it was worth, but I guess the fact is that I have a broken card, and there's nothing I can do but get another one. 😜

Reply 13 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

If you're feeling adventurous, you could try baking the card. Sometimes malfunctions are the result of broken solder joints on the GPU or memory chips, and putting the card in the oven will melt the solder enough to reconnect the joints.

It's kind of a longshot, since 9800s aren't really known for such problems, but seeing as how the card is broken anyway, you really don't have anything to lose by trying it, other than about 10 minutes of your time.

Reply 14 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

OK, so the guide says that I should take all of the heatsinks off, but how the heck do I remove the non-GPU heatsinks without damaging the chips underneath? 🤣 As well, I've noticed that the TV tuner component sucks up a LOT of heat (it's basically a big metal box). Will it interfere with the process if I leave this on, or is there some way to remove it?

Reply 15 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I've run into a few R9700s that were unable to maintain their stock memory clocks anymore. The result is image corruption, sometimes even at the desktop. Underclocking the RAM made them work again. You can dump and edit the BIOS with a editor called Rabit and flash with ATIFlash or Flashrom (whichever works). You might need to use a PCI video card (set the system BIOS to boot PCI video first) so you can see what you're doing.

Also, I fairly recently bought a 9800 Pro 256MB. It was very unstable when I got it. The cause was ATI's heatsink paste having separated / dried out. The GPU was overheating.

BTW, I'm sure the capacitors on these cards are electrolytic, probably with a temp spec of ~100C. If you bake the card, you will likely burst the caps. I've done it myself. 😀

Last edited by swaaye on 2012-05-23, 17:17. Edited 5 times in total.

Reply 16 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

If you're talking about the small ones on the mosfets and such, those are usually held on by thermal epoxy and would be rather tricky to remove safely. Your best bet would probably be to leave 'em alone and hope for the best. I'd also leave the metal tuner shield alone. The point is just to get the GPU and memory chips hot enough to reflow the solder... ideally you want to keep heat away from the rest of the card as much as possible. The little heatsinks and metal box would theoretically help with that goal, by providing somewhat of a shield for the components they cover.

You may want to see if swaaye's suggestion works first, though, before going to drastic measures...

Reply 17 of 26, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

I've run into a few R9700s that were unable to maintain their stock memory clocks anymore. The result is image corruption, sometimes even at the desktop. Underclocking the RAM made them work again.

Also, I fairly recently bought a 9800 Pro 256MB. It was very unstable when I got it. The cause was ATI's heatsink paste having separated / dried out. The GPU was overheating.

BTW, I'm sure the capacitors on these cards are electrolytic, probably with a temp spec of ~100C. If you bake the card, you will likely burst the caps. I've done it myself. 😀

Dumb question, but how would I go about underclocking the video ram? I'm kind of a n00b when it comes to this sort of stuff. 🤣

Reply 18 of 26, by swaaye

User metadata
Rank l33t++
Rank
l33t++

You can test with ATITool 0.24 or ATI Tray Tools. These are utilities that allow you to change clocks in the fly. But this may not bring the card back to workable form once it's become unstable...

BIOS editing is the other, more permanent option. With the RaBiT BIOS editor. Dump the card's BIOS, edit it, and flash with ATIFlash or Flashrom (whichever works). You need DOS access to flash, unless ATI Winflash will work (I'm unsure about it working with a 9800).

Also check to make sure the heatsink paste on the GPU isn't dried out or separated.

You may want to run off of a PCI video card while trying to do this, so you can see what you're doing. If you set the system BIOS to boot PCI video first, the AGP card will be a secondary output.

Reply 19 of 26, by Old Thrashbarg

User metadata
Rank Oldbie
Rank
Oldbie

BTW, I'm sure the capacitors on these cards are electrolytic, probably with a temp spec of ~100C. If you bake the card, you will likely burst the caps. I've done it myself.

The 100C (actually 105C, usually) is the maximum sustained operating temperature. They can withstand much higher temperatures for short periods... the ~200C in the oven is kind of on the borderline, but it's usually OK as long as the caps are good quality ones. Most of the 9800s I've seen are at least partially outfitted with solid polymer caps, which are a bit hardier than regular electrolytics.

Like I said, though, baking is kind of a last-ditch effort, which isn't without its risks... it should really only be tried if the card would otherwise be a loss.