First post, by Gahhhrrrlic
Is it normal for a P2 400 to do a better job at playing say... Unreal Tournament than a Celeron 700? Because my C700 can't even make the cut on a game from 1999. I mean seriously... is there something wrong with my computer?
It has a radeon 9500 which kicks ass. The proof of this is that no matter what resolution I set the game to, even 1280x1024, it plays just as crappy as it did at 640x480, even with all the fancy effects and AA turned on. The weird thing is, the minimum system requirements are 200MHz. So, under what circumstances is the game playable at 200 MHz, when I can't run it at 700? What settings could possibly unload 500 MHz worth of CPU load? Amazingly, I got similar "barely playable" performance from my P133 when I put a voodoo card in it and that's below the system spec and almost 600 MHz away from where I am now with a better graphics card.
I'm running this under XP with 256 MB of RAM so I'm sure somebody's going to say aha there's your problem right there. Except that I run it in W98 and it's the same story. Also, it doesn't matter how much the level is pre-cached, large scenes just grind to a halt while empty corridors are smooth as one might expect. I think the CPU is chugging but I can't figure out why unless Celerons are just that bad.
Half-Life actually plays better on my P400, which seems to support the hypothesis that the system is CPU bottlenecked.
Any education for me on this topic from someone who knows better? Gracias.
P.S. The hard drive seems to be god awful slow (about 1mb/sec) but with buffering/caching in games, I don't know if this matters.