VOGONS


Reply 20 of 31, by momaka

User metadata
Rank Oldbie
Rank
Oldbie
RubDub2k wrote on 2025-02-24, 01:42:

Yeah no spare power connectors for anything... there's one molex for the dvd drive, one FDD mini-molex (or whatever the name for the floppy power cable is), the motherboard power cable, the CPU power cable, and one SATA power cable (yeah, no IDE in this thing... High-Tech 2004 move. See the photo I attached of the power cable).

Well, I guess that sorta makes sense. After all, this was meant to be an office machine, so Dell probably figured if they don't leave any upgrade options inside the case, customers would be less likely to try upgrades and possibly screw up stuff too. And it cut (tiny) costs on the machine too, but I suppose it all adds up in the end.

RubDub2k wrote on 2025-02-24, 01:42:

So I suppose maybe it isn't thermal throttling?

Yeah, looks like it.
If it was thermal throttling, you'd get high/normal framerates... then after 10-20 minutes of playing (or however long it took for the CPU to reach over the throttling temperature), you'd get quite the stutter... and it wouldn't be going away or if it did, it would be for some seconds to half a minute, and then back to throttling.

RubDub2k wrote on 2025-02-24, 01:42:

There are a couple of areas in game that drop down into the 40s and upper 30s still, even with VSync on and this fan setup. This lasts as long as I am in that area, usually it's when I look a certain direction and am outdoors (I don't know how much half life 2 people have played here, but it's in the beginning of the game when you are starting to navigate through the canals and fight the combine). As long as I stay looking at some part of the map, the FPS will just be low. I turn 90 degrees without moving, then I'm back at 60 (without VSync, easily 90-120). Barrel explosions also cause frame drop issues too, but even after killing the combine and having no AI or explosions, I still can get those low 40s fps depending on location. Some areas are fine, but some really struggle.

Yeah, I've played HL2 over quite a few times (I think 4 or 5 total, and started 2-3 times with friends), so I know the areas (and most loading screens too) pretty well.
Indeed what you are describing is more or less normal for that era of hardware... though I should probably note here, I've never actually played HL2 on any of my higher-end Pentium 4 systems for whatever reason, so I'm not 100% certain how bad the frame dips should get. Fun fact: I've only played this game on AMD CPUs, the oldest being a Duron Applebred (Cripplebred) with a Radeon 9200 SE, which was a miserable experience... but my first for the game, so I still think highly of that system. Anyways, on the newer AMDs I've played it on, it was with kind of crappy GPUs at the time, so I never got that high of an FPS to begin with.

That said, I think you might want to turn off v-sync, as that's probably going to add a lot to the performance loss. Instead, try limiting the framerate through software like RTSS (which comes with MSI Afterburner) or simply use "FPS_MAX" command in console, then find the FPS that doesn't make your monitor tear (typically the same as the monitor's refresh rate or +/- 1 FPS.)

RubDub2k wrote on 2025-02-24, 01:42:
I'm honestly at a bit of a loss... The recommended (not minimum, those ones are lower than these) system requirements (from game […]
Show full quote

I'm honestly at a bit of a loss... The recommended (not minimum, those ones are lower than these) system requirements (from gamesystemrequirements.com and on the original box) are:

CPU: 2.4 GHz Processor
RAM: 512MB RAM
GPU: DirectX 9 level Graphics Card
OS: Windows XP/2000/Me/98

And my system's at:

CPU: P4 650 (3.4 GHz, Hyper Threading)
RAM: 3GB DDR2
GPU: 1GB DDR3 Radeon HD 5570 (Sapphire, Low Profile edition, from 2010)
HDD: 120 GB SATA III SSD
OS: Windows 2000 (Unofficial SP5.1, Unofficial Kernel Extension by BlackWingCat)

I feel I've pretty confidently ruled out the GPU and operating system at this point and concluded it must be otherwise hardware bound... I was very young when this game came out, so I didn't really play it until well beyond the Pentium 4 era. Are these frame drops just part of the Pentium 4 experience then? Was that considered normal back then to have your game fluctuate between 30-60 fps on high-end hardware?

Yes, more or less, this was normal. In particular, the P4 architecture wasn't very efficient with games, due to its long narrow pipeline. The Athlon 64 at 2 GHz pretty much crushed it in most games. Where the P4 shines is video decoding & encoding - back when GPUs weren't that great at it. A late era P4 can play 1080p H.264 pretty much without any GPU acceleration. Meanwhile, most offerings from AMD from the same era tend to get a little choppy if not very choppy without the help of a GPU.

BTW, what resolution are you running your HL2 game at? Despite the GPU handling most of the graphics, a high resolution can also have an impact on the CPU (and memory usage)... so lowering the resolution slightly should alleviate some of the hard frame dips.

RubDub2k wrote on 2025-02-24, 01:42:

I suppose this is part of the learning process... this is the first pentium 4 system I have worked on, so I don't know what's normal and what's not. Thanks for the advice everyone.

Well, unfortunately, I don't have most of my P4 systems with me here (the ones that have AGP or PCI-E anyways) as some are in storage ATM. But now that you had this experience, I'm curious if I will see the same thing on them. Perhaps when I get a chance, I'll try it out and report back here. I have some Radeon HD4670 GPUs, which I think should come pretty close to the performance of your HD5570.

RubDub2k wrote on 2025-02-24, 01:54:

While I had the case open there for the CPU test, I looked and the only non-informational lettering on them I saw was "FL", so perhaps I avoided being scathed by the capacitor plague... and to be honest, they look pretty good too, no leakage or any obvious bulging, even the ones underneath the cpu fan/heatsink.

Oh nice, those are Matsushita (Panasonic) FL series. The old ones like yours have been pretty reliable and weren't affected the same way many other ultra-low ESR motherboard capacitors were.

RubDub2k wrote on 2025-02-24, 01:54:

And I agree with that, it does make me happy that some people out there are actually taking the time to resell these old parts. It is unfortunate that finding those older computer parts is so difficult... I haven't dabbled in 286, 386 or 486 PC's much, but I am strongly against e-waste. There is so much that can still be done with these systems, but that's the downside with fast-moving tech I suppose 😒

Yeah, I try to re-purpose and re-use as many old systems as I can.
I actually got into this hobby simply because I never really got rid of my old PCs and also kept hoarding more old systems when I found them for free / really cheap. Never thought the retro PC hobby would ever come back like this... except for CRTs - now those, I knew some day they'd become valuable, as they are hard+expensive to make and to my eyes, had superior motion picture. So when everyone was throwing them away in the late 2000's, I managed to grab quite a few. Hopefully they'll last me long enough. 😁

cyclone3d wrote on 2025-02-24, 02:25:

Sooo, that system probably had the motherboard replaced during the capacitor plague. There is absolutely no other way for a SX/GX260,270, or 280 to not have bulging capacitors unless somebody replaced the motherboard or the capacitors.

Could be.
Or maybe he got lucky and his board just came with those Panny FL caps stock.
FWIW, the three options at the time for ultra-low ESR motherboard capacitors were:
Nichicon HM, HN, and HZ
Rubycon MBZ, MCZ, and MFZ (Xbox 360 custom series, mostly, and very similar to MCZ)
Panasonic FL
United Chemicon KZG, KZJ

From these, only Panny FL and Rubycon MBZ were not problematic series. MCZ was also pretty good for the most part (would occasionally go bad in very high heat areas.) HM, HN, and HZ from 2001 to 2004 were the worst - basically, guaranteed to fail. In 2005, Nichicon started fixing the issue, and some '05 datecoded caps are OK. From '06 afterwards, all should be fine.
Meanwhile, KZG and KZJ never really got fixed AFAIK. Well, modern(ish) KZG is a lot better than the old, it seems, but I still don't trust them.

Reply 21 of 31, by MikeSG

User metadata
Rank Member
Rank
Member

Both the CPU and GPU could have pumped out their thermal paste if it's never been replaced. So there is no good thermal transfer to the heatsinks. Especially if there is some improvement by using a large fan pointing at it.

Japanese caps in this era were reliable.

Reply 22 of 31, by momaka

User metadata
Rank Oldbie
Rank
Oldbie
MikeSG wrote on 2025-02-27, 13:20:

Both the CPU and GPU could have pumped out their thermal paste if it's never been replaced.

More like dried.
IIRC, Dell used a relatively thick silver-colored thermal paste (not sure what kind of paste it was though), but I've never seen it pumped out from any Dell system I serviced. Just depending on how old it was and/or how clogged it was with dust, the paste varied anywhere from "still OK" to "dry and hard like a rock".

"pumping out" is a pretty rare phenomenon in my experience. I have one very cheap n0-name thermal paste (the generic stuff on Ebay, AliExp., and Amazon) that's quite runny and it does pump out when used on bare silicon dies. But on stuff with heatspeaders, it usually still lasts quite a while with minimum of pumping out.

MikeSG wrote on 2025-02-27, 13:20:

Japanese caps in this era were reliable.

No, some of them absolutely were not - at least from the ultra-low ESR grade that i mentioned above in my last post.
Also, I forgot to mention another one there: Sanyo WF series (green caps with gold text and a "K" vent similar to Rubycon, but slightly offsided.) These should be considered replace on sight now... though I doubt you would still see any good ones at this point in time. Most should have bulged or leaked by now. If they aren't... well, that means absolutely nothing. I have a few that still aren't bulged, but they measure 2-3x their nominal capacity, which indicates high internal leakage current.... which I checked and indeed they failed that test.
Now WG and WX series from Sanyo should still be OK. I recall seeing some WG failures, but they were rare.

Reply 23 of 31, by RubDub2k

User metadata
Rank Newbie
Rank
Newbie
momaka wrote on 2025-02-26, 22:33:

Yeah, looks like it.
If it was thermal throttling, you'd get high/normal framerates... then after 10-20 minutes of playing (or however long it took for the CPU to reach over the throttling temperature), you'd get quite the stutter... and it wouldn't be going away or if it did, it would be for some seconds to half a minute, and then back to throttling.
Indeed what you are describing is more or less normal for that era of hardware... though I should probably note here, I've never actually played HL2 on any of my higher-end Pentium 4 systems for whatever reason, so I'm not 100% certain how bad the frame dips should get. Fun fact: I've only played this game on AMD CPUs, the oldest being a Duron Applebred (Cripplebred) with a Radeon 9200 SE, which was a miserable experience... but my first for the game, so I still think highly of that system. Anyways, on the newer AMDs I've played it on, it was with kind of crappy GPUs at the time, so I never got that high of an FPS to begin with.

Yeah I think it must have been normal too... I've heard that the AMD CPU's of that era were superior, but I just really like the look of this computer... Sucks that Dell didn't use AMD at the time haha.

momaka wrote on 2025-02-26, 22:33:

That said, I think you might want to turn off v-sync, as that's probably going to add a lot to the performance loss. Instead, try limiting the framerate through software like RTSS (which comes with MSI Afterburner) or simply use "FPS_MAX" command in console, then find the FPS that doesn't make your monitor tear (typically the same as the monitor's refresh rate or +/- 1 FPS.)

I still noticed practically the same dips without vsync... why would vsync add to the performance loss? Does it require extra computing power compared to just using an FPS limiter? Also, do you need an MSI GPU/motherboard to install afterburner? I've heard a lot about that program, but have never owned an MSI product...

momaka wrote on 2025-02-26, 22:33:

BTW, what resolution are you running your HL2 game at? Despite the GPU handling most of the graphics, a high resolution can also have an impact on the CPU (and memory usage)... so lowering the resolution slightly should alleviate some of the hard frame dips.

I'm running it at 1280x1024 on max settings (4x AA, as 6x AA doesn't seem to work, and high textures, shadows, etc). That's why I bought the HD5570 😀 As part of my tests to isolate the CPU I tried a variety of different resolutions and settings, and it wasn't until I lowered it all the way down to minimum settings and 640x480 that I got maybe 5 FPS back in the upper 30's-low 40's... Not worth the improvement for that much of a visual loss. Oddly enough, upping water reflection quality seemed to slow me down a bit though (I'm using the recommended "world", but I can also select "all" for the reflections). I'm surprised the HD5570 would struggle with that, given the 128bit bus width and relatively high memory/bandwidth compared to top of the line cards of the time. Unless the reflections are CPU bound to some degree?

momaka wrote on 2025-02-26, 22:33:

Well, unfortunately, I don't have most of my P4 systems with me here (the ones that have AGP or PCI-E anyways) as some are in storage ATM. But now that you had this experience, I'm curious if I will see the same thing on them. Perhaps when I get a chance, I'll try it out and report back here. I have some Radeon HD4670 GPUs, which I think should come pretty close to the performance of your HD5570.

Yeah I'd be curious to know... would make me feel a little more confident in my troubleshooting abilities haha. Looks look your GPU has a little better raw bandwidth and higher core speeds than mine from a quick google search, but mine's got a few more shading units. They should be roughly equal though, so yeah definitely throw your experiences on the thread if you do get around to it.

momaka wrote on 2025-02-26, 22:33:

Oh nice, those are Matsushita (Panasonic) FL series. The old ones like yours have been pretty reliable and weren't affected the same way many other ultra-low ESR motherboard capacitors were.

Awesome, well it sounds like I'm good at least for a little while. I'll keep that in mind if I ever stumble across more PC's from that era.

momaka wrote on 2025-02-26, 22:33:

Yeah, I try to re-purpose and re-use as many old systems as I can.
I actually got into this hobby simply because I never really got rid of my old PCs and also kept hoarding more old systems when I found them for free / really cheap. Never thought the retro PC hobby would ever come back like this... except for CRTs - now those, I knew some day they'd become valuable, as they are hard+expensive to make and to my eyes, had superior motion picture. So when everyone was throwing them away in the late 2000's, I managed to grab quite a few. Hopefully they'll last me long enough. 😁

Yeah I got into this hobby just taking apart the old family computer's growing up, and since then I've kinda always wanted to own a PC from every "Modern" operating system (any OS since PC hardware has been able to run 3D games, so Windows 95 and beyond to me). I've never paid anything more than $20 for a computer, and it's always at least had the hdd, ram, psu, cpu and case. That's kind of what makes it fun though, when you're able to find those old computers for so cheap... Currently I've got a pretty good Windows 95 machine, this pretty solid windows 2000 machine we're talking about here, a beast of an XP machine, an overkill vista machine, a pretty overkill windows 7 machine, and a pretty powerful sff windows 8.1 machine... Really only missing a solid 98 machine, so I've been keeping my eyes out for awhile for one of those (probably won't find one for cheap, but who knows). Recently got a free '06 iMac as well that needs some TLC, and a Mac Pro 1,1... got an old Mac performa from the 90s too (I need to do a little more research on it). Only have 2 CRT monitors sadly, but both are the same model of Dell Trinitron from '97 or '98 I think... working great so far, and got those for $5 a piece (couldn't say no) 😀

MikeSG wrote on 2025-02-27, 13:20:

Both the CPU and GPU could have pumped out their thermal paste if it's never been replaced. So there is no good thermal transfer to the heatsinks. Especially if there is some improvement by using a large fan pointing at it.

I replaced the thermal paste recently (some arctic paste I had lying around, so it's definitely not a paste issue). And the fan was only a minor improvement... even when I did the overkill setup with the fan right next to the heat sink, it was maybe a 5-10 fps boost tops, probably closer to 5. I also did a deep clean of the stock cooler and fan, so I doubt its them either. From everything I've experienced and what I've read, this game is just heavy on the CPU, and the Pentium 4's just weren't quite up to it. All in all the game is playable, but the immersion is lost a bit when you drop sub-40 once per level/area. Not the end of the world, but thought it couldn't hurt to see if there was anything else that could be done to get the ultimate HL2 experience 😀

Reply 24 of 31, by MikeSG

User metadata
Rank Member
Rank
Member

High shadow settings have been historically bad on the CPU.

I remember running HL2 smooth on a Geforce 4 MX 440 (Directx 7). Don't know what CPU. Sys requirements seem to be DX8.1 now, and a CPU that supports SSE. 1.7Ghz.

Your CPU is definitely up to it. May be a gain using AMDs latest drivers.

Reply 25 of 31, by momaka

User metadata
Rank Oldbie
Rank
Oldbie
RubDub2k wrote on 2025-02-28, 00:37:

I still noticed practically the same dips without vsync... why would vsync add to the performance loss? Does it require extra computing power compared to just using an FPS limiter?

As a short answer, yes.

RubDub2k wrote on 2025-02-28, 00:37:

Also, do you need an MSI GPU/motherboard to install afterburner? I've heard a lot about that program, but have never owned an MSI product...

No, you can run it on any brand of motherboard and GPU.
I have it practically on every Windows XP and 7 system that I game on.
It can also be really helpful to see how a particular game runs and what resources might be limiting it (i.e. GPU at 100%, but CPU not, or vice versa.)

RubDub2k wrote on 2025-02-28, 00:37:

I'm running it at 1280x1024 on max settings (4x AA, as 6x AA doesn't seem to work, and high textures, shadows, etc).

Ah, OK, that was pretty much considered "very high" back in the day (and 1600x1200 was rarely mentioned, tested.) 40+ FPS was considered very very good at such presets. We just gamed on lower resolutions and at lower framerates back them - at least when it came to new "AAA" games.

Coming back to my first experience of HL2 on my PC back in the days... 800x600 @ around 22-25 FPS average. 🤣 But the low FPS didn't bother me that much. In fact, I happy that the game ran at all. Funny enough, the Radeon 9200 SE and that cripplebred CPU was soooo slow with this game, that going between minimum details and max details (everything on max allowable, except for AA, which was turned off, obviously), my framerate hardly changed that much - like 3-4 FPS difference between the two. So I ran max details on 800x600.

Now, for HL2 Deathmatch online, I did drop to 640x480, and lower details, as I wanted more FPS (especially in a server with more people.) Sadly, even with that drop, I barely gained any FPS... and often times, with lots of people in one place on the map or lots of objects breaking/exploding, it wouldn't be uncommon to see my FPS dip down to 13-15FPS. Oi!! Now that's *proper* low. 🤣

RubDub2k wrote on 2025-02-28, 00:37:

As part of my tests to isolate the CPU I tried a variety of different resolutions and settings, and it wasn't until I lowered it all the way down to minimum settings and 640x480 that I got maybe 5 FPS back in the upper 30's-low 40's... Not worth the improvement for that much of a visual loss.

Agreed, indeed not worth it to go that low.
Unfortunately, these tests do confirm my suspicion that your frame dips are from CPU resource limitations.
But I'll see when I can get some of my P4 systems out and test this.

RubDub2k wrote on 2025-02-28, 00:37:

Oddly enough, upping water reflection quality seemed to slow me down a bit though (I'm using the recommended "world", but I can also select "all" for the reflections). I'm surprised the HD5570 would struggle with that, given the 128bit bus width and relatively high memory/bandwidth compared to top of the line cards of the time. Unless the reflections are CPU bound to some degree?

Not sure. But I consider the frame loss minimal, even with lower-end hardware. So I always jack up reflections to "world" in HL2 engine.

RubDub2k wrote on 2025-02-28, 00:37:

Yeah I'd be curious to know... would make me feel a little more confident in my troubleshooting abilities haha. Looks look your GPU has a little better raw bandwidth and higher core speeds than mine from a quick google search, but mine's got a few more shading units. They should be roughly equal though, so yeah definitely throw your experiences on the thread if you do get around to it.

Absolutely, will do. 😉

RubDub2k wrote on 2025-02-28, 00:37:

Only have 2 CRT monitors sadly, but both are the same model of Dell Trinitron from '97 or '98 I think... working great so far, and got those for $5 a piece (couldn't say no) 😀

Vertically flat but horizontally curvy (very slightly) screen on both? If so, those are the best Trinitrons, anyways (IMO.) Later ones (from the early 2000's) started integrating too many things onto proprietary Sony ICs and overall weren't as reliable or have as good of a picture. The 21" Sony's were even more annoying with their constant picture over-brightness issues (fixable with WinDAS... but go have fun with that! - certainly not for everyone.)

RubDub2k wrote on 2025-02-28, 00:37:

All in all the game is playable, but the immersion is lost a bit when you drop sub-40 once per level/area. Not the end of the world, but thought it couldn't hurt to see if there was anything else that could be done to get the ultimate HL2 experience 😀

Ha, I probably wouldn't even notice it. 😁
I've been a "low-spec" gamer all my life, so anything over 40-45 FPS I consider absolutely fine. That said, screen tearing does annoy me quite a bit - particularly on LCDs (I have a hard time noticing it on CRTs.) So when the PC can run a game well, I do turn on Vsync.
FWIW, the smoothest experience I've had was in CS Source on my C2Q PC with a 21" CRT @ 1280x960 V-synced at 85 Hz. Still gives even modern high-end gaming monitors a run for their money in terms of smoothness.

Reply 26 of 31, by RubDub2k

User metadata
Rank Newbie
Rank
Newbie
momaka wrote on 2025-03-01, 22:05:

No, you can run it on any brand of motherboard and GPU.
I have it practically on every Windows XP and 7 system that I game on.
It can also be really helpful to see how a particular game runs and what resources might be limiting it (i.e. GPU at 100%, but CPU not, or vice versa.)

Alright, I'll try giving that an install sometime soon here then. Sounds pretty useful

momaka wrote on 2025-03-01, 22:05:

Ah, OK, that was pretty much considered "very high" back in the day (and 1600x1200 was rarely mentioned, tested.) 40+ FPS was considered very very good at such presets. We just gamed on lower resolutions and at lower framerates back them - at least when it came to new "AAA" games.

Coming back to my first experience of HL2 on my PC back in the days... 800x600 @ around 22-25 FPS average. 🤣 But the low FPS didn't bother me that much. In fact, I happy that the game ran at all. Funny enough, the Radeon 9200 SE and that cripplebred CPU was soooo slow with this game, that going between minimum details and max details (everything on max allowable, except for AA, which was turned off, obviously), my framerate hardly changed that much - like 3-4 FPS difference between the two. So I ran max details on 800x600.

Now, for HL2 Deathmatch online, I did drop to 640x480, and lower details, as I wanted more FPS (especially in a server with more people.) Sadly, even with that drop, I barely gained any FPS... and often times, with lots of people in one place on the map or lots of objects breaking/exploding, it wouldn't be uncommon to see my FPS dip down to 13-15FPS. Oi!! Now that's *proper* low. 🤣

I really didn't get into "real" PC gaming until about 2018 or so, and at that point 60 fps was pretty much considered what "good" fps was... I built that system for $500 as I've never had extra money to throw around (at least enough for a decent GPU), so I just played on integrated graphics until pretty much this year (yeah, it took me 8 years to get a graphics card... and it's an rtx 2060 from 2018 haha). Before that desktop, I was using pretty crappy laptops with 2-4GB of ram and low-power cpu's, so I was getting pretty crap performance before then, but I got those laptops from family members as gifts for birthdays/christmas, so I wasn't complaining. Point is, I could never really play AAA games well, getting around 30 fps on low to maybe medium settings, which my friends would tell me was horrible (I didn't care all that much). It's a bit of a paradigm shift to hear that 40ish fps on max settings back then was considered "great", so that's why I ask. Thanks for the perspective

momaka wrote on 2025-03-01, 22:05:

Vertically flat but horizontally curvy (very slightly) screen on both? If so, those are the best Trinitrons, anyways (IMO.) Later ones (from the early 2000's) started integrating too many things onto proprietary Sony ICs and overall weren't as reliable or have as good of a picture. The 21" Sony's were even more annoying with their constant picture over-brightness issues (fixable with WinDAS... but go have fun with that! - certainly not for everyone.)

I attached some photos for reference, but yeah pretty boxy and mostly vertically flat. Manufacture date says May 1998, and after I opened them up and gave them a dusting/clean, they run with no issues pretty much. I think the screen here is about 14" diagonally. I attached some photos for reference. If you got any maintenance tips or other advice on these monitors feel free to let me know.

momaka wrote on 2025-03-01, 22:05:

Ha, I probably wouldn't even notice it. 😁
I've been a "low-spec" gamer all my life, so anything over 40-45 FPS I consider absolutely fine. That said, screen tearing does annoy me quite a bit - particularly on LCDs (I have a hard time noticing it on CRTs.) So when the PC can run a game well, I do turn on Vsync.
FWIW, the smoothest experience I've had was in CS Source on my C2Q PC with a 21" CRT @ 1280x960 V-synced at 85 Hz. Still gives even modern high-end gaming monitors a run for their money in terms of smoothness.

Yeah as I mentioned earlier I've also never had high end hardware in my life... I just spent about $400 this year upgrading my old PC I built in 2018, got a new motherboard, CPU, a bit more RAM and finally a GPU. Even then, the upgrade was from a 6th gen i5 to a 10th gen i5... woohoo! I've pretty much always been 3-5 years behind in hardware, but I really don't care. It works well enough and doesn't cost $2000+... RAM has gotten pretty cheap since then tho, so I think I paid $30 earlier this year for another 16 GB of DDR4 bringing my total to 24 GB... more than enough for awhile. Fingers crossed these upgrades last me another 5 years at least haha

Reply 27 of 31, by momaka

User metadata
Rank Oldbie
Rank
Oldbie
RubDub2k wrote on 2025-03-02, 20:33:

yeah, it took me 8 years to get a graphics card... and it's an rtx 2060 from 2018 haha

The situation with graphics cards really is sad today. There's absolutely nothing "in the middle" for reasonable amount of money... well, OK, there is technically - Intel. But gone are the days where you could buy a sub-$100 GPU and still have *some* ability to play any of the modern titles.

And I'm not going to even talk about the power consumption of modern GPUs.... even more sad.

RubDub2k wrote on 2025-03-02, 20:33:

It's a bit of a paradigm shift to hear that 40ish fps on max settings back then was considered "great", so that's why I ask. Thanks for the perspective

You're welcome.
Well, everything was simpler back then (I guess I will sound like an old man saying that now 🤣 ) - even the hardware reviews were. Most just ran a timed benchmark of something and posted the average FPS. The stuff where reviewers dug in and analyzed the 10% low, 1% low, framerate, and etc... that came much much later.
As for framerate... consoles were in the same boat or worse. I was never a console gamer myself, but remember that most games were capped at 30 FPS @ 720p (rarely 1080p) 0n the Xbox 360 and PS3. And on the PS3, very late gen games like The Last of Us had places in the game where the framerate really dipped in the low-mid 20 FPS.

RubDub2k wrote on 2025-03-02, 20:33:

I attached some photos for reference, but yeah pretty boxy and mostly vertically flat. Manufacture date says May 1998, and after I opened them up and gave them a dusting/clean, they run with no issues pretty much. I think the screen here is about 14" diagonally. I attached some photos for reference. If you got any maintenance tips or other advice on these monitors feel free to let me know.

I have a Trinitron from the same exact series as that one (Ultrascan 1000HS, based on D-1 chassis), but mine is the 17" model (D1025TM... CRTdatabase link here.) I found it in 2014 in a dumpster - cut cable and screen AR coating terribly scratched up. But it worked after frankenstaining a cable.
Anyways, these are exactly the last of the good series of Trinitrons that I mentioned. So definitely a nice save on your part. And being smaller 15" screens (14" viewable), yours would look perfectly fine and smooth at 1024x768. Eve 640x480/VGA won't look too blocky.

As for maintenance - glad to hear you opened yours to clean the dust. Other than that, there isn't really much else to be done with them. Just use them regularly and have fun! 😀

The only thing I do want to warn about is the beige plastic on some of these Trinitrons (and, well, other CRTs.... though not all): on certain models / series, the plastic can become extremely brittle and easy to break. If you start seeing any cracks around the corners in the bezzel or underneath where the base is, you might have to take it apart again and re-glue. The last Trinitron monitor picked up in 2022 was a 19" Trinitron that unfortunately dropped from a 10 foot high patio onto a lawn. Why? - Because the stand fell through into the case, making the monitor tilt and fall. I got it from someone on CL at the time who actually got it from an old lady. The monitor wasn't working when he got it and the case was already cracked. He taped it up and put it on CL for pickup on the rail of his porch. The tape didn't hold, the stand fell in and tilted, and then bag - shattered monitor. BUT! The tube and mainboard are still good!!! As if by miracle. When I get time some day, I need to mend back together the neck board and try it. There's a chance it might actually work.

Anyways, point of the story is, be careful if you start seeing cracks anywhere or under the case and especially how you handle the monitor. On another 19" I have, I was packing it for shipping for my moving, and as I lightly grabbed it from the side to put it in the box, my hand partially fell-through on one side of the monitor - the plastic had become that brittle (even though it wasn't really yellowed.)

Reply 28 of 31, by RubDub2k

User metadata
Rank Newbie
Rank
Newbie
momaka wrote on 2025-03-02, 22:00:

The situation with graphics cards really is sad today. There's absolutely nothing "in the middle" for reasonable amount of money... well, OK, there is technically - Intel. But gone are the days where you could buy a sub-$100 GPU and still have *some* ability to play any of the modern titles.

And I'm not going to even talk about the power consumption of modern GPUs.... even more sad.

Yeah I can't believe we're seeing the day of the $2000 gpu... kind of insane

momaka wrote on 2025-03-02, 22:00:
I have a Trinitron from the same exact series as that one (Ultrascan 1000HS, based on D-1 chassis), but mine is the 17" model (D […]
Show full quote

I have a Trinitron from the same exact series as that one (Ultrascan 1000HS, based on D-1 chassis), but mine is the 17" model (D1025TM... CRTdatabase link here.) I found it in 2014 in a dumpster - cut cable and screen AR coating terribly scratched up. But it worked after frankenstaining a cable.
Anyways, these are exactly the last of the good series of Trinitrons that I mentioned. So definitely a nice save on your part. And being smaller 15" screens (14" viewable), yours would look perfectly fine and smooth at 1024x768. Eve 640x480/VGA won't look too blocky.

As for maintenance - glad to hear you opened yours to clean the dust. Other than that, there isn't really much else to be done with them. Just use them regularly and have fun! 😀

The only thing I do want to warn about is the beige plastic on some of these Trinitrons (and, well, other CRTs.... though not all): on certain models / series, the plastic can become extremely brittle and easy to break. If you start seeing any cracks around the corners in the bezzel or underneath where the base is, you might have to take it apart again and re-glue. The last Trinitron monitor picked up in 2022 was a 19" Trinitron that unfortunately dropped from a 10 foot high patio onto a lawn. Why? - Because the stand fell through into the case, making the monitor tilt and fall. I got it from someone on CL at the time who actually got it from an old lady. The monitor wasn't working when he got it and the case was already cracked. He taped it up and put it on CL for pickup on the rail of his porch. The tape didn't hold, the stand fell in and tilted, and then bag - shattered monitor. BUT! The tube and mainboard are still good!!! As if by miracle. When I get time some day, I need to mend back together the neck board and try it. There's a chance it might actually work.

Anyways, point of the story is, be careful if you start seeing cracks anywhere or under the case and especially how you handle the monitor. On another 19" I have, I was packing it for shipping for my moving, and as I lightly grabbed it from the side to put it in the box, my hand partially fell-through on one side of the monitor - the plastic had become that brittle (even though it wasn't really yellowed.)

Sweet, good to know! Yeah I am a little worried about the plastic, I used a metal flathead screwdriver to pry the plastic clips apart in some areas when I was doing the maintenance, and it definitely had me sweating at times... I'll be using plastic spudging tools from now on for any future maintenance haha. And that's a crazy story about the trinitron you picked up... When I was talking to the dude selling the monitors, he had probably around 30 or so (he was an older gentleman that owned an in-house PC business back in the 90s-early 2000s). I'd done some research on older video game consoles and tv's, and when I saw the trinitron branding on the tv, I knew people coveted some of the trinitron tv's for the retro game consoles, so I thought it couldn't hurt to pick up two of those. So far, no issues 😀

I'm actually going to go back to that guy again and pick up a few more monitors and computer parts; are there any recommended brands anybody has for 90's to early 2000s VGA CRTs? I think the guy has a good amount of 15" and a few 17" ones left. Any brands to avoid in that case, as well?

Reply 29 of 31, by ArRoW_4_U

User metadata
Rank Newbie
Rank
Newbie

Hmm kinda tricky with the GX 280, I only have experience with the SFF version and my unit actually died long ago due to broken solder joints underneath the CPU socket. (Guess DELL was actually the first one who introduced the "Red Ring of Death haha)

One thing you can check for is if the PSU Fan actually changes speed under load or when the unit gets hot. On the SFF version with the different PSU, the fan is wired to 5V directly which never makes it spin any faster so the whole system just cooks itself to death. If your unit doesn't change fan speed then you could try to re-wire the fan to 7V which will move more hot air out of the system.

CPU wise it doesn't make sense to go higher than 3,00Ghz in these machines, not only does the PSU not provide sufficient power it also will run even hotter. I have a GX270 SFF and recently tried an P4 Prescott 3,00Ghz and it was so bad I swapped it out for the northwood one again. The Mosfets will burn which will cook the caps alive no matter how much "airflow" they get. Intel pushed really hard with "preshot" and the VRM's inside these machines just can't handle the load. (Also a reason why faster CPU's will be underclocked by the bios)

Do yourself a favor and try to get a 65W TDP Cedar Mill P4 they run at least somewhat cooler than the "preshot" your system came with.

Reply 30 of 31, by RubDub2k

User metadata
Rank Newbie
Rank
Newbie

FINAL UPDATES/CONCLUSIONS:

1) Good news, the Pentium 4 651 SL9KE does in fact work in this machine! Although it isn't officially sanctioned by Dell, it worked without any issues as far as I can tell (about 1 hour of running/benchmarking). This was using bios revision A08, so I am not sure if it would work with previous bios versions. Anyways, the point is that yes, this machine can in fact run a 65W Cedar Mill P4, which is about a 24% decrease in heat output compared to the previous 86W Pentium 4 650 Prescott that was in there before. Nice

2) As for effect on performance... I mean, it definitely didn't do worse, and all the benchmarks I did suggested it did better, but only by about 1-5 fps depending on the game tested (Return to castle Wolfenstein, Half Life 2, Minecraft). More notably was that my fps minimums (with the exception of minecraft) weren't as low as before. Where I'd be getting minimums of 45 FPS, I now found my FPS low was at 55 FPS (for return to castle wolfenstein, at least). The general trend was that while my average didn't go up significantly, it did seem to improve the critical lows a decent amount.

So, final conclusions for this thread? Cooling doesn't seem to be the limiting factor I thought it would be with this machine. After all this, it seems the desktop variant of the Optiplex GX280 seems to be able to handle an 86W Prescott pretty well thermally, assuming you've cleaned the fans/heat sink and applied new thermal paste. The Pentium 651 65W variant offered some minor improvement, but honestly, if you can only find a Pentium 650 for this machine, thermals aren't going to be your limiting factor; the P4 Netburst Architecture will. Hope this helps anyone in the future in making a decision on how to best set up one of the old machines.

Perhaps one day I'll try to get ahold of one of the mysteriously absent Pentium 4 sl9kd's... those run about 200 MHz faster. Thanks for the help everyone,

- Rubdub

Reply 31 of 31, by StevOnehundred

User metadata
Rank Newbie
Rank
Newbie
RubDub2k wrote on 2025-03-09, 06:06:

....Perhaps one day I'll try to get ahold of one of the mysteriously absent Pentium 4 sl9kd's... those run about 200 MHz faster. Thanks for the help everyone,

The SL9KD seems to be a step too far for the GX280. I've got a couple of them and neither run on any of my GX280 boards, whether they be SFF or MT versions - they happily do their thing on my Gigabyte G41 test board, however, so at least I know the CPUs aren't dead.

I presume the final BIOS (A08) is missing some code or whatever, but if anyone else has had any luck getting the Cedar Mill 661 SL9KD to run on a GX280, please let us know. And tell us the board revision 😀