VOGONS


First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hi,
after using an old Barton config for sometime now, I definetely reconsidered these machines as "usable" still today. Obviosuly everything must be pushed to its limits for both hardware choices and with lighter possible software but with a little 'patience reconfiguration' I think they're just ok. (I'd even say that probably the "single core concept" has some benefits too somewhere even todays)
Now I was asking myself what about gaming with nowdays games. I mean obviously we can't even think about details or resolution change above the minimun, but if we set the lowest bar on resolution like 640x480, with the highest hardware, are these 2003/3004/2005 32bit machines capable of moving some frames for seconds?
I can't try not having win os or newer games like Doom 2016, but did anyone try?
Thank

Reply 1 of 19, by Koltoroc

User metadata
Rank Member
Rank
Member

The results will be very disappointing unfortunately. Most modern games will not even run on those machines.

Assuming you mean period correct hardware including GPU, everything that needs directx 11 will be unable to run which excludes most games from around 2012 onwards since that was around the time DX9 started to be phased out. since about 2014 you also have the problem that games increasingly are 64bit only. In fact it is now so common, that some games don't even bother to advertise that fact anymore. That pretty much kills your 32bit systems right there.

however even 64bit systems of that time are also out of luck once you go to DX12 games, since windows 10 64bit needs some instructions (CMPXCHG16b) that are not present in many earlier 64bit cpus and I can't think of any DX12 game that has a 32bit version.

As for the performance to expect, I think "seconds per frame" is probably more in the realm of what to expect than fps, especially since many modern games are really, really poorly optimized.

BTW, Doom 2016 will not run on period correct hardware of that timeframe. 64bit executable, 4+GB ram usage at times, OpenGL 4.3/4.5 or Vulkan (directx 11 hardware level) Interestingly enough, If this game could run I would expect it to be probably one of better performing ones since it seems to be really well optimized for a change.

TL;DR: probably not going to work at all, performance likely less than unplayable.

Reply 2 of 19, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Problems with old hardware vs. new games :
1) Dual Core only (I mean you can't install/launch without proper two cores inside CPU, and Hyper Threading does not count as "core".
First Dual Cores were made available in 2005, but you can use Server grade stuff with Dual Sockets to bypass this limitation. I DO NOT know how newer games react to this idea tho.
2) SSE2 or higher only (You need at least a Athlon 64 or Pentium 4 to run Steam these days...)
3) 64-bit : If game can run on Windows 7 x64, and your CPU has 64-bit support - you are "OK" here.
4) DirectX 11...
Just buy a PCI-e DirectX 11 gpu and be done with it (Radeon 5770 or GTX 460 are cheap these days). Also DirectX 10 class hardware isn't known for best longevity (GF 8800/9800 series, Radeon 4800 series, and some other cards).

As for DOOM 2016, oldest CPU I ran it on so far was Pentium Extreme Edition 840 (from 2005) : LINK
It works fine... when CPU is OC'ed 😜
PS. A FX-60 would be OK as well for Doom 2016, but Vulkan API hates 2 or less threads, so it's a no-go for AMD.
^For non belivers in 2 thread Vulkan problem, here's a video : LINK

Last edited by agent_x007 on 2016-12-18, 16:48. Edited 1 time in total.

157143230295.png

Reply 3 of 19, by Munx

User metadata
Rank Oldbie
Rank
Oldbie

I would have to assume that 2002-2004 hardware, like Radeon R300 with its DX9 support, would still be "usable" (as in stuff could start up and wouldn't give an error message right away) today with more casual and 'e-sports' games, as they are often DX9 - League of Lengends, DotA2 and other Source games like TF2, etc.

However that is pretty much it, as games today rely more and more on new APIs, instruction sets, etc. Even CPUs that still got some good horsepower left like Phenom II X6 are having problems as more games are requiring SSE4.1

My builds!
The FireStarter 2.0 - The wooden K5
The Underdog - The budget K6
The Voodoo powerhouse - The power-hungry K7
The troll PC - The Socket 423 Pentium 4

Reply 5 of 19, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
emosun wrote:

I once had a 1999 dual pentium 3 system playing crysis. all on the low setting but still pretty amazing none the less.

Same here! My single PIII-1575 managed to cough up 21.3 fps at 800x600/low.

P6 chip. Triple the speed of the Pentium.
Tualatin: PIII-S @ 1628MHz | QDI Advance 12T | 2GB DDR-310 | 6800GT | X-Fi | 500GB HDD | 3DMark01: 14,059
Dothan: PM @ 2.9GHz | MSI Speedster FA4 | 2GB DDR2-580 | GTX 750Ti | X-Fi | 500GB SSD | 3DMark01: 43,190

Reply 6 of 19, by Anonymous Freak

User metadata
Rank Member
Rank
Member

If you want "newest game on oldest hardware", I'd have to say this Kickstarter would fit the bill: https://www.kickstarter.com/projects/stirring … nd-commodore-64

Although it's going to be a Commodore 64 game. ("PC" game in the form of shipping with a C64 emulator, so you'll actually be playing the C64 game when you play it on your modern PC.)

Reply 8 of 19, by Munx

User metadata
Rank Oldbie
Rank
Oldbie
emosun wrote:

I once had a 1999 dual pentium 3 system playing crysis. all on the low setting but still pretty amazing none the less.

Thats not a "new" game though. 2007 was almost a decade ago and Tualatins were just 5-6 years old then.

That would be like running newest games today on sandybridge i7 (which I'm sure tons of peoplestill do)
Hell, Im still on the crappy Bulldozer and I can still run todays games with acceptable framerate.

My builds!
The FireStarter 2.0 - The wooden K5
The Underdog - The budget K6
The Voodoo powerhouse - The power-hungry K7
The troll PC - The Socket 423 Pentium 4

Reply 9 of 19, by sf78

User metadata
Rank Oldbie
Rank
Oldbie
Munx wrote:

Hell, Im still on the crappy Bulldozer and I can still run todays games with acceptable framerate.

Right on! My main rig is a Phenom II with 670 GTX and I just started playing the latest Deus Ex and it runs just fine on med-high settings.

Reply 10 of 19, by 386SX

User metadata
Rank l33t
Rank
l33t
agent_x007 wrote:
Problems with old hardware vs. new games : 1) Dual Core only (I mean you can't install/launch without proper two cores inside CP […]
Show full quote

Problems with old hardware vs. new games :
1) Dual Core only (I mean you can't install/launch without proper two cores inside CPU, and Hyper Threading does not count as "core".
First Dual Cores were made available in 2005, but you can use Server grade stuff with Dual Sockets to bypass this limitation. I DO NOT know how newer games react to this idea tho.
2) SSE2 or higher only (You need at least a Athlon 64 or Pentium 4 to run Steam these days...)
3) 64-bit : If game can run on Windows 7 x64, and your CPU has 64-bit support - you are "OK" here.
4) DirectX 11...
Just buy a PCI-e DirectX 11 gpu and be done with it (Radeon 5770 or GTX 460 are cheap these days). Also DirectX 10 class hardware isn't known for best longevity (GF 8800/9800 series, Radeon 4800 series, and some other cards).

As for DOOM 2016, oldest CPU I ran it on so far was Pentium Extreme Edition 840 (from 2005) : LINK
It works fine... when CPU is OC'ed 😜
PS. A FX-60 would be OK as well for Doom 2016, but Vulkan API hates 2 or less threads, so it's a no-go for AMD.
^For non belivers in 2 thread Vulkan problem, here's a video : LINK

Thanks for your answers!

As I imagined it seems that most the time will be checked if the cpu supports or not specifica instructions or dual core presence. Probably I can imagine many games are developed to be generally adaptable to both PS and Xbox and also PC resulting in very unoptimized software. I don't understand why all these new cpu instructions are so important anyway, did SSE4.x improved "that much" computing speed in a game?

Reply 11 of 19, by Koltoroc

User metadata
Rank Member
Rank
Member
386SX wrote:

Thanks for your answers!

As I imagined it seems that most the time will be checked if the cpu supports or not specifica instructions or dual core presence. Probably I can imagine many games are developed to be generally adaptable to both PS and Xbox and also PC resulting in very unoptimized software. I don't understand why all these new cpu instructions are so important anyway, did SSE4.x improved "that much" computing speed in a game?

Games like doom show optimization is easily possible, now more so than ever before, since PS4 and Xbox One are just glorified PCs anyway (x64 CPU, DX 11 GPU, OpenGL or Direct X 11?12). The Problem is primarily that publishers refuse to invest the necessary time and money to properly optimize games.

For about 5 years now all newly sold, reasonably powerful CPUs by intel or AMD have SSE 4.x and considering how gamers tend to upgrade, it is not unreasonable to assume most gamers have by now a CPU supporting it. So why not use it? Sucks for CPUs like the Phenom II but they are now already 8 years old, predating DirectX 11 Hardware by a year dating those architectures as ancient by gaming standards. I'm not surprised that those CPUs are now slowly being deprecated, I'm more surprised it took that long.

Reply 12 of 19, by 386SX

User metadata
Rank l33t
Rank
l33t
Koltoroc wrote:
386SX wrote:

Thanks for your answers!

As I imagined it seems that most the time will be checked if the cpu supports or not specifica instructions or dual core presence. Probably I can imagine many games are developed to be generally adaptable to both PS and Xbox and also PC resulting in very unoptimized software. I don't understand why all these new cpu instructions are so important anyway, did SSE4.x improved "that much" computing speed in a game?

Games like doom show optimization is easily possible, now more so than ever before, since PS4 and Xbox One are just glorified PCs anyway (x64 CPU, DX 11 GPU, OpenGL or Direct X 11?12). The Problem is primarily that publishers refuse to invest the necessary time and money to properly optimize games.

For about 5 years now all newly sold, reasonably powerful CPUs by intel or AMD have SSE 4.x and considering how gamers tend to upgrade, it is not unreasonable to assume most gamers have by now a CPU supporting it. So why not use it? Sucks for CPUs like the Phenom II but they are now already 8 years old, predating DirectX 11 Hardware by a year dating those architectures as ancient by gaming standards. I'm not surprised that those CPUs are now slowly being deprecated, I'm more surprised it took that long.

But sometimes I think that when an old cpu become slow is more related just to the lack of some "new" instructions more than the whole old architecture. And for this I was thinking if these instructions actually really "are needed" for the goal of improved physic/graphic/ai/etc or just a way to trash old hw to buy newer one. And after these if you add the whole console market priority, I'd not be surprised to see "Wolf3D 2017" that need an octa core cpu.

Last edited by 386SX on 2016-12-19, 17:52. Edited 1 time in total.

Reply 14 of 19, by 386SX

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

So does anyone have a Sempron64 DX11 system handy to take it through modern torture? 😀

I have a Sempron 2650 (1,45x2 1MB, Radeon R3) but no Win to try... By the way I expected "more" from such a modern (ok cheap 😁) cpu.

Reply 15 of 19, by Koltoroc

User metadata
Rank Member
Rank
Member

I can see where you are coming from in regards to planned obsolescence, but I can't really agree to that. There are at least around 5 years or so between widespread adoption of those features in hardware and actual dependence on those features. If Software had been creeping up a year or less after introduction that depended on those features I could easily see this, but in this case there is enough of a market saturation of compatible hardware already that this theory seems rather unfounded. it is only a handful of games anyway that depend on that. A cursory research about SSE 4.x dependence has turned up Quantum break (Xbox one/PC cross buy, no idea about the steam version), Gears of War 4 (Xbox one cross buy), Mafia III, MGS 5 (patch seems to have removed the dependence) and no mans sky (seems to be patched out, don't know, don't care either). That looks more to me like the developers actually catching up with the market. Considering they have patched that out in 2 of the cases chances are they are not strictly needed, but if using them is simpler, faster or more convenient than alternatives, that might already be argument enough to use it if the hardware is common enough to support it. don't forget that there are pretty much no CPUs manufactured anymore without those instructions, even modern atom CPUs include SSE 4.x. And honestly, while it is nice in theory, you can't expect old, outdated and no longer manufactured hardware to be supported indefinitely.

Don't get me wrong, planned obsolescence is most certainly a thing, and a very nasty one at that, but at least in the PC market I just can't see it as being a major factor. The usable lifetime of most hardware, outside of outright broken components, is just way to long to give that credence as a motivating factor in most cases.

As for the Console CPU core count, I think those fears are rather unfounded. The Part about the console CPUS that most people and especially fanboys like to forget is the fact that thos "jaguar" cores are AMDs Atom equivalent. Low power, low performance but more cores. pretty much all modern quad core or higher CPUs (i5/i7/fx series) are way more powerful than the 8 core console CPUs to the point that even if the games use as many threads as the consoles have CPU cores, there is no way the desktop CPUs will perform slower.

Reply 16 of 19, by 386SX

User metadata
Rank l33t
Rank
l33t
Koltoroc wrote:

I can see where you are coming from in regards to planned obsolescence, but I can't really agree to that. There are at least around 5 years or so between widespread adoption of those features in hardware and actual dependence on those features. If Software had been creeping up a year or less after introduction that depended on those features I could easily see this, but in this case there is enough of a market saturation of compatible hardware already that this theory seems rather unfounded. it is only a handful of games anyway that depend on that. A cursory research about SSE 4.x dependence has turned up Quantum break (Xbox one/PC cross buy, no idea about the steam version), Gears of War 4 (Xbox one cross buy), Mafia III, MGS 5 (patch seems to have removed the dependence) and no mans sky (seems to be patched out, don't know, don't care either). That looks more to me like the developers actually catching up with the market. Considering they have patched that out in 2 of the cases chances are they are not strictly needed, but if using them is simpler, faster or more convenient than alternatives, that might already be argument enough to use it if the hardware is common enough to support it. don't forget that there are pretty much no CPUs manufactured anymore without those instructions, even modern atom CPUs include SSE 4.x. And honestly, while it is nice in theory, you can't expect old, outdated and no longer manufactured hardware to be supported indefinitely.

Don't get me wrong, planned obsolescence is most certainly a thing, and a very nasty one at that, but at least in the PC market I just can't see it as being a major factor. The usable lifetime of most hardware, outside of outright broken components, is just way to long to give that credence as a motivating factor in most cases.

As for the Console CPU core count, I think those fears are rather unfounded. The Part about the console CPUS that most people and especially fanboys like to forget is the fact that thos "jaguar" cores are AMDs Atom equivalent. Low power, low performance but more cores. pretty much all modern quad core or higher CPUs (i5/i7/fx series) are way more powerful than the 8 core console CPUs to the point that even if the games use as many threads as the consoles have CPU cores, there is no way the desktop CPUs will perform slower.

Interesting point. In my upgrade history I stopped back in the A64 generation after spending a lot on a "medium" config and to see that newer games already couldn't run smooth. So I went to notebook then netbook and I began to downshift downgrading to vintage hw. Using right now a 32bit machine for every tasks make your point right, they still are functional much after their supposed hardware lifetime.
But if we think about the complexity and capabilities of the newer generations cpu (not when the increase of speed result in a increase of heatsink/fan/stock-specifications) I would expect much, MUCH more. As always happened in the past, PC surpass easily and soon console hw but the results are that different? Beside resolution I mean. When back in the 90s you could play with the original Doom, you knew you could have never seen the same thing on a console and that was a good point to upgrade or build a new pc.
Same thing you could say with office/web tasks where you basically couldn't beleive to need quad core SSEx cpu to just use a modern gigabyte heavy word processor or the web itself becoming absurdly complex to load.

Reply 17 of 19, by Koltoroc

User metadata
Rank Member
Rank
Member

CPU evolution has stagnated for the last 5 or 6 years. The reason for that is simple, Intel had no real competition. There hasn't been any significant development outside of some marginal IPC, clockspeed and efficiency improvements. AMD simply had nothing even remotely competitive for the last few years and the only reason their FX CPUs sold was either price (very cheap CPUs and more importantly cheap but decent motherboards) or a upgrade path for older AM3 Systems. That is why I have a fx 8350 now, because I could replace it one component at a time. If I had had the money I would be running intel now. I hope this will change now with the upcomming Zen, that, if the rumours are accurate, will be at minimum equal if not better than intels high end CPUs for about half the price. We shall see.

god I hope Amd doesn't fuck this up.

unlike most older console generations, the PS4 and Xbox one were already heavily outdated the moment they released. The Ps4 pro and xbox scorpio are what they should have been at release. the original PS4 and Xbox One didn't even reach mid range PC performance. heck, the standard models *still* can't do 1080p30 without framerate issues and the ps4 pro is forced to do 4k(ish) at 30 fps without the performance to actually do that properly. and that is the main issue playability thanks to low framerates. I mean I can deal with 30fps if I have to, but only if the framerate is actually stable. That is not the case with consoles. The consoles right now are in a really shit situation. None of them are good enough to actually do what they are advertised for and interesting exclusives are the only saving grace for the PS4 at least. And that is just pure performance issues. The PC also offers way better image quality thanks tothings like texture packs the consoles couldn't even load (around 6GB texture memory use in GTAV for example, actual use btw)

Web tasks are a different issue. Modern browsers are so loaded with crap that they eat up resources like candy. I use chrome and I get memory usage of up to 15GB (I have 24GB total) The heavy scripting done on websites doesn't exactly help either. At least video decoding can be done on the gpu. It is insane how the requirements for "simple" web browsing exploded. I don't even know who to blame for it.

At least office tasks are simple. that can be done on any machine from the last 15-20 years with sufficient ease.

Reply 18 of 19, by 386SX

User metadata
Rank l33t
Rank
l33t
Koltoroc wrote:
CPU evolution has stagnated for the last 5 or 6 years. The reason for that is simple, Intel had no real competition. There hasn' […]
Show full quote

CPU evolution has stagnated for the last 5 or 6 years. The reason for that is simple, Intel had no real competition. There hasn't been any significant development outside of some marginal IPC, clockspeed and efficiency improvements. AMD simply had nothing even remotely competitive for the last few years and the only reason their FX CPUs sold was either price (very cheap CPUs and more importantly cheap but decent motherboards) or a upgrade path for older AM3 Systems. That is why I have a fx 8350 now, because I could replace it one component at a time. If I had had the money I would be running intel now. I hope this will change now with the upcomming Zen, that, if the rumours are accurate, will be at minimum equal if not better than intels high end CPUs for about half the price. We shall see.

god I hope Amd doesn't fuck this up.

unlike most older console generations, the PS4 and Xbox one were already heavily outdated the moment they released. The Ps4 pro and xbox scorpio are what they should have been at release. the original PS4 and Xbox One didn't even reach mid range PC performance. heck, the standard models *still* can't do 1080p30 without framerate issues and the ps4 pro is forced to do 4k(ish) at 30 fps without the performance to actually do that properly. and that is the main issue playability thanks to low framerates. I mean I can deal with 30fps if I have to, but only if the framerate is actually stable. That is not the case with consoles. The consoles right now are in a really shit situation. None of them are good enough to actually do what they are advertised for and interesting exclusives are the only saving grace for the PS4 at least. And that is just pure performance issues. The PC also offers way better image quality thanks tothings like texture packs the consoles couldn't even load (around 6GB texture memory use in GTAV for example, actual use btw)

Web tasks are a different issue. Modern browsers are so loaded with crap that they eat up resources like candy. I use chrome and I get memory usage of up to 15GB (I have 24GB total) The heavy scripting done on websites doesn't exactly help either. At least video decoding can be done on the gpu. It is insane how the requirements for "simple" web browsing exploded. I don't even know who to blame for it.

At least office tasks are simple. that can be done on any machine from the last 15-20 years with sufficient ease.

When I heard about the PS4 Pro I really couldn't beleive it cause I was used to the fact that a newer generation console would and should last a lot. If we think about the 360, its lifetime has been incredible and still I can say I was surprised to see games with some nice graphic even at the end. And on the paper last games should run on the first console released. In fact beside the VR requirements I couldn't explain it.
But can this also be explained cause the continuous lack of optimization with newer software? Seems like to optimize it's the last thing to consider probably for time/costs reasons or maybe the software that are used become more and more high language?

Reply 19 of 19, by candle_86

User metadata
Rank l33t
Rank
l33t

oldest still useful for gaming would be

Any Core 2 faster than Q6600/E6750
Any AMD athlon II X3 or Phenom II X3 or faster
Geforce GTX 460/AMD Radeon HD 5770