VOGONS

Common searches


First post, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Let's take these two Ivy Bridge processors as examples: Intel i7-3770T (2.5 GHz) and Intel Pentium G2030 (3.0 GHz). The i7 has slower clock speed, but certainly more processing power, while the Pentium is the opposite.

Well, it seems there are not many early 2000 games that are speed-sensitive, although I could identify at least two: Crimson Skies and F-22 Lightning 3. In the former, the roll rate of certain aircraft become uncontrollably fast when there are no textures on the screen (like night missions), while in the latter, it's very hard to aim with cannon.

For such speed-sensitive games, which one is actually worse? Lower clock speed with greater processing power, or higher clock speed with lower processing power? Which one is easier to "tame" (more friendly with CPU slowdown utils), higher clock speed or higher processing power?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 1 of 39, by carlostex

User metadata
Rank l33t
Rank
l33t

The 2 examples you present aren't applicable, since those are both Ivy Bridge cores. The i7 is only gonna show more processing power if the software is aware of multi cores, or other features. For those games you mentioned, the Pentium is gonna be faster, since it has a higher clockspeed and it's the same basic micro-architecture than the i7.

Then other questions arise. It is quite possible that eventually an older CPU architecture might run very old software faster than a more recent CPU architecture. Let's say a 3GHZ Core vs a 3GHZ Nehalem, a more conventional architecture could be faster because on newer certain features that don't matter so much anymore become deprecated.

So it really doesn't matter much, newer CPU's are quite a problem on older speed sensitive hardware.

Reply 2 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
carlostex wrote:

The 2 examples you present aren't applicable, since those are both Ivy Bridge cores. The i7 is only gonna show more processing power if the software is aware of multi cores, or other features. For those games you mentioned, the Pentium is gonna be faster, since it has a higher clockspeed and it's the same basic micro-architecture than the i7.

I thought i7 has better single-core performance as well compared to i5, i3, or Pentiums. Am I wrong?

carlostex wrote:

Then other questions arise. It is quite possible that eventually an older CPU architecture might run very old software faster than a more recent CPU architecture. Let's say a 3GHZ Core vs a 3GHZ Nehalem, a more conventional architecture could be faster because on newer certain features that don't matter so much anymore become deprecated.

Hmmm... I hope so. The mini-ITX system I'm building is going to use Ivy Bridge. I hope it will run slower for the early Windows XP games I'm gonna run on the system.

carlostex wrote:

So it really doesn't matter much, newer CPU's are quite a problem on older speed sensitive hardware.

Is higher clock speed easier to tame with CPU slowdown utils?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 3 of 39, by carlostex

User metadata
Rank l33t
Rank
l33t

i7 has the same single core performabnce than i3 or i5 if they are based on the same core and/or same micro architecture.

Intel is not designing 16 different chips, they probably have just 2 different builds to print on wafers. Possibly dual cores and quad cores not counting their high end chips. Because every single wafer does not have 100% yield, some quad cores might be harvested and sold as dual cores, they just disable the non functioning ones along with corresponding cache.

The mini ITX system you'll build will be fine running XP, just make sure you choose a motherboard that still provides all proper drivers for XP. If you are using the integrated graphics, make sure you'll get compatibility too. With all that out of the way, just take note of the speed sensitive games you want to run. Check if the motherboard has good BIOS or Windows software to declock the chip. I remember i could downclock my Core 2 Duo E8400 (which runs normally at 3.6Ghz) as low as 1200Mhz, combining the lowest multiplier with a low speed FSB setting. I bet i could go even further. You'll be able to lower the VCore voltages as well, saving a bit in power consumption.

As for slowdown utils i don't know, never tried them.

Reply 4 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

carlo, based on CPU Boss comparison, Intel i7-3770T's passmark single-threaded performance is still better than that of Intel Pentium G2030 (9.2 vs 8.5) despite the former's lower clock speed (2.5 GHz vs 3 GHz). I wonder which one would be more problematic, i.e. too fast, for older, speed-sensitive games. It should be noted that the i7 has larger L1, L2, and L3 caches as well.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 5 of 39, by carlostex

User metadata
Rank l33t
Rank
l33t

Caches are only bigger on i7 because the i7 is a quad core, each core has its set of dedicated L1 and L2 and then all the cores share a L3 cache.

The i7 has turbo, so in single core the chip is probably gonna clock higher than 3Ghz.

Oh and BTW, benchmarks can be dubious.

Reply 6 of 39, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Yea it has Turbo. Easy to forget that 😀

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 7 of 39, by meisterister

User metadata
Rank Newbie
Rank
Newbie

Sorry for the bump, but it doesn't really look like anyone answered your question. 🤣

In older games, the difference really should just be the raw speed at which the computer can run single-threaded tasks (provided that the game is single-threaded, which most of the time were). The general logic for older games was to try to crank out as many frames as possible and hope that the hardware's limitations would prevent the game from running too fast. Of course, that worked fine when you would only expect your game to run on a 386 or a Pentium III, but CPUs now are way faster.

This isn't to say that the clockspeed doesn't matter. All the clockspeed does is indicate how quickly the transistors in a given CPU switch. For example the Pentium @ 3GHz that you gave as an example can be expected (not counting power saving features or turbo) to switch 3 billion times in a second. That's it. The same thing could be said of a 3GHz Pentium 4. The difference here is the CPU's architecture. The P4 could do exactly jack-all in each clock cycle while the Ivy Bridge CPU can do significantly more. The example you gave was between a Pentium and an i7 of the same generation. Provided that they both turbo to the same speed, I would expect the i7 to be faster per core, not per thread. This is because the i7 has a good bit more cache. Now, if you loaded up the other threads on the i7, the performance per core should drop a bit as each thread would have to compete for resources. It is thus that I would expect the Pentium to be worse (given that the i7 can't turbo), as it has no threads to compete for resources and runs at a higher clock rate.

Dual Katmai Pentium III (450 and 600MHz), 512ish MB RAM, 40 GB HDD, ATI Rage 128 | K6-2 400MHz / Pentium MMX 166, 80MB RAM, ~2GB Quantum Bigfoot, Awful integrated S3 graphics.

Reply 8 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
meisterister wrote:

Sorry for the bump, but it doesn't really look like anyone answered your question. 🤣

Indeed. 🤣

meisterister wrote:

In older games, the difference really should just be the raw speed at which the computer can run single-threaded tasks (provided that the game is single-threaded, which most of the time were). The general logic for older games was to try to crank out as many frames as possible and hope that the hardware's limitations would prevent the game from running too fast. Of course, that worked fine when you would only expect your game to run on a 386 or a Pentium III, but CPUs now are way faster.

This isn't to say that the clockspeed doesn't matter. All the clockspeed does is indicate how quickly the transistors in a given CPU switch. For example the Pentium @ 3GHz that you gave as an example can be expected (not counting power saving features or turbo) to switch 3 billion times in a second. That's it. The same thing could be said of a 3GHz Pentium 4. The difference here is the CPU's architecture. The P4 could do exactly jack-all in each clock cycle while the Ivy Bridge CPU can do significantly more. The example you gave was between a Pentium and an i7 of the same generation. Provided that they both turbo to the same speed, I would expect the i7 to be faster per core, not per thread. This is because the i7 has a good bit more cache. Now, if you loaded up the other threads on the i7, the performance per core should drop a bit as each thread would have to compete for resources. It is thus that I would expect the Pentium to be worse (given that the i7 can't turbo), as it has no threads to compete for resources and runs at a higher clock rate.

Hmmm... is it possible to underclock the Pentium G2030 with a H61 mobo?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 9 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

Hmmm... is it possible to underclock the Pentium G2030 with a H61 mobo?

Don't see why it wouldn't be; would probably be easier with one of the new Anniversary chips (unlocked multi) but you should still be able to drop the FSB (okay, "reference clock") and bring the clockspeed down. EIST will also automatically underclock in response to low loading, now whether or not that will be sufficient for speed-sensitive game (or if any sudden jumps up to full speed would cause problems) is another question altogether. 😊

Reply 10 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
Kreshna Aryaguna Nurzaman wrote:

Hmmm... is it possible to underclock the Pentium G2030 with a H61 mobo?

Don't see why it wouldn't be; would probably be easier with one of the new Anniversary chips (unlocked multi) but you should still be able to drop the FSB (okay, "reference clock") and bring the clockspeed down. EIST will also automatically underclock in response to low loading, now whether or not that will be sufficient for speed-sensitive game (or if any sudden jumps up to full speed would cause problems) is another question altogether. 😊

Thanks. Well I remember reading somewhere that H77 allows you to overclock while H61 doesn't. I could be wrong though.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 11 of 39, by oerk

User metadata
Rank Oldbie
Rank
Oldbie
Kreshna Aryaguna Nurzaman wrote:

Thanks. Well I remember reading somewhere that H77 allows you to overclock while H61 doesn't. I could be wrong though.

It seems that there are some newer H81/B85 mainboards that allow overclocking if you flash to a beta BIOS:
http://www.tomshardware.com/reviews/pentium-g … cking,3888.html

Reply 12 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
oerk wrote:
Kreshna Aryaguna Nurzaman wrote:

Thanks. Well I remember reading somewhere that H77 allows you to overclock while H61 doesn't. I could be wrong though.

It seems that there are some newer H81/B85 mainboards that allow overclocking if you flash to a beta BIOS:
http://www.tomshardware.com/reviews/pentium-g … cking,3888.html

I have yet to find H81/B85 mobos that support Windows XP, but maybe I could try.

Anyway, I'd like to mention the goal of the system again. It is to be a mini ITX Windows XP legacy system, which will be used for three main purpose:
(1) to play early Windows XP games like MiG Alley, Emperor: Battle for Dune, Crimson Skies, and Freedom Force. I already have a GTX 280 for this job.
(2) to serve as HTPC/audiophile PC to feed my AV receiver.
(3) for daily works like browsing, office, Photoshop, virtual machines, and the likes. Naturally, this particular goal is multitasking-heavy, especially since I'm mostly doing my listening while working.

My initial plan is to use intel-based PC; that is, Gigabyte H61-USB3 with Intel Pentium G2030. Hence, this thread.

However, just few hours ago I'm suddenly tempted to use AMD instead. AMD seems to be focusing on multicore performance, while its single core performance is less than desirable. However, I think it suits my need perfectly.

See, AMD seems to have pretty good multicore performance, which suits my daily works - heavily multitasked, with many active applications ranging from Winamp, Firefox, Word, Excel, Visio, Photoshop, and Virtual Machines while copy pasting between those applications.

On the other hand, AMD's slow single core performance would make it perfect for playing old games. I don't have to worry about games like Crimson Skies being unmanageably fast when there aren't many textures on the screen, for example.

My plan is to use FM2-A75IA-E53 with AMD A8-5600K. Yes, the A8-7600 is clocked pretty fast, but it can be underclocked, can it?

There are advantages and disadvantages over the new alternative, but I think they are allminor.

Disadvantages:
(1) the MSI FM2-A75IA-E53 doesn't have SPDIF header, so I cannot connect the Realtek sound chip's digital output to a discrete video card like the GTX 280. Alas, I haven't found any mini-ITX AMD mobo that has SPDIF header. But this is minor issue since I can always use an external sound card like Creative X-Fi USB.
(2) the MSI FM2-A75IA-E53 uses Realtek ALC887, which only supports EAX and not A3D, while the Gigabyte H61N-USB3 has Realtek ALC889 which supports both EAX and A3D. But then again, this is minor issue. Creative X-Fi USB supports A3D, doesn't it?

Advantages:
(1) low single core performance like I mentioned above.
(2) unlike Intel's, AMD's integrated graphics is not shitty. In fact, the AMD A8-5600K has Radeon HD 7560D, which is better than my laptop's GeForce 310M.

I don't know how backward compatible the Radeon HD 7560D is for old games, but at least I can alternate between the Radeon HD 7560D integrated graphics and the GeForce GTX 280 discrete graphics. Jane's World War II Fighters, for example, doesn't run on nVidia's Tesla-generation card (it doesn't run on the 310M), but it runs on AMD 760G integrated graphics.

So I guess using AMD platform has its own benefit.

However, there is one thing I need to know:

Do modern AMD CPUs have backward compatibility problems with old games?

The latest AMD CPU I used is Athlon XP (T-bred), and it never had problems with old 3dfx games - at least the old 3dfx games I played. However, how about newer AMDs? Do they have backward compatibility problems with early Windows XP games?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 13 of 39, by ratfink

User metadata
Rank Oldbie
Rank
Oldbie

I never found issues with older games with Athlon 64 X2, Phenoms I, Phenom II X4 - at least, no problems related to the cpu that I can recall.

One mini-ATX FM2+ board which does seem to have SPDIF out is the Gigabyte F2A88XN-Wifi.

Reply 14 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
ratfink wrote:

I never found issues with older games with Athlon 64 X2, Phenoms I, Phenom II X4 - at least, no problems related to the cpu that I can recall.

One mini-ATX FM2+ board which does seem to have SPDIF out is the Gigabyte F2A88XN-Wifi.

I see, thanks!

I'm actually tempted to be able to switch between Radeon integrated graphics (the Radeon HD 7560D that comes with the AMD A8-5600K's Trinity APU) and GeForce discrete graphics (the GeForce GTX 280 that I already have). I'm especially tempted to test Trinity GPU with older games like MiG Alley and Emperor: Battle for Dune, though the primary video card will still be the GTX 280.

Of course, the switching between the Radeon integrated graphics and the GeForce video card is done from BIOS, but I still need to install both Catalyst and ForceWare on the same Windows XP system. Is that possible? I mean, how much pain does it take to install both Catalyst and ForceWare on the same system? It's something that I've never done before.

Furthermore, does the latest Catalyst that still supports Windows XP can also work with Trinity?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 15 of 39, by awgamer

User metadata
Rank Oldbie
Rank
Oldbie

>My initial plan is to use intel-based PC; that is, Gigabyte H61-USB3 with Intel Pentium G2030. Hence, this thread.

Intel came out with the Pentium G3258 that's really good, going for around $66-70.

https://www.youtube.com/watch?v=u380jWhRXTI

Motherboard from the youtube video: $75 http://www.newegg.com/Product/Product.aspx?gc … 0140823011626:s

http://www.tomshardware.com/reviews/pentium-g … nce,3849-2.html
http://cpu.userbenchmark.com/Compare/Intel-Pe … 750K/2434vs1548

Reply 16 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
awgamer wrote:
>My initial plan is to use intel-based PC; that is, Gigabyte H61-USB3 with Intel Pentium G2030. Hence, this thread. […]
Show full quote

>My initial plan is to use intel-based PC; that is, Gigabyte H61-USB3 with Intel Pentium G2030. Hence, this thread.

Intel came out with the Pentium G3258 that's really good, going for around $66-70.

https://www.youtube.com/watch?v=u380jWhRXTI

Motherboard from the youtube video: $75 http://www.newegg.com/Product/Product.aspx?gc … 0140823011626:s

http://www.tomshardware.com/reviews/pentium-g … nce,3849-2.html
http://cpu.userbenchmark.com/Compare/Intel-Pe … 750K/2434vs1548

Thanks, although actually, faster single core speed is what I'm trying to avoid in this scenario.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 17 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Hmmm....interesting stuff.

amdlova wrote:

For install two graphics drivers. you need first install nvidia driver before the amd driver. Nvidia block the the installation when see amd card plugged on system if i remember. the 185.xx driver can run both cards and work phsyx 😉

You need the drive sweeper to remove all the drivers to try the new installation.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 18 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

Hmmm....interesting stuff.

amdlova wrote:

For install two graphics drivers. you need first install nvidia driver before the amd driver. Nvidia block the the installation when see amd card plugged on system if i remember. the 185.xx driver can run both cards and work phsyx 😉

You need the drive sweeper to remove all the drivers to try the new installation.

That's not entirely accurate/true, but that doesn't mean I haven't seen very specific "step by step and no other way will work and your machine will explode in a fireball if you deviate" kinds of directions like this in the past. I've installed AMD first, nVidia first, updated both "at once" etc no problems in XP or 7. 7 will auto-install via Windows Update nVidia and ATi drivers upon detecting the hardware unless you have that feature turned off (where it won't find and install drivers for any hardware it finds). The ATi drivers on Windows Update, at least a year or so ago, don't include the entire CCC+HydraVision package, while nVidia will download the whole enchilada - if you go with ATi you may need to go on their website and grab CCC/HydraVision if you need those features (and yes, you can run HydraVision and nView side by side in XP and 7).

On XP you just install like usual - find whatever downloadable and load it. On neither XP nor 7 does the nVidia driver "block" any installation - I've confirmed this up through something like 293.xx with my 7950GX2 under XP and 307.83 with my 7900GS under 7; both "directions" will also let SLI or CrossFire enable as normal. I have no way to test with triple-GPU configurations, and the one time I tried adding a fifth GPU to my QuadSLI system, XP had a heart attack (refused to load a lot of drivers, stuff would crash, etc; pulling the fifth GPU out and it acted like nothing at all had happened) so I have not revisited that.

For extra giggles: Quadro FX and GeForce will install and work together (PNY and nVidia both claim this shouldn't work), including if the GeForce doesn't support CUDA and the Quadro FX does, and the drivers will also let you enable PhysX H/W processing on the Quadro in the process. Unfortunately I don't have anything that actually USES PhysX H/W ( 🤣) so I have no way to test performance or what-have-you. PhysX H/W will be blocked if you have nVidia + ATi but that's the only limitation I'm aware of (and there are unofficial hacks to sidestep that). I don't know about nVidia + non-ATi combinations and PhysX, like for example if you have Matrox/S3/3DLabs card installed alongside your GeForce/Quadro if it would lock PhysX or not (if I remember, and can come up with a way to test this, I will try to do so).

More info on PhysX:
http://physxinfo.com/wiki/Configuration_types

Reply 19 of 39, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
Kreshna Aryaguna Nurzaman wrote:

Hmmm....interesting stuff.

amdlova wrote:

For install two graphics drivers. you need first install nvidia driver before the amd driver. Nvidia block the the installation when see amd card plugged on system if i remember. the 185.xx driver can run both cards and work phsyx 😉

You need the drive sweeper to remove all the drivers to try the new installation.

That's not entirely accurate/true, but that doesn't mean I haven't seen very specific "step by step and no other way will work and your machine will explode in a fireball if you deviate" kinds of directions like this in the past. I've installed AMD first, nVidia first, updated both "at once" etc no problems in XP or 7. 7 will auto-install via Windows Update nVidia and ATi drivers upon detecting the hardware unless you have that feature turned off (where it won't find and install drivers for any hardware it finds). The ATi drivers on Windows Update, at least a year or so ago, don't include the entire CCC+HydraVision package, while nVidia will download the whole enchilada - if you go with ATi you may need to go on their website and grab CCC/HydraVision if you need those features (and yes, you can run HydraVision and nView side by side in XP and 7).

I see, interesting. Thanks! Anyway, once you had a system with both Radeon and GeForce on it, didn't you?

I don't plan to use secondary monitor; merely want to be able to switch between Radeon IGP and GeForce dedicated GPU from BIOS.

Anyway, suppose you're active video card (set through BIOS) is the GeForce, and you already have ForceWare installed. Would you need to reboot and switch to the Radeon IGP first (through BIOS), then install Catalyst? Or can you just install Catalyst while the nVidia is still the active card?

obobskivich wrote:

On XP you just install like usual - find whatever downloadable and load it. On neither XP nor 7 does the nVidia driver "block" any installation - I've confirmed this up through something like 293.xx with my 7950GX2 under XP and 307.83 with my 7900GS under 7; both "directions" will also let SLI or CrossFire enable as normal. I have no way to test with triple-GPU configurations, and the one time I tried adding a fifth GPU to my QuadSLI system, XP had a heart attack (refused to load a lot of drivers, stuff would crash, etc; pulling the fifth GPU out and it acted like nothing at all had happened) so I have not revisited that.

:shock:

obobskivich wrote:

For extra giggles: Quadro FX and GeForce will install and work together (PNY and nVidia both claim this shouldn't work), including if the GeForce doesn't support CUDA and the Quadro FX does, and the drivers will also let you enable PhysX H/W processing on the Quadro in the process. Unfortunately I don't have anything that actually USES PhysX H/W ( 🤣) so I have no way to test performance or what-have-you. PhysX H/W will be blocked if you have nVidia + ATi but that's the only limitation I'm aware of (and there are unofficial hacks to sidestep that). I don't know about nVidia + non-ATi combinations and PhysX, like for example if you have Matrox/S3/3DLabs card installed alongside your GeForce/Quadro if it would lock PhysX or not (if I remember, and can come up with a way to test this, I will try to do so).

More info on PhysX:
http://physxinfo.com/wiki/Configuration_types

Ah, wouldn't use PhysX anyway since mini-ITX mobo only has single PCIe slot.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.