VOGONS


First post, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

Ideally you want a cpu/gpu pairing that doesn't waste the potential of either part. This is especially important when budget is a constraint since it makes no sense to overspend on one part only to have it bottlenecked by the other. Balance is also a virtue from more theoretical or aestetic standpoint as well, and in retro builds is likely the more dominant impulse.

But other than actual benchmarking are there any down-and-dirty, back-of-the-envelope rules of thumb? Given that games and applications vary considerably in their demands on either part it could only be a broad stroke estimate, but I'm curious.

Looking back at my systems where I had a gut feeling that they were fairly well balanced [in gaming] I note a rough ratio of cpu hertz [x cores if more than 1] to graphic card memory on the order of 8:1 to about 16:1

P200 Voodoo 2/12mb = 16:1
P1000 Voodoo 5/64mb = 16:1
P2400 ATI 9700pro/128mb = 18:1
AMD2400 2x256 SLI = 4:1 [this GPU-heavy rig ran 2005-10 with only minor upgrades]
AMD2400[2core] 2x256 SLI = 9:1 [above rig with faster cpu; a more typical ratio]
AMD4000[4core] 2x1gb CF = 8:1 [555BE unlocked cores/OC @ 4.05]

Oddly the P4-2400/9700pro comb felt stout at the time and was a good W98 performer to the bitter end [2005] but it's the "lightest" in ratio terms, and it bucks the general trend of narrowing ratios.

An earlier build; P200 with a Voodoo 1/4mb = 50:1 - very "light". The ratio conveys nothing of the awe my introduction to 3D was, but perhaps hints at why there was such a thirst for bigger and bigger video cards.

The first AMD rig was "all-in" in GPU terms, the idea had been to buy as much graphic power as I could afford, sacrificing CPU to meet budget. The tactic bought me 5 years of little in the way of significant upgrades. For all I know though, the cpu may have bottlenecked the SLI, but it didn't feel that way since a late game cpu upgrade didn't have a drastic effect on game performance.

What made me think of this was a Voodoo 2/12mb SLI box I'm fiddling with. Of the various slot 1 cpu's I have on hand which might be the most well suited without resorting to bench testing? Throwing out the highest/lowest results of my quick survey the numbers suggest a cpu P200-P400 range would hit the sweet spot. Hmmm...

This excersise is based purely on gut feeling and I'd be interestest to know what people who do benchmarks and take number crunching seriously think about the idea of a "rule of thumb".

Reply 1 of 33, by sunaiac

User metadata
Rank Oldbie
Rank
Oldbie

I think you're totally over engineering the thing 😀 (and really, don't look at GPU VRAM, because a GT430 with 2GB is still a very bad card)
I also think it's nearly impossible to determine a true bottleneck (except obvious stuff like Voodoo 1 on core i 7 ...) because each game engine is different, and optimized for some specific configuration. So the ground rule is to see what's best for what you will actually play.

But if I tried some rule, I'd say take a GPU and a CPU in the same price range and contemporary to each other.
You have three main categories at each generation, low, middle and high end, with relative power rising kind of linearly.
For exemple, right now you'll have the HD7770 (150€), the HD7870(300€) and the HD7970 (450€). On the other hand, you'll have the core i3 (150€, 2 cores 3GHz), the core i7 LGA1155 (300€, 4 cores 3GHz) and the core i7 LGA2011 (450€, 6 cores 3GHz). The good rule is to match the prices.

Because you can bet developers do/did that.

PS : my exemple is not perfect obviously, games are not really able to take advantage of more than 4 cores. That's the second rule : you always have generations where CPUs can't keep up, others where GPUs can't. Right now, GPUs are the bottleneck, cause 6 cores can't be used correctly. When the 9800pro to X1950XTX came, CPUs where the bottleneck with crappy P4s.

R9 3900X/X470 Taichi/32GB 3600CL15/5700XT AE/Marantz PM7005
i7 980X/R9 290X/X-Fi titanium | FX-57/X1950XTX/Audigy 2ZS
Athlon 1000T Slot A/GeForce 3/AWE64G | K5 PR 200/ET6000/AWE32
Ppro 200 1M/Voodoo 3 2000/AWE 32 | iDX4 100/S3 864 VLB/SB16

Reply 2 of 33, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The guideline for retro gaming is more about matching GPU to the games you want to play. Voodoo cards / Glide (old games tend to look/run best with Glide). NVIDIA / OpenGL. That sort of thing.

The concern when picking a CPU is mainly to not pick one that is so slow that it is a serious bottleneck for the game engine.

Reply 3 of 33, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

Admittedly I presented a crude metric, and am fully aware that different games tax systems in differnet ways, but the question of a "rule of thumb" intruigs me. Using contemporary price range comparison is an interesting suggestion - and MSRPs would be fairly easy to establish. But I was/am looking for a simple formula based on broad brush "box top" info. Not something that is 100% accurate - as noted that's not possible given all the hardware/software parameter in play- but a metric that might safely exclude a part either too slow or too fast to be considered an ideal mate.

The question of specifc gaming needs is not the mitgating factor here, just whether a part is a *reasonable* match. The technical attributes of the GPU or CPU would be a second filter with with which to narrow the field given one's gaming requirements, and in this scenario games would be installed to fit the system, not visa-versa, as is generally the case.

Reply 4 of 33, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Interesting thread 😀 Many here will also match up devices in a "period correct" way.

For example when the Core 2 Duo came out, suddenly all the graphics cards got a huge speed boost as only now it was apparent how much the Athlon 64 was holding back the cards.

Despite this you will find most people NOT picking a Core 2 Duo but going with a Pentium 4 or Athlon 64 system instead.

For me there are certain cards that I consider significant of certain periods. For example the Ti4200 for all the DX8 games. Then there was the Radeon 9800Pro which caused huge issues for Nvidia. The 6800GT a beast of a card for that time. 8800GT bringing back ti4200 like value to the masses. Radeon 4850 a real single slot power house. And there are many more like it...

From all my computing experience, when it comes to gaming, always spend the most of your money on the video card. This advice still holds true. It's much better to get a Geforce 560 over a 550 than it is to get an i7 over an i5.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 5 of 33, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

When I decided to build a new machine to replace my old Athlon 64 4000+ /7800GT SLI workhorse in 2010, my criteria was very specific. I needed to increase the FPS in rFactor [max settings @ 1920x1200, full field, race start at LasVegas Street] from 12 to 60, an increase of 6x. So I set out determine what combination of CPU/GPU would give me at least a six-fold performance increase over the current rig. Cost was an issue so I had to grind it out.

I consulted TomsHardware's CPU and GPU hierarchy charts, gradually penciling in benchmark results to get an idea of how much actual boost moving up the increments of each of the charts yielded. This took quite some time. Then it was necessary to estimate the relative contribution of the two parts, and again trying to isolate the effects purely from published bench marks took some doing. But in the end I think I had a pretty good abstracted picture of what % increase to expect from parts chosen according the the hierarchy charts.

The "solution" was a combo consisting of an AMD Phenonm 555BE and a 1gig HD 5770, nad this hit the rFactor target practically dead-on. later, overclocking to 4ghz and adding a second GPU also fell in line. God I only hope I saved the paper work!

Now the issue is a pleasant stroll down Retro Lane. Parts are in the box, money no issue. What's a good slot 1 to match with a 4mb Rendition + a 2 Voodoo 2 SLI? I could test all the ones I have [probably suing Hind or ICR2 Rendition as the benchmarks], or I could try to concoct a 20-second synthetic approach, which as I discovered in the first pass, indicates something in the P233-400 range. Right now it's sitting on a AN430TX/P200mmx setup, but by swapping out mobos slot 1's are an option. Up until now my prime candidates were a verneable C300A, or a nice P3-533 Coppermine, both of which offer some good alternative settings... this all just got me thinking...

Last edited by tincup on 2012-07-14, 14:35. Edited 1 time in total.

Reply 6 of 33, by [GPUT]Carsten

User metadata
Rank Newbie
Rank
Newbie

For modern day apps I go the GPU-route, because with almost any game you can invest surplus in GPU power in higher image quality - which is a thing I do fancy.

Back in the day, all-maxxed out gaming was not so common as it is nowadays, I think. I often had to dial down an option or two, mainly resolution wise (having had at least 16x12-capable (used) displays since 1998). So personally I would always tend to have graphics hardware one generation more advanced than the CPU-part - with the CPU-part determining what kind of games I want to play on that particular rig and the extra graphics horse power to max out as many settings as possible.

For example:
P233 MMX w/Voodoo Graphics and it's generation
K6-3+ 400 w/Voodoo 2 SLI, TNT2-class graphics
P3-800+ w/Geforce or Radeon 1st gen
AthlonXP 3000+ for any (reasonable) thing faster that still is AGP, except maybe some HD 4670 AGP, which could profit from a fast Athlon64 or Phenom II or even Core 2 on an AGP platform.

Reply 7 of 33, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I tend to spend some 30-50 percent more on graphics card then cpu, but I also probably overshoot the graphics a bit, at least compared with average custom builds. I like to upgrade one part at time, picking good deals, without much concern for balance of the particular setup.

Reply 8 of 33, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

Yes back in the day maxed out gaming system seemed less common. Games vs hardware in constant flux and tension. It seemed that hardware was was coming out every few months that raised the bar and games took advantage of every extra ounce - the improvements in gaming over very short periods of time were tangible, noticable. Now it's totally possible to sit on a system for a few years and not miss out so much - the occasional carefully considered upgrade and you are okay. You may miss out on a few bells and whistles, but overall you are still gaming at hi res/max. You didn't have that luxury in the 90s.

My quick htz/ram comparison bears this out as the ratio steadily trends towards parity and the time between the data points stretches out.

I too tend to go the GPU first route and plug in faster and faster CPUs as the prices drop.

Reply 9 of 33, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
tincup wrote:

Ideally you want a cpu/gpu pairing that doesn't waste the potential of either part. This is especially important when budget is a constraint since it makes no sense to overspend on one part only to have it bottlenecked by the other.

But is budget really a constraint these days? Unless you're going for exotic parts (like an AGP GeForce 6800 or something), it seems to me price is not much of a consideration – and if you're going for exotic parts, the more likely you are to run into obscure compatibility problems no one ever bothered to solve.

Reply 10 of 33, by kool kitty89

User metadata
Rank Member
Rank
Member
[GPUT]Carsten wrote:

For modern day apps I go the GPU-route, because with almost any game you can invest surplus in GPU power in higher image quality - which is a thing I do fancy.

Back in the day, all-maxxed out gaming was not so common as it is nowadays, I think. I often had to dial down an option or two, mainly resolution wise (having had at least 16x12-capable (used) displays since 1998). So personally I would always tend to have graphics hardware one generation more advanced than the CPU-part - with the CPU-part determining what kind of games I want to play on that particular rig and the extra graphics horse power to max out as many settings as possible.

For a great number of old games, it was physically impossible to max things out (ie max -or even decent- framerate at max detail settings) when the games were new. The hardware simply wasn't available for that to be possible until those games were several years old already.

Games like duke nukem 3d, Quake, Tomb Raider II, etc, etc had ridiculously high resolution options for the time. They may have run fine at max detail at lower resolutions (with a good GPU -for accelerated modes), but the high res stuff would be another story, or truecolor for that matter since few fast GPUs supported 32-bit rendering early-on. (ViRGE and RAGE did 24/32-bit 3D, but weren't nearly fast enough to work well beyond low resolutions, and Riva and Voodoo only did 16-bit color 3D)

And for software rendering, with high res build games, Quake, or even Tomb Raider 1 for that matter, you wouldn't be getting good framerates at max detail either. (even though TR was limited to 640x480 max)

Reply 11 of 33, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
kool kitty89 wrote:

even though TR was limited to 640x480 max

No it wasn't.
Resolutions are special, it cannot be said we can max them even today. If games read registry to offer any resolution supported by the machine we can play after decade games of today at resolutions our graphics cards cannot dream of.

Reply 13 of 33, by PhaytalError

User metadata
Rank Member
Rank
Member

I've been pondering this same question about my "Classic" Gaming System [it's specs are in my signature]...

It's a Pentium III 700Mhz Coppermine Slot-1, 512mb RAM, and a Voodoo 3 3500TV 16mb AGP card... what would be the bottleneck in that or is it pretty well balanced? The reason i'm asking is i'm noticing framerate drops in some games, such as Deus Ex at 1024x768 resolution which I am suspecting is a CPU bottleneck, however the System Requirments of the game scream out "no", as in it's definately not a CPU bottleneck.

Would a Voodoo 5 5500 AGP be more suitable or would that be "overkill"? I'm looking to possibly upgrading the CPU eventually to a Pentium III 1.4Ghz Tualatin CPU in the near future [possibly 6-months to a year from now].

That PC is not just for Windows gaming, but also DOS gaming as well, so therefore I prefer 3DFX card's because they have amazing DOS compatability in both VGA/SVGA and VESA.

DOS Gaming System: MS-DOS, AMD K6-III+ 400/ATZ@600Mhz, ASUS P5A v1.04 Motherboard, 32 MB RAM, 17" CRT monitor, Diamond Stealth 64 3000 4mb PCI, SB16 [CT1770], Roland MT-32 & Roland SC-55, 40GB Hard Drive, 3.5" Floppy Drive.

Reply 14 of 33, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I don't mind having an overkill system to be honest. A constant 60 frames at every situation is what I like. Ideally more but now with LCD displays it's not was easy as it was back when we had CRTs.

Games like Doom 3 played at Full HD could still drop frames on a modern 8800GT and Core 2 Duo. Far Cry with HDR enabled at Full HD also is quite demanding.

The good thing is that a. These games run under XP, Vista and W7 and b. are available online for little money.

It's the W98 games (especially the early ones) like System Shock 2 that are very hard to get going on a modern machine. For this reason I see a decent W98 machine quite important and in a few years I expect the whole vintage scene moving on to this era (basically when the next generation comes along).

So these P3 machines will get quite rare pretty soon!

We really have to thank that XP has been around for so long and that DX8 and 9 was also around for a very long time. XP gave us almost 10 years of gaming compatibility. There are always exceptions, but most games that ran on XP in the early days will run fine on a modern machine.

Aspect Ratio isn't an issue either as all the video cards have good scaling options. Many games can be tweaked to run in widescreen, though usually the HUD is stretched which some might not like.

Get an X-Fi card, some analogue 5.1 speakers or good headphones with CMSS-3D and you have many years worth of the finest PC games. After that it all went towards console ports.

Reply 15 of 33, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

@mole totally right about scaling and the logevity of XP - I still run XP for example. But in retro terms though, we have a choice - what systems to build, what cards/boards/cpus to focus on - it's not so much about running a particular game to the max. Deus Ex? HalfLife 1? I think we all know what box would do the trick. But what about builds that try reflect simply a balance of parts and only install apps and games that run smoothly on that system without significantly overstreesing either the cpu or gpu components? Thats my point here.. Not to "nail" a particular game, which we all know we can do...

Reply 16 of 33, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

A cheap trick to avoid "overstessing" the GPU is to enable Vsync. So if you have a very powerful GPU and play old games at 300+ frames, enabling Vsync will actually reduce the load.

If I was building an XP machine I would go with popular choices to avoid issues with drivers.

The 7 series from Geforce was quite strong during the XP days. The 8 series started to go toward DX 10 so might be too new.

A 7900GT or 7900GS would likely be my choice. The top models 7900GTX will likely be hard to find.

The 800GT would be other choice though drivers likely won't go far back enough to not run into some weird driver issue with a game.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 17 of 33, by sliderider

User metadata
Rank l33t++
Rank
l33t++
tincup wrote:

@mole totally right about scaling and the logevity of XP - I still run XP for example. But in retro terms though, we have a choice - what systems to build, what cards/boards/cpus to focus on - it's not so much about running a particular game to the max. Deus Ex? HalfLife 1? I think we all know what box would do the trick. But what about builds that try reflect simply a balance of parts and only install apps and games that run smoothly on that system without significantly overstreesing either the cpu or gpu components? Thats my point here.. Not to "nail" a particular game, which we all know we can do...

We can thank Microsoft and the horribleness that is Vista for the longevity of XP. The slow adoption rate of Vista due to it's flaws and bloat forced MS to extend XP support because of the number of people who were ordering their new machines with XP until 7 came out.

Reply 19 of 33, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

After looking at my own "numbers" I'm up-gunning the cpu on my Legacy-1 box from a P200mmx to 233. W95osr2+/intel AN430TX/Stealth S220-4mb [Rendition]/2x Voodoo2-12mb SLI/Vibra16S [XR385 daughterboard on order]/64mbPC100/Thrustmaster ACM gamecard.