VOGONS

Common searches


RIP 32 bit PC gaming?

Topic actions

Reply 20 of 63, by Neurotic

User metadata
Rank Newbie
Rank
Newbie
Darkman wrote:

not surprising at all , XP is 13 years old, and applications/games moved on to 64bit years ago.

Im actually surprised XP lived that long, I mean looking back Microsoft tended to support their operating systems for around 6-8 years at best.

Although interestingly I do know some people who still use XP and feel no need to upgrade , so I guess for some people it still has life as a current OS

The vast majority of games are still 32-bit, My Program files folder contains just utilities, all games are in the x86 folder.

Open your eyes, people. Most people use laptops with integrated graphics, 32-bit computing is not dead just because of some overpowered, expensive monster card that nobody but power users and obsessive gamers are gonna buy.

Reply 21 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
archsan wrote:

Anyway, GM chips would be better than previous gen simply because of its efficiency. Why limit yourself to GF6/7/8/9/200/400 etc (consumes more power for lower performance). The new GM chips could probably run older DX9 games at closer to idle load & temp (rather than 100~200W with the older gen cards). Or probably 50~75W tops esp. if you downclock it. Why XP again? Well to each their own (games/compatibility issues). For me, it's out-of-the-box DS/DS3D/EAX. And add 4K to the mix while you're at it. Why not?

The insanity-class (hundreds of W) power consumption (for nVidia) really didn't come about until GeForce 8+ era (and it didn't get colossally bad until Fermi). GeForce 6 and 7 are very power efficient and cool running (even 7950GX2), unless you need some insane multi-head 4K 3D whatever (which may not even work with an old game) the newer cards aren't of much advantage from an efficiency perspective for DX8/9. For example 7900GS has 49W max TDP and is fully SM3.0 capable - it will do fine in a lot of games as long as you don't need 4K at 120 fps in Crysis or something. 🤣

Reply 22 of 63, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

^I knew you were going to say that. 😉

I have a 7800GS and frankly I'd say it's in the middle of nowhere now (along with 6/8/9 series etc), considering there are alternatives. Yes it will do FEAR, Quake4, Prey, BioShock (eh... not exactly over 60fps there) at 1600x1200 just 'fine' (barely), some with AA, some without. You definitely won't run a typical GF6/7/8/9 card *without* a fan though. Now look at GTX750/Ti, which could do what the best of 7/8 series could do with only... I have no numbers here, but maybe ~15W? And probably still at higher res/settings. Just a guess but this should give you an idea. Fanless would be no problem. 7950GX2? No thanks. 😀 Also, say you dual boot with win7/8, what would you do with the older cards when you need more performance for newer games (yeah, Crysis included)? Overclock them? Turn down the res/settings? Ehh... no thanks. This is not 2006 anymore. 😀

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 23 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t

My point wasn't that we should dump the GTX 980 into the ocean and live with 7900s and nothing newer, but that older cards up to around the GF7 era tend to be the most efficient option for running games from their eras, unless you're trying to crank up settings to what would've been unrealistic levels at the time (e.g. 4K at 120 FPS).

Also as a side note, 15W is far too optimistic for the 750 - TPU shows 52W average for gaming consumption. Just because the game is old doesn't mean the card magically stops consuming power; my GTX 660 running old titles like Portal shows similar TDP figures to newer titles like Skyrim (nothing but benchmarks will push the card to 100% at sustained max clocks - I'm assuming the 750 is similar based on the TPU #s), it just runs the old games at triple-digit FPS. It's doing more with the ~80W that it's drawing than my 7950GX2 could do with the same power, but it's not consuming less power. If that makes sense.

Reply 24 of 63, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

As I've already given example, even at 1600x1200, I couldn't always 'max out' the XP era's games (yes I included BioShock1 because of EAX) with say 4xAA, 16xAF and get consistent 60+fps (not even talking minFPS in certain scenes) with a 7800GS (comparable to 6800 afaik). Is that not realistic enough of an expectation? Not sure if 7900GS is that much better. Now OTOH I'm not sure what's your realistic levels look like, since you've only twice mentioned your "unrealistic" levels instead (4K, 120fps ... and Crysis).

And even *if* that (4K 60+ minfps blah blah) weren't realistic back at the time, so what? 😀

About "efficiency" -- if you actually read the link above, what's most interesting is the bit about setting the "target TDP/power limit", meaning, some underclocking *could be* involved (the article has the opposite objective of course). That's what I indicated in my previous posts. Now I was guessing that you could probably make it perform at 7800/7900GS level at about 15W, OK maybe a little bit more (unfortunately it seems to be limited to about 30W minimum according to the article). Also looks like the 750Ti's default is even less than your quoted number of 52W:

The interesting thing here however is that the default TDP limit for GTX 750 Ti is actually set to 38.5W inside the BIOS and the minimum of 78% you can go down to is equal to just 30W TDP

Now even with that number, will you argue that 30W is not more efficient enough vs ~80W?

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 25 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
archsan wrote:

As I've already given example, even at 1600x1200, I couldn't always 'max out' the XP era's games (yes I included BioShock1 because of EAX) with say 4xAA, 16xAF and get consistent 60+fps (not even talking minFPS in certain scenes) with a 7800GS (comparable to 6800 afaik). Is that not realistic enough of an expectation? Not sure if 7900GS is that much better. Now OTOH I'm not sure what's your realistic levels look like, since you've only twice mentioned your "unrealistic" levels instead (4K, 120fps ... and Crysis).

I never said "max out at highest possible settings" - I said run. And I also said games "from their era" - nobody is talking about maxing out top-end 2007-2008 games (some of which are DX10+) on hardware from 2005 except you. I've also said if the goal is very high settings/resolutions, newer is better. Power consumption for GPUs started to balloon when DX10 hardware came out, and has stayed relatively high since - none of these modern cards (even the 750) are going to be lower power draw than a Voodoo3, GF4, GF7, etc and for games that those cards can run. Newer is not *always* better.

I would also argue that "60+ FPS" is not a requirement of good gameplay, but interpretation of FPS #s is another discussion entirely. 😵

About "efficiency" -- if you actually read the link above, what's most interesting is the bit about setting the "target TDP/power limit", meaning, some underclocking *could be* involved (the article has the opposite objective of course). That's what I indicated in my previous posts. Now I was guessing that you could probably make it perform at 7800/7900GS level at about 15W, OK maybe a little bit more (unfortunately it seems to be limited to about 30W minimum according to the article). Also looks like the 750Ti's default is even less than your quoted number of 52W:

My "quoted number" of 52W is TPU's average measurement of the 750 in gaming (http://www.techpowerup.com/reviews/NVIDIA/GeF … _750_Ti/23.html). Your 10-15W claim is not accurate or expectable outside of idle state, and the thing you linked has nothing to do with gaming or gaming loads. The 750 (all Boost enabled GeForce cards actually) will dynamically adjust their clocks while running, and while I wouldn't expect 100% loading for an older title like BioShock, they aren't going to run it from idle either (and if they were, the performance wouldn't be very good - and remember, you want everything maxed out at very high resolutions, so you're going to be seeing higher loading as the card processes lots of AA/AF). They downclock pretty dramatically (as well as disabling on-die features) to get those 5-10W consumption figures sitting on the desktop. To be "roughly equivalent" to the 7800/7900 series you'd have to run them at between 50-60% of their rated clocks (their shader performance will still probably be higher, and they will still support DX10/11) with all features enabled, so power draw isn't going to be 5-10W there either.

Reply 26 of 63, by Yasashii

User metadata
Rank Member
Rank
Member

One day Crisis 12 is going to come out and it will require a gigawatt power supply unit, 20 CPUs, 3 buckets full of DDR46 RAM, four latest GFX cards and a petabyte of drive space. If you're the kind out guy who wants to play Crisis 12, then yeah, 32 bits won't cut it for you.

As for me, I'm perfectly happy playing games which have the hardware requirement of having a computer that works, on my 32 bit OS. Thank you.

Reply 27 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
Yasashii wrote:

One day Crisis 12 is going to come out and it will require a gigawatt power supply unit, 20 CPUs, 3 buckets full of DDR46 RAM, four latest GFX cards and a petabyte of drive space. If you're the kind out guy who wants to play Crisis 12, then yeah, 32 bits won't cut it for you.

I think you're being optimistic about the system requirements of Crysis 12. I heard 24 CPUs minimum. 🤣

Reply 28 of 63, by Neurotic

User metadata
Rank Newbie
Rank
Newbie
Yasashii wrote:

One day Crisis 12 is going to come out and it will require a gigawatt power supply unit, 20 CPUs, 3 buckets full of DDR46 RAM, four latest GFX cards and a petabyte of drive space. If you're the kind out guy who wants to play Crisis 12, then yeah, 32 bits won't cut it for you.

As for me, I'm perfectly happy playing games which have the hardware requirement of having a computer that works, on my 32 bit OS. Thank you.

Exactly my sentiment. I've seen people who claim 6 GB RAM is "not enough" or people who asked if 4 GB RAM is enough to watch streaming video from the internet. Many people ask for a "gaming rig" while they really mean "a loud monster that plays Crysis 3 at 120 fps in 4K and render a raytracing scene at the same time". Even a first generation Core Duo with a half decent videocard and 2 GB of RAM will run a modern game far better than how Quake run in SVGA on the most OMFG POWERFUL GAMING RIG at the time of its release.

Besides, I really hate the term "gaming rig". They are freaking PCs, not any "rigs" or whatever.

Reply 29 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
Neurotic wrote:

Exactly my sentiment. I've seen people who claim 6 GB RAM is "not enough" or people who asked if 4 GB RAM is enough to watch streaming video from the internet. Many people ask for a "gaming rig" while they really mean "a loud monster that plays Crysis 3 at 120 fps in 4K and render a raytracing scene at the same time". Even a first generation Core Duo with a half decent videocard and 2 GB of RAM will run a modern game far better than how Quake run in SVGA on the most OMFG POWERFUL GAMING RIG at the time of its release.

Besides, I really hate the term "gaming rig". They are freaking PCs, not any "rigs" or whatever.

6GB is "not enough" for what? Running a carrier-grade database server? 😵 🤣

Reply 30 of 63, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

@obobkivisch
Sure, that's why I often thought to myself, what the hell are we really arguing about? We have/had different expectations. 😀 Maybe I shouldn't play BS1 at 1600x1200 with the 7800GS, silly me...

none of these modern cards (even the 750) are going to be lower power draw than a Voodoo3, GF4, GF7, etc and for games that those cards can run. Newer is not *always* better.

(Voodoos and GF4s aside)... Even the underclocked 750? 😀 Hmm the numbers show there's the potential for just that. Already better at default than GF7950GX2 for sure, even GF7900GTX. Even close enough to the GF7x00GS (and there's the lower clock non-Ti model still, or cough cough maybe underclocked 750).
Point: Maybe the turning point on power efficiency levels comes with this new Maxwell generation (and hopefully AMD's latest/nextgen too)?

Fine, that 15W (don't lower that to 10W and make it sound worse please 😉, because I didn't mention that) for GF7-like-performance thing, it was not a "claim", I've been outright saying that it was a "guess" (too optimistic? maybe), and yes it was an untested conjecture on my part, alright? 😀 Yet I wouldn't be surprised if that were (close to being) true, especially since there's a 8 year gap between them. Now I'm really gonna get one of the fanless models (and maybe... wait, how much that Keithley multimeter thing costs... um, never mind). Thanks for firing me up anyway. 😉

OK so thanks for that link, and here are the numbers: it idles at 4W, blu-rays at 6W, [missing numbers: running 50% underclocked -- hmm would 30W be so outrageous here I wonder... ] averages at 52W, peaks at 57W, maxes at 66W. How bad is that compared to GTX660 (and 7950GX2, 7900 GTX) honestly? By default it's already close enough to 7900GS (wish we have the numbers there as well), and again, who knows, if some serious underclocking is involved...

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 31 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
archsan wrote:

@obobkivisch
Sure, that's why I often thought to myself, what the hell are we really arguing about? We have/had different expectations. 😀 Maybe I shouldn't play BS1 at 1600x1200 with the 7800GS, silly me...

I don't honestly remember what my 7900GS did for that game, but I know it ran well (no lag). Wasn't 1600x1200 though. 😊

(Voodoos and GF4s aside)... Even the underclocked 750? 😀 Hmm the numbers show there's the potential for just that. Already better at default than GF7950GX2 for sure, even GF7900GTX. Even close enough to the GF7x00GS (and there's the lower clock non-Ti model still, or cough cough maybe underclocked 750).
Point: Maybe the turning point on power efficiency levels comes with this new Maxwell generation (and hopefully AMD's latest/nextgen too)?

I think yes Maxwell is a "turning point" for power draw on modern cards. And yes the 750s are doing better than 7950GX2 and GTX in terms of max draw. I don't have a GTX to play around with, but the GX2 isn't so awful at idle - the worst power draw I've seen out of mine are running SLI-AA.

Fine, that 15W (don't lower that to 10W and make it sound worse please 😉, because I didn't mention that) for GF7-like-performance thing, it was not a "claim", I've been outright saying that it was a "guess" (too optimistic? maybe), and yes it was an untested conjecture on my part, alright? 😀 Yet I wouldn't be surprised if that were (close to being) true, especially since there's a 8 year gap between them. Now I'm really gonna get one of the fanless models (and maybe... wait, how much that Keithley multimeter thing costs... um, never mind). Thanks for firing me up anyway. 😉

I wasn't meaning to say 10W to make it sound worse, but to reference the actual measured #s for idle state. 😊 Sorry for that not being more clear.

There *is* a long time gap, but remember that in that time gap we also saw power consumption explode for cards, and GF7 was hailed (when it was new) as being a "turning point" for power consumption after GeForce FX and 6.

OK so thanks for that link, and here are the numbers: it idles at 4W, blu-rays at 6W, [missing numbers: running 50% underclocked -- hmm would 30W be so outrageous here I wonder... ] averages at 52W, peaks at 57W, maxes at 66W. How bad is that compared to GTX660 (and 7950GX2, 7900 GTX) honestly? By default it's already close enough to 7900GS (wish we have the numbers there as well), and again, who knows, if some serious underclocking is involved...

GTX 660 is faster/more powerful, and should've been included on those TPU charts as well (if it isn't, dig around for another review - I know they've measured it). At idle the 660 is about the same (~10W figures), under load the 660 can break 100W, but it performs better.

* Just looked and yeah it appears on the idle chart: 7W. On Blu-ray 10W, and on average 112W. [there is a difference between 660 and 660 Ti btw]

The 7950GX2 is ~120W, so about on par with the 660; I don't have any good #s I can lay hands on for idle. I don't honestly remember if my GX2s underclock at idle either. GTX I think is 70-80W (I don't have a GTX). The GX2 has an advantage in that it can do SLI-AA on-card, so you can have 16x MSAA if you're okay with the performance hit (it will perform about on par with a single 7950, which is still very usable in many games). AFAIK there's nothing out there that the GX2/GTX can run that the GS cannot though - they're too close in performance (and identical in features).

From #s that I found on the GS, 30-40W isn't unreasonable expectation in real-world gaming either. 😀

Reply 32 of 63, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

I'm taking the liberty to get on with this discussion since OP was talking about GTX970/980 release. 😁

Should've shown this list sooner to get the point across:

rated max TDP numbers (watts)

GTX 750 Ti (GM107) . 60 vs. 57 peak / 66 max @TechPowerUp
GTX 750 (GM107) .... 55

GTX 980 (GM204) .. 165
GTX 970 (GM204) .. 145

GTX 780 Ti ........... 250 vs. 269 peak / 260 max @TPU

GTX 680 .............. 195 vs. 186 peak / 228 max @TPU
GTX 660 (non Ti) .. 140 vs. 118 peak / 138 max @TPU

GTX 580 .............. 244 vs. 229 peak / 326 max @TPU
GTX 480 .............. 250
GTX 470 .............. 215

[years...]

Geforce 7950 GX2 .. 143
Geforce 7900 GTX .. 120
Geforce 7900 GT .... 82
Geforce 7900 GS .... 49
Geforce 7800 GS .... 52

These are max numbers, not average/normal gaming. There may be some inconsistencies about Nvidia's ratings here (EDIT: added some TPU numbers -- ratings vs tested), so take it with a grain of Hawaiian Bamboo Jade, um, sea salt. Anyway I'm just glad they're starting to come back into sense, so maybe within a decade ev'ryone can play Crysis 1 on integrated graphics/APUs The Way It's Meant To Be Played. 😁

Last edited by archsan on 2014-09-23, 21:44. Edited 1 time in total.

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 33 of 63, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I'm not entirely sure what we're arguing about here. 😀

I will say that I don't want a GF7 for any modern games because the cards have relatively poor texture filtering and anti-aliasing quality that was remedied by GF8. The ATI X1xxx series had superior quality at the time. X1950XTX can't run Bioshock at 1600x1200 at 60 fps though.

The only reason I worry about power consumption is because the 200+ Watt cards are difficult to cool quietly. You have to be smart about what cooler the card you buy comes with.

Reply 34 of 63, by JayCeeBee64

User metadata
Rank Retired
Rank
Retired
Yasashii wrote:

One day Crisis 12 is going to come out and it will require a gigawatt power supply unit, 20 CPUs, 3 buckets full of DDR46 RAM, four latest GFX cards and a petabyte of drive space. If you're the kind out guy who wants to play Crisis 12, then yeah, 32 bits won't cut it for you.

And when that happens I'll just do what I've done since 2005: shrug, look at nothing in particular and say "Thanks but no thanks"

As for me, I'm perfectly happy playing games which have the hardware requirement of having a computer that works, on my 32 bit OS. Thank you.

Agreed 100%. And it would take a real miracle for me to be interested in any current game being made today; until then, I'll continue to be a retro gamer 😈

Ooohh, the pain......

Reply 35 of 63, by GeorgeMan

User metadata
Rank Oldbie
Rank
Oldbie

The fact is one: Does the game run on XP? If yes, just get an Ivy Bridge with ANY recent graphics card and play old of your oldies completely MAXED out on your main PC.

I do this on my i5 sandy, with HD7870 XT Tahiti LE w/Boost. Everything's fine and supported. 😀

This is of course a no-go for any card from 2005-2008. There are better and more suitable options out there.

1. Athlon XP 3200+ | ASUS A7V600 | Radeon 9500 @ Pro | SB Audigy 2 ZS | 80GB IDE, 500GB SSD IDE2Sata, 2x1TB HDDs | Win 98SE, XP, Vista
2. Pentium MMX 266| Qdi Titanium IIIB | Hercules graphics & Amber monitor | 1 + 10GB HDDs | DOS 6.22, Win 3.1, 95C

Reply 36 of 63, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

^Good to know, I've been reading pieces of news that AMD was dropping XP support last year, but apparently that's not the case?

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 37 of 63, by obobskivich

User metadata
Rank l33t
Rank
l33t
archsan wrote:

I There may be some inconsistencies about Nvidia's ratings here (are they deliberately lowering the ratings for the GM chips? hmm), so take it with a grain of Hawaiian Bamboo Jade, um, sea salt. Anyway I'm just glad they're starting to come back into sense, so maybe within a decade ev'ryone can play Crysis 1 on integrated graphics/APUs The Way It's Meant To Be Played. 😁

I don't know if they're over- or under-rating today, but I remember they eventually came out and said they were over-rating GeForce 6 and 7 numbers at the time because of how bad power supplies tended to be at the time, and they wanted to ensure users had at least a half decent power supply. 😵

Something else worth pointing out - DX9+ games will generally work in Vista/7 (I can't think of one that doesn't offhandedly), so if you want a "mixed" gaming machine it can work pretty well, supporting games from ~2002 to present, without any sort of multi-booting or anything like that. EAX would be the only caveat I can think of - there is ALchemy, but I've never had good luck with it (or EAX in general for that matter).

Reply 38 of 63, by GeorgeMan

User metadata
Rank Oldbie
Rank
Oldbie
archsan wrote:

^Good to know, I've been reading pieces of news that AMD was dropping XP support last year, but apparently that's not the case?

Well, for R9 285, the newest AMD card in existence, there is no XP driver listed. 😉

1. Athlon XP 3200+ | ASUS A7V600 | Radeon 9500 @ Pro | SB Audigy 2 ZS | 80GB IDE, 500GB SSD IDE2Sata, 2x1TB HDDs | Win 98SE, XP, Vista
2. Pentium MMX 266| Qdi Titanium IIIB | Hercules graphics & Amber monitor | 1 + 10GB HDDs | DOS 6.22, Win 3.1, 95C

Reply 39 of 63, by swaaye

User metadata
Rank l33t++
Rank
l33t++

AMD doesn't appear to be releasing up-to-date drivers for any cards for XP. The latest are the old 14.4 WHQL which very well could be 2013 code. But hey it's XP and so they can't exactly deliver the latest D3D 11, Mantle, etc tweaks.