VOGONS


Reply 40 of 93, by F2bnp

User metadata
Rank l33t
Rank
l33t

No, it was a Gainward card actually, sorry about that.
I think you're being unfair, of course it struggled with Crysis. Everything struggled with Crysis and even nowadays there are cards out there that can't provide solid 60fps on the game. I replayed it last year on a GTX 460 1GB and there were quite a few hiccups.
I used to game at 1280x1024 and jumping from the 7800GTX 256MB to the 4850 512MB was a huge leap. I believe I played at 1024x768 and Medium to High details on the 7800 GTX. Putting Shader Settings to High practically killed the framerate.
With the 4850, I played at 1280x1024 and I think I had everything turned to the maximum, perhaps with some settings on High. I believe I got solid 30fps most of the time. Sometimes 35-40 and some other times 25-30.
It certainly was much much better and the game looked simply amazing so I couldn't bitch about that hehehe.

All of that was on a Core 2 Duo 6550 (this one's clocked at 2.33GHz, although I liked to clock it at 3.00GHz for some games, like GTA V), 2GB RAM and Windows XP (No DirectX 10 stuff 🙁 ).

I'm stumped about the newer drivers thing. I never really had much of a problem. It is a known fact that both companies' driver releases sometimes break stuff that used to work just fine. In general though, this is a small issue and it keeps happening to this day.

Reply 41 of 93, by obobskivich

User metadata
Rank l33t
Rank
l33t
sliderider wrote:

8800 Ultra's are down to the $75 range for used ones on ebay. It might be a good time to build a quad SLi system with the cards being so cheap. A quad SLi machine centered around these cards could probably still give good framerates to any recent game that still has a DX9 or DX10 renderer.

Yeah I've been eyeing them at that kind of pricing - actually found a few lots of them too. I don't think they support Quad SLI though - double or triple iirc is where the 8800U will top itself out. I think Quad was a later development - not that it would really matter, 3 and 4 way both provide the benefits of reduced/eliminated microstutter and cheap AA, while not usually providing tons of additional FPS. Even if 3-way was the top-end limit, that'd be pretty slick to have. Ignoring that it'd consume around 600W just for the gcards, and would need a special motherboard, etc.

More seriously I've just thought about getting a single card - performance wise (at least on paper) they're right up there with the 4850/4870, and can probably hold their own in most recent DX9 games (which is about as new as anything I actually care to play). The biggest rub is that the GTX 280 costs about the same. Decisions, decisions...

Maximus: My 4890 (which is still in use btw) had a heatsink similar to that MSI, it was quiet at idle, but super-duper loud when gaming (it rattled). I got a replacement for it (from Arctic) and it "cured" that problem quite nicely. Other than that it's been a fantastic card.

Reply 42 of 93, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I do not think any Nvidia cards have supported quad SLI with 4 cards until recently.

"Quad SLI" with dual GPU cards was first supported with the Geforce 7900 GX2.
The 8800 GTX / Ultra 768 mb did support tripple SLI while the rest of the G80 and G92 cards did not.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 43 of 93, by maximus

User metadata
Rank Member
Rank
Member
F2bnp wrote:

I think you're being unfair, of course it struggled with Crysis. Everything struggled with Crysis and even nowadays there are cards out there that can't provide solid 60fps on the game.

I guess you're right. Overall, the 4850's performance was great for its time. I think my framerates in Crysis were pretty similar to yours, and that was with Vista and a C2D E8400 overclocked to 3.6 GHz.

As for compatibility, I remember it being decent when I first got the card, and then getting worse and worse with each new driver release. Last time I tried to set the card up with XP, just about nothing pre-DX9 would run. (That was a year or two ago - things may have improved since then, but I doubt it.)

PCGames9505

Reply 44 of 93, by tincup

User metadata
Rank Oldbie
Rank
Oldbie

Well, a bit more back on topic, I my records show I paid $6 last Fall for my P4-3.6/2mb/800/socket 775/SL7Z5 Prescott. About middle of the pack price-wise and not exactly raiding the piggy bank 😀

Reply 45 of 93, by obobskivich

User metadata
Rank l33t
Rank
l33t
maximus wrote:

As for compatibility, I remember it being decent when I first got the card, and then getting worse and worse with each new driver release. Last time I tried to set the card up with XP, just about nothing pre-DX9 would run. (That was a year or two ago - things may have improved since then, but I doubt it.)

I have had no problems with my 4890 or 4870X2 running a number of DirectX 7/8 games, the bigger compatibility issues are related to Windows 7 in my situation (when the machine ran XP none of those titles had issues). The X2 (any CrossFire setup actually) has some minor issues with Skyrim (the menus flicker), and triple-GPU caused some stability issues with Mass Effect 2 (pulling either card and it ran fine). FWIW.

Reply 46 of 93, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Personally the Ultra or "best" cards will always command a premium. I like the "value" cards or cards "for the masses" like the GF4 Ti 4200, 6600GT, 7600GT, 9600GT. They are much cheaper and easier to find and not far behind the top models. Who knows if they also last longer because the top cards might be running close to the bleeding edge of what was possible.

The insane power draw of these cards is also a concern.

I also had a HD4850 for Crysis 😀 I think it was a HIS.

PS: We are indeed off topic. Maybe start a new thread?

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 47 of 93, by badmojo

User metadata
Rank l33t
Rank
l33t

I don't mind the off topic, this is all interesting stuff!

Regarding the "ultra" cards not lasting as long as their "for the masses" counterparts, it sounds plausible. My one and only experience with owning a top of the line card was when I bought a 4870X2 for an embarrassing amount of money on its release. The reviewers agreed that it was the fastest consumer card on the planet at the time but of course you needed every other component in your system to be top of the line to get the most out of it, and that thing ran so hot as to be unusable during the warmer months - it was stable, but it heat the room to beyond comfortable. That can't be good for it or the rest of the PC long term.

Life? Don't talk to me about life.

Reply 48 of 93, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Mau1wurf1977 wrote:
Personally the Ultra or "best" cards will always command a premium. I like the "value" cards or cards "for the masses" like the […]
Show full quote

Personally the Ultra or "best" cards will always command a premium. I like the "value" cards or cards "for the masses" like the GF4 Ti 4200, 6600GT, 7600GT, 9600GT. They are much cheaper and easier to find and not far behind the top models. Who knows if they also last longer because the top cards might be running close to the bleeding edge of what was possible.

The insane power draw of these cards is also a concern.

I also had a HD4850 for Crysis 😀 I think it was a HIS.

PS: We are indeed off topic. Maybe start a new thread?

I find this very interesting though.
Just today I was looking for specs how much heat older graphics cards produce and it's very handy to know their max power draw.

It's easy to find the more recent (PCI-E) versions but finding power consumption for AGP cards proved to be a search that lasted over an hour, but in the end I found this. This is a google search which reveals an older ascii table sorta thing with power consumption for cards till around 2007-ish? And it also lists a newer version in 2 images.

I'm still kinda the guy who will prefer mid-range graphics cards for a variety of reasons. But after having used a GF 6800 AGP in one of my rigs, I found this card to be acceptable and very good performing 😀

The modern high-end graphics cards, now those beasts need a lot of power! Todays mid-range cards consume more power then yesterdays high-end cards.
The good thing about this is that heat issues that were a problem back then, are less of a problem now. At least that's how I see it.

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 49 of 93, by tincup

User metadata
Rank Oldbie
Rank
Oldbie
badmojo wrote:

... it was stable, but it heat the room to beyond comfortable...

The "Polar Vortex" we've had in the north American realm this winter has meant good computing! Crack the window and it's better than liquid cooling 🤣..

Reply 50 of 93, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie
maximus wrote:
You mean this one? […]
Show full quote
F2bnp wrote:

I was really impressed with some MSI (I believe) 4850 that had a very smart, single slot cooler, barely made any noise and was very very cool. I built a PC for a friend that used that card and we were both very impressed.

You mean this one?

msi-radeon-hd-4850-img01-big.jpg

I have one of those. That was my GPU for Crysis as well 😀

To be honest, though, I always had mixed feelings about the 4850. It struggled a bit with Crysis at max details, even at 1280x1024. And then newer drivers made it essentially useless for older games, even in XP.

That looks like it takes 2 slots to me.

I have a HIS 4850 that is a single slot.
200809091743099426.jpg

Works ok and its quiet but runs pretty dam hot. I have another on the way to try crossfire.

Collector of old computers, hardware, and software

Reply 51 of 93, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
Tetrium wrote:
I find this very interesting though. Just today I was looking for specs how much heat older graphics cards produce and it's very […]
Show full quote

I find this very interesting though.
Just today I was looking for specs how much heat older graphics cards produce and it's very handy to know their max power draw.

It's easy to find the more recent (PCI-E) versions but finding power consumption for AGP cards proved to be a search that lasted over an hour, but in the end I found this. This is a google search which reveals an older ascii table sorta thing with power consumption for cards till around 2007-ish? And it also lists a newer version in 2 images.

I'm still kinda the guy who will prefer mid-range graphics cards for a variety of reasons. But after having used a GF 6800 AGP in one of my rigs, I found this card to be acceptable and very good performing 😀

The modern high-end graphics cards, now those beasts need a lot of power! Todays mid-range cards consume more power then yesterdays high-end cards.
The good thing about this is that heat issues that were a problem back then, are less of a problem now. At least that's how I see it.

Easiest is to just check the power plugs. First ones had none, then floppy power, then the molex power, then PCIe 6 pin, then two, then 8pin, 8+6 and 8+8...

I like to go after the "die shrink / refresh cards". Like the 7900GT. These are more mature products, consume less power and simpler design.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 52 of 93, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I prefer the top cards.

Best cooling, all parts of the GPU activated, often wider memory buss plus more memory and so on.
If you want to save power or decrease heat output make a custom BIOS with lower voltage and if necessary lower clocks.

I ran a HD 4870 Crossfire setup that was almost silent. Those cards were great, you had 100% control over the GPU voltages with Rivatuner 😀.
No need for custum BIOS! The Voltera VT1165 digital VRM controller was a great step forward 😀.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 53 of 93, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Skyscraper wrote:
I prefer the top cards. […]
Show full quote

I prefer the top cards.

Best cooling, all parts of the GPU activated, often wider memory buss plus more memory and so on.
If you want to save power or decrease heat output make a custom BIOS with lower voltage and if necessary lower clocks.

I ran a HD 4870 Crossfire setup that was almost silent. Those cards were great, you had 100% control over the GPU voltages with Rivatuner 😀.
No need for custum BIOS! The Voltera VT1165 digital VRM controller was a great step forward 😀.

Actually, I'm considering exactly that: Underclocking a high-end card. If I really need more power, then I'll just use one of my faster rigs.
I'm always on the lookout for software that can also underclock (usually they only offer an overclock option, I think it was Rivatuner??).

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 54 of 93, by Skyscraper

User metadata
Rank l33t
Rank
l33t
Tetrium wrote:

Actually, I'm considering exactly that: Underclocking a high-end card. If I really need more power, then I'll just use one of my faster rigs.
I'm always on the lookout for software that can also underclock (usually they only offer an overclock option, I think it was Rivatuner??).

Rivatuner can underclock most cards.
If you are not happy with the speed range available I think you can just edit the config file.
Setting voltages with Riviatuner was something you did in the DOS promth, you were basicly inputting commands directly to the VRM controller manually.
I am sure there are still lots of guides with more detailed instructions out there somewere 😀.

MSI After Burner can control the voltage without the DOS promth commands but I do not think it supports all old cards and Windows versions Rivatuner supported.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 55 of 93, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Skyscraper wrote:
Rivatuner can underclock most cards. If you are not happy with the speed range available I think you can just edit the config fi […]
Show full quote
Tetrium wrote:

Actually, I'm considering exactly that: Underclocking a high-end card. If I really need more power, then I'll just use one of my faster rigs.
I'm always on the lookout for software that can also underclock (usually they only offer an overclock option, I think it was Rivatuner??).

Rivatuner can underclock most cards.
If you are not happy with the speed range available I think you can just edit the config file.
Setting voltages with Riviatuner was something you did in the DOS promth, you were basicly inputting commands directly to the VRM controller manually.
I am sure there are still lots of guides with more detailed instructions out there somewere 😀.

MSI After Burner can control the voltage without the DOS promth commands but I do not think it supports all old cards and Windows versions Rivatuner supported.

Cheers! I'm already looking into this and found heaps of information 😁

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 56 of 93, by obobskivich

User metadata
Rank l33t
Rank
l33t
Mau1wurf1977 wrote:

Easiest is to just check the power plugs. First ones had none, then floppy power, then the molex power, then PCIe 6 pin, then two, then 8pin, 8+6 and 8+8...

I like to go after the "die shrink / refresh cards". Like the 7900GT. These are more mature products, consume less power and simpler design.

Indeed. To both points. Keep in mind that some "premium" boards will over-spec on their power components, and plugs, despite not drawing any extra power. I can think of many examples where the refresh cards are consistently better - the best example is probably the Radeon HD 3870 and HD 2900XT.

To the power consumption - while it has gotten pretty high in recent years, they've generally pleauteau'd around 200-250W on the top end, while performance still increases. For example the Quadro K6000 (which is an absolute monster) has a TDP spec of around 225W, which is not much higher than the 8800 Ultra, despite the K6000 being some four times over the 8800U in terms of processing power (and more like 12x in memory). As far as "mid range" cards go, you have to be careful with those anymore, because its becoming more common for last generation's "high end" to turn into this generation's "mid range" - for example the nVidia 9800GTX becoming the GTS 250, or the GTX 680 becoming the GTX 770. Power consumption isn't dropped very significantly from the very top end board to the "mid range" part as a result.

To use those two nVidia examples:
The 9800GTX has a TDP of around 140W, and a fill-rate of around 10 GP/s.
The GTS 250 is relatively similar (150W TDP and 11 GP/s).

By contrast the top card from the 200 series, the GTX 285 (which is itself a refresh part) doubles that fillrate, while only increasing TDP to 204W. The power:performance ratio is a lot better for the 285.

The GTX 680 has a 190W TDP, and around 30 GP/s fillrate.
The GTX 770 is relatively similar (33GP/s, but TDP is up to 235W).
While GTX 780 Ti offers over 40 GP/s at 250W.

In both cases, the top card offers considerable performance increases (at least on paper), with minimal additional power consumption requirements. Of course the top cards also cost more, but in terms of "power efficiency" they do come out ahead given their processing power.

With older cards it tends to be more straight forward, mid-range cards usually aren't repackaged "high end" chips from the last era. Like the GeForce FX 5600 isn't a re-packaged Ti 4600. So they can usually reap power/efficiency benefits from being newer architectures, better manufacturing tech, etc. I think GeForce 7 was probably the last generation where power efficiency was genuinely worried about by manufacturers; from ATi it probably would've been Radeon X series (like the X800). Once "they" got people around the idea of dual-slot leaf blowers as "normal coolers" and PCIe came along, it seems that power consumption went right out the window as a concern, and 200W became pretty normal (along with 60-80* C idle temps and load temps that can sometimes be high enough to boil water).

On the note of the 4870X2 specifically: mine lasted for around 5 years, burning away at 70* C or better. Last fall it started crashing/rebooting the system when tasked with a demanding DX9/DX10 game (like Skyrim) - I'm not sure what the problem actually was, I removed it and replaced it with the 4890 and everything works (didn't even have to swap drivers - 😁), so I just put it up on the shelf and left it alone. 😊

Reply 57 of 93, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

Been having a lookout for some decent P4 gear on eBay and I must say that they aren't THAT cheap (yet?). Definitely not rock bottom, especially the "good stuff".

I see a lot of 865 chipset boards, but hardly anything with the 875. Lots of OEM stuff, but not many top (Deluxe or similar) boards.

Also a lot of interesting boards with chipsets from SIS and VIA for value but I prefer Intel. Had a 865 chipset board from Asus, a Nortwood 2.6 GHz and a Radeon 9800 at around 2004 - 2005 and that era was quite interesting.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 58 of 93, by vetz

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:

As for the 8800GTX, I always drooled for one of these. The guy that sold me the Musashi gave me a dead one a few months ago and I have it on display. It's really cool to just look at it and remember all those awesome memories.

I also have found memories of the 8800GTX. It was a game changer in performance when it came out (similar to the ATI 9700PRO). I bought it along with everything else in my brand new quadcore LGA775 system back in 2007. The 8800GTX died in 2010, but my LGA775 still conquers on with some heavy upgrades as my main system.

3D Accelerated Games List (Proprietary APIs - No 3DFX/Direct3D)
3D Acceleration Comparison Episodes

Reply 59 of 93, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I hope it's ok to post here about P4 related stuff.

I asked my neighbour, friend and fellow retro computer guy if he had any S478 boards.

I borrowed the following and will have a play with them:

http://i.imgur.com/e5Zm4n7.jpg

http://i.imgur.com/7Bg595U.jpg

http://i.imgur.com/v1qwEgx.jpg

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel