VOGONS


Time for a graphics card change...

Topic actions

Reply 40 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

Yeah I said you can run anything from GeForce FX and older on AGP 3.3v like a 440BX. I've played Doom3 on a 440BX with a FX 5900 Ultra.

GeForce 3 is definitely better than GeForce 2 for image quality. GeForce FX is better than GeForce 3/4.

Power consumption isn't really something to worry about. If you want really low power consumption for some reason, look for cards with little fanless heatsinks or bare chips (some MX cards).

I have an MX card here (Inno3D GF2 MX400) and the overall image quality is terrible compared to my Matrox G400 Max or even my Voodoo 3 3000.... It's definitely a bit faster though. I think anything above a Geforce 3 is definitely overkill for my setup imo.

I've heard (or should i say read) stories about the old AGP 1.0 slots not being able to provide enough power to some AGP 4x/8x cards. And then there's the issue with manufacturers incorrectly keying their cards to fit both 3.3V & 1.5V slots when they only support 1.5V 😳. It's a minefield of horror stories out there when it comes to running later AGP cards with early AGP 1.0 slots 😖

Last edited by PowerPie5000 on 2013-01-18, 20:18. Edited 3 times in total.

Reply 41 of 130, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie

A Canopus Spectra 8400/8800 is the best of both worlds. Smoking fast. VBE 3.0 support and excellent 2d quality because of their SSH vga daughterboards 😀

Reply 42 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
subhuman@xgtx wrote:

A Canopus Spectra 8400/8800 is the best of both worlds. Smoking fast. VBE 3.0 support and excellent 2d quality because of their SSH vga daughterboards 😀

And it appears they have an external power connector in case the AGP slot can't cope... But i bet it's hard to find one 🙁 .

Reply 43 of 130, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
PowerPie5000 wrote:
subhuman@xgtx wrote:

A Canopus Spectra 8400/8800 is the best of both worlds. Smoking fast. VBE 3.0 support and excellent 2d quality because of their SSH vga daughterboards 😀

And it appears they have an external power connector in case the AGP slot can't cope... But i bet it's hard to find one 🙁 .

On ebay it might be hard to find them, but on Taobao they are easy to come by, I have seen TNT2s, GF256s, GF2 MXs, GTS, Ultra, with the very same SSH daughterboard 😀

Reply 44 of 130, by swaaye

User metadata
Rank l33t++
Rank
l33t++
PowerPie5000 wrote:

I have an MX card here (Inno3D GF2 MX400) and the overall image quality is terrible compared to my Matrox G400 Max or even my Voodoo 3 3000....

That is because it's one of the GeForce 2 cards that was poorly constructed. This is was commonplace for the line. However I do have a Dell OEM 64MB GeForce2 MX that is sharp. Probably a NV reference design.

There is actually a mod you can do to the analog circuitry to sharpen the output of some GF2 cards. Do some searches if you feel like it.

PowerPie5000 wrote:

I think anything above a Geforce 3 is definitely overkill for my setup imo.

That depends on resolution. I play at 1600x1200 because I use a 24" LCD. I'll keep the 5900 Ultra. 😀

PowerPie5000 wrote:

I've heard (or should i say read) stories about the old AGP 1.0 slots not being able to provide enough power to some AGP 4x/8x cards. And then there's the issue with manufacturers incorrectly keying their cards to fit both 3.3V & 1.5V slots when they only support 1.5V 😳. It's a minefield of horror stories out there when it comes to running later AGP cards with early AGP 1.0 slots 😖

There are some GeForce 6 cards which are keyed wrong. There are also some that legitimately work on AGP 3.3v. FX and older always support AGP 3.3v.

The power issue is with some motherboards that were poorly/cheaply designed. Super 7 boards often have this problem, but I imagine there are some Intel-based boards too. I haven't run into it with Intel though.

Bad capacitors could also cause problems today.

Reply 45 of 130, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

There is actually a mod you can do to the analog circuitry to sharpen the output of some GF2 cards. Do some searches if you feel like it.

here it is 😀

http://digilander.iol.it/grandecigno/RF_Filter.htm

Reply 46 of 130, by swaaye

User metadata
Rank l33t++
Rank
l33t++

There's also this one
http://web.tiscalinet.it/creeping_death/guide/aamodifica.htm

I'd rather just get a card with DVI. But of course if you want to use a CRT this is not really an option.

Reply 47 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

I'd rather just get a card with DVI. But of course if you want to use a CRT this is not really an option.

I found DVI to be a pain when running DOS games, but it could be chipset or vendor specific? Resolutions were either unsupported, aspect ratios wrong or the refresh rates used to get messed up in DOS mode... Think i'll stick to VGA for this old machine 😀

swaaye wrote:
There are some GeForce 6 cards which are keyed wrong. There are also some that legitimately work on AGP 3.3v. FX and older alwa […]
Show full quote
PowerPie5000 wrote:

I've heard (or should i say read) stories about the old AGP 1.0 slots not being able to provide enough power to some AGP 4x/8x cards. And then there's the issue with manufacturers incorrectly keying their cards to fit both 3.3V & 1.5V slots when they only support 1.5V 😳. It's a minefield of horror stories out there when it comes to running later AGP cards with early AGP 1.0 slots 😖

There are some GeForce 6 cards which are keyed wrong. There are also some that legitimately work on AGP 3.3v. FX and older always support AGP 3.3v.

The power issue is with some motherboards that were poorly/cheaply designed. Super 7 boards often have this problem, but I imagine there are some Intel-based boards too. I haven't run into it with Intel though.

Bad capacitors could also cause problems today.

Do you think My Intel SE440BX-2 will be upto the task of powering a GF2 Ultra or a GF3 Ti200? I think i'd prefer to take the GF3 route as it's much better than the GF2 when it comes to FSAA (would be nice to use that feature with older games 😀). The GF3 probably has better image quality too without needing to perform mods.

Intel boards may look boring/generic, but i don't think they cheap out on components (i hope)... Any other suggestions before i take the plunge for a GF2 Ultra or a GF3 ti200? Oh, and which would be more power hungry out of the 2 cards i mentioned?

Thanks 😀

Reply 48 of 130, by badmojo

User metadata
Rank l33t
Rank
l33t

I swapped my GF2 Ultra for a Winfast GF2 GTS I had lying around and the 2D is much better at 1024x768. Still not amazing, but acceptable.

Life? Don't talk to me about life.

Reply 49 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
badmofo wrote:

I swapped my GF2 Ultra for a Winfast GF2 GTS I had lying around and the 2D is much better at 1024x768. Still not amazing, but acceptable.

Is there much difference in performance between the two? And what board & CPU are you using?

Reply 51 of 130, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

I was a 8500 guy too. It sounds awesome on paper, and I got one for only $90, but that was one bugged card. It has its share of hardware problems, like even though it should be faster than GF4 in Doom3 because of PS1.4, it is slower because of some internal quirks. Truform is mostly ridiculous. It entirely lacks MSAA and its SSAA is surprisingly disappointing even though it's said to be rotated grid of some sort. Its anisotropic filtering is fast but that's because it's very limited in quality. Backward D3D compatibility is poor. It is ok for most D3D7 and 8 games, and Quake engines work fine.

You guys really don't want ATI for your old games. But maybe you just want to mess with everything anyway.

RV250 is apparently more efficient because otherwise it should not be able to keep up with R200. It lacks the Truform hardware, somehow emulating it. It has only one vertex shader whereas R200 supposedly has 2 vertex processors.

Radeon 9K Pro has only 1 vertex unit but it was more efficient than the vertex units in the 8500. It still can't compete with the 2 vertex units in the 8500 working together, but is faster than you would think being just 1 because of the optimizations that were made. There are some instances where the 9K Pro can be equal to or faster than the 8500 because of improvements in efficiency but overall the 8500 is probably still the faster card.

Reply 52 of 130, by sliderider

User metadata
Rank l33t++
Rank
l33t++
subhuman@xgtx wrote:
swaaye wrote:

There is actually a mod you can do to the analog circuitry to sharpen the output of some GF2 cards. Do some searches if you feel like it.

here it is 😀

http://digilander.iol.it/grandecigno/RF_Filter.htm

I love old mods like that. I like them better when they are old because when they are new you might be modding a $300+ video card and taking a huge risk if you mess it up. When the mod becomes old, you might be able to pick up the needed parts for $10-$20 so if you screw it up it isn't the end of the world, you just buy another one and try again.

Reply 53 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
VictorB wrote:

Maybe get a Matrox g400 non max or matrox g450. Passif cooled and for performance we have new pc's 😀 (Iam using a matrox g400 max 😁 )

My modern PC is a beast and can run all modern games on max settings, but there are still plenty of Win95/98 games that absolutely refuse to run or even install with the 64-bit version of Windows 7... I still like to have an old Win95/98 box that can at least handle all games comfortably from that era (up until Win XP).

I have a Matrox G400 Max (with a dodgy fan that i can't find a replacement for), but would prefer something with hardware T&L and maybe even FSAA that i can force on with older 3D games 😀.

Reply 54 of 130, by swaaye

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:

There are some instances where the 9K Pro can be equal to or faster than the 8500 because of improvements in efficiency but overall the 8500 is probably still the faster card.

I did some digging and the major differences between R200 and RV2x0 appear to be:

-Hierarchical Z removed (part of HyperZ). RV3x0 is like this too.
-High Order Surfaces hardware removed (Truform)
-One vertex shader (vs 2). R200/RV2x0 units have D3D8 & D3D7 T&L support.
-One texture pipe (4x1 vs 4x2). So half R200's texturing rate.
-Texture cache increased from 2K to 4K. Efficiency gains.

They talk about how RV250 can run on R200's driver so it's likely that RV2x0 is largely similar to R200.

Reply 55 of 130, by sliderider

User metadata
Rank l33t++
Rank
l33t++

Here's what I found

" Here's how the RV250 differs from its predecessor:

A slimmed-down 3D pipeline — The RV250 retains the R200's four pixel pipelines, but each pipe now lays down one texture per pass instead of two. Just like the R200, the RV250 can "loop back" within the pipeline and lay down additional textures before writing the result to the framebuffer, only the RV250 can loop back six times where the R200 could only loop back three times. Theoretically, both chips can lay down six textures per pass; they just go about it in different ways. The R200 could deliver more textures per clock, but the RV250's ability to loop back six times may help it perform better when processing complex pixel shader effects in future games.
One optimized vertex shader — To achieve a smaller, simpler chip, the R200's dual vertex units just had to go. The RV250 features just one vertex unit, an optimized version of its predecessor that ATI has dubbed version 1.1."

From this review

http://techreport.com/review/3917/ati-radeon- … o-graphics-card

So even though the 9000 Pro loses a texture unit, the remaining texture unit can do twice as much work so it can still lay down the same number of textures per pass as the 8500. The only place where it really suffers is with the missing vertex shader.

Reply 56 of 130, by badmojo

User metadata
Rank l33t
Rank
l33t
PowerPie5000 wrote:
badmofo wrote:

I swapped my GF2 Ultra for a Winfast GF2 GTS I had lying around and the 2D is much better at 1024x768. Still not amazing, but acceptable.

Is there much difference in performance between the two? And what board & CPU are you using?

Details of the system here:

A Pentium III Windows 98 build to complete the gang.

I only had a quick try of Gothic, but no, I didn't notice any difference in speed. My understanding is that the Ultra comes into it's own at higher resolutions - I play at 1024x768.

Reply 57 of 130, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie

I'm going back to a voodoo 5 5500 AGP now... Someone over at Amibay is kindly donating one 😀. I'll see how it goes and if i don't like it then i'll pass it on to someone else (i used to have one that i repaired, but i can't remember why i got rid of it 😖).

Reply 59 of 130, by SiliconClassics

User metadata
Rank Member
Rank
Member
PowerPie5000 wrote:

I have a Matrox G400 Max (with a dodgy fan that i can't find a replacement for)...

Have you considered reconditioning that fan? Here are a couple of brief articles that make it sound rather simple to refurbish a noisy cooling fan with a bit of oil and perhaps some graphite powder:

Oil bearings for fun & profit

Repair your noisy CPU and system fans

Considering how much retro hardware relies on fans that are becoming increasingly hard to find, might it make sense to add fan repair to a list of skills that includes capacitor replacement and retrobrighting?