VOGONS


First post, by EdmondDantes

User metadata
Rank Member
Rank
Member

So I was on the Vogons Wiki and reading about NVidia cards. About the Geforce FX line, it says:

"Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way."

Okay, avoiding the extension is fine, but.... how do you find out if a card has a "64-bit bus?" I tried googling "Geforce cards with 64-bit buses" but the only time I heard anything-bit at all was with regards to graphics cores and memory interfaces, and I don't know if that line is referring to either of those.

Main reason I'm asking is because I've actually seen decent prices on Geforce FX cards on the ebays but I have no practical way of knowing if they're ones I should avoid.

http://www.ebay.com/itm/5187-5256-Nvidia-GeFo … 119.m1438.l2649

That one, for example.

IF anyone could clear things up for me or explain things to my stupid head, I'd much appreciate it.

Reply 1 of 19, by LHN91

User metadata
Rank Member
Rank
Member

You've basically mentioned the relevant parts - 64bit bus refers to the width of the bus (or memory interface) between the GPU and the GPU's memory.

I'm not sure how to tell if that specific card is one with a 64 bit bus or not, unfortunately. 5500's can come in 64 or 128 bit bus variants, and that looks to be an OEM card.

Reply 2 of 19, by EdmondDantes

User metadata
Rank Member
Rank
Member

I thought of one way... I noticed just now the card has Asus markings on it, so I googled "Asus Nvidia Geforce FX" and wound up finding a CNet page with a pretty detailed rundown and a picture (stock) but which looks a lot the one that guy is selling.

https://www.cnet.com/products/asus-v-9520-x-t … b-series/specs/

Unfortunately it apparently is a crippled 64-bit one...

I'm thinking I might just go for a Geforce 3 for my rig. Geforce FX's are either expensive or one of these dubious models, and Nvidia stupidly used the "Geforce 4" name for cards that are actually GeForce 2s. With the 3s at least, they all are genuinely what they say they are, and I don't need anything with higher than DirectX 8.0 compatibility anyway.

Reply 3 of 19, by lazibayer

User metadata
Rank Oldbie
Rank
Oldbie
EdmondDantes wrote:
So I was on the Vogons Wiki and reading about NVidia cards. About the Geforce FX line, it says: […]
Show full quote

So I was on the Vogons Wiki and reading about NVidia cards. About the Geforce FX line, it says:

"Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way."

Okay, avoiding the extension is fine, but.... how do you find out if a card has a "64-bit bus?" I tried googling "Geforce cards with 64-bit buses" but the only time I heard anything-bit at all was with regards to graphics cores and memory interfaces, and I don't know if that line is referring to either of those.

Main reason I'm asking is because I've actually seen decent prices on Geforce FX cards on the ebays but I have no practical way of knowing if they're ones I should avoid.

http://www.ebay.com/itm/5187-5256-Nvidia-GeFo … 119.m1438.l2649

That one, for example.

IF anyone could clear things up for me or explain things to my stupid head, I'd much appreciate it.

If this card has indeed 128MB VRAM then it has 64bit bus.
It has HY5DU561622 chips. They are 16M x 16bit chips. Each chip is 32MByte. 128MB requires 4 chips, so there should be no chips on the back side. 16bit x 4 = 64bit.

Reply 4 of 19, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++
LHN91 wrote:

You've basically mentioned the relevant parts - 64bit bus refers to the width of the bus (or memory interface) between the GPU and the GPU's memory.

I'm not sure how to tell if that specific card is one with a 64 bit bus or not, unfortunately. 5500's can come in 64 or 128 bit bus variants, and that looks to be an OEM card.

If that is an actual pic of one of the cards being sold, it is a 256MB model. Look at the part number sticker. It indicates 256MB.

Yamaha YMF modified setupds and drivers
Yamaha XG resource repository - updated November 27, 2018
Yamaha YMF7x4 Guide
AW744L II - YMF744 - AOpen Cobra Sound Card - Install SB-Link Header

Reply 5 of 19, by shamino

User metadata
Rank Oldbie
Rank
Oldbie
cyclone3d wrote:

If that is an actual pic of one of the cards being sold, it is a 256MB model. Look at the part number sticker. It indicates 256MB.

That can be a frustrating problem with eBay video cards. Some sellers are kind of sloppy with throwing together slightly different items under a multi-item listing, and make assumptions that the differences aren't important. In this case there's a discrepancy between the title and photo with respect to the RAM size, which in turn makes a discrepancy in what the bus width might be. That alone might be reason enough to forget this seller and not get into any confusion with them.

They might have a grab bag of slightly different cards and just mailing them out at random.
The card in the photo is only 1 of 16 cards listed, so it's probably not the one you'll get. Maybe all the cards are exactly the same model as the photo, but given the mismatch in RAM size between the title and photo, I wouldn't have much confidence in that.

For this item you'd need to verify with the seller exactly what they're selling you. Since it's a multi-item listing, that's difficult. It also gets into technical stuff that the person on the other end might not interpret or answer accurately.

It's a lot easier when you can trust a seller's photos to match the item, and they show the product from enough angles to figure everything out yourself.
... It's also easier when nVidia enforces tighter standards for the naming of GeForce products.

Reply 6 of 19, by EdmondDantes

User metadata
Rank Member
Rank
Member

So basically, if its 128mb VRAM then its 64-bit?

I'm looking at GeForce 3s and 4s now. I don't need a DirectX 9 compliant card, just one that does really good at DirectX 7 and 8, so all this "is this the right thing?" nonsense isn't worth it. For the curious I basically just want to be able to play Myst III and The Longest Journey in my Win98 rig with graphical acceleration, and both games have told me my Voodoo 3 won't cut it. Admittedly I'm usually not a graphics whore, but... well, Myst is worth any sacrifice, even stupid ones.

Reply 7 of 19, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
EdmondDantes wrote:

So basically, if its 128mb VRAM then its 64-bit.

No. Bulletproof way: read the memory chip name, find datasheet, find in it width of data bus, multiple by number of chips on the card with own pathways. Soon you will be able to tell the bus width just by looking at it.

Reply 8 of 19, by martin939

User metadata
Rank Member
Rank
Member

These are oddball OEM cards manufactured for MEDION (low cost electronics), a brand that's been distributed through stores like Lidl or Aldi throughout Europe, mostly Germany.

Why bother with them when there are so many 'real' FX'es on the market?

Reply 9 of 19, by EdmondDantes

User metadata
Rank Member
Rank
Member
martin939 wrote:

These are oddball OEM cards manufactured for MEDION (low cost electronics), a brand that's been distributed through stores like Lidl or Aldi throughout Europe, mostly Germany.

Why bother with them when there are so many 'real' FX'es on the market?

Well, yeah, that's the purpose of this thread--as someone who just now looked at GeForces, I don't know anything about Medion or what's a "real" FX and what's garbage. I wasn't born all-knowing in the ways of the GeForce.

On that note, I read somewhere that there's a reason to stick to older NVidia drivers in Win98 because apparently the last official 98-era ones had problems or something? Has anyone heard something like that? I'm having trouble finding where I heard that but I think it was on a different forum that suggested sticking to 69.SomethingSomething. EDIT I found where I heard that: https://arstechnica.com/civis/viewtopic.php?f=6&t=1145398 Apparently it applies to any Nvidia before the GeForce 6.

Reply 10 of 19, by GeorgeMan

User metadata
Rank Oldbie
Rank
Oldbie

There is another problem with wiki:

Here there it only mentions that ALL GF4MX 4x0 cards are 128bit. https://en.wikipedia.org/wiki/List_of_Nvidia_ … GeForce4_series
But I have a GF4 MX440/8x that is 64bit, thus falls a bit behind a GF2 GTS. 😉

Retro1: Athlon XP 3200+ @Arctic cooler | ASUS A7V600 | Radeon 9800XXL 128MB | SB Audigy 2 ZS | 160GB IDE HDD | Win98SE & XP
Retro2: under construction with a PIII 933 or a Tualatin Celeron 1200 and a GF2 GTS 32MB

Reply 11 of 19, by red_avatar

User metadata
Rank Oldbie
Rank
Oldbie

Oh lord, this is giving me nasty flashbacks - I recall having several cards in a short span of time back then exactly because they were all weak, crippled and barely could keep up with games improving. In the end, I gave up on Nvidia at that time and went for a Radeon 9800 Pro which was very powerful but had overheating issues. Frankly, that entire era was filled with shoddy hardware (Pentium 4) or massive overheating issues (Athlon, etc.).

Retro game fanatic.
IBM PS1 386SX25 - 4MB
IBM Aptiva 486SX33 - 8MB - 2GB CF - SB16
IBM PC350 P233MMX - 64MB - 32GB SSD - AWE64 - Voodoo2
PIII600 - 320MB - 480GB SSD - SB Live! - GF4 Ti 4200
i5-2500k - 3GB - SB Audigy 2 - HD 4870

Reply 12 of 19, by firage

User metadata
Rank Oldbie
Rank
Oldbie
GeorgeMan wrote:

There is another problem with wiki:

Here there it only mentions that ALL GF4MX 4x0 cards are 128bit. https://en.wikipedia.org/wiki/List_of_Nvidia_ … GeForce4_series
But I have a GF4 MX440/8x that is 64bit, thus falls a bit behind a GF2 GTS. 😉

Yep, lots of cut cards in that line. You certainly do want to limit your investments to models that are known 128-bit, can be confirmed so from photos, or the MX460's and Quadros that are sure deals.

My big-red-switch 486

Reply 13 of 19, by elod

User metadata
Rank Member
Rank
Member

I tend to stick to the FX5700. 5800 are not a good investment (of course if you plan to use them). Quadro 3000 (a bit less so 5900 because anything clocked higher will fail earlier) would be worthwhile.

Reply 14 of 19, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

If SDR card has 8 memory chips in total, it must be 128-bit (they can be on one side, or on both).
if it's a DDR based card, 8 memory chips usually can mean it's 256-bit and if there are only two DDR VRAM chips, it must be 64-bit.

Other than that, when pics aren't good enough you need to see specs of VRM chips themselves (mentioned before) or check exact model number of it for confirmation (in reviews, GPU databases, etc.).

157143230295.png

Reply 15 of 19, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
agent_x007 wrote:

If SDR card has 8 memory chips in total, it must be 128-bit (they can be on one side, or on both).
if it's a DDR based card, 8 memory chips usually can mean it's 256-bit and if there are only two DDR VRAM chips, it must be 64-bit.

SGRAM were 32 bit, be it SDR or DDR. Clamshell was available as well. There were also 16 bit DDR SDRAM chips.

Reply 16 of 19, by lazibayer

User metadata
Rank Oldbie
Rank
Oldbie
agent_x007 wrote:

If SDR card has 8 memory chips in total, it must be 128-bit (they can be on one side, or on both).
if it's a DDR based card, 8 memory chips usually can mean it's 256-bit and if there are only two DDR VRAM chips, it must be 64-bit.

Other than that, when pics aren't good enough you need to see specs of VRM chips themselves (mentioned before) or check exact model number of it for confirmation (in reviews, GPU databases, etc.).

For DDR chips encapsulated in TSOP package, all I have only seen are 16bit chips.
For SDR chips encapsulated in TSOP package, I have seen 32bit and 16bit flavors.

Reply 17 of 19, by EdmondDantes

User metadata
Rank Member
Rank
Member

Okay, one last thing. I asked in a previous thread and nobody ever answered it:

I plan to put this GeForce (I found a ti4200 AGP from MSI, who apparently is one of the better manufacturers) in a comp that already has a Voodoo 3 in a PCI slot.

Does each card have to be connected to a different monitor? Like say my monitor is plugged into the GeForce but I want to play a GLIDE game, would I then have to plug it into the Voodoo 3? Or does it not even matter? Or is there some cable I can/have to buy?

Reply 18 of 19, by Scali

User metadata
Rank l33t
Rank
l33t
EdmondDantes wrote:

Does each card have to be connected to a different monitor? Like say my monitor is plugged into the GeForce but I want to play a GLIDE game, would I then have to plug it into the Voodoo 3? Or does it not even matter? Or is there some cable I can/have to buy?

Yes. The output of the card is driven directly by the video memory on the card.
On newer versions of Windows, there is a software solution that copies the contents of one video card's memory to another, so that outputs can be shared. This comes at a performance penalty however (probably isn't going to work if you want to game on the VooDoo3, will probably spoil the experience), and I don't think you can do this with Windows earlier than XP.

What you could use is a KVM switch or such, I suppose. Something that allows you to connect both outputs to a single monitor, and switch between them.
Or, some monitors have both DVI and VGA inputs. Perhaps you could connect one card to DVI and the other to VGA, and then use the monitor's built-in switching.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/