VOGONS


Best video card ever

Topic actions

Reply 20 of 28, by gerwin

User metadata
Rank l33t
Rank
l33t

I liked the Geforce MX 440 in its days, and I am still fond of them for use in simple workstations.
- Works without a hassle with everything I tried in Dos and Windows
- Totally reliable, even when cooled passively.
- DDR memory, preferably 128-bits
- Last NVidia card to allow CRT screen refresh rates above 60Hz in DOS
- Available up to AGP 8x
- Low power consumption, low profile, low noise.

Performance is humble though, but good enough for classic games at
800x600 and 1024x768.
I actually took an Ati Radeon 9200 and a 9600SE back to the store once, and left the store with two MX440's...

Maybe the Geforce 3 is better, but I never tried one.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 21 of 28, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
gerwin wrote:
I liked the Geforce MX 440 in its days, and I am still fond of them for use in simple workstations. - Works without a hassle wi […]
Show full quote

I liked the Geforce MX 440 in its days, and I am still fond of them for use in simple workstations.
- Works without a hassle with everything I tried in Dos and Windows
- Totally reliable, even when cooled passively.
- DDR memory, preferably 128-bits
- Last NVidia card to allow CRT screen refresh rates above 60Hz in DOS
- Available up to AGP 8x
- Low power consumption, low profile, low noise.

Performance is humble though, but good enough for classic games at
800x600 and 1024x768.
I actually took an Ati Radeon 9200 and a 9600SE back to the store once, and left the store with two MX440's...

Maybe the Geforce 3 is better, but I never tried one.

I have to agree with the GF MX cards. They are much faster then say the TNT and alike (especially memory performance), are passively cooled (they use very little power and don't get really hot) and are very easy to obtain.
I tend to use them relatively little (and this may sound weird 😜 ) as they are so hassle-less to use, I barely use them! Theres nothing special about them 🤣! They are easy, good (considering how old they are now)...easy and good! 😜

The only thing I don't like about them is that they don't support Glide, but oh well.

I've recently received a GF3. They seem much more uncommon then the GF MX series. I intend to use it but I think it's fan was either rattling a lot or just plain stuck.
It does look very pretty though, blue heatsinks
Heres a pic
http://www.thg.ru/graphic/20011218/images/ti200_e.jpg

I think GF3 is interesting as it was the first graphics card made by Nvidia that wasn't a large/giant performance leap in about 4 generations or so (TNT > TNT2 > GF > GF2) and was more an upgrade in features iirc.

Reply 22 of 28, by sliderider

User metadata
Rank l33t++
Rank
l33t++

I always thought nVidia was being a little dishonest with the GeForce 4MX line. All they did was up the clocks and add a couple of features to the GeForce 2. It had nothing to do with the GeForce 4. The GeForce 3 would outperform it in most benchmarks and games. Of course, if you were still stuck with a motherboard that had no AGP slot you probably had one because it was one of the most prolific PCI cards of it's day and was supported for a long time simply because nVidia sold so many nobody was going to eliminate support or they wouldn't have many sales. I've still got one here and I bought it because at the time it looked like PCI video cards would be phased out in favor of AGP so I wanted to have the latest one that I could get before they stopped making them. Who knew they'd still be making PCI video cards in 2010? That will probably change soon, though, as there aren't many motherboards out there that don't have at least one PCI-E slot these days, not even OEM ones. I can't see the PCI bus realistically evolving much further than 9500GT and Radeon 4350.

Reply 23 of 28, by F2bnp

User metadata
Rank l33t
Rank
l33t

I never understood the debate with the GeForce 4 MX. GeForce 2 MX was much slower than the GF2 as well. Hell it was slower than the Geforce 256 sometimes. But it was a good product at the given price nonetheless. The same applied to the GF4 MX.

Reply 24 of 28, by sliderider

User metadata
Rank l33t++
Rank
l33t++
F2bnp wrote:

I never understood the debate with the GeForce 4 MX. GeForce 2 MX was much slower than the GF2 as well. Hell it was slower than the Geforce 256 sometimes. But it was a good product at the given price nonetheless. The same applied to the GF4 MX.

The point is that having GeForce 4 in the name implies that it is related to the GeForce 4 chip, which it isn't. Their advertising also implied that it was a GeForce 4 only with nfiniteFX missing, which it clearly wasn't. The GeForce 2 MX was related to the rest of the GeForce 2 line only with some pipes and memory bandwith disabled. It wasn't a first generation GeForce or Riva with a new name. nVidia is still doing it today. The GTS250 was an update of the old 9800GTX+ which was in turn a rework of the 8800GTS.

Reply 25 of 28, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
sliderider wrote:
F2bnp wrote:

I never understood the debate with the GeForce 4 MX. GeForce 2 MX was much slower than the GF2 as well. Hell it was slower than the Geforce 256 sometimes. But it was a good product at the given price nonetheless. The same applied to the GF4 MX.

The point is that having GeForce 4 in the name implies that it is related to the GeForce 4 chip, which it isn't. Their advertising also implied that it was a GeForce 4 only with nfiniteFX missing, which it clearly wasn't. The GeForce 2 MX was related to the rest of the GeForce 2 line only with some pipes and memory bandwith disabled. It wasn't a first generation GeForce or Riva with a new name. nVidia is still doing it today. The GTS250 was an update of the old 9800GTX+ which was in turn a rework of the 8800GTS.

This is true, but it has happened lots of times in computer history. Nvidia has done the same more recently and there are tons of examples of companies misusing a name so a product just sounds better.
They do it on purpose and it's up to us to separate the crap from the real deal.

Thats also why they (mostly graphics cards manufacturers) keep changing the names of their products.
Remember when we had LE, XT and SE versions? Once people started to see through the naming scheme they just started naming their products differently 😒
And remember the first Pentium 3? It's really just a Pentium 2+, the Coppermine is the real P3!

Now, even though the 4MX was one of those misnamed video cards (it should've been called something like GF2 MX+ or something)

Reply 26 of 28, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Coppermine is not much different than Katmai. The move to on-die cache is about it. The CPU itself is basically identical.

There were some Pentium IIs with on-die cache. The Dixon mobile version had 256KB like Coppermine. And obviously the Mendocino Celeron could have been called a PII (it performs similarly to Deschutes) and it had on-die cache.

Companies play with names to fool morons and lazy people. I'm quite ok with them doing that.😁 It's not hard to find the details behind the model numbers.

Last edited by swaaye on 2010-08-20, 17:35. Edited 1 time in total.

Reply 27 of 28, by Tetrium

User metadata
Rank l33t++
Rank
l33t++

Yup, but the Katmai is basically just a Deschutes with SSE and the unique serial code thingy, hardly any different from a P2. When Intel moved the cache on die it substantially increased performance.

Reply 28 of 28, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Maybe Coppermine should be Pentium II Peformance Enhanced 2.0? P4 was the first really significant change to come along. Willamette could be P3. It's just the name game like you said. Not really worth arguing about what marketers do.

To me, SSE was what defined the PIII. SSE can make for very nice speed boosts if it gets used. Unfortunately PIII's SSE implementation kinda stinks so SSE is semi-gimped. But without SSE, all PIIIs are pretty much the same as PII because there were PIIs and Celerons with on-die cache.

I forgot about that serial number thing. 😀 I still think the privacy scare was a bit of a knee jerk, but who knows huh.