VOGONS


GeForce2 Ti vs. Matrox G400

Topic actions

  • This topic is locked. You cannot reply or edit posts.

First post, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie

Creative (MSI) GeForce2 Ti 64 MB vs. Matrox G400 DualHead 32 MB. The GeForce2 has had most of its large capacitors replaced, but otherwise both cards are stock. Detonator 45.23 for the GeForce2, latest Matrox drivers for the G400. Test system: ASUS A7V333, AMD Athlon XP 2200+ underclocked to 1.5 GHz, about 512 MB of RAM, Windows 98 SE.

All games on maximum detail, 32-bit color (if the game supports it), 640 x 480 resolution, and default driver settings.

https://www.youtube.com/watch?v=QPaMFmW8VfU

Bottom line, the G400 tends to win in image quality, the GeForce2 tends to win in speed, and there are exceptions.

Last edited by vvbee on 2017-08-24, 07:59. Edited 7 times in total.

Reply 2 of 42, by Scali

User metadata
Rank l33t
Rank
l33t

I've always considered the G400 the 'beginning of the end' for Matrox.
Their G200 was an excellent card, both in image quality and performance at the time.
But then the GeForce came around, and performance just went up to another level, as well as features, such as hardware T&L, cube mapping and such.
The G400 still had good image quality, but was not very competitive in performance or features. Matrox never managed to close that gap again.

I had one machine with a G450, and when I upgraded it to a GF2GTS, it was like night and day. Where the G450 was mostly limited to running in 640x480, I could now run most games in 1024x768 at great framerates.
The one thing I hated about the GF2GTS (it was an Asus card) was the poor image quality, especially blurry at high resolutions/refresh rates. So I decided to try the mod of removing some of the components of the low-pass output filter on the RAMDAC. That made all the difference. I now got the same crisp image quality as the Matrox.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 3 of 42, by 386SX

User metadata
Rank l33t
Rank
l33t
Scali wrote:
I've always considered the G400 the 'beginning of the end' for Matrox. Their G200 was an excellent card, both in image quality a […]
Show full quote

I've always considered the G400 the 'beginning of the end' for Matrox.
Their G200 was an excellent card, both in image quality and performance at the time.
But then the GeForce came around, and performance just went up to another level, as well as features, such as hardware T&L, cube mapping and such.
The G400 still had good image quality, but was not very competitive in performance or features. Matrox never managed to close that gap again.

I had one machine with a G450, and when I upgraded it to a GF2GTS, it was like night and day. Where the G450 was mostly limited to running in 640x480, I could now run most games in 1024x768 at great framerates.
The one thing I hated about the GF2GTS (it was an Asus card) was the poor image quality, especially blurry at high resolutions/refresh rates. So I decided to try the mod of removing some of the components of the low-pass output filter on the RAMDAC. That made all the difference. I now got the same crisp image quality as the Matrox.

Looking at that time maybe those G200 and G400 chips were already more oriented on the office business world than the game one. Even if the G400 still had great performances with the right time correct alternatives (Voodoo3, TNT2 Ultra, Savage 4), but the Geforce2 (but also the 1) imho shouldn't even be compared.
Regarding image quality, the G450 has IMHO probably one of the best analog VGA output signal, I'd say better than the G400. Their impressive image quality was also confirmed imho by the opinion the G200/G400/G450 to my eyes were some if not the best cards using an overlayed passtrough cables like the Creative Dxr3 card needed.

Last edited by 386SX on 2017-08-20, 13:24. Edited 1 time in total.

Reply 4 of 42, by F2bnp

User metadata
Rank l33t
Rank
l33t
Scali wrote:
I've always considered the G400 the 'beginning of the end' for Matrox. Their G200 was an excellent card, both in image quality a […]
Show full quote

I've always considered the G400 the 'beginning of the end' for Matrox.
Their G200 was an excellent card, both in image quality and performance at the time.
But then the GeForce came around, and performance just went up to another level, as well as features, such as hardware T&L, cube mapping and such.
The G400 still had good image quality, but was not very competitive in performance or features. Matrox never managed to close that gap again.

I had one machine with a G450, and when I upgraded it to a GF2GTS, it was like night and day. Where the G450 was mostly limited to running in 640x480, I could now run most games in 1024x768 at great framerates.
The one thing I hated about the GF2GTS (it was an Asus card) was the poor image quality, especially blurry at high resolutions/refresh rates. So I decided to try the mod of removing some of the components of the low-pass output filter on the RAMDAC. That made all the difference. I now got the same crisp image quality as the Matrox.

You are misremembering things. The G200 had excellent image quality, but performance wise it still had ways to go. Sure enough, it was a massive jump from Matrox's previous attempts like the Mystique, however it was still lagging behind the competition. It came out some time after the Voodoo2 and the Riva TNT and was quite a bit slower. It was also slower than the Rage 128 which came out a bit later.
Also, anything OpenGL based was a rather bad experience for years. Just ask swaaye, I remember him recalling tales of his days with the G200.

The G400 on the other hand was quite a bit more competitive, especially the G400MAX variant. It came out at around the same time as the TNT2 and Voodoo3 and offered similar performance in D3D. The G400MAX was in fact faster than both in many games and enjoyed the distinction of the fastest D3D card for a while until the GF256 came out. Even performance in OpenGL games was a little bit better because of Matrox's efforts. Still, a full OpenGL ICD didn't see the light of day until 2000 or so.

The G450 and G550 are more oddball cards and do not improve on performance over the G400MAX. I don't think it is reasonable to compare the G400 with the GF256, since that would also be applicable to other competing cards like the TNT2, Voodoo3 and Rage 128 Pro. Fact of the matter is that Nvidia caught everyone by surprise with the GF256 and released it quite early. I think the TNT2 was in the market for less than half a year when the GeForce came out, but that is debatable since the dates can be a mess.

I remember reading an article about Matrox employee morale and recollections from 1999-2000 and it described a very odd climate. They just couldn't compete and they stopped soon after. The G450 and G550 cards are evidence of this, but even the Parhelia was not really meant to compete with the GF4 and the like.

Reply 5 of 42, by 386SX

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:
You are misremembering things. The G200 had excellent image quality, but performance wise it still had ways to go. Sure enough, […]
Show full quote
Scali wrote:
I've always considered the G400 the 'beginning of the end' for Matrox. Their G200 was an excellent card, both in image quality a […]
Show full quote

I've always considered the G400 the 'beginning of the end' for Matrox.
Their G200 was an excellent card, both in image quality and performance at the time.
But then the GeForce came around, and performance just went up to another level, as well as features, such as hardware T&L, cube mapping and such.
The G400 still had good image quality, but was not very competitive in performance or features. Matrox never managed to close that gap again.

I had one machine with a G450, and when I upgraded it to a GF2GTS, it was like night and day. Where the G450 was mostly limited to running in 640x480, I could now run most games in 1024x768 at great framerates.
The one thing I hated about the GF2GTS (it was an Asus card) was the poor image quality, especially blurry at high resolutions/refresh rates. So I decided to try the mod of removing some of the components of the low-pass output filter on the RAMDAC. That made all the difference. I now got the same crisp image quality as the Matrox.

You are misremembering things. The G200 had excellent image quality, but performance wise it still had ways to go. Sure enough, it was a massive jump from Matrox's previous attempts like the Mystique, however it was still lagging behind the competition. It came out some time after the Voodoo2 and the Riva TNT and was quite a bit slower. It was also slower than the Rage 128 which came out a bit later.
Also, anything OpenGL based was a rather bad experience for years. Just ask swaaye, I remember him recalling tales of his days with the G200.

The G400 on the other hand was quite a bit more competitive, especially the G400MAX variant. It came out at around the same time as the TNT2 and Voodoo3 and offered similar performance in D3D. The G400MAX was in fact faster than both in many games and enjoyed the distinction of the fastest D3D card for a while until the GF256 came out. Even performance in OpenGL games was a little bit better because of Matrox's efforts. Still, a full OpenGL ICD didn't see the light of day until 2000 or so.

The G450 and G550 are more oddball cards and do not improve on performance over the G400MAX. I don't think it is reasonable to compare the G400 with the GF256, since that would also be applicable to other competing cards like the TNT2, Voodoo3 and Rage 128 Pro. Fact of the matter is that Nvidia caught everyone by surprise with the GF256 and released it quite early. I think the TNT2 was in the market for less than half a year when the GeForce came out, but that is debatable since the dates can be a mess.

I remember reading an article about Matrox employee morale and recollections from 1999-2000 and it described a very odd climate. They just couldn't compete and they stopped soon after. The G450 and G550 cards are evidence of this, but even the Parhelia was not really meant to compete with the GF4 and the like.

Maybe these companies didn't expect something like the Geforce chip could be released that soon, but from that point every companies should have immediately followed the newer Directx specifications just like NV did, maybe with the old memories of the NV1 chip vs Directx.
Why to stay on a Dx6 design when others already released refreshed Directx7 chips and with more features (example NSR for the Geforce2). Also the market if I remember correctly was alread oriented on standard APIs from a long time. No newer APIs or proprietary features could have replaced Dx and Opengl at that point imho.
The original Parhelia imho suffered the same problem and I remember the community expectations around the release and I still think it was a great card (FAA antialiasing) but the competition at that time was quiet high.
The NV10 chip imho won with its release date and it wasn't anymore the time of SLI solutions using old chips or old tricks like "16bit colors are enough.." to be competitive. From that point, both complexity and full speed with full features was needed.

Last edited by 386SX on 2017-08-20, 14:12. Edited 1 time in total.

Reply 6 of 42, by Scali

User metadata
Rank l33t
Rank
l33t
386SX wrote:

Maybe these companies didn't expect something like the Geforce chip could be released but from that point both 3dfx and the others should have immediately followed the newer Directx specifications just like NV did, maybe with the old memories of the NV1 chip vs Directx.
Why to stay on a Dx6 design when others already released refreshed Directx7 chips and with more features (example NSR for the Geforce2).

Probably a question of cause and effect.
NV probably pioneered hardware T&L, and proposed to have it supported by the DX7 standard, while they may already have had prototypes working in the lab.
Other manufacturers focused on other things, and had to start from scratch on T&L, so they would be behind the curve here.
You see the same with virtually every version of DX... one company seems to get it right out of the gate, the other is struggling to keep up.
For example:
DX7: GF256
DX8: GF3
DX9: Radeon 9700
DX10: GeForce 8800 (going beyond that even, also pioneering GPGPU for OpenCL and DirectCompute).

DX11 and DX12 aren't as clear-cut. With DX11, the Radeon 5xxx was first, but NV ran into a lot of trouble with their GF4xx. Once they sorted it out in the 5xx series, they had excellent DX11 cards as well.
DX12 is more of an API update than actually a feature update. So cards with DX12 support were already on the market when the new DX12 launched. Ironically enough, the Intel GPUs are the most feature-complete DX12 GPUs on the market.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 7 of 42, by 386SX

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Probably a question of cause and effect. NV probably pioneered hardware T&L, and proposed to have it supported by the DX7 standa […]
Show full quote
386SX wrote:

Maybe these companies didn't expect something like the Geforce chip could be released but from that point both 3dfx and the others should have immediately followed the newer Directx specifications just like NV did, maybe with the old memories of the NV1 chip vs Directx.
Why to stay on a Dx6 design when others already released refreshed Directx7 chips and with more features (example NSR for the Geforce2).

Probably a question of cause and effect.
NV probably pioneered hardware T&L, and proposed to have it supported by the DX7 standard, while they may already have had prototypes working in the lab.
Other manufacturers focused on other things, and had to start from scratch on T&L, so they would be behind the curve here.
You see the same with virtually every version of DX... one company seems to get it right out of the gate, the other is struggling to keep up.
For example:
DX7: GF256
DX8: GF3
DX9: Radeon 9700
DX10: GeForce 8800 (going beyond that even, also pioneering GPGPU for OpenCL and DirectCompute).

DX11 and DX12 aren't as clear-cut. With DX11, the Radeon 5xxx was first, but NV ran into a lot of trouble with their GF4xx. Once they sorted it out in the 5xx series, they had excellent DX11 cards as well.
DX12 is more of an API update than actually a feature update. So cards with DX12 support were already on the market when the new DX12 launched. Ironically enough, the Intel GPUs are the most feature-complete DX12 GPUs on the market.

Interesting point, I undestand. The NV30 example is also relevant, I remember the big expectations and the first reviews...

Reply 8 of 42, by dexvx

User metadata
Rank Oldbie
Rank
Oldbie
F2bnp wrote:

The G450 and G550 are more oddball cards and do not improve on performance over the G400MAX. I don't think it is reasonable to compare the G400 with the GF256, since that would also be applicable to other competing cards like the TNT2, Voodoo3 and Rage 128 Pro. Fact of the matter is that Nvidia caught everyone by surprise with the GF256 and released it quite early. I think the TNT2 was in the market for less than half a year when the GeForce came out, but that is debatable since the dates can be a mess.

I remember reading an article about Matrox employee morale and recollections from 1999-2000 and it described a very odd climate. They just couldn't compete and they stopped soon after. The G450 and G550 cards are evidence of this, but even the Parhelia was not really meant to compete with the GF4 and the like.

The G400/MAX is compared to GF256 because they were released less than a half year apart. Matrox stopped competing in 3D gaming after G400. G450 was lower performance than G400, but used 64bit-DDR memory to save board costs. I suppose Matrox thought they could just live as a business solution. G550 was more of the same, except with a bad T&L engine.

From what I remember at the time, Matrox was a private company. They didn't want to go the public route by raising venture capital to compete against Nvidia's aggressive development cycles. When Nvidia caught up to Matrox in quality (ala Quadro NVS), the writing was on the wall. Matrox tried one last go with Parhelia, but exiting the business for 4 years did its number.

Reply 9 of 42, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie

People who remember the g400 from back then probably often didn't have the benefits we do now, more mature drivers and faster cpus.

Not sure how fast the gf256 t&l engine is, but the g400's t&l engine here is the athlon xp, which may be the faster one.

The videos are vga capture by the way, so that's the quality you get.

Reply 10 of 42, by silikone

User metadata
Rank Member
Rank
Member
vvbee wrote:

Sigh.

Time to reinstall Homeworld again.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 12 of 42, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie
vvbee wrote:

in Homeworld and Soulbringer, the GeForce2 draws certain shades of red as grey, for instance.

Had a look at the software version of homeworld, and it's actually the geforce2 that renders the colors correctly. Both the g400 and the voodoo3 do it wrong, relative to the software engine. Still wrong in soulbringer though, is the geforce2, the cause for which is unknown.

Reply 17 of 42, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I have the gott version of homeworld that came with the ost cd I can rip it dor you guya if you want.

Homeworld is simply amazing overall.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 18 of 42, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie

Same as before, all settings max, 32-bit color (if supported), 640 x 480.

Colin McRae Rally, 1998: the G400 ran normally, the GeForce2 had massive flickering, canceled the benchmark
Midtown Madness, 1999: https://www.youtube.com/watch?v=PiRevl62Byc
Thief 2, 2000: https://www.youtube.com/watch?v=Dd_tWguFJsU
Black & White, 2001: https://www.youtube.com/watch?v=-Eqnk8hAt7c
Microsoft Train Simulator, 2001: https://www.youtube.com/watch?v=doWetOPXFtA

Nvidia moved on with their drivers, so forget playing Colin McRae Rally or Midtown Madness with the Detonator 45.23.

It's worth noting that when the GeForce2 isn't glitching out in Midtown Madness, it's more or less tied in speed with the G400, until later in the video when the draw distance gets smaller. It's a glitched benchmark so you can't be sure, but that's what you have. In Thief 2, the GeForce2 again isn't notably faster, and sometimes it's slower. Same in (the CPU-bound) Microsoft Train Simulator, it's only when the triangle count goes up that the GeForce2 does better. In Black & White, the GeForce2 is clearly faster.

In terms of image quality, it's typically said that the Matroxes have good 2-D, but it's obvious they're strong against the GeForce2 tech in 3-D quality, as well. Apart from glitched rain on the G400 and somewhat blurrier textures in the background, Black & White looks identical on these cards, despite the G400 being almost two years old at that point, and the GeForce2 Ti not out yet. Microsoft Train Simulator doesn't quite look the same, it's almost like there's a texture-sharpening filter on the GeForce2. The textures don't look higher-res, just sharpened. On the G400, there's an odd horizontal glitch in that game.

As before, the tests are in 640 x 480, which is not only the best resolution for everything but one that saps some of the GeForce2's fillrate advantage. On the other hand, the GeForce2 has twice the RAM compared to the G400, which probably helps in the newer games.

Someone in the thread said there's no comparison between the G400 and the GeForce2, but clearly that depends.