Reply 20 of 48, by swaaye
32 bit color has a significant speed impact with these cards so I see nothing wrong with using 16 bit. The old games don't benefit all that much from more color depth anyway.
32 bit color has a significant speed impact with these cards so I see nothing wrong with using 16 bit. The old games don't benefit all that much from more color depth anyway.
Did anyone ever see the difference beetween sharper and smoother? 😁
wrote:I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400. […]
I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400.
On a k6-2 450 the v4 is a bit faster but tnt2 ultra and g400 are really close (these three were faster than far newer cards, i tested 20+ cards.
On a p2-450 the g400 was the fastest but again, they were close.
From 666mhz p3 and up the tnt2 ultra was the card that kept on scaling so it won from there but only with a slight margin.
The TNT2 Ultra should be waster by a wide margin. Both my G400 and G450 dual-head are a bit slower than my Savage 4 ( and a lot slower in openGL).
G400 on a K6-III-400, Gigabyte GA-5AX, 128mb ram. Tested using 3D-Mark99
TNT2-Ultra gave me somewere between 2800 and 2850.
Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....
My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen
001100 010010 011110 100001 101101 110011
wrote:G400 on a K6-III-400, Gigabyte GA-5AX, 128mb ram. Tested using 3D-Mark99 TNT2-Ultra gave me somewere between 2800 and 2850. […]
G400 on a K6-III-400, Gigabyte GA-5AX, 128mb ram. Tested using 3D-Mark99
TNT2-Ultra gave me somewere between 2800 and 2850.
I haven't tested on a K6, but on a 850mhz PIII the TNT2 pulls ahead quite a bit.
Why is that exactly? Is it perhaps that nvidia optimized for Intel?
Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....
My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen
001100 010010 011110 100001 101101 110011
1999 was the year of the Athlon and Coppermines, maybe Nvidia simply couldn't be bothered optimizing stuff for the anemic K6 FPU as much as Matrox? One would need to test these cards on a period-correct early Athlon to be sure.
I wouldn't put much stock in 3DMark99. Test games. Matrox seems to have optimized for 3DMark99.
wrote:wrote:I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400. […]
I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400.
On a k6-2 450 the v4 is a bit faster but tnt2 ultra and g400 are really close (these three were faster than far newer cards, i tested 20+ cards.
On a p2-450 the g400 was the fastest but again, they were close.
From 666mhz p3 and up the tnt2 ultra was the card that kept on scaling so it won from there but only with a slight margin.The TNT2 Ultra should be waster by a wide margin. Both my G400 and G450 dual-head are a bit slower than my Savage 4 ( and a lot slower in openGL).
Well, it was in 3dmark99. With 1ghz tualatin it scored 30% higher. In an actual game the difference was 10% over the g400. It was about 30% faster in 3dmark over the voodoo4 too, but the v4 was 1% quicker in the game (direct3d).
Resolution was 1024x768, could be that the tnt2 ultra is quicker at higher res. or 32bit color. Point is, there isn't much of a difference on slower systems and you just pick the card you want for your goals.
asus tx97-e, 233mmx, voodoo1, s3 virge ,sb16
asus p5a, k6-3+ @ 550mhz, voodoo2 12mb sli, gf2 gts, awe32
asus p3b-f, p3-700, voodoo3 3500TV agp, awe64
asus tusl2-c, p3-S 1,4ghz, voodoo5 5500, live!
asus a7n8x DL, barton cpu, 6800ultra, Voodoo3 pci, audigy1
wrote:wrote:wrote:I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400. […]
I've tested these cards in a benchmark from a racegame and in 3dmark99 max. I did not test the g400max but just the normal g400.
On a k6-2 450 the v4 is a bit faster but tnt2 ultra and g400 are really close (these three were faster than far newer cards, i tested 20+ cards.
On a p2-450 the g400 was the fastest but again, they were close.
From 666mhz p3 and up the tnt2 ultra was the card that kept on scaling so it won from there but only with a slight margin.The TNT2 Ultra should be waster by a wide margin. Both my G400 and G450 dual-head are a bit slower than my Savage 4 ( and a lot slower in openGL).
Well, it was in 3dmark99. With 1ghz tualatin it scored 30% higher. In an actual game the difference was 10% over the g400. It was about 30% faster in 3dmark over the voodoo4 too, but the v4 was 1% quicker in the game (direct3d).
Resolution was 1024x768, could be that the tnt2 ultra is quicker at higher res. or 32bit color. Point is, there isn't much of a difference on slower systems and you just pick the card you want for your goals.
My tests was done at 800x600. You know, default res.
That res. seems to be on par of what most games can run fluid on old hardware.
Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....
My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen
001100 010010 011110 100001 101101 110011
wrote:Why is that exactly? Is it perhaps that nvidia optimized for Intel?
nVidia has separate codepaths for Intel and AMD in their drivers.
The OpenGL driver actually reports which version it runs. I had an Athlon 1400 for years, and my GeForce2 OpenGL driver always reported 3DNow!
When I moved the card to a PIII-era Celeron machine, it said SSE instead.
See also here: http://www.tomshardware.com/reviews/3dnow,109-5.html
I just asked this because my voodoo3 gets too hot. I already placed a nice active cooling for it, but the pcb still gets pretty hot. So I wanted to exchange for either a voodoo4/5, G400max or TNT2 Ultra, wich are the options for that era. A geforce 2 ultra would not be bad, but too much. I just don't like the heat generated by v3, otherwise I am happy with it.
Here's what I saw when I tested cards on a P3 1400 a few years ago. G400 is looking a bit suspicious in its performance.
And Quake2
wrote:Here's what I saw when I tested cards on a P3 1400 a few years ago. G400 is looking a bit suspicious in its performance.
Suspicious in what way?
3dmark99 uses DX6, so GeForce can't flex its T&L-muscle yet.
In OpenGL, T&L does get used. Aside from that, Matrox never had very good OpenGL drivers, while nVidia has been the benchmark.
wrote:Suspicious in what way?
3dmark99 uses DX6, so GeForce can't flex its T&L-muscle yet.
In OpenGL, T&L does get used. Aside from that, Matrox never had very good OpenGL drivers, while nVidia has been the benchmark.
GeForce 256 SDR has a memory bandwidth deficit but almost double the fillrate of G400 Max. I don't see how a G400 can ever win dramatically against a GeForce 256.
Perhaps I will explore this again. I don't have a GF256 SDR anymore though - just the DDR version.
wrote:GeForce 256 SDR has a memory bandwidth deficit but almost double the fillrate of G400 Max.
Yea, I'd like to see the feature tests for these cards in 3DMark99.
I figured it out. Vsync. The Geforce 256 must have had vsync enabled. With vsync disabled the GF256DDR pulls ahead by about 700 pts on the Athlon XP that I have setup at the moment. I also tested Dethkarz and the results are similar. All at 800x600x16.
I will post more data later on.
Thanks for bringing up the topic of V-Sync.
V4, ATI cards and Nvidia cards supported by Forceware 81.98, are the easiest to use. They have V-Sync options in the driver, or an official tool (overclock for V4).
Nvidia cards supported by the older Forceware 71.84 (TNT2 for example), do not have V-Sync option for DirectX, only for OpenGL. Neither does Matrox. Matrox has an official tool, but that doesn't work.
So I found a workaround in using PowerStrip 3.20. This gets it going in older Nvidia cards and also Matrox cards. I didn't have much luck with that CoolBits registry tool, but it doesn't work with Matrox cards anyway, making PowerStrip a bit more useful.
Got to praise ATI here, their V-Sync options works perfect all the way back to a Radeon 7000 (I don't have anything slower / older).
Enabled v-sync is easy to spot a score of around 60, 75 or 30 is a good indicator. I always run something with high FPS, like Forsaken at 640 x 480, to make sure it's actually working.
Yeah I am using Powerstrip 2.78 and Rivatuner to control VSync. Indeed the Matrox tool did not work for the G400.
I usually watch for frame rates locked to the refresh rate but I guess I didn't notice it when I ran the older tests.