VOGONS


First post, by athlon-power

User metadata
Rank Member
Rank
Member

So, I thought I'd start a thread to see the results other users of hardware released in similar time-frame as my very late '90s build. The only rules are that you have to use hardware released when or before 3DMark 2000 came out (December 6, 1999). Here's a link that 3DMark has to download the software, and they even provide the information needed to use the registered versions of this stuff. Guess they figured that they didn't really need to protect nearly 19 year old software.

https://benchmarks.ul.com/legacy-benchmarks

Here's the specs of my system:

Intel SE440BX-2 Motherboard
Intel Pentium III Katmai 500MHz, 100MHz FSB
192MB PC100 RAM
ASUS AGP V3800M 32MB (Basically, a nVidia TNT2 card)
AOpen AW744L II PCI Sound Card
48X Samsung IDE CD Drive
WD400 Caviar 40GB IDE HDD
Windows 98 SE

Here are some misc. results I got from 3DMark 2000:

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
2347 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1569 3D marks

I won't be posting 1024x768 benchmarks unless somebody wants me to, as I really don't feel like this is a 1024x768 class machine. It can play things like Half-Life at that resolution, but I don't prefer it as much. This is sort of an experiment to see where my machine stands, but guessing from these scores, I'm probably not in the high-end of the spectrum, even with a PIII Katmai 500MHz (which, keep in mind, was released February 26th, 1999). By October of 1999, they already had a 733MHz PIII model. The speed at which things moved during this era still boggles me. I'm just wondering as to how my machine compares with the ones that you guys have built/found, etc.

It's curious that the 16-bit based benchmark fared so much better then the 32/24-bit one did. I guess this card is better at 16-bit colors and textures than their 32-bit counterparts.

Where am I?

Reply 2 of 21, by athlon-power

User metadata
Rank Member
Rank
Member
Koltoroc wrote:

I don't think it is a proper TNT2, but rather a TNT 2 "Vanta" or M64. The regular TNT2 should have a significantly lower performance drop when using 32 color depth.

I don't know which one it is, to be honest. The model number is AGP V3800M (DVI) SDRAM. I don't know what the significance of the "(DVI)" designation is, nor the V before the 3800. I think the "M" might mean that it's an M64 card, but I'd have no idea. I'm going to pull out my old PIII 750 Laptop and see how it performs. From what I remember, the ATI Mobility 3 chipset isn't actually too terrible, though the thing gets quite toasty when it's playing something like Half-Life. I actually changed out the thermal paste on it with some new Arctic Silver a while back. Can you believe Dell decided to use a thermal pad to cool that thing?

It's a Dell Latitude C600 circa 2000, so I'll probably edit the original post and give people leeway into early y2k, but no further. I don't want my poor PIII Katmai 500MHz trying to compete with a monster 1.13GHz PIII or a similarly destructive 1.2GHz Athlon. The Katmai would just be vaporized if it tried a fight with one of those things.

Where am I?

Reply 4 of 21, by SW-SSG

User metadata
Rank Oldbie
Rank
Oldbie

"DVI" implies your card has a DVI video output. That was still relatively uncommon on most video cards (especially budget-class ones, that which yours appears to be) of that time period, hence why it's on display in the model number.

athlon-power wrote:

... Can you believe Dell decided to use a thermal pad to cool that thing?

This is actually quite normal; usually there is a tiny ~2mm gap between the chip and heatsink module's surface in these devices that prevents actual physical contact, thus necessitating a thermal pad to properly fill the space. If you notice your laptop overheating or behaving strangely after swapping pad with paste, this lack-of-contact is probably the reason.

Reply 5 of 21, by athlon-power

User metadata
Rank Member
Rank
Member
SW-SSG wrote:

"DVI" implies your card has a DVI video output.

No, that's the weird part. There's only a VGA output on this thing. I thought that at first, but the lack of a DVI interface makes me think that it means otherwise. I don't think DVI really existed in 1999, though I may be wrong.

Also, the laptop has been transferring heat fine, and I figured that the heatsink might not be making contact, so I put a small thing of thermal paste to see if it was contacting properly, and it spread out completely, in a way that seemed like it was making full contact. It still hasn't had issues, so I don't think it should be a problem.

Where am I?

Reply 6 of 21, by Koltoroc

User metadata
Rank Member
Rank
Member

I agree, that DVI likely means something else in this context. While DVI (the interface) was introduced in 1999, I find it highly unlikely for a budget card to be equipped with one at that point in time.

Reply 7 of 21, by athlon-power

User metadata
Rank Member
Rank
Member

So now the benchmarks on the Dell Latitude C600 are done. It comes in as a somewhat close second, but the desktop pretty much outperforms it most ways. Still, that's a surprisingly large amount of graphical power in a laptop meant for the business/corporate market. Here's the specs of it:

Mobile Intel Pentium III Coppermine @750MHz
256MB PC100 RAM
ATI Mobility 3 GPU with 8MB Dedicated VRAM
ESS Maestro 3i Sound Card
3Com 10/100 + v.90 Modem Mini-PCI Card
Hitachi DK23BA-10 10GB 2.5" ATA-100 HDD
Windows 2000 SP1

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
1766 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1089 3D marks

Where am I?

Reply 8 of 21, by athlon-power

User metadata
Rank Member
Rank
Member
Koltoroc wrote:

I agree, that DVI likely means something else in this context. While DVI (the interface) was introduced in 1999, I find it highly unlikely for a budget card to be equipped with one at that point in time.

I'm surprised that DVI is that old. It's funny how long some innovations in the PC industry take to actually be applied in a widespread fashion. People were still using VGA in 2005, and so on. Still, I wonder what the significance of that is. The other ones had a different lettering on eBay, in parenthesis as well, but the one with "(DVI)" was the only one I found with that designation so I gambled in the hopes that it might somehow be more powerful than the others. I still don't know if that was a bad decision, but I also got the thing for ~$12 US, so it seemed like a good deal at the time, and it still seems to be.

Where am I?

Reply 9 of 21, by Koltoroc

User metadata
Rank Member
Rank
Member

The reason why VGA lasted that long is simple, the transition to TFT screens took that long. Up to the mid 2000s CRTs were the objectively better choice thanks to the abysmal color representation of affordable TFTs at that time. Sure, there were really good ones, but they were in a similar price range as 4k displays were 2-3 years ago.

Reply 10 of 21, by PARKE

User metadata
Rank Oldbie
Rank
Oldbie
athlon-power wrote:

I don't know which one it is, to be honest. The model number is AGP V3800M (DVI) SDRAM. I don't know what the significance of the "(DVI)" designation is, nor the V before the 3800. I think the "M" might mean that it's an M64 card, but I'd have no idea.

According to the documentation the 'M' stands for "Magic''. This indeed stands for M64 and it runs at 250Mhz instead of 300Mhz for the regular and ultra versions.

Filename
v3800-104.pdf
File size
1.3 MiB
Downloads
44 downloads
File license
Fair use/fair dealing exception

Reply 11 of 21, by Koltoroc

User metadata
Rank Member
Rank
Member

The clock speed is the least concern. The memory interfaced has been reduced from 128bit to 64 bit on M64 and vanta chips. That lack of bandwidth is what kills the performance with 32bit color depth.

Reply 14 of 21, by RaverX

User metadata
Rank Member
Rank
Member

Fun fact, there are cards wih TNT2 M64 that have DVI, just one example (one of the most common of such cards):
http://vgamuseum.ru/gpu/nvidia/nvidia-riva-tn … 64-number-nine/

Does your card looks like this?
https://i.ebayimg.com/images/g/RMMAAOSwtUtXBT8g/s-l1600.jpg
There's a place for the DVI connector and a place for the silicon image chip (for the DVI).
But I have never a V3800M with DVI, I have a few variants, altough I'm not a big fan of M64, but if I find them a a very good price I still can't resist buying them.

Reply 15 of 21, by KCompRoom2000

User metadata
Rank Oldbie
Rank
Oldbie
Errius wrote:

The oldest card I have with DVI is the Matrox Millennium G450 which came out in 2000.

I have an ATI Rage 128 Pro 16MB (Apple OEM) video card with DVI-D and it's from 1999.

Admittedly the DVI interface was at its infancy at the time, so hardly any video cards and monitors had it back then. It didn't become a common sight on video cards until 2002/03 from my findings.

Reply 16 of 21, by wouterwashere

User metadata
Rank Newbie
Rank
Newbie

Intel Pentium III Katmai 600MHz, 100MHz FSB
128MB PC100 RAM
TNT2 M64 32MB
Windows 98 SE

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
2331 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1624 3D marks

The High Detail Helicopter Test is a horrible-looking slideshow, even in 16-bit.

Edit: Same system, but this time with an Intel Pentium II Deschutes 333MHz, 66MHz FSB (note that it doesn't have SSE, so it will use software T&L in stead of the Pentium III optimizations).

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
1083 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1047 3D marks (the 64MB texture test succeeds with 0,0 fps 😢)

Conclusion: In 16 bit does the Pentium III deliver 215% performance with 180% of the clock speed on the same system, or you can say that it is 19,5% faster clock for clock.
In 32 bit the Pentium III does deliver 155% performance with 180% of the clock speed.

Another way to interpret this is that the Pentium II is a bottleneck for the TNT2 M64.

Last edited by wouterwashere on 2018-12-06, 17:16. Edited 1 time in total.

Reply 18 of 21, by wouterwashere

User metadata
Rank Newbie
Rank
Newbie

This time I tested with:

AMD Athlon Thunderbird 600MHz (socket A)
256MB PC133 RAM
TNT2 M64 32MB
Windows 98 SE

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
2303 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1384 3D marks

Although the benefit of more and faster RAM, it is actually performing a bit worse than the Pentium III. This might have to do with the missing SSE on the Athlon or with the driver version.

Now we move in with some raw power: let's overclock the Athlon to 1466MHz and see if we are on the upper boundary of the graphics card, or if the 600MHz still is a bottleneck for the TNT2 M64.

800x600 @16bpp, 16-bit textures, 16-Bit Z Buffer, Triple Frame Buffer (All Tests):
2439 3D marks

800x600 @32bpp, 32-bit textures, 24-Bit Z Buffer, Triple Frame Buffer (All Tests):
1383 3D marks

With 244% clock speed increase and 6% performance gain in 16-bit and 0% performance gain in 32-bit it is safe to say that we have hit the upper boundary of this card. 😀

Conclusion: 3DMark 2000 is heavily SSE optimized if an Athlon 1466MHz can't even come close to a Pentium III 600MHz (only 85% of the performance in 32-bit).

Conclusion 2: The sweet spot for a TNT2 M64 is somewhere around Pentium III 450 - 500 MHz. I am not very surprised, as both the Pentium III and the TNT2 were released around the same time. Maybe someone has a 450 or 500 available for testing. Even more interesting would be to test the difference between a Pentium II 450MHz and a Pentium III 450MHz as they are virtually the same, except for SSE.