VOGONS


Fastest AGP video card for Super socket 7 platform?

Topic actions

Reply 80 of 91, by MarmotaArmy

User metadata
Rank Newbie
Rank
Newbie
ruthan wrote on 2019-08-28, 15:40:
Hmm, i can check bios for L2 cache, but doubt that problem would be here. Are somewhere some CPU benchmarks Win98 test to compa […]
Show full quote

Hmm,
i can check bios for L2 cache, but doubt that problem would be here. Are somewhere some CPU benchmarks Win98 test to compare, if cpu performance is ok?

Update: I did some CPU Dos testing in my pure Dos benchmarks and pathb - giving me 27s i would say that is as expected.. Bios seems to be fine?
2019-08-28 17.42.31.jpg

I dont have Win98 number in my memory, so i dont really know what to except, its 20 year when i had such machine and it wasnt fast either. I dunno if CPU or MB could be such way defective, well theoretically all is possible.. I know that RAM caching has some influence too, but i didnt see any related bios settings.
It could be some problem with 256 MB stick, but memtest where for hours ok.

Have you tried with 128mb or less ram?
Your benchmarks should improve a lot, I was testing a k6-2 550 (384mb ram) , Nvidia Quadro NVS 280 PCI, with Unreal tournament and I was getting the same 13-14fps . Then I tested it with 128mb and got 21 fps in Cityintro timedemo

Reply 81 of 91, by Minutemanqvs

User metadata
Rank Member
Rank
Member

I just came across this thread and have one experience to share. On my GA-5AX (ALI Aladdin V) + K6-III 500 I have a GeForce FX 5700LE AGP since a very long time, it gives me good results on 3DMark 2000 and the system is stable, I'm running Windows XP. The DirectX games run just ok but I didn't have another decent card.

Recently I installed Quake 3 and...it's absolutely useless. I can't even navigate the menu. It seems to be the same on other OpenGL games.

I tried:
- All kind of BIOS settings (the usual suspects...): it changed nothing
- Test ForceWare 50, 60 and 70 series: except some BSODs when starting some DirectX Games using Forceware 60 and 70 I gained nothing
- Re-check I had the latest ALI/ULI chipset/AGP drivers

Then I took an old Matrox Millenium G400 I got recently fo 5€, installed the latest Matrox drivers and...all DirectX games run smoother and Quake 3 with the default settings runs perfectly fine at 27 fps (demo four). Of course 3DMark 2000 score is lower, but it doesn't matter.

So that's just a perfect example that recent cards can be more problematic than period-correct ones. I wonder if there are compatible Matrox Parhelias...would like to test one if so 😀

Edit: I just had a look at Parhelia prices on eBay...🤣...

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 82 of 91, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Awhile back some people here determined the GeForce FX cards have higher driver overhead than earlier GeForce cards and possibly the 6000 series as well. That doesn't explain why it's unusable though...

G400 was rather CPU heavy as well. I would probably use TurboGL as it is usually a bit faster than the full OpenGL ICD.

Reply 83 of 91, by ph4nt0m

User metadata
Rank Member
Rank
Member
swaaye wrote on 2023-05-07, 22:22:

Awhile back some people here determined the GeForce FX cards have higher driver overhead than earlier GeForce cards and possibly the 6000 series as well. That doesn't explain why it's unusable though...

G400 was rather CPU heavy as well. I would probably use TurboGL as it is usually a bit faster than the full OpenGL ICD.

Matrox G200/G400/G450/G550 have hardware triangle setup called Warp engine, and it's even programmable, but Matrox has never released detailed specs. Probably not a match to NVIDIA cards. RIVA TNT2 Ultra could do 9 millions triangles per second, GeForce 256 about 15 and GeForce 2MX about 20. ATI and S3 cards also have hardware triangle setup, but Rage128 is slow pushing only 4 million triangles per second. Savage4 is quite good with 8 million per second, but hardly useable to full extent due to constraints of 64-bit memory bus. First Radeons were really outstanding with 40 million @ 270MHz. That's an overkill for Super Socket 7 boards anyway.

Voodoo 1 has software triangle setup, Rush has slow hardware, Voodoo2 has faster hardware, Banshee is very much like Rage128 doing 4 million triangles per second. Voodoo 3 isn't much different from Banshee in this regard, though higher clock speeds help. 8 million @ 183MHz isn't bad. Voodoo 4/5 are somewhat faster pushing 11 million.

My Active Sales on CPU-World

Reply 84 of 91, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
ph4nt0m wrote on 2023-05-10, 11:14:

Voodoo 1 has software triangle setup...

What makes you think so.
Also speed of hardware triangle setup does not explain CPU overhead, either you use it or don't.

Reply 85 of 91, by ph4nt0m

User metadata
Rank Member
Rank
Member
SST-1 Graphics Engine for 3D Game Acceleration wrote:

Triangle subpixel correction is performed in the on-chip triangle setup unit of SST-1. When subpixel correction is enabled (fbzColorPath(26)=1), the incoming starting color, depth, and texture coordinate parameters are all corrected for non-integer aligned starting triangle <x,y> coordinates. The subpixel correction in the triangle setup unit is performed as the starting color, depth, and texture coordinate parameters are read from the PCI FIFO. As a result, the exact data sent from the host CPU is changed to account for subpixel alignments. If a triangle is rendered with subpixel correction enabled, all subsequent triangles must resend starting color, depth, and texture coordinate parameters, otherwise the last triangle’s subpixel corrected starting parameters will be subpixel corrected (again!), and inaccuracies will result.

Triangle setup is performed in software. Voodoo 1 performs optional triangle subpixel correction which drops performance from 7 to 16 clocks per triangle. Peak triangle performance is 0.7 million triangles per second @ 50MHz.

My Active Sales on CPU-World

Reply 86 of 91, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Minutemanqvs wrote on 2023-04-28, 19:25:
I just came across this thread and have one experience to share. On my GA-5AX (ALI Aladdin V) + K6-III 500 I have a GeForce FX 5 […]
Show full quote

I just came across this thread and have one experience to share. On my GA-5AX (ALI Aladdin V) + K6-III 500 I have a GeForce FX 5700LE AGP since a very long time, it gives me good results on 3DMark 2000 and the system is stable, I'm running Windows XP. The DirectX games run just ok but I didn't have another decent card.

Recently I installed Quake 3 and...it's absolutely useless. I can't even navigate the menu. It seems to be the same on other OpenGL games.

I tried:
- All kind of BIOS settings (the usual suspects...): it changed nothing
- Test ForceWare 50, 60 and 70 series: except some BSODs when starting some DirectX Games using Forceware 60 and 70 I gained nothing
- Re-check I had the latest ALI/ULI chipset/AGP drivers

Then I took an old Matrox Millenium G400 I got recently fo 5€, installed the latest Matrox drivers and...all DirectX games run smoother and Quake 3 with the default settings runs perfectly fine at 27 fps (demo four). Of course 3DMark 2000 score is lower, but it doesn't matter.

So that's just a perfect example that recent cards can be more problematic than period-correct ones. I wonder if there are compatible Matrox Parhelias...would like to test one if so 😀

Edit: I just had a look at Parhelia prices on eBay...🤣...

Might I ask why XP?
While a K6-2/III will meet the requirements, it's far from and ideal platform.
Win9x would be a much better fit.
Also, the FX card is just fine but using later drivers is what is hurting you. I would try to use the earliest ones that I could, you are going to be limited however with such a late card.
A Geforce 2 or MX or even a 3/4 should allow you to run earlier drivers which alleviates a lot of overhead. Saving precious cycles on the CPU.
Just a suggestion.

Sound card drivers can make a difference as well, if the games are trying to offload 3dsound on to the CPU instead of doing it with the hardware on the sound card itself.
I've no practical experience with Matrox cards, I'm no help there.

Reply 87 of 91, by Minutemanqvs

User metadata
Rank Member
Rank
Member
Jasin Natael wrote on 2023-05-10, 15:51:
Might I ask why XP? While a K6-2/III will meet the requirements, it's far from and ideal platform. Win9x would be a much bette […]
Show full quote
Minutemanqvs wrote on 2023-04-28, 19:25:
I just came across this thread and have one experience to share. On my GA-5AX (ALI Aladdin V) + K6-III 500 I have a GeForce FX 5 […]
Show full quote

I just came across this thread and have one experience to share. On my GA-5AX (ALI Aladdin V) + K6-III 500 I have a GeForce FX 5700LE AGP since a very long time, it gives me good results on 3DMark 2000 and the system is stable, I'm running Windows XP. The DirectX games run just ok but I didn't have another decent card.

Recently I installed Quake 3 and...it's absolutely useless. I can't even navigate the menu. It seems to be the same on other OpenGL games.

I tried:
- All kind of BIOS settings (the usual suspects...): it changed nothing
- Test ForceWare 50, 60 and 70 series: except some BSODs when starting some DirectX Games using Forceware 60 and 70 I gained nothing
- Re-check I had the latest ALI/ULI chipset/AGP drivers

Then I took an old Matrox Millenium G400 I got recently fo 5€, installed the latest Matrox drivers and...all DirectX games run smoother and Quake 3 with the default settings runs perfectly fine at 27 fps (demo four). Of course 3DMark 2000 score is lower, but it doesn't matter.

So that's just a perfect example that recent cards can be more problematic than period-correct ones. I wonder if there are compatible Matrox Parhelias...would like to test one if so 😀

Edit: I just had a look at Parhelia prices on eBay...🤣...

Might I ask why XP?
While a K6-2/III will meet the requirements, it's far from and ideal platform.
Win9x would be a much better fit.
Also, the FX card is just fine but using later drivers is what is hurting you. I would try to use the earliest ones that I could, you are going to be limited however with such a late card.
A Geforce 2 or MX or even a 3/4 should allow you to run earlier drivers which alleviates a lot of overhead. Saving precious cycles on the CPU.
Just a suggestion.

Sound card drivers can make a difference as well, if the games are trying to offload 3dsound on to the CPU instead of doing it with the hardware on the sound card itself.
I've no practical experience with Matrox cards, I'm no help there.

I used to run Windows 2000, but as I'm using quite a lot of GoG games they tend to have better compatibility on XP, so that's the choice right there. I tried up to the lowest driver supporting the FX5700LE but they all are problematic in a way or another. As stupid as it sounds, I don't have any GeForce 2/4 MX cards to test.

...but this leads me to my tests from yesterday. I took out an ATI Radeon 9200, also a very cheap card and did some tests of Catalyst drivers:
6.11: BSOD in DirectX on ati3duag.dll > known problem, it tries to call SSE instructions. this is the driver you get from AMD if you go on their website
6.2: BSOD in DirectX ati3duag.dll > known problem, it tries to call SSE instructions
5.13 "Classic": OK. 36.4 fps in Q3, 2233 points in 3DMark 2000
5.5 "Classic": OK. 35 fps in Q3, 2291 points in 3DMark 2000

So actually this cheap Radeon 9200 (probably also the 9250) is the most compatible and well performing card I ever had in my K6-III system. It's rock stable with the Catalyst "classic" (without CCC) 5.13 drivers, I tested it in around 10 games, Direct3D and OpenGL and it works flawlessly.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 88 of 91, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
ph4nt0m wrote on 2023-05-10, 15:41:
SST-1 Graphics Engine for 3D Game Acceleration wrote:

Triangle subpixel correction is performed in the on-chip triangle setup unit of SST-1. When subpixel correction is enabled (fbzColorPath(26)=1), the incoming starting color, depth, and texture coordinate parameters are all corrected for non-integer aligned starting triangle <x,y> coordinates. The subpixel correction in the triangle setup unit is performed as the starting color, depth, and texture coordinate parameters are read from the PCI FIFO. As a result, the exact data sent from the host CPU is changed to account for subpixel alignments. If a triangle is rendered with subpixel correction enabled, all subsequent triangles must resend starting color, depth, and texture coordinate parameters, otherwise the last triangle’s subpixel corrected starting parameters will be subpixel corrected (again!), and inaccuracies will result.

Triangle setup is performed in software. Voodoo 1 performs optional triangle subpixel correction which drops performance from 7 to 16 clocks per triangle. Peak triangle performance is 0.7 million triangles per second @ 50MHz.

Thanks, somehow I forgot how little was the Voodoo doing on this front.

Reply 89 of 91, by ph4nt0m

User metadata
Rank Member
Rank
Member
Minutemanqvs wrote on 2023-05-11, 06:31:

So actually this cheap Radeon 9200 (probably also the 9250) is the most compatible and well performing card I ever had in my K6-III system. It's rock stable with the Catalyst "classic" (without CCC) 5.13 drivers, I tested it in around 10 games, Direct3D and OpenGL and it works flawlessly.

Radeon 8500 / 9100 is also a good choice as it comes with 4x2 vs. 4x1 TMUs of 9000 / 9200 / 9250, but it needs active cooling. Although if pixel shaders are not required, classic Radeon DDR / 7200 will do just fine.

My Active Sales on CPU-World

Reply 90 of 91, by Minutemanqvs

User metadata
Rank Member
Rank
Member
ph4nt0m wrote on 2023-05-11, 09:42:
Minutemanqvs wrote on 2023-05-11, 06:31:

So actually this cheap Radeon 9200 (probably also the 9250) is the most compatible and well performing card I ever had in my K6-III system. It's rock stable with the Catalyst "classic" (without CCC) 5.13 drivers, I tested it in around 10 games, Direct3D and OpenGL and it works flawlessly.

Radeon 8500 / 9100 is also a good choice as it comes with 4x2 vs. 4x1 TMUs of 9000 / 9200 / 9250, but it needs active cooling. Although if pixel shaders are not required, classic Radeon DDR / 7200 will do just fine.

Honestly I don't think it matters in a K6 system, any of these with stable drivers will do the job just fine. The 9200/9250 are dirt cheap and 3 years younger than the CPU.

Searching a Nexgen Nx586 with FPU, PM me if you have one. I have some Athlon MP systems and cookies.

Reply 91 of 91, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie

I've had good luck with Radeon cards in most of my systems, I even keep a 9250SE (128bit) PCI version in my spare machine, paired with a Voodoo 3 AGP.
Unfortunately I haven't been able to get AGP to work with my 7200/7500/FireGl 8800 (8500 drivers) on either of my SS7 machines.
The cards work fine, drivers install, Direct3d works but no AGP mode.
MVP3 boards, I've tried all of the VIA 4n1 drivers in different orders, multiple different drivers but no dice.
But oh well such is life.