VOGONS


Not Another Ultimate Windows 98 Build

Topic actions

Reply 40 of 106, by VDNKh

User metadata
Rank Newbie
Rank
Newbie
Joseph_Joestar wrote on 2022-04-05, 07:52:
VDNKh wrote on 2022-04-05, 01:37:

I've also been trying to make heads and tails of the memory pointers in the registers but I'm not a computer scientist. Relevant ones are in red. The top two are listed as the "VIA Standard Host Bridge" in Windows and control AGP and GART. In red is the Base GART Low Address. The bottom 2 are PCI/PCI bridges that show up as "VIA CPU to AGP2.0/AGP3.0 Controller" in Windows (confused yet?). They don't expose any AGP configuration but the registers in red are also partially responsible for mapping the GART. I think... I hope that someone who really knows what they're looking at can help out. I've searched Google far and wide for this problem and while many report it, no solutions are provided.

One thing does come to mind though. Assuming that you get proper performance under WinXP, it might be worth checking how these devices are configured under that OS in comparison to Win98.

It's flawless in XP SP3. AGP Memory is correctly listed and I get full performance. I made a post about it here. I figured out that the skewed performance I saw in 98, like the very high 3D transfer rate on an older driver and high Readback: Small bandwidth was a result of caching. Interestingly, in some driver between 61.76 and 45.23, NVidia implemented their own GART driver and I get a grand total of (IIRC) 32 MB of AGP aperture. It does help performance a lot (~17,100 vs. ~12,500 3DMark01) but it's still a far cry from what a 5950U is fully capable of. It's another piece I can investigate to see if I can get VIA's implementation to work properly in 98.

I did do a comparison to XP, but it uses the UAGP 3.5 driver, which didn't seem like a good comparison. With what I know now though I'm going to take a more carful look at it.

Reply 41 of 106, by leonardo

User metadata
Rank Member
Rank
Member
VDNKh wrote on 2021-11-22, 03:51:

GPU: Gainward FX5950 Ultra Gold Sample
This is the second fastest FX series GPU you can buy, the fastest being the liquid cooled variant of this card. Caught it on eBay when looking for GPUs on a whim, I don't want to say how much it was. I went with the FX series over the 6000 series for better game compatibility. I edited the 45.23 Nvidia drivers to work on this card to maximize compatibility as well. This was a requirement for the NFS games and Thief.
When it first arrived I benchmarked it to see if it was defective or not. While the electronics were fine, the original thermal paste was way past its prime. I was getting temperatures of above 110 C. Very carefully, I removed the epoxied thermal blocks and cleaned off any residue on the card and blocks. I had to soak the blocks in acetone to get all the epoxy off. After that fresh paste was applied and it was benchmarked again. The fan's bearings were also worse for wear. They had to be popped back into their correct position by squeezing them into the shroud. I guess years of spinning upside down will dislodge it slowly. Cleaned out the used grease with WD40 and left it at that. Now I get temps in the 90 C range.

...I have a stupid question: How come all the Dream W98-builds always feature an FX-series GeForce? I don't mean that in a derogatory way, I just remember the FX-series being around the time when ATi clearly had the upper hand. Radeon 9700 Pro and 9800 Pro were the cards to have back when the FX was the nVidia flagship, at leat based on my remembrance of the reviews at the time - or did they still have their drivers so screwed up that even to this day it's not worth it?

I think the Radeon 9600 Pro was the quintessential enemy of the GeForce 4 Ti-lineup, making the 9700/9800-series a direct competitor to the effexes.

[Install Win95 like you were born in 1985!] on systems like this or this.

Reply 42 of 106, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

It's a case of being preferred for backward compatibility more than anything I think. I guess 9600s are a bit of a bargain considering 9800s die suddenly and 4x00ti cards are getting more expensive. I would say they're best in a "good for 2003" XP system though or something like that.

editL oh yah, I got an AMD64 notebook with x300 in about '05 and I played a lot of "bargain bin" titles on that, which I guess may have some fixes, and probably nothing much was older than 99, but I never had noticeable backward compatibility problems with that, which is supposed to be same tech as 9500 up.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 43 of 106, by VDNKh

User metadata
Rank Newbie
Rank
Newbie
leonardo wrote on 2022-04-05, 22:06:
VDNKh wrote on 2021-11-22, 03:51:

GPU: Gainward FX5950 Ultra Gold Sample
This is the second fastest FX series GPU you can buy, the fastest being the liquid cooled variant of this card. Caught it on eBay when looking for GPUs on a whim, I don't want to say how much it was. I went with the FX series over the 6000 series for better game compatibility. I edited the 45.23 Nvidia drivers to work on this card to maximize compatibility as well. This was a requirement for the NFS games and Thief.
When it first arrived I benchmarked it to see if it was defective or not. While the electronics were fine, the original thermal paste was way past its prime. I was getting temperatures of above 110 C. Very carefully, I removed the epoxied thermal blocks and cleaned off any residue on the card and blocks. I had to soak the blocks in acetone to get all the epoxy off. After that fresh paste was applied and it was benchmarked again. The fan's bearings were also worse for wear. They had to be popped back into their correct position by squeezing them into the shroud. I guess years of spinning upside down will dislodge it slowly. Cleaned out the used grease with WD40 and left it at that. Now I get temps in the 90 C range.

...I have a stupid question: How come all the Dream W98-builds always feature an FX-series GeForce? I don't mean that in a derogatory way, I just remember the FX-series being around the time when ATi clearly had the upper hand. Radeon 9700 Pro and 9800 Pro were the cards to have back when the FX was the nVidia flagship, at leat based on my remembrance of the reviews at the time - or did they still have their drivers so screwed up that even to this day it's not worth it?

I think the Radeon 9600 Pro was the quintessential enemy of the GeForce 4 Ti-lineup, making the 9700/9800-series a direct competitor to the effexes.

It's a matter of driver support and DX7/8 performance. The FX series was the last family of cards to support the 45.23 driver which is very stable, has pretty good W98 compatibility, and works well with most mid-to-late 90s games. Popular games like Thief and NFS 3 & 4 require that driver to work on 98 (in my experience). ATI drivers for 98 in the same time period, from what I read, just suck compared to NVidia. Also its DX9 performance was worse than Radeons at the time, but it pulled ahead in DX8 and DX7. Which, if you're playing games made to run on 98, is what you are generally going to be using.

Edit: There are builds that feature an 850XT for the people that say: "screw compatibility, I want to max out 3DMark03". But 98 compatibility for that card is literally a beta release, and those cards have a propensity to randomly die. I have a friend that has gone through SIX 850XTs that were either DOA or died after some heavy use.

Reply 44 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
leonardo wrote on 2022-04-05, 22:06:

...I have a stupid question: How come all the Dream W98-builds always feature an FX-series GeForce? I don't mean that in a derogatory way, I just remember the FX-series being around the time when ATi clearly had the upper hand. Radeon 9700 Pro and 9800 Pro were the cards to have back when the FX was the nVidia flagship, at leat based on my remembrance of the reviews at the time - or did they still have their drivers so screwed up that even to this day it's not worth it?

In addition to what has been said about the 45.23 drivers, the GeForce FX is the last Nvidia card which supports both table fog and paletted textures. Those legacy features are used by certain Win9x games, and ATi Radeon cards do not support them. This can cause some visible differences in rendering:

file.php?id=117957&mode=view

Of the two, table fog seems to be more prominent and has a larger impact on the visuals. You can enable it on ATi cards through some registry tweaks, but the results are imperfect under Win9x, and that workaround doesn't function with every game.

I think the Radeon 9600 Pro was the quintessential enemy of the GeForce 4 Ti-lineup, making the 9700/9800-series a direct competitor to the effexes.

If we only look at speed, ATi is certainly better. But in terms of compatibility, even some early WinXP games like Splinter Cell favor Nvidia, especially the GeForce4 Ti and GeForce FX cards. This is an edge case though, and I wouldn't outright dismiss ATi cards just because of that.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 46 of 106, by VDNKh

User metadata
Rank Newbie
Rank
Newbie

Small update.

Spoiler
IMG_20220506_185707806_HDR.jpg
Filename
IMG_20220506_185707806_HDR.jpg
File size
1.1 MiB
Views
1693 views
File license
CC-BY-4.0

I got a great CRT! A Dell P992. It can push 1600x1200 at 85 Hz and 1024x768 at 120 Hz. Been loving it. Got it off Craigslist locally for half of what it's probably worth. The picture is still great and there's not even a scratch on the outer shell. The previous owner was the original buyer and said, in his 20 or so years of owning it, he never opened it or cleaned it in anyway. Is there any kind of preventative maintenance I should do?

Not pictured: I also got a Periboard-107, so now I have a numpad. I can't believe they still make PS/2 keyboards.

The fix for my AGP memory issue still eludes me. I have made more sense of the registers.

AGP Target Registers

file.php?id=134406&mode=view

From my previous post. The register in red at the bottom (PCI Bridge) are not the associated with the GART. The registers in row 10 on top (Host Bridge) are the GART virtual addresses that Windows sees. 08 00 00 F0, which is F0000008 read as a dword. The last 8 at the end denotes that it's a 128 MB aperture. This is seen in Windows as the memory allocation F0000000-F9FFFFFF.
The 00 F0 FB 00 (dword: 00FBF000) is the GART physical remapped address. Depending on what driver I use this can change. I was using an older driver when I made this comparison, that's why it's very different. The most recent AGP driver produces a remapped address similar to Joseph_Joestar's. Still doesn't work though.

I've taken a second look at XP's AGP configuration. The memory allocations are similar but some differences stand out:

  • The AGP controller takes the GART memory allocation in Windows (F0000000) instead of the Host Bridge. I'll chalk this up to XP's PCI driver or memory manager doing things a little differently, plus Joseph_Joestar's comparison proves either way works, but....
  • pci.sys takes an active roll in the AGP driver. Just from looking at device manager I can see that viaagp1.sys and pci.sys interface with AGP, so there's more going on here at the OS level I can't see. This probably due to Microsoft's UAGP 3.5 standard, which this motherboard was built for. I should test with a version of XP that pre-dates UAGP 3.5.
  • The Microsoft PCI driver's memory allocation includes the GART address in its range. I don't have a screenshot or copypasta handy, but it was some crazy range like 000F0000-F9FFFFFF
  • The mystery Motherboard Resource that consumes E0000000-EFFFFFFF is still there in XP. I think it might be the V-Link, But I'm not sure. All XP says is it's on the PCI bus somewhere. Still, it doesn't seem to be a problem there but I'm still suspect.

At this point I need to calmly: sit down, brew a pot of coffee, and learn how to use a kernal debugger for a 23 year old operating system. Any recommendations on the best kind of setup for this problem? Can I run 98 or NT from a virtual machine on my laptop and connect it with a USB to DB-9 adapter? Or do I really need a second machine that supports 98 or NT? SoftICE, as I understand, does not require a second computer. But it seems kind of limited when it comes to debugging drivers at boot.

Edit: Tried the original XP, with no service packs and before UAGP 3.5 was implemented, and it still performs normally. Resource allocations are the same. I downloaded the 98/ME DDK and will play around with it at somepoint.

Reply 47 of 106, by AlexZ

User metadata
Rank Member
Rank
Member
VDNKh wrote on 2022-05-19, 05:58:

From my previous post. The register in red at the bottom (PCI Bridge) are not the associated with the GART. The registers in row 10 on top (Host Bridge) are the GART virtual addresses that Windows sees. 08 00 00 F0, which is F0000008 read as a dword. The last 8 at the end denotes that it's a 128 MB aperture. This is seen in Windows as the memory allocation F0000000-F9FFFFFF.
The 00 F0 FB 00 (dword: 00FBF000) is the GART physical remapped address. Depending on what driver I use this can change. I was using an older driver when I made this comparison, that's why it's very different. The most recent AGP driver produces a remapped address similar to Joseph_Joestar's. Still doesn't work though.

I reproduced the missing AGP video memory problem on my MSI K8T Neo V (even with the oldest BIOS 7032v11), however I'm not convinced whether this issue is also affecting games rather than just synthetic benchmarks. From what I know AGP memory is meant to be used only if textures don't fit into video memory. You have three kinds of memory from the point of graphics card - local video memory (period correct usually 32MB), AGP memory (system memory directly accessible to graphics card via AGP to certain limit, configured via AGP aperture size) and system memory (slowest access).

Whether your textures end up in video memory or in AGP memory is meant to be transparent to games. The problem with games is they would have certain video memory requirements, but every user would have a graphics card with different memory size (16-64MB). One way to account for having insufficient video memory would be to switch to low detail textures and another way to use AGP memory (basically having just one size textures and have them spill over to AGP memory). AGP 2x had the same bandwidth as PC133 memory, AGP 4x had same bandwidth as DDR266 and AGP 8x had same bandwidth as DDR533. A problem of AGP memory solution is it was configurable in BIOS setup and every user could use different size of AGP aperture. In theory the highest value that is lower than system memory size is the best. Being set automatically to system memory size would have been the best. Size limit prevented games from loading hundreds of megabytes of textures into 32MB cards. Another problem was graphics cards having insufficient memory were more likely to be running at slow AGP speed (e.g 4x instead of 8x, simply due to being older), making the AGP memory solution inefficient. A much better solution for player is to use low detail textures.

A synthetic benchmark is meant to take into account AGP memory usage, as some games could end up using it, especially if graphics card has insufficient video memory. In order to do that, you would have to detect video memory size and fill it up with garbage textures that do not really end up being used (are not visible) - to simulate your game textures being located in both video memory and AGP memory. But that may not necessarily be happening in games if we use FX cards with 128MB of video memory rather than period correct 32MB of memory. Back then you would have about 128MB of system memory and about half of it being available as AGP memory. Unless you play a game that requires lot of video memory, you should be fine with 128MB in Windows 98 even without working AGP aperture. We may be chasing a red herring here.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, NVIDIA GeForce FX 5600 128MB, Voodoo 2 12MB, 80GB HDD, Yamaha SM718 ISA, 19" AOC 9GlrA
Athlon 64 3400+, MSI K8T Neo V, 1GB RAM, NVIDIA GeForce 7600GT 512MB, 250GB HDD, Sound Blaster Audigy 2 ZS

Reply 48 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
AlexZ wrote on 2022-08-21, 15:31:

I reproduced the missing AGP video memory problem on my MSI K8T Neo V (even with the oldest BIOS 7032v11), however I'm not convinced whether this issue is also affecting games rather than just synthetic benchmarks.

It does. Back when I first encountered this, I measured almost a 50% difference in Quake 2 performance with the bug (newest BIOS) and without the bug (oldest BIOS) on my K8V-MX.

You can check this very easily if you dual boot WinXP and Win98. If you have the bug, game performance will be around 50% faster in WinXP compared to Win98, using the exact same driver version. Without the bug, games are only 5% faster under WinXP, which is normal due to the better OS architecture. For reference, this was with a GeForce4 Ti4200 using 45.23 drivers.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 49 of 106, by AlexZ

User metadata
Rank Member
Rank
Member
Joseph_Joestar wrote on 2022-08-21, 16:21:

It does. Back when I first encountered this, I measured almost a 50% difference in Quake 2 performance with the bug (newest BIOS) and without the bug (oldest BIOS) on my K8V-MX.

You can check this very easily if you dual boot WinXP and Win98. If you have the bug, game performance will be around 50% faster in WinXP compared to Win98, using the exact same driver version. Without the bug, games are only 5% faster under WinXP, which is normal due to the better OS architecture. For reference, this was with a GeForce4 Ti4200 using 45.23 drivers.

I understand this can be happening in a few games. But we cannot draw general conclusions based on single game - Quake 2. I wonder if it affects rendering of selected static game scenes where all textures fit into video memory - it shouldn't. Benchmarks (even Quake 2) could be affected by using AGP memory artificially (e.g doubling texture count on purpose not to make them fit into video memory). Alternatively missing AGP memory perhaps also affects texture loading into video memory. It's hard to say without having someone here knowledgeable in NVidia driver source code.

I don't use my Athlon 64 with Windows 98, I only installed it temporarily to shed more light on this issue. It doesn't have any games on it. It was always meant to be a cheap Windows XP rig.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, NVIDIA GeForce FX 5600 128MB, Voodoo 2 12MB, 80GB HDD, Yamaha SM718 ISA, 19" AOC 9GlrA
Athlon 64 3400+, MSI K8T Neo V, 1GB RAM, NVIDIA GeForce 7600GT 512MB, 250GB HDD, Sound Blaster Audigy 2 ZS

Reply 50 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
AlexZ wrote on 2022-08-21, 17:00:

I understand this can be happening in a few games. But we cannot draw general conclusions based on single game - Quake 2.

It wasn't just Quake 2, that's just the most prominent example. I had huge performance differences (all around 50%) in Unreal Tournament 99 and Drakan too. And of course in all of the 3D Mark editions.

With Quake 2, the performance was immediately suspect to me because I had been using a GeForce4 Ti4200 in my AthlonXP 1700+ system beforehand. When I moved that Ti4200 to the Athlon 64 3000+ system (with the bug) Quake 2 performance was actually below what I had on the AthlonXP. In contrast, when using the oldest BIOS for the Athlon 64 system (no bug), it wipes the floor with the AthlonXP using the exact same Ti4200 card.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 51 of 106, by VDNKh

User metadata
Rank Newbie
Rank
Newbie
AlexZ wrote on 2022-08-21, 15:31:

A synthetic benchmark is meant to take into account AGP memory usage, as some games could end up using it, especially if graphics card has insufficient video memory. In order to do that, you would have to detect video memory size and fill it up with garbage textures that do not really end up being used (are not visible) - to simulate your game textures being located in both video memory and AGP memory. But that may not necessarily be happening in games if we use FX cards with 128MB of video memory rather than period correct 32MB of memory. Back then you would have about 128MB of system memory and about half of it being available as AGP memory. Unless you play a game that requires lot of video memory, you should be fine with 128MB in Windows 98 even without working AGP aperture. We may be chasing a red herring here.

In most games that came out for 98 the performance is fine. However, without AGP memory mapping, the GPU loses the ability to use Direct Memory Execution (DiME) There are a few games (NFS games in particular) that really seem to lean on this feature. Despite being from 1999, NFS IV cannot run at a consistent 30 FPS without proper AGP support. NFS: PU is better at maintaining 60 FPS but can dip below 30 from time to time. Thief, Deus Ex, and Max Payne don't seem to care about missing DiME, or the performance hit isn't noticeable at 60 FPS.

From an old AnandTech article:

One of the major advantages of AGP over PCI is its ability to support Direct Memory Execute of textures, otherwise known as DIME. Whenever you draw pixels in a 3D arena, you have two options, they are either (1) texture-mapped or (2) drawn using shading. When you are drawing pixels using the latter option, you don't have to worry about calling or fetching bitmaps to physically draw the pixel. However, when using texture mapped pixels you quickly become aware of the bottlenecks of the standard PCI graphics subsystem. Because of DIME, AGP accelerators (not all, see The Real Benefits of AGP) can use and manipulate your system RAM directly whenever the need for intense texture-mapping functions are in demand.

Reply 52 of 106, by AlexZ

User metadata
Rank Member
Rank
Member
VDNKh wrote on 2022-08-21, 17:12:

In most games that came out for 98 the performance is fine. However, without AGP memory mapping, the GPU loses the ability to use Direct Memory Execution (DiME) There are a few games (NFS games in particular) that really seem to lean on this feature. Despite being from 1999, NFS IV cannot run at a consistent 30 FPS without proper AGP support. NFS: PU is better at maintaining 60 FPS but can dip below 30 from time to time. Thief, Deus Ex, and Max Payne don't seem to care about missing DiME, or the performance hit isn't noticeable at 60 FPS.

This still seems like insufficient explanation of the performance drop especially in NFS. We have AGP 8x which is at least 16 times faster than PCI , 6x faster CPU, 4x faster RAM on an Athlon 64 than on period correct rig in 1998. Back in the day Riva TNT / TNT2 were released also in PCI version, thus not benefiting from any AGP features. I don't remember PCI versions getting flak for much poorer performance due to missing AGP memory or even DiME. NFS and Quake 2 were played on those cards back then.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, NVIDIA GeForce FX 5600 128MB, Voodoo 2 12MB, 80GB HDD, Yamaha SM718 ISA, 19" AOC 9GlrA
Athlon 64 3400+, MSI K8T Neo V, 1GB RAM, NVIDIA GeForce 7600GT 512MB, 250GB HDD, Sound Blaster Audigy 2 ZS

Reply 53 of 106, by VDNKh

User metadata
Rank Newbie
Rank
Newbie
AlexZ wrote on 2022-08-21, 17:43:
VDNKh wrote on 2022-08-21, 17:12:

In most games that came out for 98 the performance is fine. However, without AGP memory mapping, the GPU loses the ability to use Direct Memory Execution (DiME) There are a few games (NFS games in particular) that really seem to lean on this feature. Despite being from 1999, NFS IV cannot run at a consistent 30 FPS without proper AGP support. NFS: PU is better at maintaining 60 FPS but can dip below 30 from time to time. Thief, Deus Ex, and Max Payne don't seem to care about missing DiME, or the performance hit isn't noticeable at 60 FPS.

This still seems like insufficient explanation of the performance drop especially in NFS. We have AGP 8x which is at least 16 times faster than PCI , 6x faster CPU, 4x faster RAM on an Athlon 64 than on period correct rig in 1998. Back in the day Riva TNT / TNT2 were released also in PCI version, thus not benefiting from any AGP features. I don't remember PCI versions getting flak for much poorer performance due to missing AGP memory or even DiME. NFS and Quake 2 were played on those cards back then.

It doesn't matter how high the AGP bus bandwidth is or how many IPS the CPU can push if the bottleneck is essentially memory latency. When benchmarking earlier, I noted how tightening the timings and boosting the FSB made a relatively big impact on performance. In 3DMark's reporting, XP reports the correct AGP aperture while 98 reports 0 bytes. In ArchMark's reporting, XP is in AGP mode and 98 in PCI mode. Finally in 98, whether I install the AGP driver or not has no impact on performance at all. I could be very final and try to disable the AGP aperture in XP to see if the performance is the same but I think it's unnecessary.

I tried looking at contemporary benchmarks of the Riva TNT / TNT2 and it seems that they would have been bottlenecked by 3D performance before the memory interface would have mattered.

Reply 54 of 106, by Bruno128

User metadata
Rank Member
Rank
Member

I am using a modest FX5200 AGP in that motherboard with 3.19a BIOS and am also experiencing this issue. While one way to look at it is there is hardly any game explicitly requiring Win98 to run and pushing the AGP card to the max and one could just boot to XP for more demanding stuff... Though the persistence of the issue and the evident benchmark results demonstrate there is without a doubt a significant AGP performance drop irregardless with either FX5200 or FX5900 Ultra.
While I know that sounds weird but has anyone managed to narrow the issue to the newer BIOS specifically? Are we looking in the right place? Or is that a coincidence and is somehow related to RAM amount / Core2Duo cpu used in beta BIOS systems? (the main reason to go beta I presume)

Now playing: Red Faction on 2003 Acrylic build


SBEMU compatibility reports

Reply 55 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Bruno128 wrote on 2022-08-21, 18:49:

While I know that sounds weird but has anyone managed to narrow the issue to the newer BIOS specifically?

I can't speak for other motherboards, but on my K8V-MX, the oldest BIOS gives you full GPU performance under Win9x. Any BIOS version newer than that has the bug which effectively halves GPU performance under Win9x.

I think @blooedm researched this a bit more and came to the conclusion that something in the microcode got changed at some point in 2005 (exact date unknown). It also seems to vary by motherboard vendor, as certain manufacturers integrated the problematic microcode sooner than others.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 56 of 106, by AlexZ

User metadata
Rank Member
Rank
Member
Joseph_Joestar wrote on 2022-08-21, 19:12:

I can't speak for other motherboards, but on my K8V-MX, the oldest BIOS gives you full GPU performance under Win9x. Any BIOS version newer than that has the bug which effectively halves GPU performance under Win9x.

I think @blooedm researched this a bit more and came to the conclusion that something in the microcode got changed at some point in 2005 (exact date unknown). It also seems to vary by motherboard vendor, as certain manufacturers integrated the problematic microcode sooner than others.

That microcode suggestion is just a speculation, like other possible causes. I suspect changed chipset settings that confuse VIA AGP driver and cause it to work in a crippled mode.

It would be great if you or someone else who can reproduce the issue provided a dump of wpcredit chipset settings when BIOS configuration is exactly the same (e.g after loading optimal settings) - for both cases. The old BIOS has to be configuring something differently. Another option would be debugging the VIA AGP driver, however that's a lot more challenging.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, NVIDIA GeForce FX 5600 128MB, Voodoo 2 12MB, 80GB HDD, Yamaha SM718 ISA, 19" AOC 9GlrA
Athlon 64 3400+, MSI K8T Neo V, 1GB RAM, NVIDIA GeForce 7600GT 512MB, 250GB HDD, Sound Blaster Audigy 2 ZS

Reply 57 of 106, by Bruno128

User metadata
Rank Member
Rank
Member
Joseph_Joestar wrote on 2022-08-21, 19:12:

the oldest BIOS gives you full GPU performance under Win9x.

Ok. Though I still wonder if it is specifically a combination of BIOS + nVidia ForceWare. Did you only confirm the issue with the GF FX's in your system? Or perhaps you can say the same about the GF4-serie, Radeons or more exotic AGP cards such as from Matrox or S3?

Now playing: Red Faction on 2003 Acrylic build


SBEMU compatibility reports

Reply 58 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
AlexZ wrote on 2022-08-21, 19:30:

I suspect changed chipset settings that confuse VIA AGP driver and cause it to work in a crippled mode.

Unlikely. While I was still using the newest BIOS, I tested many different settings and nothing helped.

One of the first things that I tried was to load the BIOS defaults but that had no effect. On the other hand, flashing the oldest BIOS and loading the defaults immediately resulted in full performance.

It would be great if you or someone else who can reproduce the issue provided a dump of wpcredit chipset settings when BIOS configuration is exactly the same (e.g after loading optimal settings) - for both cases.

I did make quite a few wpcredit dumps and sent them to @VDNKh as you can see from his earlier posts in this thread. All of those were taken with the oldest BIOS though. If someone else can compare newest vs. oldest, that would be great. But I'm not (re)flashing my motherboard again.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 59 of 106, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Bruno128 wrote on 2022-08-21, 19:34:

Ok. Though I still wonder if it is specifically a combination of BIOS + nVidia ForceWare. Did you only confirm the issue with the GF FX's in your system? Or perhaps you can say the same about the GF4-serie, Radeons or more exotic AGP cards such as from Matrox or S3?

At the time, I was using a GeForce4 Ti4200 and 45.23 drivers. I also briefly used an ATi Radeon 9550 with Catalyst 6.2 and it exhibited the same performance issues.

I don't think I tested any other cards or driver versions with regards to this issue.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi