Reply 60 of 75, by vvbee
Yea PowerStrip, version 3.x probably.
Were Matrox in the habit of populating G cards with memory chips whose specs the card wouldn't be able to handle? Doesn't sound right.
Yea PowerStrip, version 3.x probably.
Were Matrox in the habit of populating G cards with memory chips whose specs the card wouldn't be able to handle? Doesn't sound right.
vvbee wrote on 2024-08-08, 14:25:Yea PowerStrip, version 3.x probably.
Were Matrox in the habit of populating G cards with memory chips whose specs the card wouldn't be able to handle? Doesn't sound right.
I asked because I use the latest version of powerstrip but as soon I try to modify the clock of my G550 it goes nuts.
Try 3.2 or 3.9 or something like that.
In WPCREDIT, the core, memory and warp clock dividers for the card look to be set in bytes 54-56. I don't know where the PLL is set, but clocking the card to 160/200 in PowerStrip I assume brings it up to 400 MHz assuming the memory divider is 2. If bytes 54 and 56 are then set to 1 it gives 133 MHz on the core and warp, so the clocks end up being 133/200, i.e. default on the core, and the memory brought up to spec.
This configuration gives about 10% extra in Quake 3, benefiting 32-bit more than 16-bit:
So this way the core and memory aren't overclocked but the speed goes up. But I don't know whether raising the PLL has effects on other components. I know some G550 come with 5.5 ns memory chips instead of 5 ns, so maybe a PLL of 400 MHz is pushing some limits anyway.
vvbee wrote on 2024-08-08, 15:33:Try 3.2 or 3.9 or something like that. […]
Try 3.2 or 3.9 or something like that.
In WPCREDIT, the core, memory and warp clock dividers for the card look to be set in bytes 54-56. I don't know where the PLL is set, but clocking the card to 160/200 in PowerStrip I assume brings it up to 400 MHz assuming the memory divider is 2. If bytes 54 and 56 are then set to 1 it gives 133 MHz on the core and warp, so the clocks end up being 133/200, i.e. default on the core, and the memory brought up to spec.
This configuration gives about 10% extra in Quake 3, benefiting 32-bit more than 16-bit:
So this way the core and memory aren't overclocked but the speed goes up. But I don't know whether raising the PLL has effects on other components. I know some G550 come with 5.5 ns memory chips instead of 5 ns, so maybe a PLL of 400 MHz is pushing some limits anyway.
I tried the v3.90 same issues. I wonder if the problem is my card actually it works fine until I use powerstrip to change even the driver options and yes I have the 5.5ns version damn...
vvbee wrote on 2024-08-08, 15:33:So this way the core and memory aren't overclocked but the speed goes up.
What do you mean, isn't the memory default 166 MHz?
Performance of the G400 and G550 in 3DMark2000's game tests (medium detail, 800 x 600 16-bit) with various driver versions:
So the Direct3D drivers had already matured by 5.52 (Feb 2000) for the G400, while the G550 reached more or less peak performance by 6.10 (Oct 2000), which predates the G550's release by a year.
The 6.01 (Jun 2000) are the first G400 drivers that I know to work with the G550, but they feel fairly broken on it. Windows acceleration is slow and game performance is low relative to the later drivers, on an Athlon 64. But on a K6-2 300 gaming performance is potentially better than on the later drivers:
G400 drivers probably benefited from G200 similarities. But they underestimated the OpenGL ICD challenge.
There's an interesting interview with Matrox in a Maximum PC issue that covers a lot of their G200/G400 efforts, though it is a point of view from August/September 1999 and the driver comments are especially optimistic regarding their OpenGL progress at that point.
https://archive.org/details/maximum-pc-the-ne … ge/n49/mode/2up
I bought Mystique G200 in late 1998 and then got a Millennium G400 when it was released and ran it until I got a Radeon LE in 2001. At one point I bought a GeForce DDR and it seemed to stutter more than the G400 so I returned it. Of course the G400 had a better image too.
The guy sounds realistic about when to call it quits with a product. But he was also right about Matrox not needing the gaming market.
The G400 drivers 6.01 (Jun 2000) and 6.10 (Oct 2000) compared to the G550-supporting 6.83 (May 2002) in 3DMark2000 on the G550:
Since the G550's multitexturing speedup is there with 6.01, and assuming drivers have to explicitly support multiple TMUs for them to work that way, Matrox would've been planning to roll out a multi-TMU G400 derivative around that time.
Not sure about the lack of triangle pushing speed on the G550 with the 6.01 drivers, as the performance of those drivers is otherwise ok in these numbers. Based on some OpenGL results with other driver versions, it took until 6.10 to fix this. I'm still slightly of the opinion that hardware T&L has something to do with this. I can't find any conclusive indication in benchmark numbers that the present-day G550 does it, but those early drivers call its device id the G800, and it's possible T&L was experimentally enabled in that hardware at the time.
It also seems possible that the improvement in performance starting with the 6.10 drivers might not be due to optimizations but Matrox giving up on the G800 effort and reverting driver support for it to the G400 path to get another office card out.
It is a mystery. G550 could be the G800 chip but pushed to market in an unfinished state.
Parhelia seems similar with its partial DirectX 9 support, AGP 4x limitation, problems on the secondary output and low clock speed. It stayed DirectX 8.1 but most of the bugs were fixed and clock increased with a respin.
I've been able to overclock my g550 using the pins but unlucky the highest clock I was able to reach was 140/176.
I don't know if my g550 has bad luck or it's because of 5,5ns memory, I suspect the limit is the core clock because even 2mhz more and the card shows no signal just after being turned on.
I'm calling the G550 at 160/200 the G550 MAX: https://www.youtube.com/watch?v=KsGXWic94hw. Maybe the cards with 5 ns chips are built better, who knows, but I don't see why they would be, unless it's an early overengineered revision or something like that. I think this one started freezing in-game at 175/220 or something like that, but I don't want to be pushing old hardware anyway.
If I were you I'd test a different version of PowerStrip and whether underclocking has the same effect.
vvbee wrote on 2024-08-16, 01:03:I'm calling the G550 at 160/200 the G550 MAX: https://www.youtube.com/watch?v=KsGXWic94hw. Maybe the cards with 5 ns chips are built better, who knows, but I don't see why they would be, unless it's an early overengineered revision or something like that. I think this one started freezing in-game at 175/220 or something like that, but I don't want to be pushing old hardware anyway.
If I were you I'd test a different version of PowerStrip and whether underclocking has the same effect.
It would be interesting to test it against a g400 max, the g550 uses ddr sdram sgram while the g400 max uses sgram.
I would be interested to test a g550 with 5ns memory but they aren't cheap from what I found on ebay.
Edit: I tried to underclock with powerstrip, same issue, I have vertical bands, then screen corruption and after some seconds system completely hanged.
G550 is using DDR SGRAM.
I must be some kind of standard: the anonymous gangbanger of the 21st century.
The Serpent Rider wrote on 2024-08-16, 02:00:G550 is using DDR SGRAM.
You are right, my bad... so it's sgram vs ddr sgram.
The G550 PCIe (Windows XP) vs some AGP cards (Windows 98), in percentages relative to the G400 (Windows 98) in 3DMark 2000:
This PCIe version is clocked at 100/150, so about a 20% reduction to the AGP version. Performance is about half, polygon throughput looks the most affected. Comparable to a Vanta-16.
The clock registers look to be the same as for the AGP version, but the core is using the maximum divider already. Artifacting starts coming around at 133/180 or so, unlike the AGP version which did 166/200 without issue. I'm guessing it's the core rather than the memory.
Using driver 5.96. Less game compatible than the Windows 9x driver. Don't know how well the other G series cards run in XP.
Testing different XP drivers for the G550 PCIe. Driver 5.96 supports the PCIe version natively, earlier drivers can be made to work by adding the ID into the inf file.
Games tested: Need for Speed 3, Homeworld, Formula 1, Darkstone, Thief 2 and F1 2000. Driver 5.96 has missing textures in Need for Speed 3 and Homeworld, and crashes in Thief 2. Driver 5.93 crashes in Thief 2. Driver 5.91 works in all and has the fewest of small glitches.
Between 5.91 and 5.93, something changed, since in 3DMark 2000 the polygon throughput doubled and the game tests got about 5% faster. But worse compatibility in actual games. For version 6.00 Matrox removed 3D acceleration entirely it seems.