VOGONS


Reply 20 of 27, by mkarcher

User metadata
Rank l33t
Rank
l33t
Siran wrote on 2023-07-22, 10:13:

I guess overclocking the memory won't make much of a difference except shortening the lifespan of the card.

It will also increase power usage. Actually, the increased power usage is the cause for a higher temperature which in turn is the main contribution to shortening the lifespan of the card.

Siran wrote on 2023-07-22, 10:13:

Btw here is a picture of the card - don't know the vendor sadly - next to the one it replaced (for picture quality alone it was way worth the change):

The FCC ID is allocated to Joytech Computer Co Ltd in Taiwan. In the late 80s and early 90s, Taiwan had the position in the world market China has today: They mass-produced budget products without a lower bound on quality and price. The TVGA9000B card you show was the absolute low end at that time, with just 512KB of video RAM and no upgrade option to 1024KB. The picture quality is determined by the HMC RAMDAC chip, and the filtering and stabilization of its reference voltage. The missing capacitor C11 on that card might be the cause of some of the quality issues. The Cirrus chip has an integrated RAMDAC of aedequate quality. As the Cirrus Chip was more expensive than the Trident chip, usually it was used on cards of better general quality only.

Reply 21 of 27, by Siran

User metadata
Rank Newbie
Rank
Newbie

Thanks again for the very detailed explanation! That TVGA 9000B was part of my very first PC as a teenager. Didn't know any better back then and since the monitor was some 14 inch low quality noname as well, I guess I never really realised how bad the PQ was. The VGA card was part of a budget 386DX-40 build with 4MB RAM, a 120MB Conner (that sadly died after all those years) on a 4386-VC-HD mainboard. The CL was definitely a big step up in terms of quality and speed (Doom low fps is about 5fps faster now just from the VGA card). I remember having a CL VLB card in my old 486DX2-66 that I had after this PC. I only remember that it had small stripy artifacts in certain SVGA games, I believe in the 3D battle scenes from Battle Isle III, so I wasn't that impressed by its quality back then, but now as an ISA card, I quite like it. They almost looked like memory-overclock artifacts (which I didn't) now that I think about it.

I've installed a case fan in order to give the old components some more breathing room and extend their lifespan as they're performing way past their original one.

Reply 22 of 27, by mkarcher

User metadata
Rank l33t
Rank
l33t
Siran wrote on 2023-07-22, 16:13:

Thanks again for the very detailed explanation! That TVGA 9000B was part of my very first PC as a teenager. Didn't know any better back then and since the monitor was some 14 inch low quality noname as well, I guess I never really realised how bad the PQ was.

If the image quality issues are "jail bars", that is vertical stripes of higher and lower brightness, LCDs are often more sensitive to subtle quality issues than CRT monitors. LCDs digitize the analog video signal without doing any kind of oversampling. They just take one sample per pixel. This makes them prone to aliasing effects, converting high-frequency interference to low frequency brighteness variations. On the other hand, I already saw jailbars on some Trident cards already on a cheap 14" CRT.

Siran wrote on 2023-07-22, 16:13:

The VGA card was part of a budget 386DX-40 build with 4MB RAM, a 120MB Conner (that sadly died after all those years) on a 4386-VC-HD mainboard.

I hope I don't spoil your childhood, by mentioning that these budget computers were not that good at value per budget. These PCs often got awards from the press as "best 386-class machine", and no vendor stops selling a machine that just got an award, even if its technology is obsolete. As the 386 DX-40 was the fastest 386-class processor available, and using 64KB cache (cheap, as the mid-level 486 computers already used the next generation chips providing 256KB) with 80ns RAM (cheap, as the 486-class computers required 70ns RAM) was good enough to deliver decent 386 performance. Nevertheless, a 486SX-25 could outperform a 386DX-40 at many tasks. Getting a 486-class machine with VL graphics definitely outperformed any 386-class machine with ISA graphics.

Siran wrote on 2023-07-22, 16:13:

I remember having a CL VLB card in my old 486DX2-66 that I had after this PC. I only remember that it had small stripy artifacts in certain SVGA games, I believe in the 3D battle scenes from Battle Isle III, so I wasn't that impressed by its quality back then, but now as an ISA card, I quite like it. They almost looked like memory-overclock artifacts (which I didn't) now that I think about it.

CL VLB graphics card were very common, and delivered acceptable performance for their low price. Most of those cards ran the memory with plenty of margin (70ns at 50MHz, 60ns at 57MHz, as described in this thread), but possibly some low-end cards used 80ns memory and an even lower memory clock, except for high-resolution SVGA modes. As the memory clock on those Cirrus chips is software programmable, it's not impossible that the BIOS used a safe MCLK for low-demand modes, but increased it for modes with high demand for memory bandwidth (like 640x480x64K).

Reply 23 of 27, by Siran

User metadata
Rank Newbie
Rank
Newbie
mkarcher wrote on 2023-07-22, 20:08:

If the image quality issues are "jail bars", that is vertical stripes of higher and lower brightness, LCDs are often more sensitive to subtle quality issues than CRT monitors. LCDs digitize the analog video signal without doing any kind of oversampling. They just take one sample per pixel. This makes them prone to aliasing effects, converting high-frequency interference to low frequency brighteness variations. On the other hand, I already saw jailbars on some Trident cards already on a cheap 14" CRT.

Exactly, the "jailbars" - couldn't find a good description for it. I don't remember ever seeing them on the old CRT. But then again, that's about 30 years ago, so I may just not remember or I didn't really care then.

mkarcher wrote on 2023-07-22, 20:08:

I hope I don't spoil your childhood, by mentioning that these budget computers were not that good at value per budget. These PCs often got awards from the press as "best 386-class machine", and no vendor stops selling a machine that just got an award, even if its technology is obsolete. As the 386 DX-40 was the fastest 386-class processor available, and using 64KB cache (cheap, as the mid-level 486 computers already used the next generation chips providing 256KB) with 80ns RAM (cheap, as the 486-class computers required 70ns RAM) was good enough to deliver decent 386 performance.

No worries, you can't spoil my childhood, we deliberately bought the PC after discussing it with a someone who knew about PCs back then since it was enough for all games at the time, especially my favorites Wing Commander 1&2. VGA cards really weren't that impactful on our decision as most games at the time were almost exclusively dependent on CPU speed and sometimes a game didn't play well with a CPU that was too fast. I believe I even had to slow the PC down via its "Turbo" button in order for WC1 to be playable. I have very fond memories of that system and it mostly survived until this day, so it wasn't all around cheap. Btw - the memory it came with already was 70ns and it had 128k 20ns cache onboard that I upgraded a few days ago to the maximum 256K alongside an additional Alter RAM, boosting memory performance quite a bit. Plus the fact that the motherboard was upgradeable up to a 486DX2-66 makes it kind of stand-out. Upgrading it to a 486DX2-66 is the next step I've planned, just to upgrade my old PC to the max. I'm just waiting for a 33MHz quartz oscillator that has to replace the currently installed 80Mhz one.

mkarcher wrote on 2023-07-22, 20:08:

As the memory clock on those Cirrus chips is software programmable, it's not impossible that the BIOS used a safe MCLK for low-demand modes, but increased it for modes with high demand for memory bandwidth (like 640x480x64K).

That might just be what happened with my card, must have been when UniVBE was required for so many games as SVGA became more and more popular. I mostly upgraded to that 486 because of Privateer (I believe Origin even marketed their game alongside "plays best on an Intel DX2" or something) and later because of WC3 since that didn't play well in SVGA on a 486, even with VLB.

Reply 24 of 27, by Marco

User metadata
Rank Member
Rank
Member

Hi,

I tested it with my gd5428 and could gain 2-5% in wintach while doing 10% overclocking from 50-55 MHz.
The initial benchmark xls shown here brings some question marks. Besides text mode there is no increase after 60MHz on gd5429. But isn’t 60mhz the stock clock and therefor the results are a bit … disappointing?
Anyway I love of course the tool.

1) VLSI SCAMP 311 | 386SX25@30 | 16MB | CL-GD5428 | CT2830| SCC-1 | MT32 | Fast-SCSI AHA 1542CF + BlueSCSI v2/15k U320
2) SIS486 | 486DX/2 66(@80) | 32MB | TGUI9440 | LAPC-I

Reply 25 of 27, by wbc

User metadata
Rank Member
Rank
Member

that's looking neat - does the utility work on all Cirrus Logic BIOSes? I recall my GD5420 ISA also being able to tweak the MCLK (albeit since it uses 100ns VRAM, there is no headroom for proper overclock), but the Video BIOS resets the memory clock at each mode set - thus I made a TSR which does the same but setting the MCLK after each mode set.

--wbcbz7

Reply 26 of 27, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
mkarcher wrote on 2023-07-22, 15:28:

It will also increase power usage. Actually, the increased power usage is the cause for a higher temperature which in turn is the main contribution to shortening the lifespan of the card.

Since this specific overclocking does not include voltage increase and the speed bump is not really all that significant (percentage wise), both power usage and longevity won't be affected in any noticeable capacity and absolutely safe to use.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 27 of 27, by Marco

User metadata
Rank Member
Rank
Member

Hi just another question: I changed my CL card to a newer CL card and now mclk informed me about Ras timing= extended (slower). On the previous card it was normal (faster).
Is there any way to change that or is it hardware-fixed?
Thank you

1) VLSI SCAMP 311 | 386SX25@30 | 16MB | CL-GD5428 | CT2830| SCC-1 | MT32 | Fast-SCSI AHA 1542CF + BlueSCSI v2/15k U320
2) SIS486 | 486DX/2 66(@80) | 32MB | TGUI9440 | LAPC-I