VOGONS


Reply 20 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
cyclone3d wrote on 2025-11-24, 04:30:

That crackling of the startup sound was a very very common thing for years on all sorts of systems and sound chipsets.

It was because Windows started playing the startup sound while the PCI bus was being thrashed by the drive controller and probably other stuff.

I appreciate various theories can be speculated with many variables - but my witness statement narrows the variables significantly because I remember clearly that my crackling started the day I swapped my ESS ISA sound card for the PCI EMU10K1 sound card and everything else in the system was constant.

I was using a VIA-based motherboard, AMD K6-2, 512k cache, AGP 3Dfx Banshee, 56k Rockwell ISA modem, Realtek PCI NIC, Maxtor IDE HDDs, and Windows 98 - but none of that changed.

We can speculate that the exact cause of my crackling was PCI DMA activity on a misbehaving VIA chipset, or some other pathway that the ESS ISA card didn't exercise. But, we cannot speculate that it was IDE bus activity because that would not have changed between ISA and PCI sound cards.

I also remember the day my VIA-based motherboard failed. I don't remember the day when the SBLive stopped crackling - I am not saying it did, and not saying it didn't, as I genuinely don't remember.

Last edited by MattRocks on 2025-11-24, 23:06. Edited 4 times in total.

Reply 21 of 38, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++

The IDE on that board would have been PCI.

ISA was a separate bus.

If / when I get the time, I can test a ton of stuff. Don't think I have that exact motherboard but I do have a bunch of different SS7 boards, including ones based on th via MVP3 chipset.

Also pretty sure I have every model of the SB-Live!

Yamaha modified setupds and drivers
Yamaha XG repository
YMF7x4 Guide
Aopen AW744L II SB-LINK

Reply 23 of 38, by chinny22

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on 2025-11-21, 13:32:

Years later, I still run into people who insist the Sound Blaster Live! was revolutionary.

“Great card!” they say. “One of the best of its era!”

I'd have to agree with those people.
At the time hardware acceleration especially consumer level was pretty revolutionary.
Aureal and their A3D api really being the only other contender.

Which other consumer soundcards in the late 90's could really compete?

BUT
I fully agree they are a confusing buggy mess I'd prefer to avoid and just get an Audigy in this day and age

Reply 24 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
chinny22 wrote on Yesterday, 05:11:
I'd have to agree with those people. At the time hardware acceleration especially consumer level was pretty revolutionary. Aurea […]
Show full quote
MattRocks wrote on 2025-11-21, 13:32:

Years later, I still run into people who insist the Sound Blaster Live! was revolutionary.

“Great card!” they say. “One of the best of its era!”

I'd have to agree with those people.
At the time hardware acceleration especially consumer level was pretty revolutionary.
Aureal and their A3D api really being the only other contender.

Which other consumer soundcards in the late 90's could really compete?

BUT
I fully agree they are a confusing buggy mess I'd prefer to avoid and just get an Audigy in this day and age

Since early 1990s S3 had been selling DVD decoders with native 48kHz audio. In 1996, S3 launched a sound card DSP that converted all inputs to 48kHz internally. It had onboard SRAM to apply fixed-function environmental effects, SRS widening, and a DAC built into the DSP (everything on one chip). The only part that S3 patented was the multiplexer that converted 48kHz to whatever the DAC was outputting. S3 humiliated the EMU8008 (Creative's first PCI sound card, the AWE64D). In 1998, EMU10K1 was the response that did almost the exact same things - but without S3's patented part and without a built-in DAC.

EMU10K1 added programmable environmental effects (like the leap from GeForce2 to GeForce 3) so it was the next generation after S3 - my point is, there were other hardware accelerators before EMU10K1 and Creative Labs did not start the trend. We agree SBLive was a buggy mess. I think SBLive was a buggy mess because it was hurried by outside pressures; without it Creative were obsolete.

Reply 25 of 38, by dionb

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on 2025-11-23, 00:38:
Joseph_Joestar wrote on 2025-11-21, 20:01:

Similarly to your experience, I have heard plenty of both on a Vortex 2, especially in games like Thief: The Dark Project. Some of it can be resolved by using older drivers though.

That is interesting because in graphics we are accustomed to newer drivers providing bug fixes, but in audio it seems more complicated.

Slightly offtopic, but that's not true either. In particular, nVidia used the same Detonator driver family from the early TNT on to deep in the GeForce family. You could still run those old cards on much newer drivers, but their overhead was huge, leading to significantly lower performance on older hardware. In general for best performance it's recommended to take the oldest stable drivers that support a given GPU, with 'stable' being defined as having the worst bugs removed.

Regarding the SBLive, I share your opinion of the overhyping of the cards and Creative's very cynical marketing and compatibility choices. But the one thing those cards were good at was high SNR audio output, so I think something is wrong in your setup if that's failing.

Reply 26 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
dionb wrote on Yesterday, 09:40:

the one thing those cards were good at was high SNR audio output, so I think something is wrong in your setup if that's failing.

You have to be kidding me. Professionals need high SNR and professionals do not use Creative!

Creative sound cards are for home entertainment, with varying (low to medium) SNR. Sometimes Creative teamed up with a professional audio company, such as Burr-Brown, and those "tuned" boards came with a lot of marketing hype and slightly less bad SNR.

Professionals used M-Audio or Digigram or something else with actual high SNR. Audophiles would consider something in the middle like Turtle Beach or Terratac. Only gamers considered Creative, and from a field of generally poor SNR competition.

I have a Creative Sound Blaster X-Fi Titanium HD and that is probably the highest SNR Creative ever released - it's not professional and it comes years later so not a fair comparison, but for SNR it's better than other Creative cards. And, if you check actual magazine reviews for the first generation of PCI sound cards - when Ensoniq was a competitor - Creative were in the doldrums! That era started ~2 yeas before the first SB Live.

That's it, I accidentally stumbled on what the D stood for.. AWE64Doldrum 😉

Reply 27 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
dionb wrote on Yesterday, 09:40:

Slightly offtopic, but that's not true either. In particular, nVidia used the same Detonator driver family from the early TNT on to deep in the GeForce family. You could still run those old cards on much newer drivers, but their overhead was huge...

Apples and oranges. Those Detonator driver jumps were huge because they were adding new features (e.g. DirectX 8 compatibility for a DirectX 7 GPU) - doing that pushed a lot of work to the CPU. That isn't the same as fixing bugs in the original shipped drivers.

But if we take a step back then your point is kind of like saying you shouldn't put high-end GPU on a low-end system because you will thrash the system with extra messages that it was not designed to accommodate. That lens is relevant.

Last edited by MattRocks on 2025-11-25, 10:27. Edited 1 time in total.

Reply 28 of 38, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on Yesterday, 10:21:

Apples and oranges. Those Detonator driver jumps were huge because they were adding new features (e.g. DirectX 8 compatibility for a DirectX 7 GPU) - doing that pushed a lot of work to the CPU. That isn't the same as fixing bugs in the original shipped drivers.

Watch this video by Phil. You'll see performance decline on older cards when using newer Nvidia drivers, even while benchmarking DX6 games and such.

Personally, I like using (retro) Nvidia drivers that are about 6-12 months newer than the card itself. That gave the developers enough time to squash any prominent bugs, while still keeping performance optimized for the current GPU architecture.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 29 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
Joseph_Joestar wrote on Yesterday, 10:27:

Watch this video by Phil. You'll see performance decline on older cards when using newer Nvidia drivers, even while benchmarking DX6 games and such.

Personally, I like using (retro) Nvidia drivers that are about 6-12 months newer than the card itself. That gave the developers enough time to squash any prominent bugs, while still keeping performance optimized for the current GPU architecture.

I don't need to watch the video - partly because I have watched some of Phil's other videos and he overlooks a lot of engineering nuance.

Those newer drivers are supporting DX7, or DX8, or maybe even DX9 (didn't watch the video) and that will slow down a DX6 card because the software drivers keep filtering/translating between DX6/7/8/9 (lots of extra CPU overhead) before dispatching any command to the DX6 card.

And, that switching logic is optimised for the new GPU that is being tested by reviewers - not the old discontinued GPUs. And, if there is a bug in that switching the drivers might never dispatch the native command to the old GPU!

There is a huge amount of stuff I don't know, but some things I actually do know 😉

Last edited by MattRocks on 2025-11-25, 10:54. Edited 1 time in total.

Reply 30 of 38, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on Yesterday, 10:31:

I don't need to watch the video - partly because I have watched some of Phil's other videos and he overlooks a lot of engineering nuance.

Those newer drivers are supporting DX7, or DX8, or maybe even DX9 (didn't watch the video) and that will slow down a DX6 card because the software drivers keep filtering/translating between DX6/7/8/9 (lots of extra CPU overhead) before dispatching any command to the DX6 card.

You should have watched the video. Performance decreases even on a GeForce 3 (DX8 card) when using drivers which were released only a year apart, and which support the same DirectX version.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 31 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
Joseph_Joestar wrote on Yesterday, 10:41:

You should have watched the video. Performance decreases even on a GeForce 3 (DX8 card) when using drivers which were released only a year apart, and which support the same DirectX version.

Ok - deal. I'll look at the video before commenting on GeForce cards. But for the points above, it depends very strongly on which driver versions are in that video - you didn't rule out drivers adding support for newer GPUs (e.g. DX9 support).

Whoa.. within minutes Phil jumps into results without explaining the host system. Which CPU? What caching does that CPU have access to? Where are the drivers being cached? Without that detail the whole test is.. not science! This is what I meant by not needing Phil's videos - Phil does not report key facts hence results are misleading.

So I checked on some split second frames..

Phil is (apparently) using an Intel board so we know immediately that Phil's findings cannot be transposed to Super Socket 7 systems where drivers need to work within a Level 2 cache bottleneck.
From the ISA, IDE, Slot CPU.. no SATA we can deduce this must be LX or BX chipset. The CPU is not on a riser board, and Pentium II had a factory fitted black cooler - so I'm guessing Phil is actually using a Pentium III on a BX chipset.

That test bed is strictly MMX/SSE, and any Detonator software drivers optimised for 3DNow/SSE2/SSE3/SSE4/AMD64 etc. could worsen performance. Could Phil actually be testing Detonator code vs his CPU's instruction set?

Now I'll watch and find out..

Last edited by MattRocks on 2025-11-25, 11:23. Edited 1 time in total.

Reply 32 of 38, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on Yesterday, 10:55:

Ok - deal. I'll look at the video before commenting on GeForce cards. But for the points above, it depends very strongly on which driver versions are in that video - you didn't rule out drivers adding DX9 support.

If you want a specific example, the 8.05 driver (July of 2001) often outperforms the 30.82 driver (July of 2002) when using a GeForce 3. Both driver versions came out during the DX8 era, and performance is still (mostly) worse on the newer one.

MattRocks wrote on Yesterday, 10:55:

Whoa.. first minute Phil jumps straight into results without explaining the host system. Which CPU? What caching does that CPU have access to? Where are the drivers being cached? Without that detail the whole test is.. junk.

He mentioned a Pentium 3 running at 1 GHz at the start of the video, but not much else. I agree that the lack of full system specs is odd though.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 33 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
Joseph_Joestar wrote on Yesterday, 11:09:

He mentioned a Pentium 3 running at 1 GHz at the start of the video, but not much else. I agree that the lack of full system specs is odd though.

I didn't hear him say that that and I worked out the same from the 1 second snapshot of Phil's rig (edited my post above).

The one good thing about Phil is that he doesn't make stuff up so we can challenge his findings. GeForce256 takes a hit with Detonator 21.81 - that is DX8 optimisations that GeForce256 does not support. That is actually loading software fixes for GeForce3 cards onto a system where there is no GeForce3 present. The CPU/GPU still need to figure out between them if those fixes should be applied or not, so the logical expectation is that the CPU will be running extra code paths. Comparing later GPUs on Detonator 81.98, it looks to me like Pentium III is the bottleneck for the extra codepaths.

But with sound cards (which is what the thread is about) the variations are not within a single chip. The same EMU10K1 is used on all SBLive cards, but those SBLive cards are all different. The changes to the cards are in the other silicon surrounding the EMU10K1, such as the DACs and capacitors. The sound card software drivers cannot query the capacitors, "are you a 5V or a 10V?" So the software drivers for a sound card are running blind or querying a blanket board identifier - and that is what I meant when I suggested sound cards are dealing with a more complicated problem space.

Reply 34 of 38, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

As much I think that Live! was pretty meh series and Audigy is much better, I can't deny that VIA had shitty PCI implementation which is really bad on some specific boards, depending on resource allocation and/or BIOS bugs. It never got better until VIA introduced a south bridge that decoupled PCI bus from other stuff (IDE, USB, LAN, etc), so first DDR chipsets basically.

Whoa.. within minutes Phil jumps into results without explaining the host system. Which CPU?

That's an issue with a lot of old Phil's videos, which makes cross comparisons a nightmare.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 35 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
The Serpent Rider wrote on Yesterday, 12:38:

That's an issue with a lot of old Phil's videos, which makes cross comparisons a nightmare.

It was actually one of Phil's sound card videos on YouTube that indirectly led me to starting this thread. Phil reviewed some old ES-based Creative cards, but Phil missed out the single most important differentiator on the very cards being reviewed: The oscillating crystal!

In the real world there were two main frequencies (or bitrates) to target:

  • 44.1kHz : CD audio, Quake III Arena, etc.
  • 48kHz: DVD audio, Unreal Tournament, etc.

The ES chipsets (all of them) only support one audio frequency (or one audio bitrate), and that frequency is set by the oscillating crystal. It might be 44.1kHz or it might be 48kHz, and if it was the "wrong" frequency/bitrate then the ES software drivers had the CPU resample the stream.

So the ES chipsets were brilliant and sucky at the same time. If you had a frequency match then the system->ES communication was a simple FIFO (like a 3Dfx card) after which the ES chip performed mixing in hardware and piped the results to your speakers with zero CPU overhead and zero aliasing artefacts. If you had a frequency mismatch then the software drivers sent resampling through the CPU before passing it down the FIFO pipe - that would cause lost frame rates on a stressed gaming platform.

Phil ran a whole review that completely ignored everything that actually mattered: The oscillating crystal and the audio stream bitrate!

The alternative was to have one chip that resampled everything to 48kHz (the low cost approach), one chip that supported multiple crystals (44.1kHz and 48kHz, higher costs), or a system with multiple sound chips hence multiple crystals (e.g. ES3170/44.1kHz + ES3171/48kHz, but rebooting and swapping cards?). Remembering this real world problem led to me remembering the popping and crackling.

Reply 36 of 38, by MattRocks

User metadata
Rank Newbie
Rank
Newbie
chinny22 wrote on Yesterday, 05:11:

At the time hardware acceleration especially consumer level was pretty revolutionary.
Aureal and their A3D api really being the only other contender.

The actual contenders were:

1996 QSound 3D - accelerated by Trident 4DWave
1996 Microsoft DirectSound 3D - version 1.0 is accelerated by S3 Sonic Wave
1997 Aureal 3D - accelerated by Aureal Vortex
1998 Sensaura - accelerated by Yamaha, ESS, NVidia, and others
1998 EAX - an extra layer accelerated by Creative Labs EMU
2000 Ensemble - mysterious dark horse that needs a mention

Only A3D and Sensaura deliver true positional sound. QSound and EAX are just ambient sounds. Microsoft DirectSound3D is a big tent wrapper with lots of versions (v1.0 is ambient sound). EAX makes it very confusing because EAX is ambient sounds on top of other sounds, which could in theory be 3D sounds - but you cannot put a consumer hardware sound card on top of another consumer hardware sound card!

It's the impossibility that confuses people. In theory the pathway could work for 3D EAX, but in reality the pathway doesn't exist. So what happened to the actual 3D sound (A3D and Sensuara)? Creative bought them and buried both technologies. Thus, Creative did not deliver 3D positional sound and Creative actually erased 3D positional sound while promising and owning 3D sound!

Creative Labs, in my opinion, played us consumers like a fiddle! In my opinion, Creative even sued Aureal for marketing effect -not because Aureal did anything technically wrong. In fact, in my opinion, Creative behaved a bit like a narcissist - they lost, they were exposed, and they went brutally aggressive! Are opinions safe things to have?

And that's not all. On the previous page I mentioned how, in my opinion, the Creative Labs CSW 4.1 speaker kit muffles mid frequencies and breaks 3D sound in DVD movies. Maybe muffling the mid frequencies would muffle A3D/Sensaura also, but I don't know because I did not test that particular combination.

The dark horse? That is the only sound processor that actually accelerates anything consistently throughout Win9x-Win11. Microsoft killed DSP audio acceleration when it dropped DS3D from Vista, and Creative quit after EMU20K2, so today is all audio is done in software. There is one exception. Long before the anti-sound movements, a company called Ensemble (acquired by VIA) decided to sidestep the whole problem of resampling bitstreams by making a hardware audio processor that switches between multiple oscillating crystals - and that hardware switching still works under Win11!

Last edited by MattRocks on 2025-11-25, 23:26. Edited 1 time in total.

Reply 37 of 38, by MadMac_5

User metadata
Rank Member
Rank
Member
MattRocks wrote on Yesterday, 14:44:
The actual contenders were: […]
Show full quote
chinny22 wrote on Yesterday, 05:11:

At the time hardware acceleration especially consumer level was pretty revolutionary.
Aureal and their A3D api really being the only other contender.

The actual contenders were:

1996 QSound 3D - accelerated by Trident 4DWave
1996 Microsoft DirectSound 3D - version 1.0 is accelerated by S3 Sonic Wave
1997 Aureal 3D - accelerated by Aureal Vortex
1998 Sensaura - accelerated by Yamaha, ESS, NVidia, and others
1998 EAX - an extra layer accelerated by Creative Labs EMU

Only A3D and Sensaura deliver true positional sound. QSound and EAX are just ambient sounds. Microsoft DirectSound3D is a big tent wrapper with lots of versions (v1.0 is ambient sound). EAX makes it very confusing because EAX is ambient sounds on top of other sounds, which could in theory be 3D sounds - but you cannot put a consumer hardware sound card on top of another consumer hardware sound card!

It's the impossibility that confuses people. In theory the pathway could work for 3D EAX, but in reality the pathway doesn't exist. So what happened to the actual 3D sound (A3D and Sensuara)? Creative bought them and buried both technologies. Thus, Creative did not deliver 3D positional sound and Creative actually erased 3D positional sound while promising and owning 3D sound!

Creative Labs, in my opinion, played us consumers like a fiddle! In my opinion, Creative even sued Aureal for marketing effect -not because Aureal did anything technically wrong. In fact, in my opinion, Creative behaved a bit like a narcissist - they lost, they were exposed, and they went brutally aggressive! Are opinions safe things to have?

And that's not all. On the previous page I mentioned how, in my opinion, the Creative Labs CSW 4.1 speaker kit muffles mid frequencies and breaks 3D sound in DVD movies. Maybe muffling the mid frequencies would muffle A3D/Sensaura also, but I don't know because I did not test that particular combination.

Everything you said lines up with my experience/knowledge; EAX 1.0 was simply reverb based on a rough approximation of what the developer thought the effect should be based on where the player was in the world. I will say that it did have 4-speaker positional sound (Need For Speed III, Descent: Freespace, and Half-Life were especially impressive to me at the time), though it was next to useless at doing positional audio with 2 speakers or headphones like its competition could do. EAX 2.0 added occlusion effects that were pretty convincing in the right games; I know that the EAX 2.0 patch for Unreal Tournament makes the positional audio sound in the same league as my Vortex 2 with A3D does in the same game with headphones, although it needs special versions of the maps to work properly. If I want to use other mods that tie into the Unreal Tournament audio stack or other maps that didn't come with the game, those effects just aren't there and we're left with 4-speaker positional audio and a bunch of reverb effects.

Creative was pretty sketchy as a company (Can't beat your competition? Sue them until they go bankrupt, then buy the remains!), but at the time more of the games I played supported EAX than A3D so I bought a SB Live Value in 1999. I ran it with a 440BX chipset and then a Via KT266A, and it was surprisingly stable under Windows 98 and later XP. So, while the Live wasn't as good as Creative hyped it up to be (48 kHz resampling for everything! What's this distortion you speak of?), it wasn't as bad as some say.

Reply 38 of 38, by leileilol

User metadata
Rank l33t++
Rank
l33t++
MattRocks wrote on Yesterday, 13:34:
In the real world there were two main frequencies (or bitrates) to target: […]
Show full quote

In the real world there were two main frequencies (or bitrates) to target:

  • 44.1kHz : CD audio, Quake III Arena, etc.
  • 48kHz: DVD audio, Unreal Tournament, etc.

Both of those games mixed at 22khz. It was the Windows Sound System and AC'97 pushing for 48khz, and ES1371 was focused on AC'97 compliance.

apsosig.png
long live PCem