First post, by Scali
As you may know, early Sound Blasters did not support the so-called 'Auto-Initialize' mode of playing samples via DMA.
They only supported what Creative called 'Single-Cycle' transfers.
This is basically the DSP transferring N bytes via DMA and playing them on the DAC, signaling an interrupt when the transfer is complete.
The problem here is that if you want to play samples that are longer than a single DMA buffer (max 64k), you'll have to restart the DSP immediately when the interrupt fires.
If you cannot do this fast enough (because you get the interrupt too late, or it takes too long to reprogram the DSP for the next transfer), this results in a tiny 'glitch' in the output.
Now, this has been the subject of various workarounds, myths and bugs in various software and hardware (think later SBs and clones), so I'd like to investigate exactly what is going on.
I've created a simple program to test various replay methods: https://www.dropbox.com/s/a38bc9ap8w5fqlv/SBDMA.zip?dl=0
I'd like you all to try it, and tell me which methods work, and which methods do not (or don't work well, eg clicks and pops in the output).
I have tested these on my 8088 machine with an SB Pro 2.0, with a v3.01 DSP.
It plays a short 64k 8-bit mono sample at 22050 Hz, in a loop. The sample is stored in the file 'sample.raw', and you can replace it with any sample you like. It will just load the first 64k of that file.
It plays the file in transfers of 1024 bytes each, so it restarts about 21.5 times per second. So you should listen to a 'rumble' of about 21.5 Hz in the signal.
I have created the following tests:
Single-cycle
This one is 'by the book', according to the Creative hardware programming manual:
Start a single-cycle transfer, and set up an interrupt handler for when it ends.
Inside the interrupt handler, I start the next single-cycle transfer as quickly as possible.
Single-cycle, no busy-wait for DSP after interrupt
This method is the same as above, with a small twist:
According to the programming manual, you need to check the status of the DSP and wait until it is not busy, before you send a new command.
During testing, I found that when the interrupt fires, I don't need to wait when I send the first command, because the DSP isn't busy: it's just signaled us that it's done.
I do still have to wait when I send the bytes containing the length of the transfer.
However, not waiting for the first write shaved off some cycles, and restarts the transfer more quickly.
Single-cycle hack
This method uses a common 'hack' from early software, as documented on the DOSBox-X site: https://github.com/joncampbell123/dosbox-x/wi … Nagging-the-DSP
In short, you simply don't wait for the interrupt to occur, but you restart the transfer in mid-flight.
Apparently this works on most hardware.
My implementation uses the timer interrupt that restarts the DSP roughly every 1024 samples, as the other methods.
I have found on my SB Pro 2.0, that I really do need to wait for the DSP not to be busy, before I send any commands, else I got very obvious pops and clicks.
So I have implemented it with full busy-waiting.
I have found that on my hardware it does not play quite seamlessly: you hear a bit of 'flutter' in the signal, it actually seems slightly worse than the second method.
Auto-init
This method should be seen as a 'reference' implementation.
It uses the Auto-Initialize mode supported by DSP v2.00 and higher, and is designed to be seamless. The DSP just continues playing endlessly (even if you would not acknowledge the interrupt it sends), and is not dependent on any CPU intervention.
If your card supports it, you can take this as a reference, and compare to the others, to see if you hear any subtle 'flutter', 'rumble' or other distortions caused by the other methods not being entirely seamless.
Any Sound Blaster 2.0 and newer will support this out-of-the-box.
Some Sound Blaster 1.0 and 1.5 cards may also support this, since Creative sold the DSP v2.00 as an aftermarket upgrade for these cards.
My conclusion so far is that I simply can't get perfect seamless playback on my 8088 with my SB Pro 2.0. The second method is close, but not perfect. I have yet to try it in a faster machine. A faster CPU may just handle the interrupt and reprogramming of the DSP just that bit faster to make it work.
This issue is also related to the Sound Blaster 16 and certain clones, see also this earlier topic on the matter: Sound Blaster 16 Bugs and Deficiencies Summary
And this blog by Great Hierophant: http://nerdlypleasures.blogspot.nl/2015/05/16 … t-playback.html
His recordings show that the Sound Blaster 16's DSP seems to be even worse at handling the single-cycle technique than the earlier cards were.
So if you have a real Sound Blaster (any kind), or any clone, I'd like to invite you to try my program, and post your findings here.
Thanks in advance!
PS: If you happen to have an SB1.x with DSP v1.xx, you could also help to settle the myth on OSDev: http://wiki.osdev.org/ISA_DMA
"Some expansion cards do not support auto-init DMA such as Sound Blaster 1.x. These devices will crash if used with auto-init DMA. Sound Blaster 2.0 and later do support auto-init DMA."
I don't think that is true. I don't think the DMA controller can crash any expansion card at all.
I think they may be confusing terminology here. You can put the DMA controller in 'auto-init', which is something different from the 'Auto-Initialize' mode of the SB.
Namely, the DMA being in auto-init just means it resets itself to the same buffer at the end of a transfer (and I don't see how that would crash the DSP, since to the DSP it is no different than when you'd reprogram it with the CPU for the next transfer).
Setting the SB in auto-init means you send it a different command. Perhaps the old DSPs crash on that command, because that command didn't exist yet.
My program checks the DSP version before it tries to run the auto-init test. So I don't expect it to crash any SB.
It has the DMA controller in auto-init mode for the entire test.