VOGONS


Reply 20 of 53, by Scali

User metadata
Rank l33t
Rank
l33t
elianda wrote:

What about if you increase the gap to a known time, cut the 'skipped' samples from the digital part and replace the output while reinit of the DSP by some OPL generated signal?
I mean you just want to have the voltage level at the output and it's source can be the OPL as well.
Since the card has no mixer there is a fixed relation between digital and OPL.

I think that will be too difficult to pull off in practice. There are two problems here:
1) OPL can only play 6-bit samples, so not every PCM sample can be played back at the exact same level on the OPL. I think the maximum output on OPL is also less (you play on only one channel, the card is designed with enough headroom to play at least 8 channels at a time).
Sure, you could reduce the PCM samples so you match them to the possible OPL levels, but that may affect quality worse than the occasional glitch.
2) The timing will be extremely difficult to pull off, if it is even possible at all. If your timing is off by a fraction, you're still getting a glitch. And that's just talking about a single PC. If you want this technique to work across multiple configurations (different CPU speeds, different types of Sound Blasters, OPL chips etc), you'd have to re-tune it for every possible configuration.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 21 of 53, by Scali

User metadata
Rank l33t
Rank
l33t

I've timed the code to send commands to the DSP (on an SB Pro 2.0).
Results:
Sending a single sample takes 216 CPU cycles. So you literally can't get more than 22 kHz out of an 8088 CPU. You can't poke the samples into the DSP faster than that. 22 kHz is the upper limit. And the bottleneck is probably the DSP, not the 8088, because it's just busy-waiting.

Setting up a new sample buffer took about 316 cycles.
So there's no way to make this seamless. The DSP just can't set up a new transfer in a shorter time than a 22 kHz sample takes. Polling for the DMA controller won't solve this either.
At 316 cycles (4.77 MHz), you'd arrive at effectively ~15 kHz. So perhaps it is seamless below that.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 22 of 53, by Cloudschatze

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

The Microsoft article is very specific about the clicking on DSP v1.xx cards.

I like that it also mentions the change from half-duplex to full-duplex MIDI I/O in v2.00. This particular limitation of the v1.05 DSP doesn't seem to get much attention, but was yet another (ridiculous) problem that required additional workarounds on the part of SB-supporting sequencer developers, and limited the appeal and usefulness of the early Sound Blaster for any serious MIDI work.

If not for "kingmaker" Microsoft, perhaps the Sound Blaster might not have evolved into a standard at all, given how half-baked the original hardware was.

The other quote is also interesting. It specifically says that Wing Commander II can crash DSP v1.05. Question remains: why?

Wing Commander II uses single-cycle DMA playback, so I'm inclined to think you're probably right about it being related to general interrupt processing/generation.

Reply 23 of 53, by Scali

User metadata
Rank l33t
Rank
l33t
Cloudschatze wrote:

If not for "kingmaker" Microsoft, perhaps the Sound Blaster might not have evolved into a standard at all, given how half-baked the original hardware was.

In my recollection, Sound Blaster was the standard before Windows was 'a thing'.
When I got my SB Pro 2.0, I was still running Windows 3.0, and had to install the multimedia extensions separately, before I could even use a sound card at all.
I did all my gaming and music editing in DOS (my SB Pro 2.0 came with the midi kit and a copy of Voyetra).

I guess it was mainly the lack of competition that made the SB a standard. You either had really expensive options like the IBM Music Feature Card or the Roland LAPC-I. Or you had other 'budget' options like the AdLib and the Covox stuff, and the SB was clearly superior to those.
By the time other serious competition arrived in the form of the Pro Audio Spectrum, the SB issues were already solved with DSP v2.00, and SB was already an established standard.

What Creative did really well, was to look closely at the AdLib, and see what it was missing: A DAC, a game port and MIDI. So they added that. Perhaps they didn't add it all that well at first, but still, it was better than what the competition was offering: nothing.
Heck, the whole PC standard is a collection of devices that are 'not done that well at first', so the SB fits right in there.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 24 of 53, by Scali

User metadata
Rank l33t
Rank
l33t

The jury is still out on what happens exactly on an SB 1.x with a DSP v1.xx chip (so far we've only tested on DSP v2.00+, the others seem very rare)...
But while investigating and working on the the program, I also kept track of my undertakings in a blog.
I've just completed it and published it, for those who may be interested: https://scalibq.wordpress.com/2017/03/12/dma-activation/
(I suppose it is mostly just a summary of things already discussed in this thread).
Thanks for everyone's help so far.
And I hope to update it with some actual DSP v1.xx data at some point. That would complete this particular journey.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie

The way it worked, when I was debugging Crystal Dream in DOSBox-X, is that on SB hardware, you can use DSP command 0x14 to start playing a non-auto-init DMA block, and then even before the block ends, issue DSP command 0x14 again to effectively start the block over, to keep the DSP playing audio continuously. So you can (as DOSLIB calls it) "nag" the DSP to continue playing every so often without waiting for the command to finish and the DSP will play continuously.

SB16 hardware emulates this quite well, however doing this with DSP commands 0xB0-0xCF (SB16 playback commands) will cause glitches in the DSP or cause the DSP to stop playing. Some clones will support this trick too.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 26 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie

What I described was tested on an old SB 2.0, SB Pro, SB16, and a Gallant SC-6000 clone that I got with a CD-ROM drive way back in the day. I don't have a Sound Blaster with DSP 1.xx to test on. I discovered it because a bug in DOSBox-X caused Crystal Dream to continuously reissue DSP command 0x14 from it's timer interrupt long before the DSP block finished playing. I later found out that Crystal Dream doesn't use the SB IRQ but watches the DSP busy bit cycle to know when the DSP has finished cycling DMA so it can reissue DSP command 0x14. It's a neat trick to avoid having to detect the IRQ, I think.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 27 of 53, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

I have a Sound Blaster card with a v1 DSP. Post a test program, and I'll report the results. (The card is installed in a Tandy 1000 TX, so real mode 286 instructions are okay, but 386 instructions are not.)

Reply 28 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

I have a Sound Blaster card with a v1 DSP. Post a test program, and I'll report the results. (The card is installed in a Tandy 1000 TX, so real mode 286 instructions are okay, but 386 instructions are not.)

DOSLIB has binary releases here:

https://github.com/joncampbell123/doslib/releases

I recommend the latest one:

https://github.com/joncampbell123/doslib/rele … a-binary.tar.xz

The sound blaster test code is in hw/sndsb.

dos86l is the recommended 16-bit binary.
dos386f is the recommended 32-bit binary.

EDIT: Hold on, just noticed a bug in enabling "nag" mode. it only triggers from the idle command if Goldplay playback mode is enabled. I'll fix that shortly.
EDIT: Never mind, it checks out.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 30 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie

Since you mentioned a 286 tandy, I recommend using hw/sndsb/dos86l/test.exe

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 32 of 53, by Scali

User metadata
Rank l33t
Rank
l33t
NewRisingSun wrote:

I have a Sound Blaster card with a v1 DSP. Post a test program, and I'll report the results. (The card is installed in a Tandy 1000 TX, so real mode 286 instructions are okay, but 386 instructions are not.)

You could try my test program (I only do 8088). Program here: https://www.dropbox.com/s/a38bc9ap8w5fqlv/SBDMA.zip?dl=0
And this sinewave is a better test-sample than the thing I included: ftp://ftp.oldskool.org/pub/misc/temp/bigsin.raw

Basically just run it, press a key to move to the next test, and listen for any disconinuities when it loops the sample.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 33 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

If you don't want to provide a proper answer, then just forget about it.

What?

Just use the Sound Blaster test program in HW/SNDSB/DOS86L/TEST.EXE.

That's the 16-bit large model build, which should work on a 286.

EDIT: TEST.EXE is a program that lets you pick a WAV file and then use it to toy around with your Sound Blaster's playback modes. Menus are provided to manually switch between playback modes, goldplay mode, buffer sizes and IRQ intervals, etc. It's testing done manually, it's not automated, sorry.

EDIT: Unpack the .tar.xz. Binary builds of DOSLIB are inside. There's a HW/SNDSB subdirectory where Sound Blaster code is located. That contains TEST.EXE. You said you wanted to test it on a 286 Tandy, so you'll need the 16-bit large model build in HW/SNDSB/DOS86L/TEST.EXE

Last edited by TheGreatCodeholio on 2017-04-15, 20:05. Edited 1 time in total.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 34 of 53, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

I didn't ask you to vomit the correct .EXE file on me. What am I supposed to do with that mystery program? What menu options must be selected to test the seamless playback issue? What are these "hacks" and "nags"? I have programmed Sound Blasters myself, and I have no Idea what half of the menu options are supposed to mean.

Reply 35 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie
NewRisingSun wrote:

I didn't ask you to vomit the correct .EXE file on me. What am I supposed to do with that mystery program? What menu options must be selected to test the seamless playback issue? What are these "hacks" and "nags"? I have programmed Sound Blasters myself, and I have no Idea what half of the menu options are supposed to mean.

HW/SNDSB/DOS86L/TEST.EXE is a "text user interface" that lets you pick a WAV file to play (File -> Load) and then play it. The menus are full of options to then adjust playback speed, format, and how it uses the DMA and DSP. It's full of hacky options to see what the Sound Blaster can do.

In your case, load a WAV file with the program, and then use the Playback menu to switch off auto-init DSP playback (if it's on). Then, use the Playback menu again to adjust the IRQ interval, and again to enable DSP nag mode to see how well it handles it. It's a general tool to toy around with your Sound Blaster to see what happens.

If all you need is something to just step through the tests, then yeah, perhaps my text UI is too much fun to play with and you just might use Scali's program instead.

EDIT: Full source code for HW/SNDSB/DOS86L/TEST.EXE is on Github if you'd like to see what it's doing:

https://github.com/joncampbell123/doslib/tree … master/hw/sndsb

TEST.EXE is a text UI:

test_000.png
Filename
test_000.png
File size
4.56 KiB
Views
1633 views
File license
Fair use/fair dealing exception

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 36 of 53, by TheGreatCodeholio

User metadata
Rank Oldbie
Rank
Oldbie

In all seriousness though, perhaps it's time for DOSLIB to have more formal testing programs than UIs to toy with hardware.

DOSBox-X project: more emulation better accuracy.
DOSLIB and DOSLIB2: Learn how to tinker and hack hardware and software from DOS.

Reply 37 of 53, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

I have successfully run SBDMA on the Tandy 1000 TX, using the Sound Blaster 1.5 with DSP version 1.05. In addition to recording the SB 1.5's audio output, I also (lamely) captured the video so you can see what is going on. Lacking a means of capturing the RGBI output, I had to resort to capturing the composite output, which is why the 80 column text is soaked in cross-color artifacts. It's still readable, though. Here is the video (about 11 MB).

Basically, normal single-cycle playback sounds good enough, with and without the "busy wait", whatever that means exactly. The hack on the other hand exhibits audible pauses. So, what does that tell us? 😀

Feel free to request any other SB 1.5 DSP v1.05 tests, as long as I am told exactly what to do, every step of the way. 😉 The card also has the CMS chips mounted and working.

Reply 38 of 53, by James-F

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
That still leaves us with 2 open questions: 1) Does a v1.xx DSP also work correctly when not waiting for busy after the interrup […]
Show full quote

That still leaves us with 2 open questions:
1) Does a v1.xx DSP also work correctly when not waiting for busy after the interrupt?
2) Will a v1.xx DSP crash when the DMA controller is in auto-init mode?

I expect the answer to 1) to be a 'yes', and 2) to be a 'no', but I'd like to make sure.

Thanks NewRisingSun.
You guessed right Scali.


my important / useful posts are here

Reply 39 of 53, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

I have repeated the procedure, this time

  1. recording the 500Hz file instead of the piano piece
  2. recording at both of the speeds that the Tandy 1000 TX supports
  3. choosing the 40 column text mode for improved legibility on the composite output.

While the piano piece did not sound so bad, the sine wave exposes the choppiness of the output without mercy.

Scali wrote:

Will a v1.xx DSP crash when the DMA controller is in auto-init mode?

Why would anyone think that this might happen? The DSP does not know the internal state of the DMA controller. In fact, the DSP cares so little about the DMA controller's internal state that the CT1320C does not even connect the terminal count (T/C) pin from the ISA bus, as seen in this picture that I made.

As it happens, setting the DMA controller into auto-init mode even when the DSP is programmed for single-cycle mode is a good workaround for yet another hardware error of the Sound Blaster 16 DSP, which occasionally requests one more byte than it should, causing sound effect dropouts in Wing Commander II, Wolfenstein 3-D and Jill of the Jungle.