VOGONS


First post, by Jan3Sobieski

User metadata
Rank Member
Rank
Member

I honestly didn't want to ask for this, but I've been reading this awesome forum for the past months and read through most of it, unfortunately I'm more confused now then i was before especially when it comes to sound.

Can someone please post (and I know this is asking a lot) or point me to the right thread if it already exists, the explanation to all these terms, what they are, what they do, why is it good/bad, what's the point of them:

FM synthesis, wavetables, soundfonts, mpu-401, UART, daugherboards, sound vs music (i never knew that it was different, is it?), OPL3 chip? what's so special about it? EMU8000, and other terms i'm forgetting right now

I think I understand emulation, but I don't get how sometimes you write that, let's say, a sound card doesn't emulate something correctly. What does that actually mean? It sounds bad?
Same thing with daugherboards. I know you plug them into the waveblaster pins or whatever, but how is it handled both in dos and windows, do i need drivers for that? Does the card use it automatically or do i have to enable something?

I know this is a lot, but I'm getting frustrated that I don't understand most of this. I've been gaming since the ancient times, I just never took pc sound/music too seriously. It always either worked or didn't and that was that.

Reply 1 of 2, by swaaye

User metadata
Rank l33t++
Rank
l33t++

FM (frequency modulation) synthesis uses mathematics to simulate musical instruments. Some instruments are easier to recreate than others (ie bells are very easy while strings are nigh impossible). Roland MT-32 uses another form of simulation, called linear arithmetic synthesis.

Wavetable (aka sample synthesis) uses recordings of real instruments. It does some interpolation/extrapolation to create the full breadth of their sound across keys. It also frequently uses various effects to enhance the sound, such as reverb and chorus.

Soundfonts are Creative Labs' proprietary way of storing sample patch sets for wavetable synths. They can be a single instrument or a set of instruments, or even a full general midi-compliant pack.

MPU-401 is an interface that was used for MIDI devices to connect to computers. I think Roland created it but I'm not sure.

Daughterboards are little cards that attach to larger boards. MIDI daughtercards are essentially a MIDI synthesizer attachment to add such functionality to a sound card. They usually connect via MPU-401 interfaces.

Sound vs. Music - basically synthesis vs. PCM sound playback. Some old games do use synthesis for sounds instead of real digital audio samples.

Yamaha OPL3 is a FM synthesis chip. It was cheap and super popular on sound cards of olden times. There was also the OPL2 which is sort of half of an OPL3.

Bad emulated FM = simplisic recreation of OPL3, usually. It's demanding to emulate something, and old computers could not be tasked with a fully-authentic OPL3 FM emulator and still be able to play games. So they simplified things and the results generally sucked compared to the real hardware.

EMU8000 is a sample synthesis chip from E-Mu systems and used by Creative for MIDI synths on their products. See AWE32, AWE64, SB32, and Waveblaster. It's a predecessor for the synth hardware in SBLive and later.

etc. 😁 You can read a lot of interesting history on other sites. Wikipedia is pretty darn good. Also look up Quest Studios.

And this
http://www.crossfire-designs.de/?lang=en&what … cle=soundcards/

Reply 2 of 2, by gravitone

User metadata
Rank Member
Rank
Member

Midi was introduced as a way to digitally transport music across a serial connection. Instead of actual sound, it transports the notes, velocity, instruments etc.

Because of a lack of standardization, a lot of different instruments and synths were put on the market that all talked midi, but due to a lack of standardization on which instruments were available, by which ID number they were known, which effects, number of channels, etc. Incompatability was still an issue. Therefore, a number of "Standards" arose. Roland standardized their own equipment early on with their own MT-32 style layout and features. The keyboards, external synths, ISA cards, etc all had the same capabilities, so each could accurately reproduce the others sound. Since composers that worked on games usually used an MT-32 for example, playback of that midi information obviously sounded best on the same device.
The early 90's saw the rise of the general midi standard, which was adopted by many manufacturers. It specified a fixed set of instruments, their ID numbers, available effects, etc. So a piece recorded with 1 midi device playing a horn for example, would be played back with the same instrument on other devices. However, exactly identical sound never arises due to differences in the quality of the sample recordings used by the wavetable, which arise due to physically different instruments being used to record them, recording conditions, etc. The volume balance between different instruments in a sample set can also make things sound different. FM synthesis can be used to approximate the GM instrument and feature set instead of wavetable synthesis. Instrument 1 on both devices might be a grand piano, but they will sound completely different compared to eachother. So playing back GM material on a GM compatible device does not automatically give you the sound the composer intended Matters were complicated even further with the introduction of the extra GS standard, which increased greatly on the instruments available. Therefore, a GS midi file played back on GM equipment will sound wrong.
Now for soundcards, the MPU interface was introduced in the form of a wavetable header on soundcards pretty early on. It is nothing more then a standard set of midi IN/OUT connectors you find on keyboards and external synths, coupled with return pins that transport the audio from the board to the soundcards mixer. MPU is nothing more then a serial port that transports midi. These were directly tied into the x86 system bus through the ISA slot and allowed a program to basically simply write the midi information to I/O port 330 for example. The external midi port which was usually sharing pin’s of the gameport was wired up to the same I/O port, so the daughterboard connector and the external midi devices receive the same data. Which makes the only difference between daughterboards and external midi modules the price and amount of extra wires needed to tie it all together.
The EMU chip was used as a replacement for the chips used on wavetable daughterboards. Instead of a fixed set of instrument samples stored in ROM, it provided RAM which could be loaded with a sample set of choice. For games this meant they could use their own unique set of instruments to provide more variety. However, because of the established midi standards GM/GS, and to a lesser extent XG, and the fact that all music was mainly composed on roland sound canvas hardware at the time, game support for the EMU and thus AWE32/64 for midi was quite poor. Some games just loaded a generic GM set, or relied on the creative supplied GM soundfont file that was loaded when the creative card was initialized during boot. In effect, the EMU card started doing what the gravis ultrasound did years before.
Because CD-ROM’s were starting to be well established as a medium, midi music was quickly phased out for later DOS and almost all windows games to make way for CD-DA, making the whole midi portion of soundcards practically obsolete for gaming.