VOGONS


When will be protected breaked?

Topic actions

Reply 20 of 30, by Nicht Sehr Gut

User metadata
Rank l33t
Rank
l33t

jez, I'm struggling with my response here. My mind is filled with sarcastic replies. I'll try to keep them in control.

Originally posted by jez Gravis Ultrasound? Hahahahah.

An excellent card that I loved. Only went to a SoundBlaster Live when I was forced to (had a couple of applications that required some DirectX support that the GUS didn't have). Imagine my surprise when I found all my MIDI files now sounded terrible. I managed to improve it using some of the various soundfonts out there, but I was never really satisfied with the results.

Put simply, the Ultrasound easily thrashed the SoundBlaster, which at the time only supported 1 or 2 simultaneous digital sounds, whereas the GUS handled 32 and had 16-bit stereo well before the SoundBlasters. In general, SB cards for me, have the been the epitome of bland averageness.

Just so you know, My AWE32 (in the 486-66) likes to pick it's IRQ, DMA, and address at random after every reboot. The settings it chooses are always useless and requires that I change them manually every time. I have yet to find a fix for this.

Maybe you should have done some research before buying a PC.

*cough*
*cough**cough*
*cough**cough**cough*
I DID.

Too many consomers expect stuff to 'just work', ya know.

*stares*
Do you have any idea how arrogant that statement sounds?

Let's see, Video card says it has 3D hardware acceleration and will support Microsoft's upcoming DirectX standard.
The video card company then resists releasing drivers that have proper DirectX support claiming that it was very difficult to implement and not conducive to the design of the card.

Perhaps you can explain how my requesting proper support was unreasonable?

Very rarely did I have a problem getting a game to work. Maybe that was because we had *very standard hardware*.

Same standard story. Same condescending tone. If you really know the PC's history, then you know that after IBM actually started selling licenses to make "clones", you know "the standards" were decided by little more than mob rule. You know about the artificial 640K barrier, the fight between EMS and XMS memory formats, EISA, IBM's PS2, OS2, etc...

The logic that I should sit back and wait for everyone else to make up their minds on what they want and copy them...just for a machine to work...is nonsense.

Your rarity of having trouble, consider yourself lucky. I was still living in a dorm at the time of my entry into PC world and found that it was filled with guys who all had their share of troubles...and yes, they got what was considered "the standard" hardware. The only difference was that we each had different problems with different programs.

The "standard"GUS wasn't SoundBlaster-compatible, and so inevitably had problems.

Well this is a shock as I was emulating SoundBlasters from the word go (actually, it wasn't until I started trying out DOS titles that didn't have GUS support). Basic SB in DOS, SP Pro when run in a Windows command prompt.

Glad I didn't have one.

Glad I did.

Reply 21 of 30, by jez

User metadata
Rank Member
Rank
Member

Nicht:

Put simply, the Ultrasound easily thrashed the SoundBlaster, which at the time only supported 1 or 2 simultaneous digital sounds, whereas the GUS handled 32 and had 16-bit stereo well before the SoundBlasters.

Most games didn't support that, though did they? So you spent a load of money for support from a handful of games. It's like the Betamax; technically superior, but not worth it because it's not the standard.

Just so you know, My AWE32 (in the 486-66) likes to pick it's IRQ, DMA, and address at random after every reboot.

*randomly*?!? I had an AWE32 (or was it an AWE64, can never remember), and its settings always seemed to be fixed at something like P330, I7, D1, high DMA 3. I used SET BLASTER with those values and most games bought it. What you're describing sounds very strange.

Video card says it has 3D hardware acceleration and will support Microsoft's upcoming DirectX standard.

Ah, but that's much more modern. So modern that it only has to support ONE standard to be compatible with pretty much any Windows game. Back in the day, the soundcard had to provide its own standard... and then hope that the game developers would develop for it. Of course, making it support an already-entrenched standard (SB) tended to help.
I know that you already know this, just articulating it.

I was still living in a dorm at the time of my entry into PC world and found that it was filled with guys who all had their share of troubles

Back in the day, the only people that should've been messing with DOS is geeks 😀 GUIs and 'standards' are for normal people 😉

And true geeks work out how to hack their soundcards' settings to work with a DOS game or 2, rather than just 'expecting it to work'. I know I did 😜 Very rewarding when it did work.

Well this is a shock as I was emulating SoundBlasters from the word go

I think 'emulating' is the word. Was it hardware emulation, or software?

== Jez ==

Reply 22 of 30, by Nicht Sehr Gut

User metadata
Rank l33t
Rank
l33t

Originally posted by jez Most games didn't support that, though did they?

Most of the ones I had/was getting did. Of course, that's like you not having problems with your programs...a very subjective view.

So you spent a load of money...

You're presuming...it was about the same price as the SoundBlasters of the time.

...for support from a handful of games.

Native or add-on support for about 100 or so titles at the time, peaked at about 150 when DOS games dried up. For all other titles, you used SBOS (for basic SoundBlaster+AdLib), or preferably Mega-Em (SoundBlaster+General MIDI or SoundBlaster+Roland). MegaEm worked like VDMSound, pumping the MIDI or Roland audio through your wavetable, but re-arranging things a bit for Roland. Since the Roland emulation wasn't an exact match you had the option of creating a custom "patch", much like using a custom soundfont nowadays.

Oh, and the demos, of course...

Obviously, support in Windows was like any other sound card, except I really liked the standard wavetable in the GUS when compared to other standard wavetables.

It's like the Betamax; technically superior,

Actually, Beta was the standard. Sony foolishly forgot the basics of capitalism in thinking they "owned" the home video market and was quite stingy about selling video licenses for their format to competitors.

Meanwhile JVC was practically giving VHS licenses away. That meant they lost money due to increased competition, but it caused a flood of VHS VCR's to hit the market. Basically, they drowned Sony. They weren't all that honest about licensing either. The first VHS HI-Fi VCR's to hit the market weren't all matching the specs properly, causing some tapes to work in HiFi in one VCR, but not another. They spread a rumor around the industry that Sony's SuperBeta (increased resolution Beta) was incompatible with previous Beta VCR's, casually forgetting that the only way it was incompatible was if you were to play a SuperBeta tape on the very first Beta VCR's released, there would be some minor video noise. Meanwhile they implied their "VHS HQ" format would do the same thing with perfect compatibility, again, casually forgetting to tell people that it did not increase the resolution in any way and simply added 4 chips that acted like one of those old "video enhancement" boxes. To add insult to injury, even though JVS advertised the 4 chip method, they would still allow you to place the "HQ" stamp on your VCR even if it only used 2 of the 4 chips.

...but not worth it because it's not the standard.

Actually, they were quite worth it. It was the VHS HiFi's that were the rip-off.

Yes, I fought in the clone wars and the format wars,...

*randomly*?!?

Yes. Does it in the 486, doesn't do it in the Celeron test rig. Which kind of brings me back to my point, it's the "architectural and software history" that I had the problem with... I also had a PCI "Lightspeed 128" video card that appeared to be busted on my first motherboard (plain blue screen). Exchanged it for another, same result...piddled around and forgot to take it back within the 30-day limit. Around a year later, I had migrated to an new AMD machine and it worked fine. This was normality.

Ah, but that's much more modern. So modern that it only has to support ONE standard to be compatible with pretty much any Windows game.

The point being, that I was explicitly promised support for DirectX, including D3D and still didn't get proper support (they finally released a set of Beta drivers that had partial D3D support, complete with a ReadMe explaining why I shouldn't expect too much from it). That was the day I swore I would never get another NVidia video card...

Then 3dfx got stupid and decided that 32Bit color was frivolous...

...thus causing me to head back to NVidia. At least they seemed to have learned their lessons...

Back in the day, the only people that should've been messing with DOS is geeks

Well "back in the day", it was DOS6 + Win31. Avoiding DOS was an illusion and Win31 was simply a "frontend" for the OS. I was one of the first in the general area to get Win95. All the reports I was hearing from Win3.1 users was how they hated it and would never use it...that's when I knew it had to be pretty good.

GUIs and 'standards' are for normal people

Well considering the "PC" was the last of the computers at the time to have a standardized GUI OS... (yes, the Mac, Atari ST, and Amiga had them first).

Besides, if we all stuck to this logic; ALL of us would be using Intel processors and AMD would be dead.

And true geeks work out how to hack their soundcards' settings to work with a DOS game or 2, rather than just 'expecting it to work'.

I think you may have misunderstood something. I didn't just type the name of the game executable and expect sound to come out. I configure everything properly: matched the IRQ's (no easy job as just about all of them were already used), the DMA, etc...

The problem was a bug in the card's firmware. It was their first PnP card and they (like everyone else) were learning why it was nicknamed PlugNPray. With hindsight, that's one thing I would've done differently: I would've gotten the MAX version of GUS instead. It wasn't as advanced as the PnP, but it was cheaper and already had a good track record of service.
[B]

I think 'emulating' is the word. Was it hardware emulation, or software?

Software, and the SB quality was the same as any other generic SoundCard out at the time. Again, you're fixating on the soundcard...I'd like to see you and HunterZ duke it out...*heh*

Just tell him, "Ensoniq sucks, Creative rules!".

Reply 23 of 30, by jez

User metadata
Rank Member
Rank
Member

Then 3dfx got stupid and decided that 32Bit color was frivolous...

32 bit color *is* frivolous. If you're using 32 bit, that card's very likely ignoring 8 bits. 24 bits are all that's needed for 16,777,216 colors (the standard for 'high color' at the moment).

== Jez ==

Reply 24 of 30, by oneirotekt

User metadata
Rank Member
Rank
Member
jez wrote:

32 bit color *is* frivolous. If you're using 32 bit, that card's very likely ignoring 8 bits. 24 bits are all that's needed for 16,777,216 colors (the standard for 'high color' at the moment).

32-bit vs 24-bit is largely a technical distinction, 3D cards incur very little extra overhead for passing along the extra 8 bits of alpha info. The important distinction is that 8 bits are used for the R, G and B channels, versus 5, 6 and 5 respectively for 16-bit.

16-bit color is noticeably inferior in quality to 32-bit for realtime 3D scenes, nasty dithering occurs in almost every case. 3Dfx was foolish to write that off. They finally did acknowledge the problem with their "22-bit" technique, but that was nothing more than a funky dithering algorithm for a 16-bit image. The 22 bits distinction was pure marketing hogwash.

Soon even 8 bits per color channel might not be enough, as 3D scenes with lots (> 8) of rendering passes begin to show banding / dithering artifacts... which is why John Carmack is talking about using floating point color for Doom3.

Reply 25 of 30, by Nicht Sehr Gut

User metadata
Rank l33t
Rank
l33t

Originally posted by jez 32 bit color *is* frivolous.

No, it's not..and even if that was true 3dfx was sticking to 16Bit video, refusing to even consider implementing 24Bit. That's what killed them. It's one thing to say, that you will eventually implement 24Bit video, but would concentrate on 16Bit for speed...but 3dfx stubbornly ignored 24Bit/32Bit.

With NVidia you could choose, with 3dfx you were stuck with 16Bit. By the time they released cards with 32Bit support they were already in serious trouble.

Reply 27 of 30, by HunterZ

User metadata
Rank l33t++
Rank
l33t++
Nicht Sehr Gut wrote:

Software, and the SB quality was the same as any other generic SoundCard out at the time. Again, you're fixating on the soundcard...I'd like to see you and HunterZ duke it out...*heh*

Just tell him, "Ensoniq sucks, Creative rules!".

Heh, they both suck. Ensoniq developed first-generation PCI soundcard technology and sold it to the monkeys at Creative before it was refined enough, and Creative made flaky drivers and stupid frilly applications for the cards instead of improving the technology. To be fair, I hear the Audigy 2 is good, but that's probably just compared to everyone else (because everyone else who makes good stuff gets bought out by Creative). 😜

I also hear that the GUS was a nifty card, although mostly from religious fanatics resembling some certain Amiga nuts I happen to know... 😉

Reply 30 of 30, by Nicht Sehr Gut

User metadata
Rank l33t
Rank
l33t

Originally posted by HunterZ Heh, they both suck.

Ooo. The bitterness runs deep. *heh*

Ensoniq developed first-generation PCI soundcard technology and sold it to the monkeys at Creative

If only they had sold it to the humans there instead...ah, what might have been.

...that's probably just compared to everyone else (because everyone else who makes good stuff gets bought out by Creative).

Not true. There are others, you just don't hear much about them. If you have a boatload of cash there are some audiophile cards available.

I also hear that the GUS was a nifty card, although mostly from religious fanatics resembling some certain Amiga nuts I happen to know...

You clone drone *heh*. PC people were getting GUS cards because they were tired of 8Bit soundblasters that couldn't play more than two digital sounds at once. If it weren't for competition from the GUS (and cards like it), Creative would've almost certainly sat on their laurels for a good, long time.