VOGONS

Common searches


First post, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

Which programming language(s) were used to write the majority of games from the early 80s to early 90s? On PCs, Commodores, Amigas, etc. ? I'm particularly curious as we are beginning to see some folks write new games for old hardware --- what languages are most useful for this?

Reply 1 of 18, by DracoNihil

User metadata
Rank Oldbie
Rank
Oldbie

Early 80's? It's all done in the raw assembly language of the processor architecture itself. Though there are far and few inbetween using some dialect of BASIC

The 90's was around the time C started becoming widespread but alot of it still has inline assembly for critical tasks.

“I am the dragon without a name…”
― Κυνικός Δράκων

Reply 3 of 18, by SpeedySPCFan

User metadata
Rank Member
Rank
Member

What DracoNihil said. Assembly was the most popular, with C growing in popularity in the 90s but ASM was still used a lot for efficiencies sake.

Musician & music gear/game reviewer.

MIDI hardware: JD-990, SC-55, SC-880, SD-90, VL70-m, Motif ES, Trinity, TS-10, Proteus 2000, XK-6, E6400U

Reply 4 of 18, by vladstamate

User metadata
Rank Oldbie
Rank
Oldbie

A good book related to the subject:Racing the Beam

YouTube channel: https://www.youtube.com/channel/UC7HbC_nq8t1S9l7qGYL0mTA
Collection: http://www.digiloguemuseum.com/index.html
Emulator: https://sites.google.com/site/capex86/
Raytracer: https://sites.google.com/site/opaqueraytracer/

Reply 5 of 18, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie

The languages used then aren't necessarily the most useful to use for then now. E.g. some c++ compilers that can excrete dos binaries have partial support for up to c++11, which depending on the goals may or may not make the work easier.

Inline assembly sure, but that assumes you've exhausted algorithmic optimizations, for which you'll first have rethought many of your modern intuitions about programming and not just the choice of language. For something like a 486, and depending on the task, fairly modern pure c++ sans assembly may give decent enough performance.

Reply 6 of 18, by gca

User metadata
Rank Member
Rank
Member

There were also some dedicated languages back in the 8-bit era that might have survived into the PC age for creating adventure games. Only titles I can remember are GAC (Graphic Adventure Creator), Quill and Adlan.

You also had a couple of dedicated game dev libraries/languages like Sprites Alive and Pandora (which is horrific to my modern programming sensibilities).

In a pinch I guess 3D Construction Kit could have been used to create your own Freescape games using its built in scripting system.

Reply 7 of 18, by Scali

User metadata
Rank l33t
Rank
l33t
vvbee wrote:

Inline assembly sure, but that assumes you've exhausted algorithmic optimizations, for which you'll first have rethought many of your modern intuitions about programming and not just the choice of language. For something like a 486, and depending on the task, fairly modern pure c++ sans assembly may give decent enough performance.

A 486 and a semi-modern C/C++ compiler, yes. But the question was more about what they used back then.
Back in the day of early 8088 systems however, the compilers didn't do a very great job at optimizing, neither for speed nor for size. And you didn't have that much memory to begin with (early PCs had only 16-64k of memory).
So one reason to do everything in assembly was to have total control over the size and placement of code and data (remember, we are also dealing with 16-bit segmented memory here, making it all the more difficult for compilers to be efficient).

On PC, Turbo Pascal was also reasonably popular, mainly for two reasons:
1) It had quite sophisticated support for inline assembly
2) It had a very efficient system for compiling and linking directly from the editor. This made the turnaround from editing, compiling, running in debugger much faster. These days we are accustomed to just being able to press the 'run in debugger'-button, and expect our program to start in a few seconds. Turbo Pascal was the first environment that made this somewhat possible. C generally had minutes of compile-time on these old machines, everytime you changed a single line of code, so this method of development was not very feasible.
See also here: http://prog21.dadgum.com/47.html

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 8 of 18, by collector

User metadata
Rank l33t
Rank
l33t

Some companies used their own scripting languages. LA had Scumm. Sierra had AGI and later SCI with tools and its compiler written in C.

The Sierra Help Pages -- New Sierra Game Installers -- Sierra Game Patches -- New Non-Sierra Game Installers

Reply 9 of 18, by vvbee

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

A 486 and a semi-modern C/C++ compiler, yes. But the question was more about what they used back then.

Sure. There were two questions, which languages were used, and which languages are most useful, so I think caution is warranted. That said, choosing an obsolete language or an obsolete subset of one even when it's not needed for the task - assuming you do have a choice - can still be useful for getting immersed in a period correct coding mindset.

Reply 10 of 18, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

If your goal isn't necessarily to just make new software but also to improve upon early DOS games that might've been inferior to Apple and Commodore versions by adding EGA/TGA support and 3-Voice/Adlib sound -- then you would want to be very familiar with the original languages, no?

Reply 11 of 18, by Scali

User metadata
Rank l33t
Rank
l33t
vvbee wrote:

Sure. There were two questions, which languages were used, and which languages are most useful, so I think caution is warranted.

I think it depends a lot on whether you are targeting 32-bit systems or not.
32-bit compilers have been used for many years after DOS became obsolete, and as such, you can still find reasonably new 32-bit compilers that still have DOS support as some kind of vestigial feature.
16-bit compilers however have stopped development in the early 90s, so most tools never evolved beyond what you would use in the era of the 486. The only 'modern' 16-bit compiler I know of is OpenWatcom.
So if you want to target 286 or lower, you won't have much choice beyond the tools that people used back in the day.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 12 of 18, by vladstamate

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

16-bit compilers however have stopped development in the early 90s, so most tools never evolved beyond what you would use in the era of the 486. The only 'modern' 16-bit compiler I know of is OpenWatcom.

That is true, however I did a bit of research and it seems you can squeeze (although maybe not entirely out of the box) 16bit code out of both gcc and clang/llvm. Now this will not get you DOS binaries however.

I work with llvm a lot in my job. I wonder if adding a target for DOS executables (16bit) out of LLVM/clang would be useful for anyone?

YouTube channel: https://www.youtube.com/channel/UC7HbC_nq8t1S9l7qGYL0mTA
Collection: http://www.digiloguemuseum.com/index.html
Emulator: https://sites.google.com/site/capex86/
Raytracer: https://sites.google.com/site/opaqueraytracer/

Reply 13 of 18, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie

Not to ignore the previous convo, just throwing in my 2 bits from my own experiences.

BASIC was king of the 8-bit systems in terms of ease of use, millions of simple games were made with it, and that includes "professional" level games you bought on disks from the store. 6502 assembler was used for highly optimized games, typically anything beyond your basic pong level game that demanded higher performance. Generally 6502 was needed to do anything impressive on the 8 bit systems at a reasonable frame rate. Hence why most of the "professional" games we smile upon 30+ years later are written in pure assembly.

MS-DOS was a battleground of programming languages.. but PASCAL and then later C are the big standouts. Compilers in those times weren't optimized to the level they are today (as mentioned above), for example if you used C to move memory to your VGA ("plot pixels") using MS C/C++ 7.0, they would be moved in byte sized (8 bit) writes at a time. Programmers generally would write 286 assembly modules for this which would allow them to do WORD sized (16-bit) writes to VGA, resulting in massive performance gains. There are plenty of examples of that, typically when you needed to move memory as quickly as possible -- you used 286 assembly.

These assembly modules would be compiled into objects and later linked into the executable during compilation. Later, compilers allowed "inline assembly" to be used (-G2 flag for 286 inline in microsoft compilers), which would allow you to throw assembly functions straight into your C code with FAR less hassle. That was a GLORIOUS thing!

----------------------------------------------

Today if you're looking to play around with the old systems, I'd suggest learning basic until you can push it to it's limits, then switch to 6502. For MS dos machines go with straight C and later dig into some 80x86 assembly for optimizations (I do love quickbasic though, nice little games can be made with it). Minor learning curve of irrelevancies with the dated compilers, but it should only take a little bit of your time.

As for newer compilers making 16-bit binaries ...dunno... I always used inline even with win98 (my brain is dated however, I decided to go back to college for accounting and left my spirited roots behind ((until now)). I think it's still a good thing to learn some assembler and the 80x86 microprocessor in general, if anything for a better understanding of what your C or C++ code will do, and for debugging. I always find that people who are only HLL (high level language) oriented have a difficult time forming solutions to simple problems. So dig into those books anyway 😎

infiniteclouds wrote:

If your goal isn't necessarily to just make new software but also to improve upon early DOS games that might've been inferior to Apple and Commodore versions by adding EGA/TGA support and 3-Voice/Adlib sound -- then you would want to be very familiar with the original languages, no?

If you don't understand the original languages, you'll have a very difficult time understanding the source code, one would imagine.

There are two ways of "improving" early DOS games that I can see..
1) a full rewrite into lower level language (which may provide no bonus at all if you use the same algorithms)
2) rewriting functions for higher performance i.e. blitting functions, line drawing, collision, clipping, etc etc).

If you don't have the original source code.. then it's up to your imagination to emulate the game.

As for adlib support, that can be an entire technical conversation in and of itself. You'll either have to learn how to program the adlib, or include somebody elses sound library into your own project. Both of which are going to require you to understand the codebase you're working in.

Last edited by BeginnerGuy on 2017-12-27, 14:37. Edited 2 times in total.

Sup. I like computers. Are you a computer?

Reply 14 of 18, by Scali

User metadata
Rank l33t
Rank
l33t
vladstamate wrote:

I work with llvm a lot in my job. I wonder if adding a target for DOS executables (16bit) out of LLVM/clang would be useful for anyone?

Reenigne is working on a 16-bit DOS version of gcc: https://blogs.mentor.com/embedded/blog/2017/0 … -lite-for-ia16/

The biggest problem here is the segmented memory model: if you just adapt any compiler designed for flat memory models, then the best you can do is to use 'huge' pointers (gcc ia16 is currently limited to 64k memory at best, so it is a 'flat memory model' in that it only uses one segment).
Optimizing for 'far' and 'near' pointers, as well as being able to place code, stack etc in the most efficient segments etc is a very specific problem to 16-bit x86. So it is going to take a lot more than just a few tweaks to support 16-bit instructions and a DOS-compatible linker.
Else you'll end up with a compiler that's probably far less efficient than the old compilers from Watcom, Microsoft and Borland, which do understand segments, and allow you to control them with all sorts of directives.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 15 of 18, by clueless1

User metadata
Rank l33t
Rank
l33t

Matt Barton (Matt Chat) has done a number of interviews with programmers and developers of our favorite MS-DOS games. Many times they talk about the programming limitations and how they worked around them. Good resources from the horses' mouths.
https://www.youtube.com/user/blacklily8

Another good resource is David Schroeder's blog:
https://www.gamasutra.com/blogs/DavidHSchroed … arn_at_Yale.php
He's the author of Dino Eggs and Crisis Mountain for the Apple II.

The more I learn, the more I realize how much I don't know.
OPL3 FM vs. Roland MT-32 vs. General MIDI DOS Game Comparison
Let's benchmark our systems with cache disabled
DOS PCI Graphics Card Benchmarks

Reply 16 of 18, by Joey_sw

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

It had a very efficient system for compiling and linking directly from the editor. This made the turnaround from editing, compiling, running in debugger much faster. These days we are accustomed to just being able to press the 'run in debugger'-button, and expect our program to start in a few seconds. Turbo Pascal was the first environment that made this somewhat possible. C generally had minutes of compile-time on these old machines, everytime you changed a single line of code, so this method of development was not very feasible.
See also here: http://prog21.dadgum.com/47.html

The quoted article has an interesting line: "zero compilation speed eventually became standard, with the rise of interpreted languages like Perl, Ruby, and Python."

The 'interpreted langauages' phrase make me remember the QBasic, which also have decent IDE which better for small project that you can manage 'Subs' modular-y with F2 key.
I was hoping that kind of feature to manages procedures & functions would available
when pascal for windows aka 'Delphi' finally available, but some reason that i never knew it was never there.
We got seperates units files views instead.

-fffuuu

Reply 17 of 18, by Falcosoft

User metadata
Rank Oldbie
Rank
Oldbie

The only 'modern' 16-bit compiler I know of is OpenWatcom.

Maybe as a 'modern' cross compiler for real mode DOS target Free Pascal can be mentioned.
http://wiki.freepascal.org/DOS
ftp://ftp.freepascal.org/pub/fpc/snapshot/trunk/i8086-msdos/
It is still not perfect but supports many memory models (Tiny, Small, Medium, Compact, Large, Huge).

Website, Facebook, Youtube
Falcosoft Soundfont Midi Player + Munt VSTi + BassMidi VSTi
VST Midi Driver Midi Mapper

Reply 18 of 18, by Kerr Avon

User metadata
Rank Oldbie
Rank
Oldbie

On the 8 bits (ZX Spectrum, Commodore 64, BBC Model B, etc), it was mainly machine code (mostly written in assembly language), though BASIC was usual for the type in programs that early computer magazines often listed. And commercial software was sometimes written in BASIC, and either distributed as such, or as a compiled version of the BASIC, to speed it up. But most 8 bit games, including almost all of the good ones, had to be written in machine code for speed reasons, plus BASIC often wouldn't have the ability to perform specific machine-code only functions, such as waiting to draw to the screen until the screen refresh had passed (to avoid screen tearing or flickering) or interrupt drive sound.

Also, well written machine code was more efficient, space-wise, than BASIC, which was of real concern when the machine you wrote the game for only have 16kb or so free (even the Commodore 64, with it's 64kb of RAM, only had about 37kb free for programs, if I remember correctly).