VOGONS

Common searches


First post, by Calaver

User metadata

When I run DOSBox in XP in 24-Bit graphics mode it gives me an error:

"Exit to error: Failed to create a rendering output"

However, when I run in 16-bit mode it works fine. Is there a fix to this, or should I just run it in 16-bit mode from now on?

Reply 2 of 10, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

What video hardware do you have?

16bpp should be enough and it's faster. For true colour, you will need a 32bit capable video card. Try updating video drivers, but it it's already working using 16bpp I wouldn't bother.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 3 of 10, by Calaver

User metadata
Rank Newbie
Rank
Newbie

Good point...my laptop only runs 24-bit max, so that might not be compatible.

If I can't run it on my laptop, that's no big deal, because I have a much more powerful desktop (2GHz as opposed to 400MHz). I was just experimenting with it for the time being anyway, until I get back home to try it out more in-depth.

Reply 4 of 10, by `Moe`

User metadata
Rank Oldbie
Rank
Oldbie

It depends largely on the video drivers, what ouput mode the SDL library uses and all that. For a 400MHz box, definitely use 16bit. Everything else is wasting your precious CPU. 24-bit isn't worth it (and even if dosbox worked on your machine in 24bit, it would be a lot slower than 16bit, 24bit is even slower than 32bit).
I have had a 300MHz laptop some time ago, using dosbox with quite some success. But honestly, the visual difference between 16 and 24 bit on such a machine isn't worth it. DOS games don't even use 24bit colour (original vga hardware has max. 256 colours out of a 18 bit colorspace), so you don't really lose anything.

Reply 5 of 10, by Guest

User metadata

The only programs I'm interested in running are actually FastTrakker and Impulse/Scream Tracker. A friend of mine also runs demos like Second Reality just fine on his 2GHz Athlon regardless of color depth.

As far as I know, 24-bit is just a compressed buffer for 32-bit, or vice-versa...I forget which is which. But I'm fine with running my laptop exclusively in 16-bit mode, especially with Debian/KDE. It's a lot smoother than, say, Windows 98 or XP.

Funny how I'm so interested in taking a step backward with DOS.

Reply 7 of 10, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

If I am not mistaken, 32bpp is 24bpp+Alpha Channel (which controls transparency)

EDIT: Corrected the 'alfa' typo. I wrote it in Spanish. 😁

Last edited by eL_PuSHeR on 2005-03-26, 15:19. Edited 2 times in total.

Intel i7 5960X
Gigabye GA-X99-Gaming 5
8 GB DDR4 (2100)
8 GB GeForce GTX 1070 G1 Gaming (Gigabyte)

Reply 9 of 10, by `Moe`

User metadata
Rank Oldbie
Rank
Oldbie

In the context of most frame buffers, 32bpp is 24bpp plus padding. Using the extra 8 bit as alpha channel is something done for, say, PNG images or OpenGL textures, but (usually) not for application windows. The extra 8 bit go unused most of the time, but they are faster to use, as the CPU can access 4-byte (32-bit) chunks a lot faster than 3-byte (24-bit) chunks.

Reply 10 of 10, by Calaver

User metadata
Rank Newbie
Rank
Newbie

That makes a whole lot of sense. Running it in 32-bit mode on my desktop it has no problems with the OpenGL driver output. I can still run it on my laptop, but not as often as my desktop, so I won't mind switching from 24 to 16 when needed. I consider the matter closed for myself, but I guess it can stay open if anyone else has a related question.

Thanks guys.