Reply 127 of 386, by thedoctor45
- Rank
- Newbie
looking good so far in Windows 95:
I'll install DirectX real quick to give Croc a try.
awesome work kekko & gulikoza 😉
looking good so far in Windows 95:
I'll install DirectX real quick to give Croc a try.
awesome work kekko & gulikoza 😉
If i set the cpu core to anything but "normal" DOSBox crashes with a segmentation fault when trying to launch Tomb Raider - at normal setting it works but is unplayably slow.
also the 16Bit color bug gives me a headache again.
the bug and the crash won't happen with the openglide wrapper build.
i think you can change colours in voodoo_data.h
#ifdef LSB_FIRST
Bit8u b, g, r, a;
First of all, thanks to everybody for their effort in making this possible, and special thanks to thedoctor45 for helping me with some compiling problems. But I have a really strange problem right now, I'm using MinGW + MSYS to compile everything:
c:/dosbox/dosbox/dosbox_org/src/dosbox.cpp:449: referencia a `PCI_Init(Section*)
' sin definir
c:/dosbox/dosbox/dosbox_org/src/dosbox.cpp:450: referencia a `VOODOO_Init(Sectio
n*)' sin definir
hardware/libhardware.a(memory.o): En la funci¾n `Z18MEM_GetPageHandlerj':
c:/dosbox/dosbox/dosbox_org/src/hardware/memory.cpp:147: referencia a `voodoo_pa
gehandler' sin definir
ints/libints.a(bios.o): En la funci¾n `Z13INT1A_Handlerv':
c:/dosbox/dosbox/dosbox_org/src/ints/../../include/callback.h:47: referencia a `
pci_callback' sin definir
collect2: ld returned 1 exit status
I don't know what to do about it. Thanks again to everybody.
i used gulikoza's patch and removed all inline words from voodoo files and it works (it has a memory leak btw)
it has a memory leak btw
Where?
In gulikoza's patch.
When I run any 3d game dosbox's size in memory is rising about 200kB/s in tomb raider 1 when i am just standing looking at the wall.
It doesn't happen without gulikoza's patch.
Well, that's what I said in the description of the patch 😀
The code around there needs to be cleaned up, I just made the fastest way to get it working because I wanted to see it work with gcc 😀
SET SST_TMUMEM_SIZE=2
fixes carmageddon
I found this in voodoo2 drivers in V2-auto.inf file:
;----------------------------------------------------------------------- ; INF file for Voodoo2 based 3D Accelerators running un […]
;-----------------------------------------------------------------------
; INF file for Voodoo2 based 3D Accelerators running under Windows 95/98
; (c) 1998-2000 - 3dfx Interactive, Inc.
;
; Updates autoexec.bat for backwards compatiblities with older games
;
;-----------------------------------------------------------------------[version]
;Class=MEDIA
signature="$CHICAGO$"; Install sections
;----------------------------------------
[DefaultInstall]
UpdateAutoBat=Voodoo2.autobat[Voodoo2.autobat]
CmdAdd=Rem,"Added for Voodoo2"
CmdAdd=set,"SST_GRXCLK=90"
CmdAdd=set,"SST_FT_CLK_DEL=0x4"
CmdAdd=set,"SST_TF0_CLK_DEL=0x6"
CmdAdd=set,"SST_TF1_CLK_DEL=0x6"
CmdAdd=set,"SST_VIN_CLKDEL=0x1"
CmdAdd=set,"SST_VOUT_CLKDEL=0x0"
CmdAdd=set,"SST_TMUMEM_SIZE=2"
btw, i have noticed that dxdiag detects 8MB but 3dmark99max detects only 4MB.
btw2, how is it possible that your emulator has 8MB ram? wiki says there were voodoo1 cards with only 4MB or 6MB
Impressive work mate, keep it up - a super-solid Voodoo 1 implementation will be appreciated by many.
I was thinking again about threading the renderer to increase performance; threading is making the thing way too complex.
Perhaps, using opengl for the triangle and fastfill commands may be easier, fast and portable enough, without the need of threading it too.
the actual renderer it's still needed for many aspects of an accurate and working emulation and would be left in place; the user may also switch back to software renderer on-demand.
I'm not such an expert in opengl and graphics libraries in general; from what I've read around, off-screen rendering may fit our needs.
By rendering to the voodoo back-buffer, we should be able to integrate it with the rest of the emulation; moreover, the full screen switching and the video capture should not break.
Please let me know your comments about this; if anyone has suggestions that might help, please post it here.
By the rule of thumb, anything not done with polygons (eg. rendered by the cpu to some framebuffer) in any 3d engine is slow. First you loose all the advantages of 3d rendering (the cpu does all work) plus you're adding all the overhead (which is not to be underestimated...directly writing bitmaps is horribly slow compared to polygon processing). Glide was a nice exception since it allowed framebuffer locking, Direct3D and Opengl do not. You can see how OpenGL is one of the slowest outputs in Dosbox unless special extensions can be used to speed up the whole thing 😀
You know the code and codepaths best...perhaps you could estimate how much polygon processing would really be offloaded to the GPU and how much it would still had to be done by the cpu...
From what I've seen, most of the time consumed by the emulation is spent by the scanline rasterizer.
The direct lfb access should not be a big issue; of the few games that actually use it, many seem to just write to lfb after 3d is completed, for hud info and such things.
Differently from using d3d or opengl for frame buffer accessing, the emulation just directly reads/writes a memory area, no locking or strange things happen, so not much overhead actually.
The scanline rasterizer is slow, not just because is software, but also because is not been written performance-wise, it's more oriented to accuracy and code readability (it's part of an hardware documentation project).
It could be greatly optimized; one of the first things you can notice is that many checks it does during pixel pipeline processing may be moved out of the scanline loop, but that would mean having tens of different scanline renderers, from basic solid color to shaded filtered textured dithered alpha-blended fogged z-clipped renderer, and all possible combinations. Many things could be rewritten, or assembly optimized.
The idea is to use opengl just for triangles, make it render off-screen to our frame buffer instead of screen as usual and leave lfb handling as-is.
I wasn't talking specifically about (voodoo) lfb access, but more generally about what you said that only some functions could be offloaded to the GPU. The rest would need to be rendered by the CPU and sent to OpenGL as finished bitmaps so that the final frame could be composited...
yes, you've been clear 😀 I think that just triangle (and maybe fastfill) commands may be offloaded to the gpu.
about direct access, I was hoping of avoiding the writing to a bitmap for 2d. The render-to-buffer technique would in theory allow us to still write triangles to the voodoo back-buffer, as the software renderer, leaving lfb writes as-is, but I was asking for some comments on this because I'm not fully aware of how it works.
I have a question. Forgive me if it's redundant with something mentioned in the numerous pages here..
I assume that the goal here is to create a higher compatibility partially accelerated layer for Glide - i.e., you're implementing more than most Glide wrappers do as far as hardware emulation goes, and also accelerating what you can using CPU/GPU of the host - correct?
This is, of course, as opposed to a pure software emulation in which none of the processing is outsourced beyond DOSBox to the host, which would be a bit less efficient.
As I said, I assume you're doing the former, but just wanted to ask.
Hmm, Kekko, and I have one question. Can you increase the memory limit in your DOSBOX SVN up to 128 MB?
I've launched Ultima IX. Game works, but very slow. Maybe I have too slow processor (I have Athlon64 2GHz)...I want to know, how fast run this DOSBOX SVN on machines other testers?
wrote:Hmm, Kekko, and I have one question. Can you increase the memory limit in your DOSBOX SVN up to 128 MB?
I've launched Ultima IX. Game works, but very slow. Maybe I have too slow processor (I have Athlon64 2GHz)...I want to know, how fast run this DOSBOX SVN on machines other testers?
Good idea, I remember when I played Thief: The Dark Project and Deus Ex on a 64mb machine with a Voodoo 2 card and using Windows 98, and the swapfile had to work like mad, which slowed down things a lot.
On your average 4gig system the swap file of the pmode extender surely is kept in memory.
wrote:I've launched Ultima IX. Game works, but very slow. Maybe I have too slow processor
It's Ultima IX. What are you expecting, stable and proper performance? Today's computers still struggle with it. 😀