I'm trying to use DDRAW wrapper with Humvee Assault (2003) but the game doesnt recognize dgVoodoo. Config is ok and ddraw dll is in root folder.
The game engine is DirectX 8.1 and its perfectly supported in new system (WS, fullHD, AA, etc) but all videos, menus and loading screens are rendered in horrible 640x480 ddraw (i guess), they look awfuly scaled.
I'm not sure what is the executable tho. Because there is a "launcher" in root folder and you must execute it in order to play. But in bin folder theres another exe that launch a command line. This is supposed to be the main executable but that behaviour is strange.
Yes, a new "Direct3D 11 MS WARP (software)" output api in the list, without killing the GPU drivers.
However, it's a bit confusing. I still have my usual GPU-adapters, but the WARP renderer cannot be tied any of them, so the best would be to have an empty adapter-list with empty display output-list when WARP is choosed. However, dgVoodoo still needs a valid adapter with valid outputs to handle some tasks needed here and there, like querying the current video mode for a given emulated DD device (which needs a valid adapter and a monitor attached to; a DD or D3D8 device is basically bound to a given monitor). So, I cannot have empty lists, but it's not so good because the user may get mislead, thinking that WARP utilizes the chosen adapter for the rendering itself. 🙁
According to the MS docs, starting with Win8 (if I got it right), an adapter called "Microsoft Basic Render Driver" is always present but it's unusable for dgVoodoo as it never has outputs.
Do resolutions appear in the setup app when 10.0 is selected? Because if they do then D3D11 can be initialized and the dll side has something else problem.
I'm trying to use DDRAW wrapper with Humvee Assault (2003) but the game doesnt recognize dgVoodoo. Config is ok and ddraw dll is in root folder.
The game engine is DirectX 8.1 and its perfectly supported in new system (WS, fullHD, AA, etc) but all videos, menus and loading screens are rendered in horrible 640x480 ddraw (i guess), they look awfuly scaled.
I'm not sure what is the executable tho. Because there is a "launcher" in root folder and you must execute it in order to play. But in bin folder theres another exe that launch a command line. This is supposed to be the main executable but that behaviour is strange.
If it's DirectX 8.1 you need D3D8.dll not ddraw.dll.
A question: why "All of them" entry is named in that way (only in D3D12 multiple/different GPUs can work together, if I'm not wrong)?
How about change "All of them" to a "Default" adapter (maybe the one actually used by Windows Desktop) so you can bind it to WARP (and grey out the field while WARP is selected as Output API).
If it's DirectX 8.1 you need D3D8.dll not ddraw.dll.
Only the "gameplay" is rendered in direct3d, everything else is directdraw.
It was a common shit back in the day, not sure the advantage of it.
I dont want to use the direct3d wrapper since game is already perfectly supported in new systems, there's no reason to pass it through a wrapper. All dgV options are already supported by the game.
Here it is, Dege. Some save game files for SC PT color light issue. Just drop the folder/profile into Save directory. http://www.megafileupload.com/nloP/JCH.rar
Spot2 and Spot5 are some perfect examples of the issue. The rest are just other areas for you to check the fix.
Hi! Just writing again to remind about the problem with text boxes in Star Wars: Galactic Battlegrounds. And I found another problem with this game. Since version 2.51 after starting new game level I have 2 cursors on the screen - the game's one and the system's. If I press Esc and open Pause menu, then close it, system's cursor disappears, and then I can play normally. On v2.5 this bug doesn't happens.
De-M-oN wrote:This turned out to be a bigger problem than first thought. […] Show full quote
De-M-oN wrote:
I just would wish a bit more stable 60fps. its swapping between 57 and 60. but the 3fps swapping feels a bit stronger than the number difference may guess you. But thats crying at big niveau. I'm happy. But maybe you have a tip to make it more stable 60fps? Using less MSAA didnt help. And the game apparently uses vsync/framelimit too.
This turned out to be a bigger problem than first thought.
Dege wrote:
As for the FPS drop, what about slightly smaller resolution? If it's still jumping between 57/60 fps then the issue probably is not reolution related.
Definitely not.
I could make out where the drops come from. It's the effects like tire smoke, dust/sand clouds, and such.
You especially notice it when you drive with traffic and drive behind of a water truck.
I mean this one:
The water it produces on its backside reduces fps to 30 to 40 fps - the closer you are in the water cloud, the worse the fps gets.
Any chance to fix it?
I'd like to add to this, that the problem is not only with the water truck alone if it sounded like this, it is just the worst for him
Its very irritating especially with steering wheel if the game starts suddenly to lag while you drive around turns, just because you pass by a watertruck.
Even spraying water in rain races by vehicles before you cause slower fps and its a bit annoying.
A fix for this would be nice.
Its not the game itself, it works smooth with direct3d and it works smooth even with zeckensack's wrapper.
stranno wrote:I want to keep 1920x1080 resolution all the time and get properly scaling. […] Show full quote
teleguy wrote:
I downloaded the demo and checked with Afterburner. The menus are DirectX 8.
What do you want to accomplish with dgVoodoo? Currently it can't do anything against low resolutions.
I want to keep 1920x1080 resolution all the time and get properly scaling.
Are you sure this software is accurate? I still think there is ddraw involved. I have seen it lots of times and looks pretty much like this game.
Direct3D 8.1 wrapper doesnt work (in my test) anyway.
Reshade's DirectX 8 to 9 wrapper works so it must be DirectX 8.
Edit: Actually dgVoodoo (D3D8.dll) works as well (at least sometimes) but the menu only takes up a small portion in the upper left corner of the screen.
A question: why "All of them" entry is named in that way (only in D3D12 multiple/different GPUs can work together, if I'm not wrong)?
How about change "All of them" to a "Default" adapter (maybe the one actually used by Windows Desktop) so you can bind it to WARP (and grey out the field while WARP is selected as Output API).
Because,
- For Glide, one choose an adapter (and a connected output) on which the rendering goes
- For DirectX, one choose which adapters to enable for taking part of the DDraw/D3D8 device enumeration
Even old DX can drive multiple GPUs. D3D9Ex and above supports limited video memory sharing between GPUs, even the Desktop Window Manager needs this feature when drawing the desktop in Aero Mode (for multimonitor systems).
What DX12 adds to all of this, is "just" efficient, page-level videomemory sharing through mapping, between GPUs, so the needless copying from one to another, or reusing a texture without copy but with slow reaching because of system bus speed, are avoided.
I added 'Microsoft Basic Render Driver' as an adapter for WARP api output, with 'default' display output.
-------------
Ok, now Pandora Tomorrow:
Here is a debug-screenshot of light polygons drawn with alpha blending and depth-testing disabled.
Clearly there are a yellow zone (RGB: 21,21,19), a red zone (RGB: 21, 19, 19) and a white zone (RGB: 19, 19, 19).
The game simply puts out color coming from the vertex shader (multiplied by a shadow map, so that even the light can be in shadow 😁), so
it's simple like hell. The vertex shader used for the light polygons is:
1Direct3DDeviceImpl8::CreateVertexShader 2Vertex Shader Validator 3 4 vs.1.0 5 6 mov r0, v0 7 max r0.x, v0.xxxx, c0.xxxx 8 mul r0.x, r0.xxxx, c86.xxxx 9 add r0.x, r0.xxxx, c86.wwww 10 mul r0.z, r0.zzzz, c86.zzzz 11 mul r0.y, r0.yyyy, c86.yyyy 12 mul r0.y, r0.yyyy, r0.xxxx 13 mul r0.z, r0.zzzz, r0.xxxx 14 mov r0.w, c1 15 m4x4 oPos, r0, c2 16 m4x4 r1, r0, c18 17 mov oT0, r1 18 mov oT0.z, c1 19 dp4 oT2.x, r0, c22 20 dp4 oT2.y, r0, c23 21 dp4 oT2.z, r0, c24 22 dp4 oT2.w, r0, c25 23 24 mov r1, c87 <----- this part is calculating the color 25 mov r2, v0.xxxx 26 add r2, c1, -r2 27 mul r3, r2, r1 28 mov oD0, r3 <----- color output is in r3 29 mov oD1, r3 30 31Vertex declaration: 32 33 stream:0, v0: FLOAT3 34 stream:0, v1: FLOAT3 35 stream:0, v3: FLOAT2 36 37Generated VS 4.0:
It shows that the game calculates the light vertex color based on its incoming position-like data (it explains why the glitch can be viewpoint-dependent) and then multiply it by a single constant (c87). From the debug log, c87 is set to:
So that, red is the most dominant color (it's factor is 0.003922). It seems the game tries to compensate sg coming from the position to get white color, but it doesn't work very accurately for all cases. The difference between the calculated red, green, blue components is minimal (max 1/256) but if they are drawn in multiple times, and alphablended onto each other (additive blending is used) then the error accumulates, like in this case, and causes that color-glitch (21-19 is 2/256). For 16 bit rendering, with dithering enabled, this is much less visible, if visible at all, at the expense of color banding.
Dege wrote:Ok, now Pandora Tomorrow: […] Show full quote
Ok, now Pandora Tomorrow:
Here is a debug-screenshot of light polygons drawn with alpha blending and depth-testing disabled.
Clearly there are a yellow zone (RGB: 21,21,19), a red zone (RGB: 21, 19, 19) and a white zone (RGB: 19, 19, 19).
The game simply puts out color coming from the vertex shader (multiplied by a shadow map, so that even the light can be in shadow 😁), so
it's simple like hell. The vertex shader used for the light polygons is:
1Direct3DDeviceImpl8::CreateVertexShader 2Vertex Shader Validator 3 4 vs.1.0 5 6 mov r0, v0 7 max r0.x, v0.xxxx, c0.xxxx 8 mul r0.x, r0.xxxx, c86.xxxx 9 add r0.x, r0.xxxx, c86.wwww 10 mul r0.z, r0.zzzz, c86.zzzz 11 mul r0.y, r0.yyyy, c86.yyyy 12 mul r0.y, r0.yyyy, r0.xxxx 13 mul r0.z, r0.zzzz, r0.xxxx 14 mov r0.w, c1 15 m4x4 oPos, r0, c2 16 m4x4 r1, r0, c18 17 mov oT0, r1 18 mov oT0.z, c1 19 dp4 oT2.x, r0, c22 20 dp4 oT2.y, r0, c23 21 dp4 oT2.z, r0, c24 22 dp4 oT2.w, r0, c25 23 24 mov r1, c87 <----- this part is calculating the color 25 mov r2, v0.xxxx 26 add r2, c1, -r2 27 mul r3, r2, r1 28 mov oD0, r3 <----- color output is in r3 29 mov oD1, r3 30 31Vertex declaration: 32 33 stream:0, v0: FLOAT3 34 stream:0, v1: FLOAT3 35 stream:0, v3: FLOAT2 36 37Generated VS 4.0:
It shows that the game calculates the light vertex color based on its incoming position-like data (it explains why the glitch can be viewpoint-dependent) and then multiply it by a single constant (c87). From the debug log, c87 is set to:
So that, red is the most dominant color (it's factor is 0.003922). It seems the game tries to compensate sg coming from the position to get white color, but it doesn't work very accurately for all cases. The difference between the calculated red, green, blue components is minimal (max 1/256) but if they are drawn in multiple times, and alphablended onto each other (additive blending is used) then the error accumulates, like in this case, and causes that color-glitch (21-19 is 2/256). For 16 bit rendering, with dithering enabled, this is much less visible, if visible at all, at the expense of color banding.
So, it's the game, it's not a bug.
But why Komat wrapper does not have this problem, even with newer GPU's?(it has but only 3 places i saw.)
Why if i play with correct GPU this issue does not occur.?
EDIT: When i played this on Win XP on correct GPU i saw color banding preaty often. Is there a way for an end user to fix this.?
A dithering option in dgVoodoo would help the games/other games? What about 16 bit rendering.?
PS I'm not sure what i'm asking....the ideea is to find a way to have this game run without these issue. 😀 Options or settings.
But why Komat wrapper does not have this problem, even with newer GPU's?(it has but only 3 places i saw.)
Why if i play with correct GPU this issue does not occur.?
Probably Komat wrapper works internally in 16 bit mode, emulating the old GPUs computing units.
Last edited by lowenz on 2016-04-22, 09:54. Edited 1 time in total.
EDIT: When i played this on Win XP on correct GPU i saw color banding preaty often. Is there a way for an end user to fix this.?
A dithering option in dgVoodoo would help the games/other games? What about 16 bit rendering.?
PS I'm not sure what i'm asking....the ideea is to find a way to have this game run without these issue. 😀 Options or settings.
I don't think problem is 16 bit *COLOUR* output but the 16 bit *processing* of data (int16? fp16?)
So using WinXP in 16 bit *COLOUR* mode can't make a difference (it can only ruin the output).
EDIT: When i played this on Win XP on correct GPU i saw color banding preaty often. Is there a way for an end user to fix this.?
A dithering option in dgVoodoo would help the games/other games? What about 16 bit rendering.?
PS I'm not sure what i'm asking....the ideea is to find a way to have this game run without these issue. 😀 Options or settings.
I don't think problem is 16 bit *COLOUR* output but the 16 bit *processing* of data (int16? fp16?)
So using WinXP in 16 bit *COLOUR* mode can't make a difference (it can only ruin the output).
Sorry, I thought you were talking about a *recent* try with WinXP and today hardware to force 16 bit colours 😁 (in Win8/8.1/10 there's no more 16 bit output).
I can see the issue with Komat wrapper too, on my AMD HD7850. I tried it quickly at 2 different positions.
If there is difference between dgVoodoo and Komat wrapper then it may be some precision difference.
DX8 standard spcecifies 22 bit floating point precision as a minimum, for vertex shaders v1.0 and v1.1.
dgVoodoo obviously uses IEEE floats through vs4.0. I don't know how modern hw handles old vs code coming from DX9, I think it's up to it.
Some may emulates the old, less precision, some may map code to modern float processing unit, I don't really know.