VOGONS


dgVoodoo 2 for DirectX 11

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 1800 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
lowenz wrote:

If you want a baseline testbed, I'm ready to test a 2009 D3D10 system (a good notebook with a Dual Core Proc@2.5 GHz + 4 GB RAM + GeFoce 9600M but with Windows10 😁) for the classic 1280x800 / 1024/768 resolutions.

Ok, thanks, then I'm going to include the 10.0 support in the next WIP.

Peixoto wrote:

What do we lose with that? won't textures actually look better without mipmaps?

Generally speaking, I think not. Some looks better but Moire-like effect is present with large resolution textures.
But, as I said, we would only lose the mipmapping for colorkeyed textures (with colorkeying on) which are, I think, not even multilevel ones
in practice.

Peixoto wrote:

I can understand a depth blt, but how many games you now that actually lock the depth buffer, multisampled or not ?

Not many, actually, I can only instance Druuna Morbus Gravis, which has an Alone In The Dark-like rendering: static 3D background but living characters are rendered in 3D way onto it. In this game the z-buffer is also part of the background, the game uploads background z-data manually to the z-buffer. Or, there is BlairWitch 3 which is very similar, but I can't remember if exactly the same as for the rendering method.

Edit: Back in the ancient times, locking z-buffer was a common method for visibility testing, typically used for lens-flare effects. The game
rendered the scene and then locked the z-buffer to see what depth values are in the buffer at the 'sun position'. If they were the same as the one used for clearing the z-buffer then the sun were treated as visible, otherwise some objects got drawn in front of it. (Later 'occlusion query' was invented to avoid locking the z-buffer.)
It was very typical for Glide games, I don't know how it was true for DX games but I cannot remember if encountered one doing that.

Peixoto wrote:

can SM 4.0 pixelshaders output depth (SM 3.0 can) ? would that not be enough for depth blts ?

Yes, they can emit depth values, but the problem is on the input side, the "z to z" shader has to read the input data from another z-buffer.
And the input one cannot be multisampled for 10.0, what is more, yesterday found that 10.0 doesn't even support raw memory copying (without shaders) from z-buffer to z-buffer. So, for 10.0, it all has to be substituted with shader z-copy that only works for non-AA cases.

Reply 1801 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t

I have a new WIP:

http://dege.fw.hu/temp/dgVoodooWIP20.zip

I finished the first version of the 10.0 support, and unlike I said previously, it has full precompiled shader and vertex tweening support after all.
But, citating from the readme:

WIP20 readme wrote:
What you should know about the new features compared to v2.51: […]
Show full quote

What you should know about the new features compared to v2.51:

1. Various D3D8-bugfixings, I tried a lot of new D3D8 demos from Pouet.

2. Internal DirectShow hooking support for DirectDraw, to avoid blank screens when game movie is played back through DShow.

3. Window handling is partly rewritten to avoid rendering deadlocks

4. Adding feature level 10.0 support as an output API. Precompiled shaders are available for it too.

However 10.0 currently has the following limitations:

- No texture mipmapping with Glide
- No mipmapping in DX(<=7) when multilevel colorkeyed textures are used (seems to be a very rare case)
- No Phong shading is available (not much loss)

10.0 is so strict on Z-buffer usage that:

- *Z-buffer copy with forced MSAA doesn't work
- *Z-buffer lock doesn't work at all
- **3D Rendering into cube textures doesn't work with single Z-buffer

*These cases can and much likely will cause problems in games with Alone-In-The-Dard-style rendering like Druuna Morbus Gravis and BlairWitch 3.

**I think it's not a problem in practice, only affects 1-2 sample applications from the DX7/8 SDK. It's not typical for games doing such operations.

I share this mainly for testing 10.0 and still haven't checked several problems you reported (Star Wars, DX8 thrash driver, stucking resolution,...).
For 10.0, don't forget to select it in the setup as the output API because the default is 10.1.
If 10.0 proves to be useful then I may put an auto-mechanism into the runtimes for choosing 10.0 when 10.1 is unavailable.

Reply 1802 of 3949, by teleguy

User metadata
Rank Member
Rank
Member

I tried the new WIP on my Laptop (Windows 7 64 bit, HD 3200) with 10.0 selected.

So far I tested

Jedi Knight - dgVoodoo watermark doesn't show, hardware acceleration button is missing, software mode works

Incoming -shows error message "Unable to find suitable device"

MDK -doesn't even launch

Reply 1805 of 3949, by teleguy

User metadata
Rank Member
Rank
Member
lowenz wrote:

Maybe it's a good thing to reset ALL the options (deleting old config files, the general AND the ones in the game folders) due to the virtual device changes.

The laptop gpu is DirectX 10.0 only so I didn't use dgVoodoo prior to that, hence no old config files.

Edit: If you mean game config files, all tested games were freshly installed.

Reply 1806 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie
lowenz wrote:
Dege wrote:

- No Phong shading available (tolerable, as it didn't prove to be very useful in practice + performance issues again, it's very computation-consuming)

Why the option is still there (in the configurator)? 😐

I reply by myself 😁
The old 10.1 feature level is still there AND we got the 10.0 as an alternative, good thing! With 10.0 selected the Phong setting is in fact greyed out!

Reply 1808 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

Without creating a dedicated profile (see the problem), the emulation goes well on this old and very weak - for today standards 😁 - 9600M.

Unreal Gold NaliCastle classic flyby runs @92 FPS vs 180 FPS ("Native Direct Draw" in a 1024x768 window to avoid VSync Windows 10 limitations), all details maxed details out.
IT'S GOOD!

Reply 1809 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

W-O-W, *helluva* 😁 new problem in Splinter Cell Versus (10.1 and 10.0 rendering path), on my Radeon 7850! Never ever seen before this release, so it' strictly related to new D3D8 changes.

*With normal/night vision there's some serious FPS killing going on in open space, looking at distance (1.5 FPS!!!!!!)
SCCT_Versus_2016_04_16_03_12_54_961.png

*With thermal vision the FPS rate raises up again! (30 FPS)
SCCT_Versus_2016_04_16_03_12_49_474.png

It seems to me an occlusion-related problem! And see the CPU utilisation topping the maximum! (quad threaded CPU -> 25% = 1 virtual CPU full)

Reply 1812 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
lowenz wrote:
W-O-W, *helluva* :D new problem in Splinter Cell Versus (10.1 and 10.0 rendering path), on my Radeon 7850! Never ever seen befor […]
Show full quote

W-O-W, *helluva* 😁 new problem in Splinter Cell Versus (10.1 and 10.0 rendering path), on my Radeon 7850! Never ever seen before this release, so it' strictly related to new D3D8 changes.

*With normal/night vision there's some serious FPS killing going on in open space, looking at distance (1.5 FPS!!!!!!)
SCCT_Versus_2016_04_16_03_12_54_961.png

*With thermal vision the FPS rate raises up again! (30 FPS)
SCCT_Versus_2016_04_16_03_12_49_474.png

It seems to me an occlusion-related problem! And see the CPU utilisation topping the maximum! (quad threaded CPU -> 25% = 1 virtual CPU full)

Now all dynamic-compiled pixel shaders target 4_0 and it turned out that something isn't compliant to 4_0 when it comes to sampling depth buffers for nv shadow hack so the shader(s) won't compile. Anyway, I fixed it, thanks.

lowenz wrote:

The general profile "finds" the correct VGA but a dedicated profile doesn't

What profiles are these?

teleguy wrote:
I tried the new WIP on my Laptop (Windows 7 64 bit, HD 3200) with 10.0 selected. […]
Show full quote

I tried the new WIP on my Laptop (Windows 7 64 bit, HD 3200) with 10.0 selected.

So far I tested

Jedi Knight - dgVoodoo watermark doesn't show, hardware acceleration button is missing, software mode works

Incoming -shows error message "Unable to find suitable device"

MDK -doesn't even launch

According to the symptoms, it seems as if dgVoodoo couldn't initialize D3D11 at all. Did you set the output API to 10.0 in the setup?

Reply 1813 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

As you can see in the screenshot I simply added the Unreal System folder path (where the exe is located) -> "Unavailable" adapter.
With general (generic?) profile the VGA is correctly found.

I'll try launching as admin.

Last edited by lowenz on 2016-04-16, 08:56. Edited 1 time in total.

Reply 1814 of 3949, by Dege

User metadata
Rank l33t
Rank
l33t
lowenz wrote:

As you can see in the screenshot I simply added the Unreal System folder path (where the exe is locaetd) -> "Unavailable" adapter.
With general (generic?) profile the VGA is correctly found.

I'll try launching as admin.

Ok, thanks. I almost messed it up with some nVidia profiles. I'll check it out.

Reply 1815 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie
Dege wrote:
lowenz wrote:

As you can see in the screenshot I simply added the Unreal System folder path (where the exe is locaetd) -> "Unavailable" adapter.
With general (generic?) profile the VGA is correctly found.

I'll try launching as admin.

Ok, thanks. I almost messed it up with some nVidia profiles. I'll check it out.

?

Unreal (1998) is my custom profile but how "nVidia" profiles play a role here? 😐

Reply 1816 of 3949, by teleguy

User metadata
Rank Member
Rank
Member
lowenz wrote:

As you can see in the screenshot I simply added the Unreal System folder path (where the exe is located) -> "Unavailable" adapter.
With general (generic?) profile the VGA is correctly found.

I'll try launching as admin.

I had the same problem. Simply clicking on OK will create a local dgVoodoo.conf and the next time you run dgVoodooSetup from inside the game folder the adapter will be selectable.

Reply 1818 of 3949, by teleguy

User metadata
Rank Member
Rank
Member
Dege wrote:

According to the symptoms, it seems as if dgVoodoo couldn't initialize D3D11 at all. Did you set the output API to 10.0 in the setup?

Yes, 10.0 was selected. I had already thought myself it was something like dgVoodoo not being able to access the gpu at all. I checked dxdiag but it did not find any issues, the installed DirectX version was 11 and DDI version was 10.

Reply 1819 of 3949, by lowenz

User metadata
Rank Oldbie
Rank
Oldbie

Performance report!

dgVoodoo 2.52 WIP ( GeForce Ti 4800 profile + 128 MB) + Windows 10 64 + 9600M (9600M! A 2008/2009 noteboook VGA!)

Unreal 2.27i (but Original 2001 DDRAW) Fullscreen 1280x800 Maxed out with NO AA -> 113 FPS
Unreal 2.27i (but Original 2001 DDRAW) Fullscreen 1280x800 Maxed out with MSAA 4x -> 101 FPS (!!!!!)

It's good! It's really good man!