First post, by Squall Leonhart
DgVoodoo leaks d3d12 resources when presentation and rendering are not on the same GPU (muxless switchable graphics behavior) on
intel + Geforce configurations.
DgVoodoo leaks d3d12 resources when presentation and rendering are not on the same GPU (muxless switchable graphics behavior) on
intel + Geforce configurations.
Squall Leonhart wrote on 2026-02-01, 09:13:DgVoodoo leaks d3d12 resources when presentation and rendering are not on the same GPU (muxless switchable graphics behavior) on
intel + Geforce configurations.
No, it does not. If you mean that guru3dforums (or whatever) thread, that was actually me. But I have a plain desktop machine so it's not even a muxless case, it's something that they just imagined for themselves. But even with a muxless setup, dgVoodoo runs on the same path so it should not matter.
Dege wrote on 2026-02-01, 16:29:Squall Leonhart wrote on 2026-02-01, 09:13:DgVoodoo leaks d3d12 resources when presentation and rendering are not on the same GPU (muxless switchable graphics behavior) on
intel + Geforce configurations.No, it does not. If you mean that guru3dforums (or whatever) thread, that was actually me. But I have a plain desktop machine so it's not even a muxless case, it's something that they just imagined for themselves. But even with a muxless setup, dgVoodoo runs on the same path so it should not matter.
I can assure you, yes it does.
even happens on Intel+Intel (Iris IGP Presentation + Iris Max DGPU Rendering.)
How?
What are the repro steps?
Btw, you mean leaking d3d12 resources or plain memory?
Because what I posted on guruwhatever was about leaking system memory, and that occured with an NV dGPU + Intel iGpu configuration when running the rendering on the iGPU.
MSIA hooked the D3D12 Present call and always called into NV API from the hook, so I got an internal error from an OS component for each frame presentation call, and that was accompanied with a constantly growing memory leak. When I closed MSIA without closing the demo doing the rendering then leaking memory just stopped. When I restarted it again then it continued.
Now that I happen to have an Intel dGPU + Intel iGPU setup, I "interestingly" cannot reproduce the leak. Because no NV API to call into.
(Btw, as a side note, an NV 5060Ti is practically unusable with 32 bit D3D12. The GPU just hangs or crashes when the GPU usage reaches 85-90%.)