First post, by RetroJunkHoarder88
Hey there, Im trying to build a gaming rig using components lying around to take to LAN parties to play retro games.
Because this is meant to play 2000s era games, I have decided to use a CRT: Dell Trinitron P992.
Specs:
Video: Nvidia GTX 970 with no VGA just DVI, Displayport and HDMI.
Monitor is connected via passive DVI to VGA adapter (4 pin)
Motherboard: ASUS P8Z77-V with latest BIOS (1402)
CPU: Core i7 2700K
Using Windows 10 I noticed that the screen is constantly changing its brightness depending on if there is a dark or bright window being displayed. I am running the latest drivers that this card supports. After googling it seems like there is some automatic brightness setting that is just flat out missing on my Windows and NVIDIA display settings. I also checked the Power settings to find this option but no luck. Its just missing. Some people suggested manually edit the registry but I can never find the exact entry that others say works for them. I just dont have it.
Here is where it gets weird.
I thought oh this is just some Win 10 nonsense so I installed Windows 7 SP1 and lo and behold: same problem.
I then pulled out the video card and switched to Intel Integrated HD Graphics (3000 I think?). The issue went away.
Ok so its the video card: I then tried other adapters, DVI(1 Pin) to VGA, Active Displayport to VGA, Active HDMI to VGA and Active DVI to VGA. None worked.
So now either I use integrated graphics, drop the CRT or find some better video card with a VGA port and pray.
Seems like some data is not being transmitted through the adapter but i'm not sure. Has anyone else seen this and if so any suggestions on something else I could try to fix this?