VOGONS


First post, by RetroJunkHoarder88

User metadata
Rank Newbie
Rank
Newbie

Hey there, Im trying to build a gaming rig using components lying around to take to LAN parties to play retro games.

Because this is meant to play 2000s era games, I have decided to use a CRT: Dell Trinitron P992.
Specs:
Video: Nvidia GTX 970 with no VGA just DVI, Displayport and HDMI.
Monitor is connected via passive DVI to VGA adapter (4 pin)
Motherboard: ASUS P8Z77-V with latest BIOS (1402)
CPU: Core i7 2700K

Using Windows 10 I noticed that the screen is constantly changing its brightness depending on if there is a dark or bright window being displayed. I am running the latest drivers that this card supports. After googling it seems like there is some automatic brightness setting that is just flat out missing on my Windows and NVIDIA display settings. I also checked the Power settings to find this option but no luck. Its just missing. Some people suggested manually edit the registry but I can never find the exact entry that others say works for them. I just dont have it.

Here is where it gets weird.

I thought oh this is just some Win 10 nonsense so I installed Windows 7 SP1 and lo and behold: same problem.

I then pulled out the video card and switched to Intel Integrated HD Graphics (3000 I think?). The issue went away.

Ok so its the video card: I then tried other adapters, DVI(1 Pin) to VGA, Active Displayport to VGA, Active HDMI to VGA and Active DVI to VGA. None worked.

So now either I use integrated graphics, drop the CRT or find some better video card with a VGA port and pray.

Seems like some data is not being transmitted through the adapter but i'm not sure. Has anyone else seen this and if so any suggestions on something else I could try to fix this?

Reply 1 of 2, by bZbZbZ

User metadata
Rank Member
Rank
Member

Interesting. I have a very similar system (Core i7-3770, GTX 980 driving a CRT using a passive DVI-I to VGA adapter), and I've never experienced any automatic brightness. I'm running a dual boot of Windows 10 x64 and Windows XP 32-bit.

When I search online (you have probably searched longer than me so you might have come across the same material already), I see suggestions that this might be a monitor color profile issue (Start > type 'color management')... and some other people reporting that the auto-brightness issues stopped when they disabled of motherboard (MSI) overclocking software.

Do you have any older LCD flat panel displays lying around? If you have an LCD monitor with VGA input, do you get auto brightness when you connect the GTX 970 to the LCD monitor using VGA through the DVI-I passive adapter? How about when you connect the GTX 970 to an LCD flat panel using DVI-D? HDMI? Just some ideas that might help you narrow down what combination of parts might be involved... good luck...

Reply 2 of 2, by RetroJunkHoarder88

User metadata
Rank Newbie
Rank
Newbie

Thank you for the reply and for your experience. Since you have similar specs and no issues, I will try to see if maybe the monitor is the cause. It would be weird because when I switch to Intel graphics it definitely stopped. To add some more info, this monitor was purchased from eBay and the seller performed a WinDAS calibration before selling so maybe he messed something up.

I do have some older LCDs with VGA so I will try them and report back.