VOGONS


Bought this (Modern) hardware today

Topic actions

Reply 2300 of 2320, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
clueless1 wrote on 2026-01-17, 12:07:
What frequency is your monitor? Are you vsyncing or not? I used to have a 144Hz monitor but it started dying (wigging out at 1 […]
Show full quote
Ozzuneoj wrote on 2026-01-16, 08:16:

[ I haven't gone too crazy with the tuning but I have settled on an undervolt curve that peaks at 2715Mhz at 935mV, with a +1500Mhz memory overclock (tested stable using Memtest Vulkan). The power consumption is down about 50-60 watts from stock and between the undervolt and memory overclock performance is actually up around 5%! In the 3dmark Speedway test I scored 6820 at these settings, topped out between 227-236 watts and only hit 58C while the GPU fans don't even have to spin up much past idle due to the huge cooler. It idles at 18 watts too... which just seems crazy.

What frequency is your monitor? Are you vsyncing or not? I used to have a 144Hz monitor but it started dying (wigging out at 144Hz, but acting normal when I drop to 60Hz). So currently I've got a 60Hz monitor and I just tried vsyncing everything in Nvidia Settings. Two things:
1) power draw is even lower. I'm seeing 40-150 watts depending on the complexity of the scene with the majority of the time being <100 watts. All the while, frames are locked at 60 fps.
2) gameplay feels even smoother when locked at 60 than when unlocked and running up above 100.

I was able to get even more aggressive with my overclock/undervolt, knowing that the vsync will still keep the power in check. I feel like increased power on the bottom end (in the most framerate-taxing scenarios) will be of more help to the glass smooth feeling. My current settings are: overclock core by 150Mhz and mem by 1200Mhz. Then flatten the curve at 995mv to 2880Mhz. Doing the overclock first raises the entire curve between 0 and 995mv by 150Mhz. Honestly, 995mv was already at 2700 at stock, so I only raised the 995mv spot from 2850 to 2880 after the 150Mhz o/c.

edit: just plugged my old 144Hz monitor in and I could run it at 120Hz or 100Hz. Tried both, but there were too many instances of game framerates dipping below those two numbers and it ended up feeling less smooth when locking vsync to 120Hz or 100Hz. I actually like locking it at 60Hz best. I can easily run any game at a setting that will *never* dip below 60fps, and it seems to me that when the monitor is locked at a single refresh rate that never deviates from the game's framerate, the experience of fluidity is at its peak.

I have two 240Hz monitors, and I looooove high refresh rates and high frame rates. Sadly, my eyes were spoiled many years ago by running games at 1600x1200 @ 109Hz on my old CRT. I got a 60Hz LCD in ~2010 and the consistent geometry and clarity were nice, but the motion clarity and lower refresh rate just never felt good. I got a 120Hz BenQ gaming LCD in 2015 and it was a huge improvement. Going to 240Hz a couple years ago was also very noticeable for me, so I try to keep my frame rates as high as possible. I really don't play the latest AAA games though because they're just not my thing. And I guess that works out, because most of them would absolutely not run anywhere near 240Hz... or possibly even 120Hz. 🤣

I try to cap frame rates or use Vsync where possible. I have found that in some games vsync or in-game caps destroy performance. And it isn't the usual half-vsync frame rate thing... it's something engine related. However, if I set a universal 240Hz maximum frame rate in the nvidia control panel it seems to not have these issues and I still don't experience enough screen tearing to be an issue. For example, in the game Teardown I am now able to run the game at a solid 240fps during "normal" gameplay with basically no fluctuations at all. When I start bringing down buildings and tens of thousands of voxels are flying in all directions, it comes down of course, but that is to be expected. With the in-game vsync enabled it would either lock it at 120fps or it would fluctuate a lot between 100 and 160 or so, which was weird.

Power consumption definitely comes down with lower frame rates though. That was the main reason I enabled the cap. I can't remember what the GPU's power consumption was when I was playing Teardown at 240fps... I feel like it was in the 150-170 watt range but I will have to check again. I am very glad to be able to cap all games universally though... Some don't have a cap in the menus and loading screens and my counter will show 3500fps+, which feels unnecessary somehow. 😅

Now for some blitting from the back buffer.

Reply 2301 of 2320, by pete8475

User metadata
Rank Oldbie
Rank
Oldbie

Late last night I made a silly decision and ordered a 12GB RTX4070 for ~$600 Canadian on ebay. It's a used card pulled from an Acer computer.

Hopefully it arrives during the week so I can test it out next weekend.

Reply 2302 of 2320, by H3nrik V!

User metadata
Rank l33t
Rank
l33t
Ozzuneoj wrote on 2026-01-17, 20:03:

[

I have two 240Hz monitors, and I looooove high refresh rates and high frame rates. Sadly, my eyes were spoiled many years ago by running games at 1600x1200 @ 109Hz on my old CRT.

Wow, that CRT must've cost an arm and a leg back then? Maybe even a kidney. Must have looked downright beautiful!

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 2303 of 2320, by bestemor

User metadata
Rank Oldbie
Rank
Oldbie
H3nrik V! wrote on 2026-01-18, 17:13:
Ozzuneoj wrote on 2026-01-17, 20:03:

[

I have two 240Hz monitors, and I looooove high refresh rates and high frame rates. Sadly, my eyes were spoiled many years ago by running games at 1600x1200 @ 109Hz on my old CRT.

Wow, that CRT must've cost an arm and a leg back then? Maybe even a kidney. Must have looked downright beautiful!

I don't remember any kidneys or other bodyparts were involved though ?
A 21 inch CRT of decent quality with a 340Mhz pixel clock (Samsung 1100DF/Viewsonic P227f), did only cost around $400 pre-inflation (or ca 4000 DKK/old conversion rates) back in 2005-2006, when the last few remaining boxed/new retail specimens were available for purchase online. Max resolution was 2048x1536@75hz.

Reply 2304 of 2320, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
H3nrik V! wrote on 2026-01-18, 17:13:
Ozzuneoj wrote on 2026-01-17, 20:03:

[

I have two 240Hz monitors, and I looooove high refresh rates and high frame rates. Sadly, my eyes were spoiled many years ago by running games at 1600x1200 @ 109Hz on my old CRT.

Wow, that CRT must've cost an arm and a leg back then? Maybe even a kidney. Must have looked downright beautiful!

Nope, I bought it as a refurb in 2005 for under $200 US. It is an HP P1230, which is a rebadged Mitsubishi Diamond Pro 2070SB.

https://crtdatabase.com/crts/mitsubishi/mitsubishi-2070sb

2048x1536 at 86Hz
1600x1200 at 109Hz
1440x1080 at 120Hz
1024x768 at 160Hz

I still have it and it seems to still work as well as it did the day I bought it over 20 years ago. And yes, it has been interesting to lug it around with me every time I have moved, and I have had it set up on the 2nd, 3rd and 4th floors of a few different houses in that time. I will say though, it weighs almost 30 pounds less than the much hyped Sony GDM-FW900, takes up less space and supports higher refresh rates at 4:3 resolutions. I think this is a contender for the best CRT ever made, honestly. Even the Iiyama Vision Master Pro 514 seems to have almost the same specs, so probably uses the same Mitsubishi tube.

I cracked mine open a few years ago to see how the caps looked, and I couldn't really get to them easily but everything I could see looked perfect, so they apparently managed to keep plague caps out of these too.

Last edited by Ozzuneoj on 2026-01-19, 09:30. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 2305 of 2320, by H3nrik V!

User metadata
Rank l33t
Rank
l33t

Holy smoke, I've been too late to the game buying CRTs 😭 back then when everybody were eager to get on the LCD wave, we all just scrapped our Trinitron based monitors. D'oh!

If it's dual it's kind of cool ... 😎

--- GA586DX --- P2B-DS --- BP6 ---

Please use the "quote" option if asking questions to what I write - it will really up the chances of me noticing 😀

Reply 2306 of 2320, by Nexxen

User metadata
Rank l33t
Rank
l33t
pete8475 wrote on 2026-01-17, 23:43:

Late last night I made a silly decision and ordered a 12GB RTX4070 for ~$600 Canadian on ebay. It's a used card pulled from an Acer computer.

Hopefully it arrives during the week so I can test it out next weekend.

Sounds about right, it's like 380€. USD is 430.
Unless 600 is a lot in Canada.

PC#1 Pentium 233 MMX - 98SE
PC#2 PIII-1Ghz - 98SE/W2K

- "One hates the specialty unobtainium parts, the other laughs in greed listing them under a ridiculous price" - kotel studios
- Bare metal ist krieg.

Reply 2307 of 2320, by pete8475

User metadata
Rank Oldbie
Rank
Oldbie
Nexxen wrote on 2026-01-19, 08:08:
pete8475 wrote on 2026-01-17, 23:43:

Late last night I made a silly decision and ordered a 12GB RTX4070 for ~$600 Canadian on ebay. It's a used card pulled from an Acer computer.

Hopefully it arrives during the week so I can test it out next weekend.

Sounds about right, it's like 380€. USD is 430.
Unless 600 is a lot in Canada.

No not a huge amount by any means but the prices have been creeping up lately on video cards, so I ended up getting this 12GB card now rather than a 16GB card in a few months as I had planned.

Reply 2308 of 2320, by Nexxen

User metadata
Rank l33t
Rank
l33t
pete8475 wrote on 2026-01-19, 10:58:
Nexxen wrote on 2026-01-19, 08:08:
pete8475 wrote on 2026-01-17, 23:43:

Late last night I made a silly decision and ordered a 12GB RTX4070 for ~$600 Canadian on ebay. It's a used card pulled from an Acer computer.

Hopefully it arrives during the week so I can test it out next weekend.

Sounds about right, it's like 380€. USD is 430.
Unless 600 is a lot in Canada.

No not a huge amount by any means but the prices have been creeping up lately on video cards, so I ended up getting this 12GB card now rather than a 16GB card in a few months as I had planned.

Well, having a RX 5600XT (like a 2060) with 6gb of ram I can't really see the difference between 12 and 16gb 😀
Anyway have fun with it!

PC#1 Pentium 233 MMX - 98SE
PC#2 PIII-1Ghz - 98SE/W2K

- "One hates the specialty unobtainium parts, the other laughs in greed listing them under a ridiculous price" - kotel studios
- Bare metal ist krieg.

Reply 2309 of 2320, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Historically, I don't think doubling up on GPU RAM ever paid off. By the time minimum spec reached the high amount of RAM for that card, the GPU was too slow or didn't have the features. Plus it never boded well for overclocking, so you could have situations where games got more GPU and memory speed demanding vs memory size demanding and your buddy who bought the middle of the road RAM load, can overclock into smoothness and you're sitting there in stutter, so from a "future proofing" point of view, it could actually be detrimental.

Most blatant example I can remember was with Geforce 4 class, the 128MB ones actually got a slower spec of RAM as well as being hampered by the statistical whammy of "you have to flip twice as many heads in a row to win silicon lottery" to have good overclock. That resulted in 128MB cards getting hung up around a 15k 3Dmark score while 64Mb cards could blast past them into the 20k and above range. (Given stout CPU and motherboard performance also)

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 2310 of 2320, by robertmo3

User metadata
Rank Oldbie
Rank
Oldbie

no need to worry, only 5090 has doubled ram 😉

Reply 2311 of 2320, by Mandrew

User metadata
Rank Member
Rank
Member

Total modern hardware noob here: Are these 9060XT cards really that picky when it comes to displays hooked to the HDMI port?
I tried 3: one is an old full HD TV that is confirmed working with a Radeon 270X but there was no image with the 9060XT (signal yes, image no). The other is a 5 year old 4K Hisense TV that's displaying the BIOS in green and Windows 11 has messed up blueish pink colors. The last one is an ancient Sharp hd-ready LCD TV that's working perfectly.
I tried 2 HDMI cables, no idea what version they are. Is it really possible that a modern GPU doesn't work with a 2020 4K TV but does with a mediocre TV from 2009?
I know guys, I'm supposed to get a modern monitor to use with a new GPU and I'm planning to do so, I just wanted to test it until then and I'm really surprised that it didn't go smoothly at all.

Reply 2312 of 2320, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
Mandrew wrote on 2026-01-19, 18:08:
Total modern hardware noob here: Are these 9060XT cards really that picky when it comes to displays hooked to the HDMI port? I t […]
Show full quote

Total modern hardware noob here: Are these 9060XT cards really that picky when it comes to displays hooked to the HDMI port?
I tried 3: one is an old full HD TV that is confirmed working with a Radeon 270X but there was no image with the 9060XT (signal yes, image no). The other is a 5 year old 4K Hisense TV that's displaying the BIOS in green and Windows 11 has messed up blueish pink colors. The last one is an ancient Sharp hd-ready LCD TV that's working perfectly.
I tried 2 HDMI cables, no idea what version they are. Is it really possible that a modern GPU doesn't work with a 2020 4K TV but does with a mediocre TV from 2009?
I know guys, I'm supposed to get a modern monitor to use with a new GPU and I'm planning to do so, I just wanted to test it until then and I'm really surprised that it didn't go smoothly at all.

That doesn't sound right to me. If you google 9060XT HDMI problems and don't see an explanation of what you're experiencing, sadly the card might be defective. Granted, it's pretty rare these days for a card to have problems like that right out of the box.

Now for some blitting from the back buffer.

Reply 2313 of 2320, by Munx

User metadata
Rank Oldbie
Rank
Oldbie
BitWrangler wrote on 2026-01-19, 16:36:

Historically, I don't think doubling up on GPU RAM ever paid off. By the time minimum spec reached the high amount of RAM for that card, the GPU was too slow or didn't have the features.

Historically, graphics cards used to come with more VRAM than you needed for that time and would double that amount every couple generations or so. We have not gotten much of a VRAM bump for almost a decade now. Had the 3070ti come with 16GB of memory instead of 8, I would still be very happy with it and would not have upgraded.

My builds!
The FireStarter 2.0 - The wooden K5
The Underdog - The budget K6
The Voodoo powerhouse - The power-hungry K7
The troll PC - The Socket 423 Pentium 4

Reply 2314 of 2320, by Mandrew

User metadata
Rank Member
Rank
Member
Ozzuneoj wrote on 2026-01-19, 18:40:

sadly the card might be defective. Granted, it's pretty rare these days for a card to have problems like that right out of the box.

Entirely possible but the card is running as expected with the Sharp display though. Games are working, colors are OK, temps are fine. Hard to believe that a card is bad with 1 display but fine with another.

Reply 2315 of 2320, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
Mandrew wrote on 2026-01-19, 19:08:
Ozzuneoj wrote on 2026-01-19, 18:40:

sadly the card might be defective. Granted, it's pretty rare these days for a card to have problems like that right out of the box.

Entirely possible but the card is running as expected with the Sharp display though. Games are working, colors are OK, temps are fine. Hard to believe that a card is bad with 1 display but fine with another.

If the old HD Sharp is only 720P, then it could be that it just isn't affected by whatever is not working on the card, while the ones demanding higher resolutions are. You could try hooking up two displays so that you can (in Windows) select one that isn't working and change it to a lower resolution to see if that makes an image appear. That could narrow down the issue at least.

A five year old 4k TV should work on any recent video card. Heck, my LG C1 OLED is almost a 5 year old model now, and I still feel like it's practically new. heh

Again, if you Google it and people aren't complaining in droves about HDMI compatibility problems then it is more likely to be a defect with something. Maybe try getting a brand new HDMI cable to be sure it isn't a couple of inadequate or defective cables.

Now for some blitting from the back buffer.

Reply 2316 of 2320, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
Mandrew wrote on 2026-01-19, 18:08:
Total modern hardware noob here: Are these 9060XT cards really that picky when it comes to displays hooked to the HDMI port? I t […]
Show full quote

Total modern hardware noob here: Are these 9060XT cards really that picky when it comes to displays hooked to the HDMI port?
I tried 3: one is an old full HD TV that is confirmed working with a Radeon 270X but there was no image with the 9060XT (signal yes, image no). The other is a 5 year old 4K Hisense TV that's displaying the BIOS in green and Windows 11 has messed up blueish pink colors. The last one is an ancient Sharp hd-ready LCD TV that's working perfectly.
I tried 2 HDMI cables, no idea what version they are. Is it really possible that a modern GPU doesn't work with a 2020 4K TV but does with a mediocre TV from 2009?
I know guys, I'm supposed to get a modern monitor to use with a new GPU and I'm planning to do so, I just wanted to test it until then and I'm really surprised that it didn't go smoothly at all.

My guess would be (U)EFI boot or 4G Decode/ReBAR doing weird things.
Does it do it with CSM enabled (disabled) ?
I had black screen issue on first boot of my 9060 XT 16GB (Sapphire Pure), but after disabling 4G Decode and enabling CSM it started working (however, I had this issue on DP - not HDMI). After AMD driver installation inside booted Windows, I could enable both ReBar and 4G Decode + disable CSM on next restart after driver installation (and everything worked fine).

Reply 2317 of 2320, by Mandrew

User metadata
Rank Member
Rank
Member

I just remembered: when the Hisense 4K TV is connected to the 9060XT the TV says 4K@30Hz, meaning that the cable is probably an older 1.4 version. Could that cause this? I have to buy a better cable before I RMA this thing.

agent_x007 wrote on 2026-01-19, 20:27:

My guess would be (U)EFI boot or 4G Decode/ReBAR doing weird things.

I'll check all of those things soon, new rig is under construction.

Reply 2318 of 2320, by Munx

User metadata
Rank Oldbie
Rank
Oldbie
Mandrew wrote on Yesterday, 09:20:

I just remembered: when the Hisense 4K TV is connected to the 9060XT the TV says 4K@30Hz, meaning that the cable is probably an older 1.4 version. Could that cause this? I have to buy a better cable before I RMA this thing.

agent_x007 wrote on 2026-01-19, 20:27:

My guess would be (U)EFI boot or 4G Decode/ReBAR doing weird things.

I'll check all of those things soon, new rig is under construction.

It's also worth updating to a new motherboard BIOS - it's not uncommon for newer GPUs to just not talk with older motherboards properly, even if it's just a year or so difference.

My builds!
The FireStarter 2.0 - The wooden K5
The Underdog - The budget K6
The Voodoo powerhouse - The power-hungry K7
The troll PC - The Socket 423 Pentium 4

Reply 2319 of 2320, by Mandrew

User metadata
Rank Member
Rank
Member

Got a new flashy "8K" 2.1 HDMI cable, problem went away. I checked the other cable just for fun: it didn't show the green BIOS anymore either and the 4K Hisense TV now reports 4K@60Hz. So the HDMI cable was old but it wasn't what caused the problem. I tried it with Doom Eternal and it worked fine (capped at 60 FPS). Then I went to get the kid from school and when I got back and turned the PC on it was green BIOS again no matter which cable I used. Windows is fine, Doom Eternal plays fine, I even cooked the GPU for 10 minutes with Furmark to get an error out of it, nothing. Changing CSM settings doesn't make a difference. A BIOS update is next.
Basically the PC turns on, the white Asrock logo appears and the screen is green, BIOS is also green, then Windows 11 loads and it's all nice and dandy from there.
The green thing only happens with the 4K Hisense TV. I guess I have to find someone with a decent monitor to make the final decision because I don't want to deal with the RMA if it's all caused by a shitty TV and not a faulty GPU.