VOGONS


Reply 20 of 28, by Almoststew1990

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:

Increasing the resolution to 2560x1440 would be more beneficial to image quality than increasing settings. You can probably get a 1070 Ti to work hard at 1920x1080@60hz, it just isn't a good use of resources.

Well I got an Acer Nitro Something 1440p 144Hz monitor during Black Friday and I have to run it at 0.75x resolution at medium settings to get 40 to 60fps. I tried to run natively at 1440p but has to use low settings and, in my opinion, it looked a lot worse like that than medium settings at 1080p on a 1080p monitor. For me, things like higher draw distance presents a better use than higher resolution in terms of "better", which for me is "more believable" graphics.

Edit - this is RDR 2 with my 1070ti and my 3700x!

On all games ive had to drop settings dramatically to get 100fps at 1440p - after all I am asking the GPU to render twice as many pixels, nearly twice as fast!

Reply 21 of 28, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
Almoststew1990 wrote:
What you say sounds sensible, but it survived the current gen of consoles going very multi threaded. I would imagine its gaming […]
Show full quote
SPBHM wrote:

yes but, I think this sort of setup is going to drop massively in terms of viability for gaming in a year, when the new consoles are out a lot of upgrading will have to happen, for now it's mostly fine.

What you say sounds sensible, but it survived the current gen of consoles going very multi threaded. I would imagine its gaming suitability isn't going to suddenly drop, in the same way quad cores haven't suddenly become useless when the PS4 launched.

SPBHM wrote:

I'm still using my budget sandy bridge solution from early 2012, still feels better than what higher end CPUs I had in the early 2000s after 1 year of use.

Yes, I think there has been a huge slow down in CPU tech over the last 10 years, compared to the 20 years before.

the difference is that the PS4 had a very slow CPU (single thread performance can drop bellow the original Athlon 64 from 2003, so even with 6-7 cores available for gaming MT performance is not very impressive, often behind a Sandy Bridge i3 with only 2 real cores), just lots of cores, this time the PS5 is going to have a CPU likely comparable to at least a Ryzen 1700x which is still fairly decent, so I expect it to be very different this time.

Reply 22 of 28, by ZellSF

User metadata
Rank l33t
Rank
l33t
Almoststew1990 wrote:

Well I got an Acer Nitro Something 1440p 144Hz monitor during Black Friday and I have to run it at 0.75x resolution at medium settings to get 40 to 60fps. I tried to run natively at 1440p but has to use low settings and, in my opinion, it looked a lot worse like that than medium settings at 1080p on a 1080p monitor. For me, things like higher draw distance presents a better use than higher resolution in terms of "better", which for me is "more believable" graphics.

Edit - this is RDR 2 with my 1070ti and my 3700x!

If I've heard correctly, that's one of the more demanding games and not representative even of how AAA games run. Games that can't maintain 60FPS at 1080p with medium settings (which 1440p with a 0.75 resolution scale would be) on a 1070 Ti are rare.

Almoststew1990 wrote:

On all games ive had to drop settings dramatically to get 100fps at 1440p - after all I am asking the GPU to render twice as many pixels, nearly twice as fast!

Though I apparently didn't actually say it, I meant the 1070 Ti was suited for a higher resolution OR refresh rate, although for a lot of games both is achievable.

Thankfully you don't need to reach the "native" refresh rate of your screen, sadly you do have to reach the native resolution, unless you're ok with interpolation blur.

Reply 23 of 28, by Davros

User metadata
Rank l33t
Rank
l33t

I have a i5-2400 running at stock (3.1ghz) the ops score in brackets
CPU-Z Single Thread - 325 (299)
CPU-Z Multi Thread - 1259 (1550)

Guardian of the Sacred Five Terabyte's of Gaming Goodness

Reply 24 of 28, by oeuvre

User metadata
Rank l33t
Rank
l33t

try pairing an RTX2080 with a core 2 quad Q9400

HP Z420 Workstation Intel Xeon E5-1620, 32GB, RADEON HD7850 2GB, SSD + HD, XP/7
ws90Ts2.gif

Reply 25 of 28, by Almoststew1990

User metadata
Rank Oldbie
Rank
Oldbie
Davros wrote:

I have a i5-2400 running at stock (3.1ghz) the ops score in brackets
CPU-Z Single Thread - 325 (299)
CPU-Z Multi Thread - 1259 (1550)

I'd be interested to know if you've got any benchmarks of the games above to see what cpu utilisation you get!

Reply 26 of 28, by pixel_workbench

User metadata
Rank Member
Rank
Member

I'm still using my i7 860 that I bought in 2009, and it still runs everything I want. However, I disagree about the "clunky base clock overclocking" - there's nothing clunky about it. What's clunky is the way intel intentionally disabled overclocking for most cpus that followed, while intentionally disabling some virtualization features on the special-K models.

My Videos | Website
P2 400 unlocked / Asus P3B-F / Voodoo3 3k / MX300 + YMF718

Reply 27 of 28, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
oeuvre wrote:

try pairing an RTX2080 with a core 2 quad Q9400

Not exactly related but heres what happens when you pair a low end Celeron with a 2080ti:

https://www.3dmark.com/compare/spy/4867464/spy/4933837

Only 3 - 10% difference graphically when the card is the bottleneck.

A different story when the CPU is the bottleneck:

https://www.3dmark.com/compare/spy/4856189/spy/4934089

Reply 28 of 28, by oeuvre

User metadata
Rank l33t
Rank
l33t

Interesting.

HP Z420 Workstation Intel Xeon E5-1620, 32GB, RADEON HD7850 2GB, SSD + HD, XP/7
ws90Ts2.gif