VOGONS

Common searches


How over powered are modern PCs?

Topic actions

Reply 20 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
clueless1 wrote on 2021-11-09, 01:05:

Even for modern gaming, GPUs are usually a much bigger bottleneck than CPUs. Most of the most demanding games will play just fine on a several generation old i7 with a current high end GPU. Whereas the opposite is not true -- demanding games will play bad on the newest, fastest CPU with a several generation old GPU. So in the gaming world, CPUs can be seen as overpowered compared to GPUs.

Yes this is true. And becomes more evident the higher the resolution people are playing at.
With the marketing constantly pushing for 4k ultrawide gaming monitors and TVs the CPUs at a certain point become almost irrelevant.
And with current GPU market it isn't getting any better any time soon.

Even my own gaming rig has a overpowered CPU, R9-3900x, but due to crazy pricing it is still matched to my old GTX 1080.
Quite a unbalanced system but it wasn't done on purpose.

Reply 21 of 103, by DaveJustDave

User metadata
Rank Member
Rank
Member

I have a 3900X too. I just can't bring myself to upgrade to a 5900/5950x. What's 10-14% more performance going to do for me vs another $600-700 in my pocket?

I think I'll be sitting out the next gen too. Eventually the hardware will be so crippled that it's unusable because someone somewhere exploited the CPU and new BIOS patches will render them unusable. That's what happened to the last 6 core intel setup I had after the whole spectre fiasco.

And i think at that point you'll really just be upgrading to handle the bloat associated with a) preventing you from getting hacked while b) simultaneously tracking you and everything you do

I have no clue what I'm doing! If you want to watch me fumble through all my retro projects, you can watch here: https://www.youtube.com/user/MrDavejustdave

Reply 22 of 103, by DracoNihil

User metadata
Rank Oldbie
Rank
Oldbie

Meanwhile, I'm just here with a Intel NUC6i7kyk and everything runs just fine, including old games that only want a 16-bit display mode. (the dithering actually works rather than isn't present at all)

All this modern hardware with the 128 threads and whatever doesn't apply to me since most of what I play isn't even multicore aware, let alone multithread aware. And the few that are do not make efficient usage of multiple cores and multiple threading.

The i7 this thing has is of the Skylake generation and has 4 cores with HyperThreading. As far as I'm aware the only things I even do on here that benefit from the "8 cores" is compiling software or libx264 video encoding. Not a whole lot of games I play seem to benefit from this.

“I am the dragon without a name…”
― Κυνικός Δράκων

Reply 23 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie

Yes totally.
I only have the 3900x because it was a gift. I had a Ryzen 7 1700 that i built pretty close to launch, last year I finally bit the bullet and upgraded to a 3700x which I didn't really need but kind of wanted.
Then I was generously gifted the 3900x so I gave my 1700 to my nephew and sold the 3700x and added a RX580 to my living room HTPC/Gaming PC that I honestly can't remember ever using. That machine is overpowered as well for what I intended to use it for, but it is getting old as well i7-4770 and the RX-580 is still perfectly usable for the light gaming I wanted to do with it.

But I've got a full time job that sees me working a absolute minimum of 40 hours plus two kids under the age of 5, so time is a precious thing.

Reply 24 of 103, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
clueless1 wrote:

Even for modern gaming, GPUs are usually a much bigger bottleneck than CPUs. Most of the most demanding games will play just fine on a several generation old i7 with a current high end GPU.

That's because we had very anemic console generation and current generation with decent CPUs are still not the main target.
Although you have to remember that standards changed over past 10 years too. Now it's expected from PC to pump out 100+ Hz refresh rates.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 25 of 103, by liqmat

User metadata
Rank l33t
Rank
l33t
BitWrangler wrote on 2021-11-09, 02:32:

Though it could also be said that the GPU progression is kinda pointless past about 2014, because all it pushes is resolution, and by the time you're sitting close enough to a 4k screen to tell the difference, you've only got a tiny circle of it in focus and the rest is in peripheral vision and may as well be quarter VGA res. 1080p was the high water mark, much past that is just for specification masturbators, it doesn't really get you anything.

Reminds me of those video game screenshot comparisons over the last few years I've seen. I'm convinced they just copy and pasted the exact same screenshot over and over. Whereas back in the 80s, 90s and even the early 00s you could actually tell the difference.

DaveJustDave wrote on 2021-11-09, 17:02:

I have a 3900X too. I just can't bring myself to upgrade to a 5900/5950x. What's 10-14% more performance going to do for me vs another $600-700 in my pocket?

I think I'll be sitting out the next gen too. Eventually the hardware will be so crippled that it's unusable because someone somewhere exploited the CPU and new BIOS patches will render them unusable. That's what happened to the last 6 core intel setup I had after the whole spectre fiasco.

And i think at that point you'll really just be upgrading to handle the bloat associated with a) preventing you from getting hacked while b) simultaneously tracking you and everything you do

I recently upgraded to a 5900X. The only thing so far that has efficiently used all 12 cores is a virus scan. 🤣

Reply 26 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-11-09, 17:23:
clueless1 wrote:

Even for modern gaming, GPUs are usually a much bigger bottleneck than CPUs. Most of the most demanding games will play just fine on a several generation old i7 with a current high end GPU.

That's because we had very anemic console generation and current generation with decent CPUs are still not the main target.
Although you have to remember that standards changed over past 10 years too. Now it's expected from PC to pump out 100+ Hz refresh rates.

Pretty much.

And with the latest generation of consoles being the same hardware (with optimizations like dynamic resolution) that the games are being written for....these newer consoles are going to be expected to deliver those crazy high refresh rates as well.
Likely not yet to the degree that PCs are, but they are closing the gap.
With these cheaper and cheaper 4k high refresh rate TVs hitting the market all the kids think they have to have that crazy high resolution and refresh rate or they are somehow missing out on gameplay.
It's like the old megahertz adage where higher numbers will always sell regardless of whether or not it makes any sense.

Reply 27 of 103, by mothergoose729

User metadata
Rank Oldbie
Rank
Oldbie
rmay635703 wrote on 2021-11-08, 23:45:

We are somewhere near 99% bloat
And it’s unlikely to get better as we continue down the rabbit hole of new code on top of antique psuido code that becomes redundant.

Speed is relative. I work with big chunky bloated frameworks and when optimized correctly they feel as responsive or more responsive than bare no frills JavaScript. The bottlenecks also change. The network latency is often more impactful than render time for example, so forcing more of the work down into the browser often makes a lot of sense.

They are also infinitely easier to manage for large projects, integrate new browser features, and optimize for mobile devices. You optimize for the users you have.

Reply 28 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
mothergoose729 wrote on 2021-11-09, 17:43:
rmay635703 wrote on 2021-11-08, 23:45:

We are somewhere near 99% bloat
And it’s unlikely to get better as we continue down the rabbit hole of new code on top of antique psuido code that becomes redundant.

Speed is relative. I work with big chunky bloated frameworks and when optimized correctly they feel as responsive or more responsive than bare no frills JavaScript. The bottlenecks also change. The network latency is often more impactful than render time for example, so forcing more of the work down into the browser often makes a lot of sense.

They are also infinitely easier to manage for large projects, integrate new browser features, and optimize for mobile devices. You optimize for the users you have.

This is what I find interesting and one big reason that I started the thread. I know next to nil about code in general even though I do work with some software engineers (I work as a IT analyst) and I find that we often approach solutions to the same problem from complete opposite directions. Which is always educational for me.

It is helpful to know the reasoning behind why sometimes the hardware guys see the code as "unoptimized" when in reality it is simply optimized in a different way than they would like. Optimized for a different user.

Reply 29 of 103, by DaveJustDave

User metadata
Rank Member
Rank
Member

at this point, i think they should just optimize gpu/cpu for Unity, which is what everything is written in now.

Really the only thing pushing me to upgrade these days is frame rates. I have a 144hz 38" GSYNC ultrawide, and 3900x + 3080 with DLSS gets me a pretty smooth experience with multiplayer FPS. I'm more than willing to trade detail (which I don't notice AT ALL) for low latency/lag

I have no clue what I'm doing! If you want to watch me fumble through all my retro projects, you can watch here: https://www.youtube.com/user/MrDavejustdave

Reply 30 of 103, by zapbuzz

User metadata
Rank Oldbie
Rank
Oldbie

PC's aren't really overpowered but seemingly so for 5 years as they are under supported. Every time they bring ou new tech the softwares are lagging behind. for example MMX II became supported more in xp than 2000 but was around in the 2000 support phase.

Reply 31 of 103, by Big Pink

User metadata
Rank Member
Rank
Member
TheMobRules wrote on 2021-11-09, 03:51:

one must not underestimate the ability of modern ultra-bloated JavaScript to bring powerful computers to their knees

I ended up retiring my old Core2 Quad when simply opening Twitter used to freeze the system for ten seconds. A site designed around the almost utilitarian concept of displaying simple 140 character messages (SMS!) should be fairly lightweight.

How many lines of JavaScript are there?
Bout two million.

I thought IBM was born with the world

Reply 32 of 103, by clueless1

User metadata
Rank l33t
Rank
l33t
Jasin Natael wrote on 2021-11-09, 15:49:
Yes this is true. And becomes more evident the higher the resolution people are playing at. With the marketing constantly pushi […]
Show full quote
clueless1 wrote on 2021-11-09, 01:05:

Even for modern gaming, GPUs are usually a much bigger bottleneck than CPUs. Most of the most demanding games will play just fine on a several generation old i7 with a current high end GPU. Whereas the opposite is not true -- demanding games will play bad on the newest, fastest CPU with a several generation old GPU. So in the gaming world, CPUs can be seen as overpowered compared to GPUs.

Yes this is true. And becomes more evident the higher the resolution people are playing at.
With the marketing constantly pushing for 4k ultrawide gaming monitors and TVs the CPUs at a certain point become almost irrelevant.
And with current GPU market it isn't getting any better any time soon.

Even my own gaming rig has a overpowered CPU, R9-3900x, but due to crazy pricing it is still matched to my old GTX 1080.
Quite a unbalanced system but it wasn't done on purpose.

My son and I both have Haswell i7s. I guess this is now 8 generations old, yet our GPUs are still the bottleneck. He has an RTX 2060 KO and I have a GTX 1650 Super. Crazy.

The more I learn, the more I realize how much I don't know.
OPL3 FM vs. Roland MT-32 vs. General MIDI DOS Game Comparison
Let's benchmark our systems with cache disabled
DOS PCI Graphics Card Benchmarks

Reply 34 of 103, by Almoststew1990

User metadata
Rank Oldbie
Rank
Oldbie

I think the high end is mucher higher than it used to be. It used to be that the top end was maybe 500MHz faster than the mid range, single core CPU. Then dual core CPUs came along and the high end was dual core but you had a trade off of slower clock speeds on the dual cores. Then the high end was quad core and dual core was low-mid but with fairly similar clock speeds.

But now clock speeds are all 4GHz plus (and kind of hard to measure due to different types of turbo boost etc, not to mention Alder Lake's power and efficient cores) but at the low end you have quad cores and the high end you have 16 cores!

However people tend to do more with their high end CPUs theses days like video editing and encoding which I think happened less before youtube, twitch etc.

I personally am quite happy navigating the web on a 3GHz Q6600 with 8GB of RAM and various gaming gpu but I know my Dad would somehow manage to make a system like that grind to a halt over 6 months.

Ryzen 3700X | 16GB 3600MHz RAM | AMD 6800XT | 2Tb NVME SSD | Windows 10
AMD DX2-80 | 16MB RAM | STB LIghtspeed 128 | AWE32 CT3910
I have a vacancy for a main Windows 98 PC

Reply 35 of 103, by antrad

User metadata
Rank Member
Rank
Member
Big Pink wrote on 2021-11-09, 21:06:

I ended up retiring my old Core2 Quad when simply opening Twitter used to freeze the system for ten seconds. A site designed around the almost utilitarian concept of displaying simple 140 character messages (SMS!) should be fairly lightweight.

I have an old computer from 2005 (which I don't use anymore) and now it struggles to display even a reddit page that is only simple text for which a Pentium 1 should be enough. It was a perfectly usable computer until around 2013 and could manage much more complex websites and tasks, but even simple modern websites became ridiculously demanding. And don't forget Internet browsers. Whole Windows XP used 200-250 MB of RAM on that computer, but modern Firefox uses 900 MB as I am on this simple Vogons page writing this.

https://antonior-software.blogspot.com

Reply 36 of 103, by gaffa2002

User metadata
Rank Member
Rank
Member

In my view, we got past the "sweet spot" for personal computers for a while. Even the most affordable computers (mobile devices included) can easily render a nice UI in a resolution high enough for our eyes to perceive as "great", and provide the same user experience than high end devices 99% of the time.

And then we have capitalism... companies that make those devices and their software are having a hard time convincing the end user to buy their new stuff. Putting very low "minimum specifications" is also part of the plan: They allow you to install brand new software on weaker systems just for you to see how badly it runs in your outdated machine , making you consider an upgrade.
To add insult to injury, the technology industry makes a lot of money from our data. So along with the usual software bloat, we also have an overhead for having each action we make associated to an user account being sent to some server for whatever reason (in other words, you brand new hardware is working against you 😁). And I totally agree with what other users said here... in such times of pandemic this is just disgusting.

Even if you are a PC gamer the situation is kinda similar (even worse, maybe?), you basically need exponentially more powerful hardware to have logarithmic gains. I remember seeing a video from NVIDIA a few days back explaining how much better 240hz are compared to 60hz, showing some examples in very slow motion to explain concepts like input delay, ghosting and screen tearing. That's all true, but they "forgot" mentioning you need 4x the computing power to run a game in 240fps to be able see such minimal gains.
Same thing with resolutions. The difference between 4k and 2k is not worth the 4x extra computing power required. Much less worth the 16x extra computing power required for 8k (and no, I don't consider upscaling as being a justifiable reason for such screens).
Holy crap, if you want true 8k@240fps you need 64x more power! How does it make any sense?! To play a game in 2k@60fps you need one computer, but then you need to buy another 63 similar computers to play the same game, with the same graphical effects, but at 8k@240fps.

Last edited by gaffa2002 on 2021-11-10, 15:14. Edited 2 times in total.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 37 of 103, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

And then we have capitalism...

That's actually the reason we had CPU stagnation in mainstream market from first Core i7 release and up to first Ryzen release.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 38 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-11-10, 15:12:

And then we have capitalism...

That's actually the reason we had CPU stagnation in mainstream market from first Core i7 release and up to first Ryzen release.

Yes we had a monopoly.
Intel had zero need to try and innovate, they could keep puking out the same chip year after year with a 100mhz faster turbo speed and "improved" igpu and call it a generational leap.
Once Ryzen turned the tide for AMD all of sudden they had to actually try and innovate. It took them the better part of 5 years to actually do so.
We finally have some stiff competition again and this should be great for the average consumer, but that remains to be seen.
On the GPU side of things it largely doesn't matter as no one is selling anything for close to list price even if there is stock.
We are all screwed on that front except for the companies making and selling the cards.

But hey what are you going to do?

Reply 39 of 103, by gaffa2002

User metadata
Rank Member
Rank
Member

Well, if we are talking strictly about consumer CPUs, then is the other way around: In order to improve something you must have a need, a demand for improvement... CPUs nowadays are doing far more than the average user needs and the only thing its making people consume newer technology is capitalism, which keeps creating needs that didn't exist before (i.e. being able to connect my fridge to the internet, or using a complex AI to recognize and interpret my voice to flip a simple light switch). So speaking of CPU/GPU speed evolution, capitalism is actually the thing keeping development in those areas from stagnating even more.
Now if we look outside the PC gaming niche and talk about technology in a broader view, then I agree that capitalism is holding technological evolution back. Everything invented nowadays, unfortunately, must be a product above all things, which means very little interest in evolving in other fronts like making technology more accessible, or expand existing infrastructure to provide connectivity to remote areas.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD