VOGONS

Common searches


How over powered are modern PCs?

Topic actions

Reply 40 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
gaffa2002 wrote on 2021-11-10, 18:40:

Well, if we are talking strictly about consumer CPUs, then is the other way around: In order to improve something you must have a need, a demand for improvement... CPUs nowadays are doing far more than the average user needs and the only thing its making people consume newer technology is capitalism, which keeps creating needs that didn't exist before (i.e. being able to connect my fridge to the internet, or using a complex AI to recognize and interpret my voice to flip a simple light switch). So speaking of CPU/GPU speed evolution, capitalism is actually the thing keeping development in those areas from stagnating even more.
Now if we look outside the PC gaming niche and talk about technology in a broader view, then I agree that capitalism is holding technological evolution back. Everything invented nowadays, unfortunately, must be a product above all things, which means very little interest in evolving in other fronts like making technology more accessible, or expand existing infrastructure to provide connectivity to remote areas.

Well I agree with you basically.
Capitalism is the driving force for many things, technology not the least of them. Medicine advancements, manufacturing, infrastructure, investments, real estate you name it.
Capitalism isn't by it's nature a negative thing at all, I think that some people have that idea but that isn't the case.
The would wouldn't be any where near as developed as it is today without it. Of course some people/companies exploit it for their own ends but that is going to happen. The nature of things.

Reply 41 of 103, by antrad

User metadata
Rank Member
Rank
Member
gaffa2002 wrote on 2021-11-10, 14:33:

Same thing with resolutions. The difference between 4k and 2k is not worth the 4x extra computing power required.

You need to take into account also the increase in screen size. Monitors used to be 15" and less, but now 24" is minimum and people want 27" and more. TVs are also much larger than they used to be. If you want to use a bigger screen you need higher resolution and then 4K is worth it. I have a 24" monitor and not long ago I used a 32" 1080p TV as a monitor as a test and it was a whole new level of immersion and I instantly fell in love, but with such big screen so close the picture is a bit blurry and aliased and you need higher resolution.

https://antonior-software.blogspot.com

Reply 42 of 103, by gaffa2002

User metadata
Rank Member
Rank
Member
antrad wrote on 2021-11-10, 19:28:
gaffa2002 wrote on 2021-11-10, 14:33:

Same thing with resolutions. The difference between 4k and 2k is not worth the 4x extra computing power required.

You need to take into account also the increase in screen size. Monitors used to be 15" and less, but now 24" is minimum and people want 27" and more. TVs are also much larger than they used to be. If you want to use a bigger screen you need higher resolution and then 4K is worth it. I have a 24" monitor and not long ago I used a 32" 1080p TV as a monitor as a test and it was a whole new level of immersion and I instantly fell in love, but with such big screen so close the picture is a bit blurry and aliased and you need higher resolution.

Got your point but I wasn't talking about the screens themselves, but the computing power required to keep up with them. In the example you gave, if you want to use a 32'' 4k screen as a computer monitor, you also need a much more powerful computer to render stuff at 4k. Which means you need a hefty investment just to get rid of "a bit of blur and aliasing". Some people may think it's worth the investment, but for the average user there is no point in doing that, and for each resolution jump, it gets more expensive for less results.
In addition, screen sizes are also reaching some kind of "sweet spot". I don't see people using 70'' PC monitors because it would be extremely uncomfortable, nor people buying 1000'' TV screens because house sizes do not change at the same pace as TV screens.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 43 of 103, by RaiderOfLostVoodoo

User metadata
Rank Member
Rank
Member
Almoststew1990 wrote on 2021-11-10, 07:37:

I personally am quite happy navigating the web on a 3GHz Q6600 with 8GB of RAM and various gaming gpu but I know my Dad would somehow manage to make a system like that grind to a halt over 6 months.

That's the bare minimum I would suggest these days.
The biggest bottleneck would be the 8GB of RAM. For me, 8GB won't do it anymore. I have a tendency to keep a lot of tabs open. When I still had 8GB, I had to close Firefox, or games wouldn't even start because of the lack of RAM.

clueless1 wrote on 2021-11-09, 23:55:

My son and I both have Haswell i7s. I guess this is now 8 generations old, yet our GPUs are still the bottleneck. He has an RTX 2060 KO and I have a GTX 1650 Super. Crazy.

Yeah, I'm using Haswell as well. Maxed out with 32GB@2400MHz now.
My GPU is a bit dated. But with the current prices for both new and used GPUs, I probably won't be upgrading it for a while. Damn kids with their crypto crap. Investing into this waste of ressources and then complaining about price scalping for GPUs. And the worst part: They don't even realize that they're causing this problem themself. If nobody would buy these shit coins, the chinese wouldn't mine them and there wouldn't be a shortage.

I plan to upgrade to an i9-9900K(S), as soon as the kids starts flocking to DDR5 and flooding the market with their old DDR4 rigs. It's the most powerful CPU that still has Win7 support. Nice to have the ability to go back.
But I won't use LGBTQ lights. Do kids really like this or is just for posing? I prefer mono blue. I remember when Enermax released their T.B.Vegas fans and everyone thought this is rediculous. How has this become the standard in just 10 years?

gaffa2002 wrote on 2021-11-10, 14:33:

In my view, we got past the "sweet spot" for personal computers for a while. Even the most affordable computers (mobile devices included) can easily render a nice UI in a resolution high enough for our eyes to perceive as "great", and provide the same user experience than high end devices 99% of the time.

I my opinion that sweet spot was 2nd gen Core i. They did overclock so well, that it took several generations to beat them. Can't go that high with Ivy Bridge or Haswell.
They got all you need: USB 3.0, SATA 6GB/s and are even backward compatible with Windows XP!

I recently build a rig for my parents, because they were constantly complaining that their old Athlon is too slow.
I already had few i5 2500Ks, because I did some binning for my XP rig. Gave em my 2nd best sample, which is running at 4.5GHz in the 70°C range now. Slapped it onto a Asus P67. Added 16GB@1600MHz (which I will probably upgrade to 32GB at some point). Samsung 840 Pro 256GB. Semi-passive 550W PSU with 80PLUS Gold. Cooler Master CM690 II, which I got for 15 bucks on Kleinanzeigen (Craigslist in Krautland).
Also slapped a GeForce 670 into it, so my dad can play the sequels of his favorite game from the 90s. Doom 2016 runs fine at 2k resolution and ultra settings.
That whole rig did cost a bit over 200€.

Last edited by RaiderOfLostVoodoo on 2021-11-10, 22:08. Edited 1 time in total.

Reply 44 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Almoststew1990 wrote on 2021-11-10, 07:37:

I personally am quite happy navigating the web on a 3GHz Q6600 with 8GB of RAM and various gaming gpu but I know my Dad would somehow manage to make a system like that grind to a halt over 6 months.

8GB is simply not enough these days.
My work machine pretty much eats 12GB constantly, with Outlook/Excel/Word/Teams and always Chrome and Edge both open with multiple tabs.
If I throw in LogMeIn and BeyondTrust it easily chews up over half my 16GB of ram.
If I open Powershell or something it gets even worse.
I wouldn't say I'm really a "power user" either but I pretty much use those apps all day every day.
And Microsoft says you can "use" Windows 10 with a 1GHZ CPU and 2GB of ram..........

Reply 45 of 103, by bestemor

User metadata
Rank Oldbie
Rank
Oldbie
RaiderOfLostVoodoo wrote on 2021-11-10, 21:54:

I plan to upgrade to an i9-9900K(S), as soon as the kids starts flocking to DDR5 and flooding the market with their old DDR4 rigs. It's the most powerful CPU that still has Win7 support. Nice to have the ability to go back.

What motherboard do you plan on combining that with, for Windows 7 I mean ?

Reply 46 of 103, by antrad

User metadata
Rank Member
Rank
Member
RaiderOfLostVoodoo wrote on 2021-11-10, 21:54:

If nobody would buy these shit coins, the chinese wouldn't mine them and there wouldn't be a shortage.

You are out of date. Chinese government messed them up, so now most minning is happening in USA.

https://www.forbes.com/sites/roberthart/2021/ … sh=3d28738b6af7

USA is now both printing money and also mining bitcoin out of thin air, who needs manufacturing anymore...

https://antonior-software.blogspot.com

Reply 47 of 103, by pixelatedscraps

User metadata
Rank Member
Rank
Member

I would point out this same question has been floating around repeatedly for the last 30~ years or so and the answer remains the same: the everyday user doesn't and hasn't ever needed much: word processing, email, Solitaire and now Netflix, etc, while the class-leading creative professional in their respective industry will always push for and require more speed, capacity or throughput. As client demands and standards progress, so too does the computer power required to deliver.

I own a photography studio and we upgrade the 5 machines in our studio probably every 2-3 years. Previously, it was perhaps every 4-5 years. As the software providers we use switch to subscription models (Adobe / Capture One in particular) we find the relentless feature updates are either poorly coded (ie. lazy code not sexy streamlined coding) or require ever-accelerating hardware upgrade cycles in order to utilise at their desired speeds...

Once you identify what people are doing with their machines - animation / motion graphics, 4k-8k video editing, commercial photography, competitive gaming, crypto mining - what have you - then you realise what and who the bleeding edge is created for. For the average Netflix-Youtube-email user, a mid range 2014 Mac or PC is probably still enough.

My ultimate dual 440LX / Voodoo2 SLI build

Test bench: Asus P3B-F | 1.3Ghz Tualeron w/ Powerleap | Geforce 2 Ti500 | SB Live! 5.1 CT4760

Reply 48 of 103, by zyzzle

User metadata
Rank Member
Rank
Member
rmay635703 wrote on 2021-11-08, 23:45:

We are somewhere near 99% bloat
And it’s unlikely to get better as we continue down the rabbit hole of new code on top of antique psuido code that becomes redundant.

Yes, this is the key. Not that modern CPUs are overpowered (they are, indeed), but that modern bloat is "over"bloated! It's gotten so absurd and ridiculous that I almost refuse to run any application, game, or software that has been written later than about 2010 or so, sometimes even 2005.

The travesty is that NO money or time is currently spent on any kind of optimization. Even 10% of development time going toward code optimization would reduce modern bloat GREATLY, probably by at least 50% and perhaps as much as 80%. Less reliance on dlls, libararies, .NET framework, "packaged code", bloated and inefficient Javsascript, and even using better compression techniques (throw out .ZIP, make large-dictionary LZMA (.7z) the standard), would pay GREAT dividends.

But, no one cares any more, because modern CPUs are overpowered. It's a vicious, disasterous cycle which feeds upon itself, and in the long run hurts everyone and degrades software in general.

Reply 49 of 103, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

8GB is simply not enough these days.

You can improve that by storing pagefile on NVME SSD, Intel Optane or some LSI WarpDrive. Latter even has official XP drivers.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 50 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

I'd think that's not only about the optimization problem but also the very same GUI design. In the past even productivty or professional applications had quite smart GUI with zero effects, many option sorted in smart ways to be remembered, maybe cause the o.s. GUI had that layout (Win 95 like apps for example) but they were really clean, with high contrast well designed buttons, menu etc and they were light and fast. Nowdays I see most apps (beside needing some heavy dependencies/libraries to work) with absurd "cheap" looking based GUIs where it's even difficult to understand how to open a file. And this problem seems mostly on any o.s. GUI even some linux distributions, where I see many apps has the same modern design maybe to appear futuristic but at the end heavy and difficult to find/understand.
Similar GUIs in the 2000's imho would have been considered not-professional at all when it wasn't used the classic Win 9x/2000 like style to try instead some new proprietary idea. Some did have success like Winamp similar interfaces but others not.

Last edited by 386SX on 2021-11-12, 18:16. Edited 4 times in total.

Reply 51 of 103, by gaffa2002

User metadata
Rank Member
Rank
Member
zyzzle wrote on 2021-11-12, 00:22:
Yes, this is the key. Not that modern CPUs are overpowered (they are, indeed), but that modern bloat is "over"bloated! It's gott […]
Show full quote
rmay635703 wrote on 2021-11-08, 23:45:

We are somewhere near 99% bloat
And it’s unlikely to get better as we continue down the rabbit hole of new code on top of antique psuido code that becomes redundant.

Yes, this is the key. Not that modern CPUs are overpowered (they are, indeed), but that modern bloat is "over"bloated! It's gotten so absurd and ridiculous that I almost refuse to run any application, game, or software that has been written later than about 2010 or so, sometimes even 2005.

The travesty is that NO money or time is currently spent on any kind of optimization. Even 10% of development time going toward code optimization would reduce modern bloat GREATLY, probably by at least 50% and perhaps as much as 80%. Less reliance on dlls, libararies, .NET framework, "packaged code", bloated and inefficient Javsascript, and even using better compression techniques (throw out .ZIP, make large-dictionary LZMA (.7z) the standard), would pay GREAT dividends.

But, no one cares any more, because modern CPUs are overpowered. It's a vicious, disasterous cycle which feeds upon itself, and in the long run hurts everyone and degrades software in general.

Sad but true 🙁.
Back then we upgraded to have new or improved functionalities, today we upgrade to avoid losing functionalities we already have 🙁.

This seems to be a problem with the IT industry in general... all this modularizing and bloating and recursive crap is not just for justifying over powered machines, but also allows IT professionals to be easier to find and replace.
It's a bit sad to see IT professionals in that never ending cycle of learning "new" stuff (which is 99% of the time just existing concepts with a cool name) because this is what the industry asks for, with no time to focus in learning the fundamentals.
I feel that a lot of younger IT guys are slaves of current trends because they lack the fundamentals, its not their fault, it's just the way the industry likes them to be.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 52 of 103, by Caluser2000

User metadata
Rank l33t
Rank
l33t

I predict MS Windows 12 will bring back 3d window buttons.....

And a new startup tune.

There's a glitch in the matrix.
A founding member of the 286 appreciation society.
Apparently 32-bit is dead and nobody likes P4s.
Of course, as always, I'm open to correction...😉

Reply 54 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

About the video editing I suppose that when even phones seems to need 4K resolutions just cause some need 4K new TVs I wonder what's the benefit of such resolutions becoming "a standard" on such camera sensors in addition. I suppose a 1080p video with a reflex camera like sensor results in huge pixel quality improvement and less demanding encoding CPU/GPU work. At the end having HD 720p/1080i TVs I find that those lower resolutions are not nearly a problem. Imho this modern market acceleration in the resolutions when not even hardware encoders/decoders/3D accelerators are common anyway speak for itself. Many dvd or blue ray player might not even be compatible with H264 High Profile decoding, not to mention H265 or VP9 or AV1 that are already used talking about how heavy web browsers end up being if in addition to javascripts has to sw decode such cpu demanding codecs.

Reply 55 of 103, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie
MAZter wrote on 2021-11-12, 18:06:

Try to edit the video on a slow computer and the question will disappear by itself, and today all users are uploading the video on YouTube, this is the average user.

Average teen maybe.

Collector of old computers, hardware, and software

Reply 56 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2021-11-12, 02:36:

8GB is simply not enough these days.

You can improve that by storing pagefile on NVME SSD, Intel Optane or some LSI WarpDrive. Latter even has official XP drivers.

Well. I suppose that I could do that. But here is the thing....number one is that it is my work PC and I don't get to pick and choose what hardware to upgrade the thing.
I was issued the bog standard Dell Optiplex with the bog standard i5 CPU and had to fight to even get the 8GB ram upgrade that I did get.
Secondly, while I did eventually get to upgrade from the standard 500GB spinning rust to a 240gb Samsung SATA ssd, this was only after the 36 month warranty of the PC was past and my company was willing to pony up to replace the failing HDD.
This machine has no nVME support. I'm not even sure that it has a m.2 slot of any kind, but if it does it's merely SATA III and offers me no speed benefit.
Also, there are no additional SATA ports on this thing to add a second drive, not without removing the optical drive anyway.
Also while the 7th gen platform does support Optane (I think anyway) this particular Optiplex does not.

In short, if this were my PC those options are viable.
But it's almost ALWAYS going to be easier AND cheaper to just upgrade or increase the amount of ram.
Optane/StoreMI was a interesting idea but kind of pointless as flash prices rapidly came down. Outside of the enterprise space, mechanical drives have very little point anymore. I use them every day in NAS/SAN and RAID applications, but most desktop/workstation users are better off just buying faster storage and additional ram in my humble opinion.

Reply 57 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Caluser2000 wrote on 2021-11-12, 18:03:

I predict MS Windows 12 will bring back 3d window buttons.....

And a new startup tune.

We have came full circle from 2001.
Windows 11 looks like Windows XP had a bastard child with Windows 10.

Reply 58 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Unknown_K wrote on 2021-11-12, 19:05:
MAZter wrote on 2021-11-12, 18:06:

Try to edit the video on a slow computer and the question will disappear by itself, and today all users are uploading the video on YouTube, this is the average user.

Average teen maybe.

Not really. Many people from all age groups are using YouTube a their primary platform.
I think you would be surprised.

Reply 59 of 103, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
gaffa2002 wrote on 2021-11-12, 13:35:
Sad but true :(. Back then we upgraded to have new or improved functionalities, today we upgrade to avoid losing functionalities […]
Show full quote
zyzzle wrote on 2021-11-12, 00:22:
Yes, this is the key. Not that modern CPUs are overpowered (they are, indeed), but that modern bloat is "over"bloated! It's gott […]
Show full quote
rmay635703 wrote on 2021-11-08, 23:45:

We are somewhere near 99% bloat
And it’s unlikely to get better as we continue down the rabbit hole of new code on top of antique psuido code that becomes redundant.

Yes, this is the key. Not that modern CPUs are overpowered (they are, indeed), but that modern bloat is "over"bloated! It's gotten so absurd and ridiculous that I almost refuse to run any application, game, or software that has been written later than about 2010 or so, sometimes even 2005.

The travesty is that NO money or time is currently spent on any kind of optimization. Even 10% of development time going toward code optimization would reduce modern bloat GREATLY, probably by at least 50% and perhaps as much as 80%. Less reliance on dlls, libararies, .NET framework, "packaged code", bloated and inefficient Javsascript, and even using better compression techniques (throw out .ZIP, make large-dictionary LZMA (.7z) the standard), would pay GREAT dividends.

But, no one cares any more, because modern CPUs are overpowered. It's a vicious, disasterous cycle which feeds upon itself, and in the long run hurts everyone and degrades software in general.

Sad but true 🙁.
Back then we upgraded to have new or improved functionalities, today we upgrade to avoid losing functionalities we already have 🙁.

This seems to be a problem with the IT industry in general... all this modularizing and bloating and recursive crap is not just for justifying over powered machines, but also allows IT professionals to be easier to find and replace.
It's a bit sad to see IT professionals in that never ending cycle of learning "new" stuff (which is 99% of the time just existing concepts with a cool name) because this is what the industry asks for, with no time to focus in learning the fundamentals.
I feel that a lot of younger IT guys are slaves of current trends because they lack the fundamentals, its not their fault, it's just the way the industry likes them to be.

Forced obsolescence is in full swing. As I type this my company has about 40 machines heading to recycler.
There are 5-6 Dell servers with Bloomington based quad and hecatore CPUS. All intact minus the drives.
Most of the workstations are i5's but there are a handful of i3's as well. All 3rd and 4th gen for the most part. Maybe one or two 2nd gen systems.
All of them are perfectly "capable" machines with legit Windows 7 or 8 licenses and while we did strip the 4GB ram modules by and large they still have 2gb sticks in them.
They aren't "officially" supported and none of them can will upgrade to 11 when the time comes. So off to the recycler they go.

Pretty regular occurrence. Perfectly capable hardware if a bit long in the tooth.
We are a throw away society. At least these machines will be responsilby recycled.