UCyborg wrote on 2025-09-13, 06:19:
UCyborg wrote on 2025-09-13, 06:19:
Jo22 wrote on 2025-09-13, 05:32:
How is this difference being justified? 😟
Why does Raspbian of 2012 needed so much less, while having same feature set?
The crazy amount of programmers think it's NORMAL for used CPU cycles to scale with CPU advancements, for the same task! They're nuts!
Adding to that, old hardware is simply irrelevant for the most part outside of community visiting obscure forums such as this one.
It's not just about old hardware, I think, but a fundamental problem. 😟
I've noticed the bloat in the year 2000 already.:
While a Windows 3.x program was 200KB in size and ran quick,
a similar Win32 application on 98SE now was 2 MB in size. And less snappy running.
There's a saying for this phenomenon: "software is getting slower more rapidly than hardware is becoming faster."
https://en.wikipedia.org/wiki/Wirth%27s_law
If Linux or Open Source community really was advancing,
then there would be at least an attempt to break this cycle somehow.
Because otherwise, I'm afraid, the true capabilities of modern computers may be never utilized.
The ratio between bloat and fast hardware may always remain same or similar.
Then there's the scalability problem.
In order to manage many processes and cores, computing power is required too.
Just making more cores and use more threading and more memory doesn't fix the issue.
It rather increases the pressure on the operating system's scheduler and the virtual memory managment.
PS: Then there's the design of Linux/Unix: Everything is a file.
Seems fine at first, but there's a catch. Millions of (tiny) files put a pressure on the filesystem and the mass storage devices.
If for example, there's an SSD which must erase blockwise, then lots of read-write-modify requests may happen.
It's not just about wrong alignment, but also about physical sector size.
In order to having things running smoothly,
the OS should work with sector sizes that match the internal sector size used by the flash memory. 4KB, for example.
Edit: Speaking of Linux, it's kinda funny.
Every few years Linux users look back and say "how far we have come!".
Back in 2000, Linux users told how far it had advanced since early 90s and that modern Linux can't be compared with it.
In the 2010s, the same had been said about "early' Linux from 2000: "how far we have come!"
Now in 2025, same thing happens about Linux from ~2008 (when Ununtu was rising)..
In 2050 someone will surely say that now-current Linux was in its infancy and that it can't be compared to then-current Linux.
By 2050, Linux will have moderate resource requirements of 64 CPU cores, 16 TB RAM and and 500 TB holo drive. In the name of "progress".
It's same principle as farther, faster, higher perhaps.
Or the fairytale of unlimited economic growth (even if resources aren’t endless).
People assume that there is no limit and act accordingly.
That explains why software is so bloated and uneccesarily complex.
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//