Ozzuneoj wrote on 2025-12-23, 04:34:
Overall though, things have stagnated quite a bit, yes. I think we can blame that on the fact that nearly everything the average non-gamer needs to accomplish on a computer could be done plenty fast enough on a Windows XP computer with an SSD, a couple gigs of RAM and a Core 2 Duo... if not for outdated security updates and web browser support. That has taken a lot of the urgency out of advancing the speed of general computing.
Indeed. Though I personally think the smartphone "revolution" also fueled a big portion of this stagnation too.
Just think about how many people there are out there that don't even have a stand-alone computer at home anymore (I'm talking about laptops and desktops here.) To many, a phone, which is really the most commonly portable form of a computer there is these days, is often good enough for everything they need. The refocus of the tech industry into smartphones and tablets probably at least somewhat reduced the speed and/or need for development of PC hardware. Also, due to the small size and low power availability (battery) of phones and tablets, the hardware in them is also lot more limited in performance compared to a "regular" computer (be it desktop or laptop). As such, the need for more optimized / less bloated software re-emerged again for many software devs - not just for desktop "apps", but also in the sphere of web development too. (On the other hand, don't get me started on how everything on the 'net has become so vertical view -oriented - ugh!) So probably a good deal of why there was some stagnation / slowdown in software bloat in the late 2000's and onwards is directly linked to the popularity of smartphones... and probably why the PC hardware from that era (~ Core 2 Duo) can still be usable today. As you said, if it wasn't for outdated security and outdated/abandoned OS support, Win XP and 7 could still do like 90% of the office tasks today, if not more.
That said, I can understand why software tech companies can't keep supporting their old products forever - after all, they need to support themselves financially. Solely relying on doing that from tech support on an old product would likely mean the company would need to downsize aggressively after the product is released and goes into its "support/updates" phase. On the other hand, I think we can all also agree that it's very -wrong- when a software company forcefully makes their old software obsolete purely to remove competition from their current product so that they can rake in larger profits. Ideally, software companies should adjust their product's lifecycle expectations to what the market demand is. For Windows, I think we can easily say that's now around the 10-year mark, seeing how XP and 7 (and even W10!) turned out. It would also allow a longer lifecycle of PC hardware... which would probably cause a stagnation in the development of new hardware techs. But on the other hand, imagine if your current rig would now be "just fine" for the next 10+ years with no need for upgrading whatsoever (even for the latest AAA games). Wouldn't that be nice? Not only would this be better for the consumer (more budget-friendly), but it would also certainly be better for the environment too. And it even has the potential to be better for the tech companies, as it would mean a (very) slow, but stable growth. Of course, in saying this, it's probably quite obvious now why tech companies aren't doing so. After all, no one wants small profits over long periods of time. Large profits over a short period of time - yeah baby, hit me up!
And that pretty much concludes why we are where we are... with AI, RAM prices, and all that jazz.
luckybob wrote on 2025-12-22, 18:49:
it will collapse. The system will /eventually/ self-correct.
That's what was said many times about cryptocurrency... and yet, here we are.
As such, I don't think this AI bubble will burst either. But *maybe* after a few years (or hopefully less) it might just level out. But who knows?!
If anyone wants to help do anything about it, then start by not "feeding the monkeys/trolls" anymore.
Or more directly said, avoid using AI or any products that heavily rely on AI. And I know that might seem almost impossible these days, given how integrated it's become. But it's not impossible - it just may take a lot of adjustments to one's everyday technology habits and use.
Hoping wrote on 2025-12-22, 12:42:
Nowadays, 64 GB is considered a lot, but in my opinion, that's ridiculous. 64 GB should be considered the minimum today, 128 GB should be acceptable, 256 GB should be standard, and 512 GB should be for enthusiasts.
Why?! So that we can have even more sloppy/lazy software taking 10's of gigabytes to run the same basic crap that the old version could?
In the same sense, should we keep adding more wheels to cars just because technology has made them cheaper compared to what they were in the early automobile era (e.g. equivalent to the 80's and 90's in computers)?
Perhaps, the more direct question is, at what point is the good enough really good enough finally?
Yes, if we do stop at good enough, that would lead to stagnation in development of new and/or better technologies.
... and imagine if we had stopped at 640kb of memory (which someone famously said "ought to be enough for everyone".) On the other hand, did we really need all of this excess that we have today? And is anyone even asking what's the price we had to pay for all of that?
Hoping wrote on 2025-12-22, 12:42:
It is normal for such a strong software advance as AI to suffer from this stagnation.
I wouldn't call LLMs (what everyone refers to as "AI") a software advance anymore.
It _had_ a lot of potential to be a great and very useful tool. Instead, it fell too quickly in the hands of the masses and greedy companies... and now, most of its use is a waste of energy and resources.