Does the perfect computer exist? […]
Show full quote
Does the perfect computer exist?
Technology is always changing, i.e. getting more sophisticated and faster, and new standards are always being developed and implemented.
Some of this is actual innovation, and the rest of it is just companies trying to artificially render the old hardware obsolete so that everyone has to buy new. Also, this never results in a faster UI experience, as the software gets slower at about the same rate as Moore's Law. In fact, DOS 6.22 booted up faster in 1992 than Windows 10 boots up today on a typical machine, just as an example.
I think one of my goals is to break out of this neverending and expensive cycle and only ever use one computer for the rest of my life. Hahaha. What a pipe dream, right?
Everything degrades and breaks over time, eventually.
I have this idea of a sort of minimalist computer, probably a laptop. Its feature set is clearly defined and set in stone. There is no artificial planned obsolescence involved. A focus is placed on robustness, stability, simplicity of usage, and longevity. The software is developed to satisfy the predetermined feature set, ruthlessly beta tested, and then code-frozen except for bugfixes and security updates. The user interface is simple, and built with the highest respect for the end-user in mind. The hardware is sufficient for the tasks it is meant for, with no gimmicks that don't do anything useful, strange proprietary ports, or arrogant corporate illusions about being "innovative" when it's really just different for no real reason. It has a modular design, with each component discrete and easily replaceable (that is, to take care of external changes in technology such as new Wi-Fi standards). Also, the battery lasts 24 hours, and the whole thing costs $250.
At least, if I owned a computer company, this is what I would tell the engineering team to make. Wow. Maybe I'll also find a million dollars in my closet.
I also think hardware has gotten so cheap and so fast nowadays, but we are not taking as much advantage of it as we could because of software bloat. Suppose all I want my computer to do is word processing and web browsing, just like most consumers today who aren't gamers or developers. If you make the software efficient enough, you could do that using hardware similar to what was offered 10 years ago, at much less cost due to advances in manufacturing technology that have made everything cheaper.
There's also an issue with "development for its own sake", for example Gmail changing their look every couple of years without changing functionality, just so the developers have something to do.
My view is that something like a mail client, or an operating system (on the same hardware, that is) has a theoretical state somewhere out there in which it is "mature," i.e. fully functional without bugs, security holes, works great, does what it's supposed to do and does it well, etc. Few software ever gets to this point and STAYS THERE because once it gets there, the incentive to continue development is still there. This leads to change for its own sake without real improvement. A good example of this is Microsoft Office. It's had the same functionality, the core features that everyone uses and very few people step outside of, since Office '97 or so. But MS has an incentive to keep releasing new versions of the product whether or not it needs it, just to hang the shiny apple of something new to people and make them think the old version is somehow deficient in some way. Obviously sometimes it is. But honestly now I, as a normal semi-developer user, can use Office 2000 with no issues as my daily driver office suite because I simply am not among the 3% or whatever it is of people who need the new version.
Anyway, rant over... just some things that have been going through my head lately. Any thoughts?