VOGONS


First post, by songo

User metadata
Rank Newbie
Rank
Newbie

We live in the era where computers and high-tech overall are ubiquitous - yet so many people behaves like some illiterate ignorants. There is still a social emphasis on buying newer hardware without any real reason.

Back in the day, when PC were expensive, hardware was a luxury item and you were forced to squeeze every bit off it. I spent almost decade on Pentium 2000 MMX / 32 MB Ram and guess what? Never really had any urge to upgrade beside 3dfx card cause it was still functional as modern office / multimedia device even in 2007?

- Gaming ? I NEVER ran out of games. Even on newer hardware I still played games that was able to run on my old rig, hell... there were so many titles for which even P200 was an overkill! And there's a lot I have never touched yet.

- Listening music? Check
- Watching movies? Ok, quality is rather unimpressive but it works!
- Printing? Check
- Graphics? Checks

And so on and on and. Now I'm getting upset when some 'tech' youtube celebritres whining their new smartphone is weak for having ONLY 4Ghz RAm? Or some budget i7 CANNOT handle gaming, WTF? I know it's kind of mental shortcut of 'that stuff cannot run modern games' but anyway, it's annoying as hell.

Or calling high end Pentium 4 'writing machine'.

Seriously, I cannot understand why average Joe should replace his PC more frequent that once in a decade.

Oh, and this mentality that treats computers like horses which are getting old and loosing their strength as the times passes by. Those PC's didn't loose their funcionality, just use software dedicated to them or install Linux and you are good to go.

And i don't even start about all this that about 'weak' consoles... Maybe my old P200 made me stuck in some mental limbo but I refuse to call something like Xbox 360 or Playstation 3 a 'poor' or 'outdated' hardware.

Anyone shares similiar feeling?

Reply 1 of 97, by Repo Man11

User metadata
Rank Oldbie
Rank
Oldbie

I agree that there are many, many people out there who would be perfectly happy with a new hard drive and a reinstall of their OS instead of buying some new Thunderbolt Grease-Slapper PC. But marketing types will have their way.

"I'd rather be rich than stupid" - Jack Handey

Reply 2 of 97, by Doornkaat

User metadata
Rank l33t
Rank
l33t

SSDs changed the game quite a bit. A fifteen year old PC can still feel snappy for multimedia, web browsing, many desktop applications and older/less demanding games. And if you keep doing the same things with the same software PCs will not become slower.
If you need newer features like hardware encoding/decoding for new codecs however an upgrade may be due because your PC may lack the computing power to do that in software.

Also most tech youtubers are to be considered to be within the spectrum of entertainment/infotainment/advertisement. It's in the nature of their business to exaggerate the speed differences between cutting edge and mature tech. Their contributions aren't meant to be objective, they're meant to produce clicks. If you tell your viewers that most users won't notice the technical improvements on the latest smartphones compared to the last couple of generations viewers will often rather watch somebody who emphasises on the differences. This is bad for business.

Reply 3 of 97, by user33331

User metadata
Rank Member
Rank
Member

Some games absolutely need to be photorealistic and also virtual reality=VR needs to be photorealistic.
Modern work involving virtual reality needs very heavy duty and powerful machines.

I have wanted and waited since 1980 that games would start to look as real as possible and that needs constant development in GPU+CPU hardware.

If computer hardware won't develop then we would be stuck in a horrible limbo = similar to dark middle ages.
That would mean horrible times.

Reply 4 of 97, by luckybob

User metadata
Rank l33t
Rank
l33t
297.jpg
Filename
297.jpg
File size
38.57 KiB
Views
2641 views
File license
Fair use/fair dealing exception

Seriously, I cannot understand why average Joe should replace his PC more frequent that once in a decade.

The biggest reason is gaming. I could run Cyberpunk 2077 on a 2011 era machine. it ran okayish. I recently went balls deep into a new system, and played the same game again. The difference was mind boggling.

The second reason was compatibility. I had to fight with the system regularly to keep it plodding along. MY TIME IS WORTH SOMETHING. Not having to fight with old drivers saves me an hour or two a week. The hacked nvme drive added another hour. But a new system, where all the parts are designed to work together with regular support? That's frustration that I no longer have in my life.

Using linux to get more life out of an older machine is NOT something for the "average joe". Hell, i'd argue its not for 9/10 of the computer literate. I'd go so far as to say using linux outside of the server environment is for masochists. Linux people are the vegans of the computer world. Congratulations you can compile your own kernel. Nobody outside your particular circle-jerk gives a shit.

third, what if hardware just dies? SSD drives have a limited lifespan. the LED's in the monitors fade and die with age. I've literally worn out keyboards and mice. fans die, causing overheating and premature failure. Hell, i've seen people who have never cleaned the inside of their computers and it looked like someone had dumped their vacuum cleaner into their computer.

Software changes MUCH too quickly. Look at emerging AI software and the subsequent hardware acceleration. Where was that 5 year ago? Granted its not ubiquitous yet, but a real-world example is video card upscaling. AI is used to train the upscale of 1080 to 1440. Ai is used to learn your voice for those voice assistance things. On the flip side, what if a technology just doesn't work out? One could argue VR hasn't really gotten too far off the ground. It certainly is a lightyear ahead of the 3d glasses of the 2000's.

I do however agree, people should not be expected to upgrade just for the sake of it. And I agree with your feelings regarding getting every scrap of use out of something. That said, the common rabble want more. They want their ticktock videos to be instant and 4k resolution. American companies are right there to profit from it.

It is a mistake to think you can solve any major problems just with potatoes.

Reply 5 of 97, by user33331

User metadata
Rank Member
Rank
Member

I think in early 2000 there was some talking about even planet earth expiring and people should start thinking about finding and colonizing planets similar in condition as earth.

That should be the ultimate goal and develop that tech as fast as possible to make sure we can build space shuttles and such that reach those far corners of space.

Space is the old wild west USA a real unknown final frontier.
So sad that USA stopped 2011 manned space launches.

At least North Korea understands the need of space rocket developing.

So many valuable elementary substances waiting in space: metals and minerals.

Reply 6 of 97, by Deksor

User metadata
Rank l33t
Rank
l33t
luckybob wrote on 2022-03-25, 07:57:

Linux people are the vegans of the computer world. Congratulations you can compile your own kernel. Nobody outside your particular circle-jerk gives a shit.

While I agree with everything else you've said, I don't agree with this.

There are definitely people that get annoying with this, but the primary reason (at least for me) is just having the choice of what I want in my computer.
Did I want ads in my system? Did I want a different gui because Microsoft rushed their new windows and tried to make it look "more appealing" ? Did I want to retire all my computers without tpm by 2025 when windows 10 becomes unsupported ?

The list is very long and many complain about this because sometimes we can't even do something about it.

To me there's no pride of shame to have for the os you use. It's just part of a very complex tool called a computer.

I also think that Linux is dominant not just because it's Unix based but because it's the most advanced/polished open source OS project and the community around it is huge.
If you want a free open source windows you can use ReactOS, but I'm not sure it's really usable yet (at least for what I'd want to do)

Yes the community has annoying people, but it's like with every community, it's the loudest minority that can cause issues

Trying to identify old hardware ? Visit The retro web - Project's thread The Retro Web project - a stason.org/TH99 alternative

Reply 7 of 97, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
luckybob wrote on 2022-03-25, 07:57:

Using linux to get more life out of an older machine is NOT something for the "average joe". Hell, i'd argue its not for 9/10 of the computer literate. I'd go so far as to say using linux outside of the server environment is for masochists.

I would agree that Linux in its current state is not exactly appealing to the masses as a desktop environment. At the same time, I think that a tech savvy person can get used to it fairly quickly, if needed.

I was using Linux as my secondary OS on/off for about a decade, before switching to it permanently in 2019. Windows 7 support was ending and I didn't like the privacy aspects and the constant feature updates of Windows 10, so I skipped that.

However, my use case is such that I rarely play modern games, and when I do, it's on consoles. For people who primarily game on their PC, Linux probably isn't a the best solution at this time. We'll see if the emergence of Steam Deck improves that.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 8 of 97, by Jo22

User metadata
Rank l33t++
Rank
l33t++
user33331 wrote on 2022-03-25, 08:13:

I think in early 2000 there was some talking about even planet earth expiring and people should start thinking about finding and colonizing planets similar in condition as earth.

That should be the ultimate goal and develop that tech as fast as possible to make sure we can build space shuttles and such that reach those far corners of space.

Yes, I agree and feel the same.
That would actually be a proper use of technology, imho.

I once read on the web, that on the space station, the crew once (jokingly?) wished for a big flat screen so they had a proper "bridge" like Captain Kirk had (viewscreen). :)

The irony is, though, that space travel doesn't really need sophisticated computers, though.
A C64, Apple II or a VT100 terminal with a CP/M card do totally suffice.

Space travel is not so much more difficult than sailing.. Once in space.
In essence, it's all about navigation and resource managment.

Other aspects can be controlled by independent logic circuits that don't necessarily need to be digital or microprocessor controlled.

Old space probes didn't even have i4004 microprocessors yet.
They used custom processor boards based on discrete digital logic (Mariner, Pioneer, Voyager).

Or, were mostly analogue (early Luna/Lunik probes) and did send telemetry by using FM transmissions that may contained information divided into "channels" (logical channels rather than RF channels) provided by a VCO and a mechanical commutator.
Anyway, I'm just a layman here (that stuff is described in Karamonolis' OSCAR book).

https://en.wikipedia.org/wiki/Voltage-controlled_oscillator

https://en.wikipedia.org/wiki/Commutator_(electric)

user33331 wrote on 2022-03-25, 08:13:

Space is the old wild west USA a real unknown final frontier.
So sad that USA stopped 2011 manned space launches.

+1

I watched it on TV when the last STS "flew".. into the museum.
Was an heartbreaking experience, indeed. And I'm not even from the US.

Last time I felt like this before was when MIR de-ortbited in 2001. Almost exactly 10 years before. 😔

Attachments

  • kommutator.jpg
    Filename
    kommutator.jpg
    File size
    251.81 KiB
    Views
    2565 views
    File license
    Fair use/fair dealing exception

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 9 of 97, by brian105

User metadata
Rank Member
Rank
Member
songo wrote on 2022-03-25, 05:07:
We live in the era where computers and high-tech overall are ubiquitous - yet so many people behaves like some illiterate ignora […]
Show full quote

We live in the era where computers and high-tech overall are ubiquitous - yet so many people behaves like some illiterate ignorants. There is still a social emphasis on buying newer hardware without any real reason.

Back in the day, when PC were expensive, hardware was a luxury item and you were forced to squeeze every bit off it. I spent almost decade on Pentium 2000 MMX / 32 MB Ram and guess what? Never really had any urge to upgrade beside 3dfx card cause it was still functional as modern office / multimedia device even in 2007?

- Gaming ? I NEVER ran out of games. Even on newer hardware I still played games that was able to run on my old rig, hell... there were so many titles for which even P200 was an overkill! And there's a lot I have never touched yet.

- Listening music? Check
- Watching movies? Ok, quality is rather unimpressive but it works!
- Printing? Check
- Graphics? Checks

And so on and on and. Now I'm getting upset when some 'tech' youtube celebritres whining their new smartphone is weak for having ONLY 4Ghz RAm? Or some budget i7 CANNOT handle gaming, WTF? I know it's kind of mental shortcut of 'that stuff cannot run modern games' but anyway, it's annoying as hell.

Or calling high end Pentium 4 'writing machine'.

Seriously, I cannot understand why average Joe should replace his PC more frequent that once in a decade.

Oh, and this mentality that treats computers like horses which are getting old and loosing their strength as the times passes by. Those PC's didn't loose their funcionality, just use software dedicated to them or install Linux and you are good to go.

And i don't even start about all this that about 'weak' consoles... Maybe my old P200 made me stuck in some mental limbo but I refuse to call something like Xbox 360 or Playstation 3 a 'poor' or 'outdated' hardware.

Anyone shares similiar feeling?

I think you're way too biased by old hardware. Go take an Ivy Bridge computer, not upgraded in any way, and put Windows 10 on it, since 7 is muerto for updates and 8.1's demise is right around the corner. It's not pretty without at least an SSD upgrade. That 10 year old hardware won't take you very far and even a newer low-end PC would absolutely annihilate it. Most consumers don't even know how to swap a hard drive for an SSD, so already there's more incentive for them to buy newer hardware.

And Linux? Nobody sane wants to touch that with a 10 foot pole. For the people that know how to keep the beast alive without it keeling over because your kernel upgrade decided to implode, it's a decent experience, but still inferior in many ways to Windows, which most consumers use. (and they won't learn Linux because why do they need to?)

Also, Xbox 360/PS3 is definitely outdated. It might not seem outdated to people that exclusively play retro games on a Voodoo 3, but that stuff looks BAD compared to anything remotely new (ie last decade). I would see your argument if you mentioned the PS4 or Xbox One, as the visual improvements between that and the PS5/Xbox One X Box Series X Box Box X (they have good names) were minimal, but those console from the early 2000s are stuck in the interlaced era.

Presario 5284: K6-2+ 550 ACZ @ 600 2v, 256MB PC133, GeForce4 MX 440SE 64MB, MVP3, Maxtor SATA/150 PCI card, 16GB Sandisk U100 SATA SSD
2007 Desktop: Athlon 64 X2 6000+, Asus M2v-MX SE, Foxconn 7950GT 512mb, 4GB DDR2 800, Audigy 2 ZS, WinME/XP

Reply 10 of 97, by Jo22

User metadata
Rank l33t++
Rank
l33t++

^To be fair, that's why things like ECDL exist.
In 21st century, digital technology is everywhere. And not just since yesterday, but the 70s - 50 years ago. It's time that people finally lift their bottom and start learning about it. I mean, internet is a himan right by now. Digital base knowledge is mandatory. Period.
People should be required to pass a basic test to be able to purchase a new PC.
Just like there are car driving licenses, ham radio license, ship radio licenses, licenses for cooks, taxi drivers etc etc.
Being ignorant is a luxury that only people in first world countries can afford.
In other parts of the country (Edit: global village) won't surivive much long without them.

https://en.wikipedia.org/wiki/European_Comput … Driving_Licence

Last edited by Jo22 on 2022-03-25, 10:36. Edited 1 time in total.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 11 of 97, by chiveicrook

User metadata
Rank Newbie
Rank
Newbie

Seriously, I cannot understand why average Joe should replace his PC more frequent that once in a decade.

Influencers/youtubers/hardcore gamers aside that's exactly what's happening among reasonable average Joes. Surprisingly little changed for the average Joe over the last 10 years and, at least in my community, barely anyone upgraded during that time.

To put it in my personal perspective:
Back in the 90s tech moved extremely quickly but one could still keep old hardware for quite some time. My pentium 90 system was enough for me from 1994 to 1999. By 1999, however, it was extremely inadequate for anything but office work.
That's 5 years between upgrades.

My p3-450 system from 1999 lasted until 2006 with minimal upgrades.
That's 7 years.

My p4 system from 2006 lasted only until 2010. Dual and quad core revolution changed so much that tech started to move at early 90s pace and quickly became cumbersome.

I had to buy a laptop in 2010 and that lasted only until 2015 (due to hardware simply giving up), but many of my friends built westmere or sandy bridge based pcs and some of them are still using them without issue. Systems from sandy bridge era absolutely fly under windows 10 after SSD upgrade (and then to work better with win10 than with win7 even without one). Only now do they start upgrading because of multi-core revolution which is happening now (and which is reminiscent of the first intel core revolution)
That's 10-12 years between upgrades.
One could argue that tech progress that the average Joe could see and use almost stagnated between 2010 and 2020.
I suspect that people who buy PCs now won't upgrade for quite some time.

To contrast with all of the above there is also one new type of consumer in the world: fashionable, trend-chasing moron.
They buy the newest iphones every year not because they need to but because they want to. They buy random laptops and discard them the moment it starts to run "slow" (usually because of the user's stupidity). Their whole life revolves around consumption and "living in the now". They are an example of a dangerous trend in modern society: progressive stupidification of the masses. Instead of colonizing mars we are arguing that the earth is, in fact, not flat.

So in essence:

Anyone shares similiar feeling?

Yes.
I guess I'm getting old and grumpy .

Reply 12 of 97, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie

You can cross the Atlantic in a canoe with a sextant, powered by your arms and guided by stars.... or you could hop on something with thousands of more horse power, heated, much more detailed knowledge of your surroundings beyond the visible eye, and less likely to be toppled by rogue wave... increasing your probability of survival, staying warm, and getting there quicker giving you more time on holiday, or whatever it is you are doing at your destination.

You can commute to work in a horse and cart, which would be great at first, but the novelty would soon wane and if a rainy day doesn't put you off, not being able to spend as much time at home with your family and things that perhaps matter more to you would soon have one questioning why one persists to spend 10 times as long to commute to work on a cold wet day rather than use some other form of transportation like a car.

You can write software with a simple text editor and command line tools to build and debug (many of us did, and some sado-masochists still do), or you could use much more sophisticated, complex environments which help you produce and debug solutions of a far greater complexity and scale using resources that your old hardware simply could not handle in any reasonable time scale to produce software with complexities beyond anything they could have imagined 10/20 years ago.

Yes some tasks feel like they are wasted on the technology we have today but there is so much more that goes on under the hood and computers are general purpose machines designed for general purpose tasks and not just what a single user wants. While there is a lot of bloat with many systems and they feel sometimes that they are not working in the most efficient manner, there is a lot of software out there which is incredibly well optimised and is still limited by our current technology. It could be argued that until we have hardware which can instantaneously perform our task, it's not powerful enough yet... so why stop developing?

It does go both ways though, creativity and ingenuity can about from the tools you have and how you choose to apply them, or you can make new tools and develop the ones you have to open up new avenues of creativety which simply would not have been possible with your original tools. Personally, I'n not one for standing still. Noveltiy is great and fun, but we should not restrict ourselves to something, just because it worked.

There is a saying you have probably heard... bad workers blame their tools, better workers make new tools... but the best software developers make tools to do the work for them.

Jo22 wrote on 2022-03-25, 10:14:

People should be required to pass a basic test to be able to purchase a new PC.
Just like there are car driving licenses, ham radio license, ship radio licenses, licenses for cooks, taxi drivers etc etc.

Should you know how to write to be able to buy a pen? Should you know how to fix a nuclear power station to be able to use the electricity it generates?

Should you know how a certain program is written in order to be able to use it (this would certinaly make my life easier as users would be able to fix their own problems and I can spend more time writing new features, rather than swatting bugs 😉)?

Jo22 wrote on 2022-03-25, 10:14:

Being ignorant is a luxury that only people in first world countries can afford.
In other parts of the country won't surivive much long without them.

I do kinda agree, but at the same time I would rather listen to a doctor about a medical ailment than some person down the pub who gives me free advice and is good at the weekly pub quiz. Afterall the doctor has devoted what is argubly a mesaurable period of their life to studying and understanding what my problem may be, and they diagnose every day. I wouldn't want to (nor should I have to) study medicine for X number of years just to be able to live and fix myself when I break.

Reply 13 of 97, by javispedro1

User metadata
Rank Member
Rank
Member
brian105 wrote on 2022-03-25, 09:38:

I think you're way too biased by old hardware. Go take an Ivy Bridge computer, not upgraded in any way, and put Windows 10 on it, since 7 is muerto for updates and 8.1's demise is right around the corner. It's not pretty without at least an SSD upgrade. That 10 year old hardware won't take you very far and even a newer low-end PC would absolutely annihilate it. Most consumers don't even know how to swap a hard drive for an SSD, so already there's more incentive for them to buy newer hardware.

I disagree there. I actually upgraded a family member's computer with a Clarkdale chip to Windows 10, and it runs perfectly. Perfectly as in "I just cannot distinguish the performance from my own workstation, which is at this point 7 years old but much more powerful".
Sure, it's not entirely "without upgrades", since I doubled the RAM on it (up to 8GiB). Biggest pain was the damn WDDM1 drivers.
And the machine already came with an SSD. It was explicitly configured like that, ~11ish years ago, because even in Windows 7 era, my advice was "go SSD". Didn't matter which one, it's just that for office use you don't really need more than 32-48GiB, and the SSD made the computer so fast to boot that you would no longer need to suspend/hibernate it at all. (Off-topic: They actually got one of these often maligned Sandforce compressing SSDs and to be honest I don't know why the discontent with Sandforce (at the time). It works perfectly OK and even if it may not be competitive with other SSDs it is on the same league and therefore delivers its primary purpose: to be several order of magnitudes more responsive than a HDD. )

I think the reason for this surprising thing that definitely didn't happen before (try running Windows XP on a 1990 era computer) is that for the past 1-2 decades performance improvements in LOW END hardware have been much milder than in previous decades (because they've focused on power, or price, or something else).
I am not talking about high end hardware, it is definitely much better than it was in 2005. But since 2005 a couple "interesting" things happened that kept the low end hardware ....very low end:
* Netbooks
* Ultraportables
* Tablets/convertibles
* El-cheapo no-name tablets

All of them have: ridiculously underpowered CPUs (think ULP or even -- god -- Atom, which is not necessarily more performing than a Clarkdale chip at all), low amounts of RAM (<=4GiB!), low amounts of disk space, etc. (64GiB). And they were/are sold with latest Windows.

Software can no longer get away with increasing the minimum requirements significantly if they want to run on these devices, so they don't.

One example is Windows 11. You may think the entire purpose of Windows 11's new CPU requirements is to force the sale of the new hardware (I do!) , but even they do still support low-end Atom processors that are not much better than the 12-year old Clarkdale CPU.
And just think of how much pain MS goes through to support Windows 10/11 on devices with only 64GiB disk, with this entire "Compact OS" initiative and the like.
It basically means Windows just CANNOT really grow in minimum requirements.
They can (and do) mandate newer CPU generations for no technical reason whatsoever, but in practice the low end of these newer CPU generations is not much better in performance than the older generations (performance/power is another story), so they can't really actually increase how much Windows consumes.

I am quite sure Windows 11 will run the same on the Clarkdale computer as Windows 10 runs (once you skip the requirements checker).

Reply 14 of 97, by Shreddoc

User metadata
Rank Oldbie
Rank
Oldbie

The decades-out-of-date Linux memes are strong in this thread. What's next, bulbs of garlic around the neck, sign of the evil eye, hiss hiss?? 😁 Folks - they (desktop Windows/Linux/Mac) are all just basic, 8-yr-olds-learn-em-with-ease, simple GUIs. You click on liddle picture-y thingys with obvious names on them, and pretty much identical stuff happens across the board.

It's endlessly amusing that people learn 50 different games per year without even giving it a second thought - all with entirely different interfaces, key controls, symbols, fonts, styles, feature placements.... but slide that concept sideways to operating systems, and suddenly it's "oh no the little pictures are in slightly different places, this is an existential threat!". 😀

And yeah of course people are hooked on the commercial upgrade bug, the drive to be at the technical cutting edge. It's quite understandable, and comes with it's own type of thrills. Some mistake that as universal superiority in all things, which of course no era ever is - otherwise we wouldn't all be gathered here at this forum literally dedicated to old games - but such opinions aren't uncommon, they don't call it "mainstream" for nothing!

Last edited by Shreddoc on 2022-03-25, 11:21. Edited 1 time in total.

Reply 15 of 97, by chinny22

User metadata
Rank l33t++
Rank
l33t++

But that's how consumerism works!

Back in the day hardware was expensive so unless you had a lot of disposable income you had no choice but stick with a PC or maybe upgrade 1 or 2 components to stretch it out a few more years.
But then the internet came along and everyone wanted a PC, hardware became much more mass produced therefor reducing manufacturing costs making the PC somewhat a disposable item.

Then you have to take into account repair/maintenance costs.
9/10 times wipe/reinstall of Windows will "fix" a slow PC, but that's labor intensive, backing up everything then reinstalling everything. Digging up licences to software that the user will have naturally lost over time.
If your paying someone to do this then it's not cheap.
Maybe you can just upgrade the RAM but in both cases you still have a PC that has no warranty but higher risk of hardware failing. (which the average Joe will need to pay someone to diagnose)
or
just go out buy a lower spec PC (which what the average Joe does) £300 is the cheapest at curries.co.uk a big retailer here. If it only lasts 5 years that's £60 per year. less then a full tank of petrol these days 🙁

It's not just computers though.
Last year our washing machine broke, landlord paid someone to check it and only needed a £20 part. I would have thought something like that would have been cheaper to replace, and when was the last time anyone took a pair of shoes to the shoe repair place?!

Rather then be annoyed, benefit from peoples ignorance! I haven't purchased a new computer in years. laptops with dead HDD's are great value and easy fix.
My daily PC is Socket 1366 workstation from 2011. apart from the CPU which cost £15 I've managed to upgrade every other part for free. with the Case, MB, PSU are original.

Reply 16 of 97, by Cuttoon

User metadata
Rank Oldbie
Rank
Oldbie

Boy, are you opening a can of worms there.

It's called the hardware-software-cycle.

The-Hardware-Software-Cycle.png
Filename
The-Hardware-Software-Cycle.png
File size
76.04 KiB
Views
2437 views
File license
Fair use/fair dealing exception

Stole that from here where it's called a "virtuous cycle", bringing forth innovation and progress.
In a conventional way, it is.

But, while capitalism has its merits, this is one of its systemic failures: Commercial, closed sourced software is an obvious insult to human understanding.
Why would we subject a product with absolutely no marginal cost to the same production environment as a toothbrush? It's bound to deliver an inferior product, clearly rewarding unfavorable aspects of commerce like fraud and monopoly. Which is what we've sadly witnessed time and again during two decades of Microsoft dominance.

Why does it happen nontheless? Because systemic failure is, well, systemic. Manufacturers shipping their hardware with a free copy of Windows may not have planned for it, but it ensured their customers would soon buy a more potent system, giving the company an evolutionary edge over their competition. And so the cycle begins. It's the IT equivalent of (planned) obsolecence - the few parts of a system that actually do age physically won't come near that momentum.

While professional IT is mainly driven by productivity, innovation in private machines is a way of turning a mere tool into a conventional consumer product with a defined life cycle.

Why does open source dominate the server market? Because setting them up is the service rendered. And there's no point to artificially maxing out the hardware, it's scaled to purpose as it is, energy efficiency becoming the main criterion today.

For Apple and Microsoft, Dell and Intel, it remains good business and apart from huge losses to consoles, tablets and smart phones, TVs, speakers, the desktop/conventional laptop remains in technological deadlock.
On a macroeconomic level, abolishing all leagal protection for closed source software would yield a huge benefit, turning software developement from its head on its feet. So far, coercing customers into buying a copy gets rewarded while the actual service of developement remains opaque.

In practice: While luddism is fun, there are certain hard limits, no user being an island. Beyond the artificial need for an up to date Windows OS with an inherently unsafe architecture. The web, HD video do have some merit. I'd estimate those limits today between 1 GHz, 2 GB RAM or double that. 4 GB would mean Intel chipsets from the mid 2000s. Sure it's nice to do real time HD video editing at home, but how many really do? With 99.9 % of actual, real world screentime out there, anything that does not run is cheap programming.

Personally, anecdotal:
As an Ubuntu user since, IIRC, 8.04, I tend to get rather annoyed by the preachyness of Windows users. I can't even buy a Playboy magazine without dozens of headlines screaming at me things like "the new Windows xx and why it's awesome" "how to optimize Windows beyond a steaming shitpile" or "how do you make your shrink understand your nighmares about that insufficient swap partition".
Can't these people just use their strange bloatware in peace, without constant proselytizing?

I don't think I could reproduce a single unix shell command without mistake right away.
(Maybe 'll' but not sure what is does.)
But since roughly 2007, or since XP SP3 ultimately annoyed the shit out of me, I download a boot CD or thumbdrive, hook up the DSL cable and install a free OS with a modern GUI and that's it.
Browser, E-Mail, Office and multimedia tools, right out of the box. Rock stable, secure, fast and not a single EULA box to check.
Yet to find a random printer that won't work right away or screen with the wrong resolution.

I guess Windows is for nerds who need certain videogames on a machine made for work and communication or who can't do without some strange one-platform tools of their particular circle jerk.

Last time mommy's windows refused to acknoledge the duplex option of an HP laserjet 2xxx (read: THE most common printer on earth) despite 200 MB of dedicated driver package from some convoluted clusterfuck of a website - well, fixing that was a near-death experience. I'm getting to old for that shit.

And I seriously doubt the overall sanity of anyone hating on Bill Gates because some weird conspiracy tale about vaccines or GMO seeds.
OK, his daydreams about nuclear power show that he's detached from reality and in love with bullshit, yes. And he's rich, so there's room for a lot of Bond villain fantasy.
But why? That vile f*cker has quite openly visited endless suffering upon this world in his day job! And no amount of money and philantropy for the rest of his life is going to atone for that.
Why would you make up some more?

Why does any of that matter?
Isn't that all just fun and games in a free society?
Because we live on a finite planet and there's no planet B in sight. Especially IT is made with 60 % coal power.
Also, because, no average, non-nerd user I've ever met was happy about moving to a newer system. People are ever so alienated.

I like jumpers.

Reply 17 of 97, by Jo22

User metadata
Rank l33t++
Rank
l33t++
spiroyster wrote on 2022-03-25, 10:48:
Jo22 wrote on 2022-03-25, 10:14:

People should be required to pass a basic test to be able to purchase a new PC.
Just like there are car driving licenses, ham radio license, ship radio licenses, licenses for cooks, taxi drivers etc etc.

Should you know how to write to be able to buy a pen? Should you know how to fix a nuclear power station to be able to use the electricity it generates?

Should you know how a certain program is written in order to be able to use it (this would certinaly make my life easier as users would be able to fix their own problems and I can spend more time writing new features, rather than swatting bugs 😉)?

Heh. 😁 Let me put it this way :
People without digital competence are like pedestrians walking on the highway.. On the wrong side.

spiroyster wrote on 2022-03-25, 10:48:
Jo22 wrote on 2022-03-25, 10:14:

Being ignorant is a luxury that only people in first world countries can afford.
In other parts of the country won't surivive much long without them.

I do kinda agree, but at the same time I would rather listen to a doctor about a medical ailment than some person down the pub who gives me free advice and is good at the weekly pub quiz. Afterall the doctor has devoted what is argubly a mesaurable period of their life to studying and understanding what my problem may be, and they diagnose every day. I wouldn't want to (nor should I have to) study medicine for X number of years just to be able to live and fix myself when I break.

Doctors.. *sigh*. 😒 That's a tricky topic for me. Always lets my emotions run high and low.
Mainly because of all the injustice that's related to them.
I think they are like politicians and vary in quality.
A doctor is a master of speech and knows about matter, but not necessarily substance.
To my unimportant experience, they sometimes have no idea what's going inside a patient. Most of their work is based on speculation, stereotypes or outdated medical information.

For example: Since eternity, healers knew that psyche and physiology are one.
From a scientific point of view (without including the soul or the bio electric field etc), processes in the brain cause hormones and other substances to be send out into the body.
The other way round works, too. A physical harm causes a harm of the psyche.
Unfortunately, doctors with their dated school medicine traditionally don't take both into account. They essentially threw hundreds of thousands of years of human empathy and common sense to the wind. They try to separate what is one. Examine a forest under a microscope.
Likewise, psychiatrist do not care about physical harms, either.

I've met both quite some arrogant doctors that didn't really knew what they were talking about - and a few very good ones that actually did (kudos to them).
Alas, the first type didn't care about the experience of their patients or their relatives.
What's even worse, they casually overstep their competence. Like that internal specialist that thought he's a psychiatrist and
starts changing medication at will without asking a proper psychiatrist via phone first. 🙄
That dude even contradicted with the official information of some medicine that the manufacturer itself published online.
While looking in such doctor's eyes, I figured they didn't truely want to heal or find out the truth. They simply wanted to be right and felt the urge to exercise power over their patients.
It's like a drug to them, so that whole behavior makes sense.
Such doctors were like "I get the feeling you might be right, but that makes me feel offended and I still prefer my opinion".
And that's what makes them so dangerous, I think.
They're blinded by their own confidence. They lack cognitive self reflection. Alas, that way, you can't fix an underlying problem. You can merely fix the symptoms.

That's why I do rather care about the experience of a random person in the public (say in a pub) that went through a certain illness himself/herself,
than a doctor who never suffered from a certain illness or never took the medication himself that the constantly talks about.
Unfortunately, people that didn't were in my situation won't believe me. And that's okay, they don't have to.

All I can say as a general advice, thus: Don't stop both thinking for your selves, listen to your hearts and get multiple sources of information, if needed. It's your health and your life.
If you're not satisfied with the treatment/analysis of one doctor, get second opinions from other doctors. And if you can, double check what the doctor diagnosed. Ask questions.
Ask people at the pharmacist, the apothecary. Maybe some nurse, too. Or other people with life experience. Remember that a gut feeling is a signal of your body, too.
So if you get the feeling that something is wrong, don't ignore it too long.
But don't get brainwashed by people from a church or sect. 😉

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 18 of 97, by Cuttoon

User metadata
Rank Oldbie
Rank
Oldbie
spiroyster wrote on 2022-03-25, 10:48:

Should you know how to write to be able to buy a pen?

No, but until recently, threre were certain checks that might keep you from using that pen to reach anyone that won't open your letter. That was called an "editor".
And seeing quite a few people on facebook today, that idea had its merits. 😉

spiroyster wrote on 2022-03-25, 10:48:

Should you know how to fix a nuclear power station to be able to use the electricity it generates?

No, but you should feel responsible that there is someone there who does and you have to pay taxes for the insurance. As it turns out, both didn't work in Chernobyl and even the latter not quite in Fukushima.

spiroyster wrote on 2022-03-25, 10:48:

Should you know how a certain program is written in order to be able to use it (this would certinaly make my life easier as users would be able to fix their own problems and I can spend more time writing new features, rather than swatting bugs 😉)?

Of course anyone should be free to use his game boy, kitchen knife or TV set at home.
In fact, anyone can own a car and drive it within his property.
It's using public roads where it gets interesting, agreed?

So, few people die of internal hemorrhages because of a facebook post or outlook virus.
(Although some stuff on fb invokes similar sensations.)
But it does cause actual harm to some people that others fuck up with their networked machines. Or that social media abuse is slowly turning us into fascists.
(I do appreciate the irony that I'm lamenting that on a social medium linking to another one.)
Apart from all the simple environmental implications of overconsumption.

Now - about my constitutional right to bear doomsday devices...

I like jumpers.

Reply 19 of 97, by Tree Wyrm

User metadata
Rank Newbie
Rank
Newbie

What a fun little thread there. But yeah, there is a bit of familiar sense that's been gnawing for some time. Not to the extent that retro computers are of practical use, while they're fun to tinker with I'm under no illusion that I could use my old socket 7 system as a daily driver.

That being said about six-eight years ago I was chasing the latest and greatest, making fancy builds and custom loop liquid cooling. Thankfully that quickly died out after a third build.

When SSDs became affordable and reliable that was definitely a tangible improvement. Still, it was just an improvement, not a radical change, same with high refresh rate displays, sure I like them but not like it allows me to do something I couldn't do before.

My older X99 rig remains quite capable of doing all the tasks and I see no point of upgrading it much. 3700X is the current main but really it was just an excuse to build mini-itx system. The oldest (but not retro) system I've installed ESXi and turned into VM host. Retired rigs can have their uses.

Various issues with audio interfaces (MOTU gear and Roland MX-1) in Windows prompted me to get Mac Mini, so I'm not looking much forward to upgrading to Windows 11 at all. I've used Linux desktop at my previous job and it won't be a problem to switch over, it's probably long overdue that I finally did.

Most games I play now tend to be some indie titles with modest or even toaster requirements, or just mods for older games. I can overlook issues coming from a fan production when there is creativity and fun to be found, both sorely missing in many commercial productions. Looking over the last few years I can't name a single big title that I played and it hadn't disappointed me. A lot seem adamant to be the next milestone in train wreck production of escalating issues, severe mismanagement and disgraceful practices. Nothing surprising there though and no expectations either.

The limited living space constraints what I can have, but I've come to appreciate the limitation itself, simple concerns safeguard against going on a hoarding spree. Have to be picky and define the limits of what I actually want to have and not letting gear acquisition syndrome get ahold of my purchases.