VOGONS


Golden age of computing for personal computers

Topic actions

Reply 20 of 123, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
RandomStranger wrote on 2021-04-02, 16:45:

I think there is no one golden age. The 70s and early 80s were interesting and exciting times, but PCs weren't particularly user friendly and virtually nothing was compatible with each other. Sometimes not even with themselves.

Exactly. Sure, it might have been "fun" to wrangle with obscure technical glitches and incompatible standards and uncooperative hardware – but it's also pretty nice not to have to deal with those things. And sure, maybe Windows 10 prevents you from getting direct access to the hardware or something, but does it really matter when the hardware is still powerful enough to accurately simulate just about everything from days gone by?

It is a little disconcerting that things seem to be trending towards taking even more control away from the end user, and maybe having even more power won't be worth the tradeoff. But we're not there yet.

Reply 21 of 123, by Bancho

User metadata
Rank Oldbie
Rank
Oldbie
imi wrote on 2021-04-02, 22:31:
this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers […]
Show full quote
Bancho wrote on 2021-04-02, 21:27:
Almoststew1990 wrote on 2021-04-02, 15:28:
In some ways, it's now, for a typical user. […]
Show full quote

In some ways, it's now, for a typical user.

PCs are so simple for the user that my gran can use Windows 10 for emails, youtube, all that stuff. Windows 10 updates itself, it protects itself, I didn't need to install any drivers etc. being "the computer nerd" in my family, it makes my life easier that it just works for her, and my Dad, and my fiancé, and my colleagues...

Also think about how crap cheap laptops and PCs used to be. Now a cheap laptop is entirely adequate. I have a two year old £200 12.1" netbook that is quite happy browsing, youtube, films, office stuff etc. I play old games on it too. Desktop CPUs are (/were, pre-COVID) cheap, RAM is cheap, storage is cheap. GPUs are stupid at the moment (even pre-COVID for the RTX2000 series) but a typical user does not need a GPU. You can buy a lot of PC for not much money (well, in 2019 early 2020) that will last a typical user many years.

A typical user has moved away from unreliable and physically large storage (DVDs, floppies and the drives themselves) and now downloads everything. Your harddrive is just a stick that sits on the motherboard. You can have a PC that is quite usable that is the size of a stick of chewing gum.

And let's not forget your typical user does most of their Personal Computing on a smart phone. If in 1997 you told me I could do gaming, emails, music, films, find information on literally anything I wanted, on a device that had (I dunno) 40x the processing power of my PC back then, that fits in my pocket, doesn't run out of battery a day and a half, is silent, cool, has a screen maybe 5x higher res than my PC monitor I would be amazed!

Gaming and advanced users on the other hand, that's a different story...

I would say Windows XP Core 2 Duo, 8800GT era was the best. Fast, cheap CPUs that are quite capable of running the net (which itself is expanding rapidly but before the scourge of social media) and games. They can be overclocked to give nearly a decade of future proofing. Late XP is a slim, reliable OS that advanced users can navigate with ease. There are exciting developments in graphics cards whilst not featuring the current 🤣 pricing. Games themselves are evolving in a good direction - Bioshock, Crysis, Far Cry (retaining depth whilst the graphics and sound have improved rapidly), whilst pre-dating predatory DLC practices and Live Service bullcrap.

Got to agree with this. This is the age where anyone can access a computer and the resource and material to learn about all the different technologies at play and even specialise in them. (It's not just about the nerd and geeks now). Hardware is simple and just works.. (and is incredibly powerful) most of the time! Don't get me wrong its nice to mess about with the old retro gear, because it reminds me of my youth, but no way in hell would I want to wish that on people with how systems work today.

Even gaming today. The choice is crazy and the platforms like steam and xbox game pass is mind blowing. Imaging seeing game pass even 15 years ago you would be like..woah.

this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers are merely a tool.
barely anyone today understands what is behind the technology they are using, what is required to provide them the tools that they are using, how the games produce graphics that they see or what makes the software they use every day function... this is not "computing", this is just digital life.

https://en.wikipedia.org/wiki/Computing

What do you think all makes that happen? Magic? People understand computers more than ever. People are learning coding all different languages, using raspberry pi's to do all manner of things. That's how we got to where we are today! The thread title is golden age of personal computing. Young kids are doing their school work on platforms like google classroom and teams. I'm working with colleagues from all over the world in real-time. I watcha friend play a game on twitch on my TV. Was any of this possible before? of course not. Personal computing has never been better. I can multi boot any number of systems on really fast storage. I can just have as many VM's I can fit on my middle of the road Lenovo Ryzen laptop. Hell, I can run linux directly from the windows command line. I can stop / start a Nutanix cluster from my mobile phone.

Computing today is awesome.

Reply 22 of 123, by brostenen

User metadata
Rank l33t++
Rank
l33t++

I would say that the golden era, is from around some 1978'ish/1980'ish to around 1999/00. Those 20 years are the span, that saw the basic's of what we are using today, get invented and introduced. Affordability, local busses, perfect audio, 3D GFX and so much more. Things had to be invented and after that, we have only seen some extreme refinement of what already was.

Except for Unix. However that was not really for home/amateur use before around the mid-1990's somewere (Free-BSD). Unless one is counting early Linux as Unix.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 23 of 123, by brostenen

User metadata
Rank l33t++
Rank
l33t++
Bancho wrote on 2021-04-02, 22:58:

Computing today is awesome.

Yup. Computing today is awesomme. However most people don't care for anything else than touchscreen stuff. You know, the usual standard dumb user, that knows nothing about anything else than how to use the product. People with technical knowhow are quite rare, if you look at all users in the broadest picture. They will never ever understand even 1% of what we are talking about on this forum.

One example of a dumb user, are those that still think, that sending and recieving data means downloading and nothing else. Or those that think that wifi means internet.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 24 of 123, by imi

User metadata
Rank l33t
Rank
l33t
Bancho wrote on 2021-04-02, 22:58:
imi wrote on 2021-04-02, 22:31:
this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers […]
Show full quote
Bancho wrote on 2021-04-02, 21:27:

Got to agree with this. This is the age where anyone can access a computer and the resource and material to learn about all the different technologies at play and even specialise in them. (It's not just about the nerd and geeks now). Hardware is simple and just works.. (and is incredibly powerful) most of the time! Don't get me wrong its nice to mess about with the old retro gear, because it reminds me of my youth, but no way in hell would I want to wish that on people with how systems work today.

Even gaming today. The choice is crazy and the platforms like steam and xbox game pass is mind blowing. Imaging seeing game pass even 15 years ago you would be like..woah.

this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers are merely a tool.
barely anyone today understands what is behind the technology they are using, what is required to provide them the tools that they are using, how the games produce graphics that they see or what makes the software they use every day function... this is not "computing", this is just digital life.

https://en.wikipedia.org/wiki/Computing

What do you think all makes that happen? Magic? People understand computers more than ever. People are learning coding all different languages, using raspberry pi's to do all manner of things. That's how we got to where we are today! The thread title is golden age of personal computing. Young kids are doing their school work on platforms like google classroom and teams. I'm working with colleagues from all over the world in real-time. I watcha friend play a game on twitch on my TV. Was any of this possible before? of course not. Personal computing has never been better. I can multi boot any number of systems on really fast storage. I can just have as many VM's I can fit on my middle of the road Lenovo Ryzen laptop. Hell, I can run linux directly from the windows command line. I can stop / start a Nutanix cluster from my mobile phone.

Computing today is awesome.

I think we have a starkly different understanding of what "Golden Age" means then :p

Reply 25 of 123, by Horun

User metadata
Rank l33t++
Rank
l33t++
brostenen wrote on 2021-04-02, 23:08:

I would say that the golden era, is from around some 1978'ish/1980'ish to around 1999/00. Those 20 years are the span, that saw the basic's of what we are using today, get invented and introduced. Affordability, local busses, perfect audio, 3D GFX and so much more. Things had to be invented and after that, we have only seen some extreme refinement of what already was.

Except for Unix. However that was not really for home/amateur use before around the mid-1990's somewere (Free-BSD). Unless one is counting early Linux as Unix.

Agree ! to me that Golden era was early 80's to mid 90's where everything started and grew for small business and home users....Yes FreeBSD came out 1994, the earliest Unix/Linux for the average intelligent computer geek ;p

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 26 of 123, by brostenen

User metadata
Rank l33t++
Rank
l33t++
Bancho wrote on 2021-04-02, 21:27:
Almoststew1990 wrote on 2021-04-02, 15:28:
In some ways, it's now, for a typical user. […]
Show full quote

In some ways, it's now, for a typical user.

PCs are so simple for the user that my gran can use Windows 10 for emails, youtube, all that stuff. Windows 10 updates itself, it protects itself, I didn't need to install any drivers etc. being "the computer nerd" in my family, it makes my life easier that it just works for her, and my Dad, and my fiancé, and my colleagues...

Also think about how crap cheap laptops and PCs used to be. Now a cheap laptop is entirely adequate. I have a two year old £200 12.1" netbook that is quite happy browsing, youtube, films, office stuff etc. I play old games on it too. Desktop CPUs are (/were, pre-COVID) cheap, RAM is cheap, storage is cheap. GPUs are stupid at the moment (even pre-COVID for the RTX2000 series) but a typical user does not need a GPU. You can buy a lot of PC for not much money (well, in 2019 early 2020) that will last a typical user many years.

A typical user has moved away from unreliable and physically large storage (DVDs, floppies and the drives themselves) and now downloads everything. Your harddrive is just a stick that sits on the motherboard. You can have a PC that is quite usable that is the size of a stick of chewing gum.

And let's not forget your typical user does most of their Personal Computing on a smart phone. If in 1997 you told me I could do gaming, emails, music, films, find information on literally anything I wanted, on a device that had (I dunno) 40x the processing power of my PC back then, that fits in my pocket, doesn't run out of battery a day and a half, is silent, cool, has a screen maybe 5x higher res than my PC monitor I would be amazed!

Gaming and advanced users on the other hand, that's a different story...

I would say Windows XP Core 2 Duo, 8800GT era was the best. Fast, cheap CPUs that are quite capable of running the net (which itself is expanding rapidly but before the scourge of social media) and games. They can be overclocked to give nearly a decade of future proofing. Late XP is a slim, reliable OS that advanced users can navigate with ease. There are exciting developments in graphics cards whilst not featuring the current 🤣 pricing. Games themselves are evolving in a good direction - Bioshock, Crysis, Far Cry (retaining depth whilst the graphics and sound have improved rapidly), whilst pre-dating predatory DLC practices and Live Service bullcrap.

Got to agree with this. This is the age where anyone can access a computer and the resource and material to learn about all the different technologies at play and even specialise in them. (It's not just about the nerd and geeks now). Hardware is simple and just works.. (and is incredibly powerful) most of the time! Don't get me wrong its nice to mess about with the old retro gear, because it reminds me of my youth, but no way in hell would I want to wish that on people with how systems work today.

Even gaming today. The choice is crazy and the platforms like steam and xbox game pass is mind blowing. Imaging seeing game pass even 15 years ago you would be like..woah.

One of the keys to a all digital society, is to take the power away from the user and present the machine as an easy to use device. On were you do not have to understand anything other than the user interface. Making the stuff more accessible to everyone, makes more dumb users with no understanding of hardware at all. It is just an easy to use tool these days. If we had stayed in those old days, and we still would have to know about IRQ's and all that jumper-jazz. Well... Then we would not have so many users today, and people would still think that we are those strange nerds who sits in that dark basement and never see the sun.

Just saying.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 27 of 123, by Bancho

User metadata
Rank Oldbie
Rank
Oldbie
brostenen wrote on 2021-04-02, 23:15:
Bancho wrote on 2021-04-02, 22:58:

Computing today is awesome.

One example of a dumb user, are those that still think, that sending and recieving data means downloading and nothing else. Or those that think that wifi means internet.

But why is the user dumb? because that's how they perceive how they use their personal computer? The fact that a user doesn't have to know how to fuck around with shit means that technology is doing its job right? Personal computing shouldn't mean.. to use this thing means your need to fully understand the deep fundamentals of it to get the most of it? I bet most of us don't know the gap size of the spark plugs in your car? I bet some of us don't even realise the car doesn't even have spark plugs!

I can sit on the toilet having a shit, and think something to myself and wonder, how does that work... jump on google and find out. Just because most users are not that way inclined doesn't mean that facility doesn't exist.

Reply 28 of 123, by Bancho

User metadata
Rank Oldbie
Rank
Oldbie
imi wrote on 2021-04-02, 23:19:
Bancho wrote on 2021-04-02, 22:58:
imi wrote on 2021-04-02, 22:31:

this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers are merely a tool.
barely anyone today understands what is behind the technology they are using, what is required to provide them the tools that they are using, how the games produce graphics that they see or what makes the software they use every day function... this is not "computing", this is just digital life.

https://en.wikipedia.org/wiki/Computing

What do you think all makes that happen? Magic? People understand computers more than ever. People are learning coding all different languages, using raspberry pi's to do all manner of things. That's how we got to where we are today! The thread title is golden age of personal computing. Young kids are doing their school work on platforms like google classroom and teams. I'm working with colleagues from all over the world in real-time. I watcha friend play a game on twitch on my TV. Was any of this possible before? of course not. Personal computing has never been better. I can multi boot any number of systems on really fast storage. I can just have as many VM's I can fit on my middle of the road Lenovo Ryzen laptop. Hell, I can run linux directly from the windows command line. I can stop / start a Nutanix cluster from my mobile phone.

Computing today is awesome.

I think we have a starkly different understanding of what "Golden Age" means then :p

What in your opinion then, is the "Golden Age" and why?

Reply 29 of 123, by brostenen

User metadata
Rank l33t++
Rank
l33t++
Horun wrote on 2021-04-02, 23:21:

Agree ! to me that Golden era was early 80's to mid 90's where everything started and grew for small business and home users....Yes FreeBSD came out 1994, the earliest Unix/Linux for the average intelligent computer geek ;p

There was a saying in the mid-90's... It goes something like: "The second you pay for your brand new top of the line computer, it will be obsolete."

It was of course not true. It was just a way of saying, that technology moved so fast forward like nothing else. Back then, you were able to buy a new type of 3D gfx technology, and the next day a new type of technology to produce 3D was introduced. I fully remember the summer of 1996 or 1997, in were the local shop changed the fastest CPU every week. One week it was something like 180, next week 200 and so on. I have no idea what exactly was the number, so it is just an example that I wrote. Heck. I don't even remember if that was P1 or P2.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 30 of 123, by brostenen

User metadata
Rank l33t++
Rank
l33t++
Bancho wrote on 2021-04-02, 23:26:
brostenen wrote on 2021-04-02, 23:15:
Bancho wrote on 2021-04-02, 22:58:

Computing today is awesome.

One example of a dumb user, are those that still think, that sending and recieving data means downloading and nothing else. Or those that think that wifi means internet.

But why is the user dumb? because that's how they perceive how they use their personal computer? The fact that a user doesn't have to know how to fuck around with shit means that technology is doing its job right? Personal computing shouldn't mean.. to use this thing means your need to fully understand the deep fundamentals of it to get the most of it? I bet most of us don't know the gap size of the spark plugs in your car? I bet some of us don't even realise the car doesn't even have spark plugs!

I can sit on the toilet having a shit, and think something to myself and wonder, how does that work... jump on google and find out. Just because most users are not that way inclined doesn't mean that facility doesn't exist.

A dumb user, is a user who just uses the machine, without knowing anything about it. I am not talking about how creative or not creative the user are with the software. I am talking about actual computer knowledge. You know. Exactly as I have tried to describe it. People who will never ever understand even 1% of what goes on here on Vogons. Those are the dumb users.

EDIT:
A dumb user are also one of those that call you and me a nerd and look down on us. And at the same time begs for our help, on their bleeding knees, once their computer does not work. They are those that can start the computer and start fortnite and nothing else. You perfectly well know who they are. 😉

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 31 of 123, by Namrok

User metadata
Rank Oldbie
Rank
Oldbie

It's really hard for me to perceive now as any sort of computing golden age. Is the hardware better than ever? Sure. Is the thought behind the software better than ever? IMHO, it's worse than it's ever been. It's all about training the user into compulsive behavior, to drive "engagement", harvest data, and hopefully a mountain of service fees.

Does Destiny 2 look a lot better than Doom 3? Oh hell yeah. And it runs at 4k at 144 hz in ultra wide screen with HDR. And yet, I found Doom 3 a more enjoyable experience running between 30-60 hz on a 1280x1024 CRT, because it's not trying to be the last game I ever play, and constantly throwing me from skinner box to skinner box hoping I get addicted, and reaching for my wallet. And it's like that all the way down the software stack. It's atrocious.

Win95/DOS 7.1 - P233 MMX (@2.5 x 100 FSB), Diamond Viper V330 AGP, SB16 CT2800
Win98 - K6-2+ 500, GF2 MX, SB AWE 64 CT4500, SBLive CT4780
Win98 - Pentium III 1000, GF2 GTS, SBLive CT4760
WinXP - Athlon 64 3200+, GF 7800 GS, Audigy 2 ZS

Reply 32 of 123, by Horun

User metadata
Rank l33t++
Rank
l33t++

Namrok you are thinking of it too deeply IMHO. It is not about the power of a computer but the ease of early access at a reasonable price.
Going by the OP:

jasa1063 wrote on 2021-04-02, 01:53:

What do you consider the Golden Age of computing for personal computers? For me it starts in 1977 with the release of the Apple II, Commodore Pet, and TRS-80 Model I and runs through 1999 ending in the Windows 9x era. Computers and the Operating Systems that followed put more and more software layers between the end user and the hardware. This for me took a lot the "Personal" out of the Personal Computer.

And considering the historical perspective of the "Golden Age of" and looking at Radio, TV, Cars, etc. is easy to figure that if the considered Golden Age of Radio, TV and cars it is right after it first became cheap enough and good enough for the average person to to be able to afford one and the early general increase in usage AFTER it was economically available..
Cars from 1920 to 1950. TV from 1950's thru 70's. Radio from early 1920s and lasted through the 1950s. (mostly from Wiki but still valid!)
Then the Golden age of Computers are from 1980's thru about 2000 or there abouts...just my opinion and following the typical "Golden Age of" historical data...

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 33 of 123, by kolderman

User metadata
Rank l33t
Rank
l33t

It was *obviously* the 90s. PCs went from hobbyist toys to an essential home device. Mainstream introduction of VGA, 386 and cdrom allowed games to quickly enter new territory. First practical examples of many game genres, notably the FPS, RTS and online gaming. Geekdom became cool thanks to ID software mainly, and the first widespread phenomena of fan made maps and mods and online communities bloomed. Games went from primitive to (by the end of 90s) games that still hold up and play well today (quake, unreal, HL). 3D acceleration. DOS to Windows. The internet (kind of a big deal). 3dfx vs nvidia. Midi music. Laptops. The great RAM crash. C++. Optical mice. Webcams and digital cameras.

The amount of change and pioneering that occurred in that decade will not be repeated.

Reply 34 of 123, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Jorpho wrote on 2021-04-02, 22:56:
RandomStranger wrote on 2021-04-02, 16:45:

I think there is no one golden age. The 70s and early 80s were interesting and exciting times, but PCs weren't particularly user friendly and virtually nothing was compatible with each other. Sometimes not even with themselves.

Exactly. Sure, it might have been "fun" to wrangle with obscure technical glitches and incompatible standards and uncooperative hardware – but it's also pretty nice not to have to deal with those things. And sure, maybe Windows 10 prevents you from getting direct access to the hardware or something, but does it really matter when the hardware is still powerful enough to accurately simulate just about everything from days gone by?

It is a little disconcerting that things seem to be trending towards taking even more control away from the end user, and maybe having even more power won't be worth the tradeoff. But we're not there yet.

I tend to agree, but there's an anomaly in history.

In the CP/M era, things were compatible. Initially.

CP/M had supported an official 8" floppy format that was universally supported by all PCs with 8" floppy drives.
However, that changed when the 8" format brcame obsolete.

The new 5,25" floppies used numerous different formats (hard sectored, soft sectored, X tracks/side, Y tracks/side, single sided, double sided and so on).
(Eventually, formats like the ones of TRS-80, Osborne etc. became de-facto standards. For a short time, before CP/M died.)

But still.. For file transfers, the Kermit protocol was available. The RS232 serial port was available and existed since the early days of Telex machines.

Then, IBM released the Model 5150 and MS-DOS became more widespread.
But not every where. In Europe, the Sirius-1 was available before, for example.

Initially, also, the MS-DOS platform still followed the programming guidelines of using high-level ABIs, like CP/M.
The specialized, low-level stuff was handled by a portion of the OS.
In CP/M-80 nomenclature, this was the "BIOS" part.
In our time frame, this would be called a "HAL" - Hardware Abstraction Layer.

So aslong as the applications were calling the operating systems, they continued to function.

In that era, makers of PCs could make an agreement with Microsoft and develop their custom versions of DOS that took advantage of their PCs unique fearures (OEM releases of DOS).
Without loosing application compatibility for well written programs.

However, when the PC became ubiquitous, programmers became perverted and "optimized" their software through low-level code, assembler routines and other Mod 5150 dependencies (4,77MHz timing, register modifications) etc.

Of course, without spending a single though on the poor souls with generic, MS-DOS compatibles.
No alternate code-paths, no alternate binaries, no nothing.
That's when the compatibility mess started.

Using the PC-BIOS directly was the least of a "crime" , though. 😉
It was, while hardware-dependant, another form of HAL.
Through BIOS emulators it was technically possible to maintain compatibility among different PC models, even afterwards.
Provided, that the API/ABI calls were used in a standardized fashion.

That being said, this no offense. I just want to point out that the "devil sits in the detail". 😉

Edit : Edited. Sorry. Haven't sleept in a while. 😴

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 36 of 123, by 386SX

User metadata
Rank l33t
Rank
l33t
imi wrote on 2021-04-02, 22:31:

this is not the age of computing though, it is an age of digital creation, digital work, and digital consumption, and computers are merely a tool.
barely anyone today understands what is behind the technology they are using, what is required to provide them the tools that they are using, how the games produce graphics that they see or what makes the software they use every day function... this is not "computing", this is just digital life.

https://en.wikipedia.org/wiki/Computing

I'd agree. This is a "user" life and anything technical is not important for the users anymore and sometimes I think it must not be. User just have to use their digital devices because modern consumer logic is to buy and buy and buy new devices as soon the new one is released and new o.s. make the older version obsolete without a real technical reason for it to be obsolete at all.
For example I could install a last generation Linux kernel into a Pentium 4 with SSE2 at least and it still would be a nice home office machine if pushed with maximun ram and SSD and a latest (for example AGP) GPU. With Win 8 I already found Socket 478 couldn't install it anymore for example with an error during the initial installation boot.
Not to mention the mobile world where after a couple of updates anything should theorically be trashed considering bugs, app compatibility, integrated batteries not removables etc...

Reply 37 of 123, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Shagittarius wrote on 2021-04-03, 07:54:

Golden Age = Pre Internet

I would say pre social networks.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 38 of 123, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
jasa1063 wrote on 2021-04-02, 01:53:

What do you consider the Golden Age of computing for personal computers? For me it starts in 1977 with the release of the Apple II, Commodore Pet, and TRS-80 Model I and runs through 1999 ending in the Windows 9x era. Computers and the Operating Systems that followed put more and more software layers between the end user and the hardware. This for me took a lot the "Personal" out of the Personal Computer.

This is an interesting question.
Apple II, Commodore Pet and TRS-80s were not actually considered PCs back in those days but I presume this is not what you mean.

For me the golden age would be something roughly from 1995 to roughly 2005, before laptops and later mobiles started to eat PC market share.
It was also the start of internet for the masses, before it matured, riped and started to get mushy and smelly of corporate monetisation and "having to be online all the time" etc.
Just my opinion on this, don't take this for an absolute truth.

But imo PCs were at their height and most popular during those years.

Before 1995 PCs were still a somewhat rare scene and most people didn't even know how to operate one. They were also very expensive.
Post 2005 laptops and other smaller devices took over some of the roles that had originally been for the PC (like for instance chat services or social media, which is now more of a mobile thing). Post 2005 PC sales even started to decline and PC evolution decelerated. It's as if PC evolution is perhaps at most one fifth these days of what it was back in the later 90s.
And post 2010 retro computing started to become more and more expensive, more popularized and "hip", but also less accessible due to natural causes (aging and fewer available parts) and other influences (including, but not exclusively: scalpers practiced their skillz with retro hardware from back then which they perfected with modern equipment these days).

For me the peak of retro computing was at around <2005 to 2010, before the masses started catching on. Virtually no PC component seller back then would even consider valuing parts that were just obsolete, not even the people selling old parts almost exclusively.
It was the best times, even though some amazing stuff is being done these days which were not or barely happening back then (like for instance XTIDE).

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 39 of 123, by jasa1063

User metadata
Rank Oldbie
Rank
Oldbie

You can only have a Golden Age of something through the lens of history. For comic books this was from 1938 to 1956, which was then followed by the Silver Age 1956 – 1970, and finally the Bronze Age 1970–1984. I think we can all agree there was definitely a Golden Age of PC computing. When and for how long is a point of debate and personal perspective in this thread. So this leads to next questions. Have we already had Silver Age of PC computing and when and for how long? I am less sure on this one. You may still be in it. I will leave it up to everyone else here to give their opinion on that question.