VOGONS

Common searches


Reply 40 of 56, by Errius

User metadata
Rank l33t
Rank
l33t

yes, my history is similar. A 486 in 1994 then a P2 in 1999. Matrox Millennium G200. However I DID get into Quake in a big way. It was my most-played game until the cheaters ruined it.

Is this too much voodoo?

Reply 41 of 56, by Bruninho

User metadata
Rank Oldbie
Rank
Oldbie
Gmlb256 wrote on 2021-09-17, 21:21:

For me it was the era between DOS and early 3D accelerators as it was the golden age of PC. Although I'm not nostalgic about 486s or anything below, I was familiar with some popular DOS games such as Wolf3D and DOOM back then.

Good choice! Me too. I don't remember much from 486 and before, okay I remember playing Wolf3D and DOOM II* in a 486 PC, but I do remember things from 1994 and beyond. But, as well as some members, I've also missed the Voodoo/Glide thing, or to be honest, it practically passed by me like a blazing fast Ferrari. I can only remember having a Diamond card installed with a Pentium (III? I never had a MMX one though), somewhere in 1999, before I went to use "heavier drugs" like Core 2 Duo PCs and ATI Radeon cards for gaming stuff like GP4, Counter-Strike and FIFA 2000 for the next decade of gaming.

* = I played with my dad with a serial connection. The cable was from my bedroom to his computer next door. I remember my mum complaing about the cable every time she went through it... 🤣 this was probably the only game I have played over serial these days.

"Design isn't just what it looks like and feels like. Design is how it works."
JOBS, Steve.
READ: Right to Repair sucks and is illegal!

Reply 42 of 56, by zyzzle

User metadata
Rank Member
Rank
Member

The only period I've "missed" -- by intention -- is from 2010 to the present. I mean, that whole era has been a time of vastly dimenishing returns, especially in regards to Intel CPUs. It has been a terrible time of castratation as well, ie, of eliminating compability of older code, functions, and needless abandonment of same. The bloat, the "walled garden" aspects, the control aspect of two or three companies (I'm looking at the big A, the big G and the big M) have made the hobby of being a computerist very frustrating indeed. Coupled with runaway greed of those selling older hardware and the rampant commercialization of nearly *any* "retro" hardware, whether it actually has value or not, has been enough for me to almost give up the hobby completely. I am glad to have been a very active participant from the late '70s until 2010 or so when the first Intel Core systems were released (I stopped at the i7 2600k when compatibility of DMA, sound, and video was castrated and DOS compatbility of those Intel core systems was very poor indeed, so Intel could save a few pennies.)

As far as gaming, I feel that I've missed *nothing* by not following the last ~15 years. In fact, I've only benefitted since I've missed all the bloat, "emoting", $3000 video cards, and ridiculous system requirements, while the quality of games has vastly diminished since its heydey in the early - late '90s.

Reply 43 of 56, by Gmlb256

User metadata
Rank l33t
Rank
l33t
zyzzle wrote on 2021-09-19, 00:22:

The only period I've "missed" -- by intention -- is from 2010 to the present. I mean, that whole era has been a time of vastly dimenishing returns, especially in regards to Intel CPUs. It has been a terrible time of castratation as well, ie, of eliminating compability of older code, functions, and needless abandonment of same. The bloat, the "walled garden" aspects, the control aspect of two or three companies (I'm looking at the big A, the big G and the big M) have made the hobby of being a computerist very frustrating indeed. Coupled with runaway greed of those selling older hardware and the rampant commercialization of nearly *any* "retro" hardware, whether it actually has value or not, has been enough for me to almost give up the hobby completely. I am glad to have been a very active participant from the late '70s until 2010 or so when the first Intel Core systems were released (I stopped at the i7 2600k when compatibility of DMA, sound, and video was castrated and DOS compatbility of those Intel core systems was very poor indeed, so Intel could save a few pennies.)

As far as gaming, I feel that I've missed *nothing* by not following the last ~15 years. In fact, I've only benefitted since I've missed all the bloat, "emoting", $3000 video cards, and ridiculous system requirements, while the quality of games has vastly diminished since its heydey in the early - late '90s.

Compatibility with DOS was already "broken" long time before the Intel released the Sandy Bridge CPUs and consumers didn't care about this anymore at this point. 😜

The bitcoin mining really ruined the discrete graphics card market, my modern computer currently has a GeForce GTX 1070 and bought it prior this thing started really to take off. However I don't play modern games (with some exceptions) much as I used to due to the reasons you said such as the quality of the current games.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 44 of 56, by HangarAte2nds!

User metadata
Rank Newbie
Rank
Newbie

I missed out on PII and PIII. There are about 3 years from 1999 to 2002 where I didn't play any new PC games. During that time, I had an N64 and Dreamcast. I also didn't play many PC games between 2009 and 2019. But I had new PC hardware in that time.

Reply 45 of 56, by Shreddoc

User metadata
Rank Oldbie
Rank
Oldbie

I missed late 1995 until late 1999. In hindsight, that sucked because I missed out on arguably the most momentous societal change of the past 50 years, and probably a landmark time in human history itself - the advent of the www internet. I was perfectly positioned to take full advantage of that time, as an established DOS nerd + fledgling programmer entering adulthood, but other areas of life took over and instead I look back now on a parallel universe where things turned out a lot different. Too much Wayne's World.

I think this topic highlights what a challenge it is to lose (say) 5 years in an industry that moves incredibly fast. You can't get that time or those experiences (in their original context of history) back once they're missed. If I had children looking to be into IT, I'd be teaching them that lesson in advance. You're taking notes, right?? 🤣

Reply 46 of 56, by SarahWalker

User metadata
Rank Member
Rank
Member

I missed late 1995 to mid 1998, mostly due to only having a 486SX at the time, therefore missing out on late DOS, early Windows 95 and most of the "multimedia revolution". Then I somewhat lost interest in late 2001 and only sporadically picked up since.

Reply 47 of 56, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++
Shreddoc wrote on 2021-09-19, 05:04:

I think this topic highlights what a challenge it is to lose (say) 5 years in an industry that moves incredibly fast. You can't get that time or those experiences (in their original context of history) back once they're missed. If I had children looking to be into IT, I'd be teaching them that lesson in advance. You're taking notes, right?? 🤣

Yeah seeing "all this" happen, makes you wonder if any focus on specific career direction in school is worth it, as half the types of jobs will just be gone, and half of the ones they might end up with don't exist yet. Sorry kid, I know you were low in natural talents and big into movies, but theatre projectionist or video rental store manager is out now, we can offer you uber driver or tesla mechanic.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 48 of 56, by pixelatedscraps

User metadata
Rank Member
Rank
Member

I missed out on 2014 - 2020 pretty much. I was balls deep in OS X / macOS environment both in my studio and at home, broken off my engagement and met someone new (now my wife) and was falling head over heels in love with the world, travel, music festivals - the lot. It wasn’t until Covid hit that I found myself with nothing to do, nowhere to go and no desire to party during lockdowns that I rediscovered an old flame in the PC world.

My most active years would have been 1994-1998, 2002-2008 I would say.

My ultimate dual 440LX / Voodoo2 SLI build

Reply 49 of 56, by gerry

User metadata
Rank Oldbie
Rank
Oldbie
zyzzle wrote on 2021-09-19, 00:22:

The only period I've "missed" -- by intention -- is from 2010 to the present. I mean, that whole era has been a time of vastly dimenishing returns, especially in regards to Intel CPUs. It has been a terrible time of castratation as well, ie, of eliminating compability of older code, functions, and needless abandonment of same. The bloat, the "walled garden" aspects, the control aspect of two or three companies (I'm looking at the big A, the big G and the big M) have made the hobby of being a computerist very frustrating indeed. Coupled with runaway greed of those selling older hardware and the rampant commercialization of nearly *any* "retro" hardware, whether it actually has value or not, has been enough for me to almost give up the hobby completely. I am glad to have been a very active participant from the late '70s until 2010 or so when the first Intel Core systems were released (I stopped at the i7 2600k when compatibility of DMA, sound, and video was castrated and DOS compatbility of those Intel core systems was very poor indeed, so Intel could save a few pennies.)

As far as gaming, I feel that I've missed *nothing* by not following the last ~15 years. In fact, I've only benefitted since I've missed all the bloat, "emoting", $3000 video cards, and ridiculous system requirements, while the quality of games has vastly diminished since its heydey in the early - late '90s.

true that since 'retro' computing became a recognised 'market', the prices have gone up - things got rate too

diminishing returns indeed but on then plus side a 10 year old pc is still useable online etc with latest OSes without much sense of being out of date, just not suited to latest games

DOS compatibility was quite rightly not a priority in the last decade

i'd say that many games are as good and sometimes better than the past - there is much in the way of semi-pro gaming out there on gog, steam etc that perhaps wouldn't not have got published back when physical media and distribution made that more expensive. it's worth exploring.

Reply 50 of 56, by gerry

User metadata
Rank Oldbie
Rank
Oldbie
Shreddoc wrote on 2021-09-19, 05:04:

I missed late 1995 until late 1999. In hindsight, that sucked because I missed out on arguably the most momentous societal change of the past 50 years, and probably a landmark time in human history itself - the advent of the www internet.

you certainly missed one of the most amazing times for computer use!

in my view the decade from 1990 to 1999 saw more radical changes in technology application and 'globalism' generally than the previous 3 decades, the 80's which seemed so modern at the time has more in common with the 1950s and 60's than with what happened in the late 90's. In 1990 you'd likely not have a mobile, certainly have no 'www' and things like microwaves were expensive - in 1999 you'd probably have a mobile, likely have a computer and go 'surfing the web' and things like microwaves were becoming relatively cheap

Reply 51 of 56, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Namrok wrote on 2021-09-15, 12:57:
Punching out around the Core 2, Geforce 8000 series was good timing. For years and years, my Core 2/Geforce 8800 GT rig was goo […]
Show full quote
Tetrium wrote on 2021-09-15, 12:28:

I basically missed everything after LGA775 and everything after the 8800GTS. I did keep up a bit better with the AMD stuff but mostly due to a friend of mine keeping me updated.
I did not miss the era from before the Pentium 1 and I did get to actually use a PC or other computer from before that era back when those were still new (like Amiga 500 for instance), but I often had no idea about the internals apart from "Oh it's a 486 with a color graphics card but no sound card" and learned most of this after I started the hobby.
But actually the biggest significant period of computer development which I missed is actually the current one. A PC that is 10 years old now can still do most basic stuff I need from it today (and quite literally so). This feat was impossible when you were to try something like a Pentium 60MHz from 1993 with software made for an Athlon XP 3200+ from 2003. That difference is just insane! Now 10 years PC evolution seems rather stale to me in comparison.

Punching out around the Core 2, Geforce 8000 series was good timing. For years and years, my Core 2/Geforce 8800 GT rig was good enough for everything. I upgraded to an SSD at one point. Then parts would fail, like the mobo and cpu in 2013 or so, and I'd upgrade those. But I almost never felt compelled by it's sluggishness to upgrade to the latest and greatest. At least except for the GPU which I would upgrade perfunctorily whenever I'd go to play a new game and the framerate couldn't stay above 30. I believe it got upgraded twice over 10 years? Maybe three times?

It wasn't until 2019, 12 years later, that I finally built a brand new computer totally from scratch again. The Zen2 architecture, and the RTX 2000 series cards were just too tempting. A part me of regretted not waiting for the RTX 3000 series cards and the Zen3 CPUs I knew were around the corner. But that regret was short lived as the initial shortages of those parts turned out to be permanent. Especially for the GPUs.

Over the intervening years, nothing was exciting about computer hardware. Intel dominating the space, releasing what felt like the same chips over and over again. And I honestly couldn't tell you what major feature advances GPUs made between the Geforce 8000 series and the GTX 1000 series. Speed obviously. But hard on or off qualitative features? At least not that I noticed. A little cursory research shows the 8000 series were Nvidia's first unified shader architecture, with more generalized programmable cores. A model that generally didn't change in a revolutionary way until we got AI upscaling and raytracing.

The years in which Intel dominated was definitely boring from my perspective. I basically viewed it as a sockethopping experience with each consecutive socket having barely any real world gain whatsoever. I mean even the sockets themselves seem so similar that it is hard to see which socket it even is (I'm looking at you, LGA1150/1151/1155/1156! -_- ).
For me it seemed that the differences were less technical and less real world and more about marketing, so I was never really motivated to learn about them. AMD definitely had the more interesting socket designs even though they were lagging behind in the performance department.

Another thing that played a role for me personally, is that games themselves also changed. Lootboxes, 'surprise mechanics' (which was just laughable when I first learned of this)
, games being less about skill and games being less ruthless and more about fancy graphics and trying to get gamers addicted to gambling instead of gaming. It kinda feels reminiscent to the old multimedia-hype we had around the time of the Pentium and Windows 95 or something 😜.

And thus I tended to stay with older and fewer games.

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 52 of 56, by Gmlb256

User metadata
Rank l33t
Rank
l33t
Tetrium wrote on 2021-09-20, 12:37:

The years in which Intel dominated was definitely boring from my perspective. I basically viewed it as a sockethopping experience with each consecutive socket having barely any real world gain whatsoever. I mean even the sockets themselves seem so similar that it is hard to see which socket it even is (I'm looking at you, LGA1150/1151/1155/1156! -_- ).
For me it seemed that the differences were less technical and less real world and more about marketing, so I was never really motivated to learn about them. AMD definitely had the more interesting socket designs even though they were lagging behind in the performance department.

Most of the recent CPU designs were focused on performance/watt and laptop designs so it's not surprising that the performance gains were smaller since Ivy Bridge. Even though x86 CPUs aren't on the smartphone and tablet market anymore, there were improvements on laptops which used bulky designs previously.

AMD even used LGA socket on their Ryzen Threadripper and server CPUs for higher pin density because using PGA on these CPUs would have been too bigger for motherboards.

Another major factor on this was the Intel 10nm manufacturing delays and current marketing (comparing themselves to AMD, WTH?) which damaged their reputation in recent years.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 53 of 56, by brostenen

User metadata
Rank l33t++
Rank
l33t++

Periode that I missed out on, when we are talking personal computers?

Well...
I was born in 1976, and the first computer I saw in person was around 1985. So the periode between 1976 and 1985 is were I missed out.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 54 of 56, by comp_ed82

User metadata
Rank Newbie
Rank
Newbie

I wasn't really around for 8-bit micro gaming or PC gaming before the "Shareware Era", so I never really did any PC gaming before getting a modem and discovering BBSes in the early 90s.
I skipped most of the early 3d-accelerated-card era, mostly because I was into console emulation at the time. (Additionally, I suck at first person shooters, so I never really needed 3d much.)
These days, my latest system is a Sandy Bridge i5 that I used to use for CPU mining cryptocurrency and currently have set up for work-from-home tasks on Windows 10.
Most of my daily driving is done on older Windows OSes and older platforms (Windows XP and 8.1, Intel Core 2, AMD K10), and I'm not likely to upgrade until absolutely forced to do so.

Reply 55 of 56, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Namrok wrote on 2021-09-15, 12:57:
Punching out around the Core 2, Geforce 8000 series was good timing. For years and years, my Core 2/Geforce 8800 GT rig was goo […]
Show full quote
Tetrium wrote on 2021-09-15, 12:28:

I basically missed everything after LGA775 and everything after the 8800GTS. I did keep up a bit better with the AMD stuff but mostly due to a friend of mine keeping me updated.
I did not miss the era from before the Pentium 1 and I did get to actually use a PC or other computer from before that era back when those were still new (like Amiga 500 for instance), but I often had no idea about the internals apart from "Oh it's a 486 with a color graphics card but no sound card" and learned most of this after I started the hobby.
But actually the biggest significant period of computer development which I missed is actually the current one. A PC that is 10 years old now can still do most basic stuff I need from it today (and quite literally so). This feat was impossible when you were to try something like a Pentium 60MHz from 1993 with software made for an Athlon XP 3200+ from 2003. That difference is just insane! Now 10 years PC evolution seems rather stale to me in comparison.

Punching out around the Core 2, Geforce 8000 series was good timing. For years and years, my Core 2/Geforce 8800 GT rig was good enough for everything. I upgraded to an SSD at one point. Then parts would fail, like the mobo and cpu in 2013 or so, and I'd upgrade those. But I almost never felt compelled by it's sluggishness to upgrade to the latest and greatest. At least except for the GPU which I would upgrade perfunctorily whenever I'd go to play a new game and the framerate couldn't stay above 30. I believe it got upgraded twice over 10 years? Maybe three times?

It wasn't until 2019, 12 years later, that I finally built a brand new computer totally from scratch again. The Zen2 architecture, and the RTX 2000 series cards were just too tempting. A part me of regretted not waiting for the RTX 3000 series cards and the Zen3 CPUs I knew were around the corner. But that regret was short lived as the initial shortages of those parts turned out to be permanent. Especially for the GPUs.

Over the intervening years, nothing was exciting about computer hardware. Intel dominating the space, releasing what felt like the same chips over and over again. And I honestly couldn't tell you what major feature advances GPUs made between the Geforce 8000 series and the GTX 1000 series. Speed obviously. But hard on or off qualitative features? At least not that I noticed. A little cursory research shows the 8000 series were Nvidia's first unified shader architecture, with more generalized programmable cores. A model that generally didn't change in a revolutionary way until we got AI upscaling and raytracing.

I ended up using my PC for 10 years until it died and had to be replaced. Its replacement was not even that much faster even though its hardware was much younger. It was upgraded a few times though, mostly graphics card, memory and the addition of a couple more harddrives (and some other minor things like swapping some fans and upgrading the PSU etc which of course wouldn't have much effect on the performance anyway except perhaps for adding reliability).

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 56 of 56, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
Gmlb256 wrote on 2021-09-20, 13:32:
Most of the recent CPU designs were focused on performance/watt and laptop designs so it's not surprising that the performance g […]
Show full quote
Tetrium wrote on 2021-09-20, 12:37:

The years in which Intel dominated was definitely boring from my perspective. I basically viewed it as a sockethopping experience with each consecutive socket having barely any real world gain whatsoever. I mean even the sockets themselves seem so similar that it is hard to see which socket it even is (I'm looking at you, LGA1150/1151/1155/1156! -_- ).
For me it seemed that the differences were less technical and less real world and more about marketing, so I was never really motivated to learn about them. AMD definitely had the more interesting socket designs even though they were lagging behind in the performance department.

Most of the recent CPU designs were focused on performance/watt and laptop designs so it's not surprising that the performance gains were smaller since Ivy Bridge. Even though x86 CPUs aren't on the smartphone and tablet market anymore, there were improvements on laptops which used bulky designs previously.

AMD even used LGA socket on their Ryzen Threadripper and server CPUs for higher pin density because using PGA on these CPUs would have been too bigger for motherboards.

Another major factor on this was the Intel 10nm manufacturing delays and current marketing (comparing themselves to AMD, WTH?) which damaged their reputation in recent years.

Intel's CPUs did seem to have gotten cooler (along with smaller dies), but tbf I was not really following hardware development anymore at that time so I don't know the details as well as I know them of older generations.

I always liked PGA Socketed motherboards better (so boards that had a socket for a PGA CPU and not an LGA one). Makes it easier to not get a damaged one and perhaps also easier to identify (especially when looking at bad quality pics of it online).

Intel was clearly milking for years and basically got lazy, complacent and fat.

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!