VOGONS


Reply 60 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2026-01-22, 16:24:

1997 PC total sales were about 100 Million units by then, but from top 10 laptop vendors, the number hadn't even gone to 10s of millions, it's hard to nail down, but something like 5 million is a reasonable guess. Game companies were just not even going to consider a platform with less than double digit market penetration. They barely considered Macs...

Including laptops is not targeting laptops. Once you accept that distinction, the market-share argument simply collapses.

What is really going on is that the game publishers don't want support calls or complaints. If their game might trigger support calls from laptop owners, they will print a big warning that stops that ever happening. And, if the game works on a laptop, they'll happily take the sale.

Last edited by MattRocks on 2026-01-22, 16:37. Edited 1 time in total.

Reply 61 of 161, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

No it doesn't it's gonna cost them money and gain them nothing due to tiny market share. There was never a "don't suck on LCD" box they forgot to check.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 62 of 161, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie

This whole thread reminds me of a long lost joy. The joy of getting a CPU/Video Card upgrade and going back to play games at higher settings. What fun I use to have revisiting old titles and seeing them get the crap kicked out of them by new hardware. It seems to me that joy is forever gone as all the games target the current hardware for their maximum settings. Gone are the days of Crysis and others. I think its an admission that you can't sell games profitably for an extended time frame anymore. Unless of course your doing a GaoS (Game as a service), which I hate just as much, the shoe-horning of all genres into dumb online implementations for subscription fees. Just give me a good game that's complete as a standalone and let me finish it.

Too bad, yet another great thing gone by the wayside in the modern gaming world.

/oldmanrant off

Last edited by Shagittarius on 2026-01-22, 16:44. Edited 1 time in total.

Reply 63 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2026-01-22, 16:34:

No it doesn't it's gonna cost them money and gain them nothing due to tiny market share. There was never a "don't suck on LCD" box they forgot to check.

Because LCD behaviour did not break games.

It doesn't cost them anything to sell an extra 5M copies.

However, it will cost them if they DO NOT sell those copies. It will cost them even more if the most popular kid in school says their business sucks because it won't support their one fancy Alienware laptop!

Reply 64 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
Shagittarius wrote on 2026-01-22, 16:41:

This whole thread reminds me of a long lost joy. The joy of getting a CPU/Video Card upgrade and going back to play games at higher settings. What fun I use to have revisiting old titles and seeing them get the crap kicked out of them by new hardware. It seems to me that joy is forever gone as all the games target the current hardware for their maximum settings. Gone are the days of Crysis and others. I think its an admission that you can't sell games profitably for an extended time frame anymore. Unless of course your doing a GaoS (Game as a service), which I hate just as much, the shoe-horning of all genres into dumb online implementations for subscription fees. Just give me a good game that's complete as a standalone and let me finish it.

Too bad, yet another great thing gone by the wayside in the modern gaming world.

/oldmanrant off

Very true.

PC gaming was entirely unique. Pre-release orders, pirated copies, trading, social gatherings, technical knowledge, historical continuity - all gone with authenticated subscriptions.

And, I suspect subscriptions will ultimately starve the industry and drive people back to consoles including Xbox - Microsoft knew exactly what they were doing when they began to dismantle PC gaming but that's just sales optimisation.

For me, the more interesting detail is that when Microsoft saw that a Windows PC could defeat gaming consoles like SEGA, that's when Microsoft realised they owned gaming - and what happened next was deeply strategic.

Reply 65 of 161, by douglar

User metadata
Rank l33t
Rank
l33t
MattRocks wrote on 2026-01-22, 16:49:

For me, the more interesting detail is that when Microsoft saw that a Windows PC could defeat gaming consoles like SEGA, that's when Microsoft realised they owned gaming - and what happened next was deeply strategic.

Windows only owned gaming as long as the technological advancement was so fast that console developers couldn't keep up with PC hardware. When PC hardware was 4x -10x more powerful, the windows games were going to have an advantage. That's where the cutting edge was going to be.

These days, you need to spend over $1000 just to match the power of a PS5 and the % of computers in the wild that are legitimately more powerful than a $400 switch 2 isn't large enough to lead the market. I feel like PC games have become an after thought for hardware reasons now that single core CPU performance stalled with the Ryzen 5xxx CPU and mainstream graphics performance stalled after the Nvidia 3xxx GPU.

Things stalled 5-6 years ago. Consoles caught up.

Reply 66 of 161, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie

There are still massive hardware ranges for people who game on PC (CPU/RAM/Storage/GPU), but a console is a 100% known hardware platform you can optimize for which is why consoles caught up.

I think quite a bit of people play older games on PC, esports titles that run on potato hardware, and that weird indie stuff. The top hardware used for AAA titles on release day are probably a minority now with costs as they are.

Collector of old computers, hardware, and software

Reply 67 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
Unknown_K wrote on 2026-01-22, 18:52:

There are still massive hardware ranges for people who game on PC (CPU/RAM/Storage/GPU), but a console is a 100% known hardware platform you can optimize for which is why consoles caught up.

This aligns closely with my thoughts. The main difference is that in my thoughts Microsoft owns a console platform and service meaning Microsoft wants games on that console first - not necessarily on Windows!

The biggest threat to Xbox is the reconfigurable IBM PC re-emerging as a good gaming platform. That is why, in my opinion, Microsoft Windows withdrew support for hardware audio mixing and other areas of competitive innovation that fixed hardware platforms like Xbox cannot accommodate.

What killed Sega (and almost killed Nintendo) was PC expansion cards changing every few months - their fixed platforms assumed multi-year time horizons, not monthly hardware upgrades.

douglar wrote on 2026-01-22, 17:58:

Windows only owned gaming as long as the technological advancement was so fast that console developers couldn't keep up with PC hardware.

Yes, but the way I see it, it was Microsoft that actively clamped down on those technological advancements.

Microsoft killed hardware audio mixing on Vista+. Through Xbox contracts, Microsoft heavily re-engineered NVidia’s FX microcode and slowed GPU competition. Those changes were both directly attributable to Microsoft’s strategies. There may be many more examples.

Reply 68 of 161, by douglar

User metadata
Rank l33t
Rank
l33t
MattRocks wrote on 2026-01-22, 19:20:

Yes, but the way I see it, it was Microsoft that actively clamped down on those technological advancements.

Microsoft killed hardware audio mixing on Vista+. Through Xbox contracts, Microsoft heavily re-engineered NVidia’s FX microcode and slowed GPU competition. Those changes were both directly attributable to Microsoft’s strategies. There may be many more examples.

Microsoft's influence seems extremely minor. If AMD or Nvidia could make a faster GPU that didn't consume 250watts, they would. And it's not like Intel's ARC GPU's are affordable, available, or even especially desirable. The limits are more related to the laws of physics than market segmentation.

10 years ago, you could buy this year's flagship GPU for $500, last year's flagship at $200, and viable product for $80 and play a AAA game. Today, 6 year old refurb flagship GPU costs almost $500 and minimal viable product is over $300 if you want to buy new. That's where we are at today. PC gaming just isn't a main stream product anymore. Devs will treat it accordingly.

Reply 69 of 161, by leileilol

User metadata
Rank l33t++
Rank
l33t++

This thread is stupid. Thanks ChatGPT and CRT twitter

MattRocks wrote on 2026-01-22, 19:20:

What killed Sega (and almost killed Nintendo) was PC expansion cards changing every few months - their fixed platforms assumed multi-year time horizons, not monthly hardware upgrades.

No. The laceration to exit from console hardware happened in 1994. The Geforce2 GTS did not kill the Dreamcast as you are insinuating. The original Playstation still saw new releases after that point, because clearly there's a market segment that doesn't give a fuck about the 'gamers want 100fps' thing. PC Moore's Law did not endanger consoles. There'd still be plenty of popular games on multiple platforms that were locked to 30fps throughout the 2000s.

MattRocks wrote on 2026-01-22, 16:43:

Because LCD behaviour did not break games.

Syncing to refresh will. Try playing Cyber Sphere on a laptop. also LCDs won't prevent uncapped framerate issues and exploits, they can still be done with tearing. YOU DO NOT NEED A CRT (OR EVEN HIGH REFRESH RATES) TO DO THAT. Quarantine can still be as unplayably fast on a mid-90s laptop as it would on a mid-90s desktop. LCD persistence won't prevent the severe flickering either.

Early PC game framerates being low targets were less 'inferior LCDs' and more the ISA bus and timers.

apsosig.png
long live PCem

Reply 70 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
douglar wrote on 2026-01-22, 19:50:
MattRocks wrote on 2026-01-22, 19:20:

Yes, but the way I see it, it was Microsoft that actively clamped down on those technological advancements.

Microsoft killed hardware audio mixing on Vista+. Through Xbox contracts, Microsoft heavily re-engineered NVidia’s FX microcode and slowed GPU competition. Those changes were both directly attributable to Microsoft’s strategies. There may be many more examples.

Microsoft's influence seems extremely minor. If AMD or Nvidia could make a faster GPU that didn't consume 250watts, they would.

Microsoft is a software company. NVidia is a hardware company.

NVidia FX series underperformed not because of the hardware, but because an extra instruction-translation layer that Microsoft added to abstract reprogrammable shader logic and present an interface that is easier for general software engineers to exploit.

FX5200 is the exemplar of crippled slowness because Microsoft’s architecture required FX to hit the CPU with extra translation work that converts DX bytecode to NV internal instructions.

That reflects power and control. Microsoft wanted game studios to target Microsoft APIs, not NVidia APIs. NVidia just wanted to survive its XBox contract.

Microsoft’s influence on the PC gaming market was huge. It effectively stalled IBM PC market competition and focussed gaming studios on Microsoft’s XBox roadmap.

NVidia survived/adapted, but that all comes later.

Reply 71 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
leileilol wrote on 2026-01-22, 21:02:
This thread is stupid. Thanks ChatGPT and CRT twitter […]
Show full quote

This thread is stupid. Thanks ChatGPT and CRT twitter

MattRocks wrote on 2026-01-22, 19:20:

What killed Sega (and almost killed Nintendo) was PC expansion cards changing every few months - their fixed platforms assumed multi-year time horizons, not monthly hardware upgrades.

No. The laceration to exit from console hardware happened in 1994. The Geforce2 GTS did not kill the Dreamcast as you are insinuating. The original Playstation still saw new releases after that point, because clearly there's a market segment that doesn't give a fuck about the 'gamers want 100fps' thing. PC Moore's Law did not endanger consoles. There'd still be plenty of popular games on multiple platforms that were locked to 30fps throughout the 2000s.

MattRocks wrote on 2026-01-22, 16:43:

Because LCD behaviour did not break games.

Syncing to refresh will. Try playing Cyber Sphere on a laptop. also LCDs won't prevent uncapped framerate issues and exploits, they can still be done with tearing. YOU DO NOT NEED A CRT (OR EVEN HIGH REFRESH RATES) TO DO THAT. Quarantine can still be as unplayably fast on a mid-90s laptop as it would on a mid-90s desktop. LCD persistence won't prevent the severe flickering either.

Early PC game framerates being low targets were less 'inferior LCDs' and more the ISA bus and timers.

I made no such bizarre insinuation and you have totally lost the plot: GeForce2 didn’t exist when SEGA entered collapse!

SEGA and Nintendo were both pressured by Sony bringing PC style ecosystem to consoles.

Sony used CDs to deliver tools that would evolve mid-generation, middleware that would improve year to year, portable engines, etc.

Adding a CD drive to a SEGA or Nintendo would create a superficial liability because the rest of their ecosystems assumed nothing would change at the platform level. Their entire business model relied on an assumption that their customers would wait for a whole new platform every ~5 years. Their customers were not competitive gamers.

PCs were the polar opposite. PCs were adapted continuously, and could even be reconfigured between sessions.

Sony took the principle of continuous adaptation and applied it to consoles so that their PlayStation behaved like a slowly changing PC ecosystem, while SEGA behaved like a sealed appliance. The PlayStation wasn't as adaptable as a PC, but it didn't need to be - it only needed to be more adaptable than a Nintendo/SEGA.

GPUs can be made relevant only if you consider how GPU churn ships options first, and some options get unlocked/exploited later. That philosophy matches how Sony approached consoles. SEGA took the opposite view. SEGA consoles all had good graphics processors on launch day, but their capabilities are irrelevant - what is relevant is that their capabilities could not be changed after launch day.

Side note: Out of desperation, SEGA did add superficial CD drives to their fixed hardware platforms. Why do that? It was really sad to watch SEGA dying because they were dying with dementia.

Reply 72 of 161, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

BS. Pilotwings 1990, Nintendo put a math acceleration chip in the cartridge.

edit: Also if you had the first clue what you were talking about you would know that Sega introduced CD drives 3 years before release of Playstation.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 73 of 161, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie

Wow. That is a whole lot of wrongness that bears no resemblance to the history that many posters on this board lived through.

GBAJAM 2024 submission on itch: https://90soft90.itch.io/wreckage

Reply 74 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2026-01-23, 15:20:

BS. Pilotwings 1990, Nintendo put a math acceleration chip in the cartridge.

edit: Also if you had the first clue what you were talking about you would know that Sega introduced CD drives 3 years before release of Playstation.

I keep saying "Slow down" because I know a lot people don't take the time to read properly before writing erroneous responses.

I know exactly what I am talking about: On a SEGA, the CD was used only for data content. On a PC, and on a PS1, the CD can be also used to used to deploy platform updates - that is the key difference on which you need to slow down and read again.

Reply 75 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
bakemono wrote on 2026-01-23, 16:29:

Wow. That is a whole lot of wrongness that bears no resemblance to the history that many posters on this board lived through.

Really? I'm pretty sure SEGA exited the hardware console industry on every planet - even mine! 😉

I didn't own a Dreamcast. I just watched the sale prices slide week-on-week.

Reply 76 of 161, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

You are doing it again, looking at what the Playstation is in the 2000s and projecting it back to 1994, NO!

You are getting things totally backwards for cause and effect, nobody was thinking about little Johnny's alienware that might exist in 5 years time when they were programming things in mid 1990s.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 77 of 161, by MattRocks

User metadata
Rank Member
Rank
Member
BitWrangler wrote on 2026-01-23, 16:39:

You are doing it again, looking at what the Playstation is in the 2000s and projecting it back to 1994, NO!

You are getting things totally backwards for cause and effect, nobody was thinking about little Johnny's alienware that might exist in 5 years time when they were programming things in mid 1990s.

That is a deal! I should stop thinking of PS1 as PC-like. The last console I owned was a Genesis. I never even used a PS1.

Reset to facts: SEGA and Nintendo took a beating in the market because of PCs and PS1 - that much is absolutely correct. PS1 targeted an older age group - basically people who grew up with Nintendo and had outgrown Nintendo. That is the reality that needs explaining. It had nothing at all to do with Nvidia and I didn't suggest it had.

Reply 78 of 161, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

The way in which early Playstation was PC like is in how it had CPU, graphics, memory arranged, such that development, which Sony was also more open with, became more like PC development, rather than the very closed system, our way or the highway development which was the norm on nintendo and sega. The PC-ness of having an operating system, driver upgrades and persistent storage didn't come along until later. While this openness may have attracted a lot of developers in the 90s they found themselves walled in again in the 2000s through different methods, like the online store model and increasingly sophisticated encryption and authentication.

Conceptually, independent devs were broke out of Sega/Nintendo jail, allowed to run free until Sega and Nintendo suffered from lack of titles and then rounded up and imprisoned in new compounds. They were somewhat "free" on the PC until 2010s when Microsoft began to put more limits and conditions on what was and wasn't allowed to run on Win10++ Hence breakout attempts like the steam deck. However, I think we are needing to see hardware costs plummet before the free-er, more open platform cycle repeats, as locked in but subsidised hardware is going to look more attractive in the short to mid term.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 79 of 161, by RetroPCCupboard

User metadata
Rank Oldbie
Rank
Oldbie

Is this thread supposed to be "Rage Bait"? If so, then mission accomplished! If not, then poor OP is taking a beating for trying to start a discussion.