leileilol wrote on 2026-01-22, 21:02:This thread is stupid. Thanks ChatGPT and CRT twitter […]
Show full quote
This thread is stupid. Thanks ChatGPT and CRT twitter
MattRocks wrote on 2026-01-22, 19:20:
What killed Sega (and almost killed Nintendo) was PC expansion cards changing every few months - their fixed platforms assumed multi-year time horizons, not monthly hardware upgrades.
No. The laceration to exit from console hardware happened in 1994. The Geforce2 GTS did not kill the Dreamcast as you are insinuating. The original Playstation still saw new releases after that point, because clearly there's a market segment that doesn't give a fuck about the 'gamers want 100fps' thing. PC Moore's Law did not endanger consoles. There'd still be plenty of popular games on multiple platforms that were locked to 30fps throughout the 2000s.
MattRocks wrote on 2026-01-22, 16:43:
Because LCD behaviour did not break games.
Syncing to refresh will. Try playing Cyber Sphere on a laptop. also LCDs won't prevent uncapped framerate issues and exploits, they can still be done with tearing. YOU DO NOT NEED A CRT (OR EVEN HIGH REFRESH RATES) TO DO THAT. Quarantine can still be as unplayably fast on a mid-90s laptop as it would on a mid-90s desktop. LCD persistence won't prevent the severe flickering either.
Early PC game framerates being low targets were less 'inferior LCDs' and more the ISA bus and timers.
I made no such bizarre insinuation and you have totally lost the plot: GeForce2 didn’t exist when SEGA entered collapse!
SEGA and Nintendo were both pressured by Sony bringing PC style ecosystem to consoles.
Sony used CDs to deliver tools that would evolve mid-generation, middleware that would improve year to year, portable engines, etc.
Adding a CD drive to a SEGA or Nintendo would create a superficial liability because the rest of their ecosystems assumed nothing would change at the platform level. Their entire business model relied on an assumption that their customers would wait for a whole new platform every ~5 years. Their customers were not competitive gamers.
PCs were the polar opposite. PCs were adapted continuously, and could even be reconfigured between sessions.
Sony took the principle of continuous adaptation and applied it to consoles so that their PlayStation behaved like a slowly changing PC ecosystem, while SEGA behaved like a sealed appliance. The PlayStation wasn't as adaptable as a PC, but it didn't need to be - it only needed to be more adaptable than a Nintendo/SEGA.
GPUs can be made relevant only if you consider how GPU churn ships options first, and some options get unlocked/exploited later. That philosophy matches how Sony approached consoles. SEGA took the opposite view. SEGA consoles all had good graphics processors on launch day, but their capabilities are irrelevant - what is relevant is that their capabilities could not be changed after launch day.
Side note: Out of desperation, SEGA did add superficial CD drives to their fixed hardware platforms. Why do that? It was really sad to watch SEGA dying because they were dying with dementia.