386SX wrote on 2022-03-27, 11:17:
I agree that if we compare a common 80's CRT TV to a modern 4K TV panel there's not even sense to compare them but at the end they both does the same task from a concept point of view. I'm not saying that the progress that arrived to a 4K TV OLED panel isn't welcome, of course they are awesome from any technical point of view but at the end the average consumer I suppose will not even configure that TV for the native source resolution. Maybe using some old 576i 1080i signal on it and just saying "..colors are saturated and the image is brighter than ever.." but do they use that panel for something that panel would deserve to be used? Are the average consumer real cinema fans with example hundreds original genuine supports they will benefit that panel from? Or even are disc based movies still a common option instead of the "on demand" (equal to not phisically own anything of course if that's progress) TV modern services?
Then I undestand even in the 90's PC got upgraded soon, but they were the real "future" in those days and I understand as we all probably felt just upgrading them beside the need, was interesting. But I also remember having my 80386SX based cheap second hand machine from 1994/5 to late 1998/99 and I didn't even care about tech upgrades in those times. I jumped from MSDOS/Win 3.1 to a Win 98 K6-2 machine but without a real need just like nowdays I see people buy entire new TV just cause the DVB-T2 whatever switch when a 20$/E decoder would just do the same.
TV sets:
It depends, the European PAL standard in 576i was quite a bit better than NTSC. Compared to mature CRT screens, the first flatcreens were more compact and had lower power consumption and higher resolution and no flicker. In any other respect, (lighting, contrast, colors, movement) they were abysmal. But as you wrote: Both served the same concept and both were perfectly sufficient to let children rot away in stupid ignorance in front of them.
From a mere production point of view, if you still were to buy a factory new TV set in 2022, for whatever strange reason, it might as well be 4k. But,
- how much of today's real world content was actually produced anywhere near 4k quality - never mind the distribution format? Most cinema reels in the 1980 were closer to 480p.
- how many users will actually be able to tell the difference between 1080p and 4k, per se or even on a large 50/50 split screen? AFAIK, from a minor distance, it's already past the average resolution of the human eye.
More importantly, yes - initial color TV broadcasting was a signal that piggybacked on the legacy monochrome one, backwards compatible.
Because, many pepole would not get a new TV set to watch.
That spirit has passed.
If anything, there will be a new BS DVB standard every few years - so that people will buy a new TV set or settop box.
And that is the real "planned obsolecence" of IT and entertainment electronics.
The other, conventional kind of "planned physical failure" after "warranty time plus ten minutes"...
- I'd say rather tough to realize in stuff without moving parts. So, dedicated, highly specialized engineers since the 1970 have made great strides in ensuring that a mid-range IC car engine will produce some kind of "total write off" after 150k to 200k km.
But a semiconductor? Nope.
First-hand account from a dude who designs molds for plastic case parts, during a seminar: That idea of "chrome plating on plastic" for mobile devices, that's popular for a reason. It will always chip off after a certain time. And it will look crappy AF.
What kills off devices is neither age nor mileage, it's the user's compulsive consumption and FOMA.
BTW., the light bulb story is probably a rather bad example for planned obsolecence.
Yes, there will have been some fishy things in the past and varying quality.
But, most claims out there are folk lore and conspiracy tales.
Any incandescant light is a compromise between longevity and energy efficiency - subject to the rules of thermodynamics and black body radiation.
Want them to last forever? Put two of them in line. They will hardly ever burn out but they also will only deliver about half of the usable light together as one of them alone at normal voltage.
Name brand lightbulbs have for many years cleary stated data which could be confirmed independently: Certain number of watt, lumen and hours, 1000 for conventional, 2000 for halogen.
There would have been quite the market for longer lasting ones because in many commercial places, the effort of changing one far exceeded the sales prices.
Meanwhile, if I'm not mistaken, there were actually printers which had a preset page limit in their firmware, without the CEO ending up blindfolded against a wall...