Grzyb wrote on 2024-04-13, 19:17:
Jo22 wrote on 2024-04-13, 17:41:
Seriously, though, with EGA IBM has done some design decisions that I don't understood.
As usually, you fail to understand that computer technology in the 80s was EXPENSIVE. [..]
I know, but I don't consider it an argument. Computing always had been expensive. 🙁
Whoever you're asking, everyone whines about computers being expensive at the time.
Original IBM hardware was expensive for its IBM branding alone.
IBM's EGA wasn't ingenious, either, just overly complicated. Just like Microchannel.
IBM could have gone another route and produce something similar to the Hercules InColor card (EGA's short-lived rival).
Not very backwards compatible (MDA support, but no CGA support), but less complicated to produce.
Anyhow, it were clones which made the IBM PC successfull, I think.
Because IBM did the only thing right and decided to give away proper hard and software documentation for the platform, describing every part.
Not that this was any unusual, I think. In the 70s, most appliances still had a schematic inside the owners manual. TVs and radios, most notably, I think.
It was just a positive and noble move that IBM continued with this tradition in the 80s.
On the other hand, IBM likely had to do that, considering its good reputation.
Because good documentation and the possibility for users to be able to fix
expensive equipment throughout the years was a key element in IBM's reputation.
Other companies didn't care so much about long lasting products.
That's something that made IBM stand out, probably it was that way because of IBM's typewriter background.
And because IBM never was an company specialized in working with end users/private customers, maybe.
IBM rather was a company which did business with other big companies/cooperations/govs.
In such an environment, no one was acting small-minded. IBM made equipment and mentioned a price.
If things met the requirements, an agreement was being made. Or something along these lines.
Commodore aka CBM tried to be same, I guess, but wasn't in practice.
Because of curiosity, I've just read a book, "the downfall of Commodore" and the things being mentioned in there were really bizarre at times.
I never thought that a company, especially in my place, could act so unprofessional at times.
In the book, several situations also showed how being small-minded, too greedy and money-oriented all time can ruin everything.
Especially the cost-saving measures almost broke the company multiple times (the saying "penny wise and pound stupid" applied here, imho).
Anyway, that's a different story likely and doesn't belong here.
I've intended this thread to be about the Amiga graphics and the creation process of games/graphics, rather than, um, PC vs Amiga. 😉
Speaking of the Amiga, the Amiga 1000 and later on, the A2000 also was expensive.
The A1000 had ICS graphics (NTSC model) and 256KB of RAM initially, for example.
These 256KB of RAM were barely enough for anything,
later Amiga Software of late 80s wouldn't run on a stock A1000, until it was being heavily upgraded.
The 256KB of RAM didn't allow sample based music, either, because samples need RAM.
The AdLib with its FM synthesis didn't have this issue.
So the platform wasn't really being an exception here. Like others, it had to fight those high prices, too.
But at least, there were two models that turned out to be successful.:
A500 for home users/kids and the A2000 for professionals.
The later was being used in offices, even, like a real computer.
The graphics were being used for telecommunications, even.
Like video conferences and such. Very interesting. 😀
A picture of an A2000 in an 80s news paper can be seen here.
It's an article about "tele medicine", about how doctors could communicate via fibre connection.
Edit:
megatron-uk wrote on 2024-04-13, 20:44:
Grzyb wrote on 2024-04-13, 19:17:
Yes, it was only a little step forward, but...
those who tried to make a Quantum Leap(TM) instead, failed miserably 🤣
I see what you did there! 🤣
Um, is that a reference to that old TV show of same name, by any chance? 😁
If so, it had aired under a different name here where I live (which translates to "back to the past").. I merely know the English title through sheer coincidence.
Another show, but from the 90s, was Time Trax. Also not bad.
In the place where I live, the title was changed to something that tranlates to "time trax - back to the future".
Edit: There also was the Sinclair QL (Quantum Leap).. Hm.
Edit:
In 1984, it was possible to make something better than EGA, but it wasn't possible to mass-sell it.
EGA was supposed to be a mass-selling product, so compromises were necessary.
But it failed, it never was. Probably because of the missing EGA monitors only a few had owned?
The original EGA by default only had 64 KB of RAM.
And that was enough to be better than MDA, CGA, and Hercules - and without the need to purchase the new monitor.
Sure, but then what was the purpose of EGA? EGA had worse text output than MDA/Hercules.
I mean, that's exatly the point here. If something is expensive, it should be functional, at least.
But stock EGA wasn't. It was simply expensive, without a purpose.
To do anyhting meaningful, EGA needs its own colour monitor with 350 lines.
That's why multisync monitors and EGA/Super EGA clones, -even VGA systems-, made EGA popular (postmortem).
Also, EGA had the ability to display in high-resolution mode with insufficient RAM.
It would reduce colours from 16 to 4 (?), I vagely remember.
Edit: Another point is that IBM was rich, very rich. It didn't have to be so profit-oriented when introducing a new standard.
If you produces something of quality, something that finds the liking of the industry, it will pay back manyfold.
At the time, IBM could have lived with less profits in the inital time and let PC users have time to adopt.
The original IBM PC had sold for years after its introduction.
Or to quote the founder of Bosch, “It’s better to lose money than trust”. (source)