Just noticed something.
None of the Sharp MZ series computers used an i8088.
They all used full versions of the processors, be it Z80, i8086 or iAPX 286.
That's what I meant to say, the i8088 was sort of a mistake (technically, not economically). But it wasn't the only one.
The i8080 and i8085 are inferior to the Z80 and Z80 compatibles/derivatives/supersets.
That's why the Intel developers who worked on the 8080 had founded Zilog and created the Z80, the processor the 8080 originally was meant to be.
Edit: That's why I believe that the NEC V20 was the best that ever happened to the IBM PC.
It compares to the 8088 like an Z80 compares to the i8080.
Both the Z80 and NEC V20 are much more elegant/intelligent in their design than the predecessors.
The V20/V30 even have the ability to replace the 8080 in its functionality (8080 mode).
Edit: I forgot to mention, some of the 16-Bit MZ models ran MS-DOS, too.
There's even a video, albeit about an 80286 model.
Edit: Or let me put it this way, I wonder if the IBM PC was designed to be weak on purpose.
When the IBM PC 5150 was released, IBM did well with its mini computer businesses etc.
If the IBM PC was any more powerful than it was, say by using an 8086 or 68000, it would have been a threat to the other business fields.
Ok, maybe I'm just imagining things, but the development of Windows was similarly affected.
Both Windows 2.x and OS/2 1.1 shared same visuals.
Same was with Windows 3 and OS/2 1.2 and 1.3, with the difference that OS/2 had used the new GUI earlier.
So why was Windows 3.0 so crude/depressing looking in comparison to OS/2 at the time?
I think that was intentionally, due to pressure from IBM.
IBM was afraid of Windows stealing the show, so to say.
So Windows 3 was allowed to be an upgrade over Windows 2, but simultaneously had to be looking inferior to OS/2.
Then, after the split up, Windows 3.0 MME and Windows 3.1 was allowed to be looking much more friendly.
Edit: Okay, so how does all of this relate to the thread's topic?
Well, I just wonder if the IBM PC truly deserves to be considered a reference for how to build a good x86 PC.
It's a nostalgic and historical relevant piece of hardware, but was it ever "leading edge"?
Edit: I hope I'm not making people mad by thinking out loud here, but I believe without this question being answered, it's hard to think about designing an (new) 8086 PC.
Edit: I forgot to mention, I'm a young XT owner, too.
I have a Siemens Nixdorf M35 8810 PC with a very large, old mainboard and a Hitachi CRTC for its internal CGA..
I've upgraded it with a V20, a second graphics card (Hercules) and really love that old behemoth. ^^
So it's not that I hate XTs whatsoever.. It's just that I find it interesting to get rid of bottlenecks.
A new, all 16-Bit PC wouldn't loose its PC personality, whatsoever.
All the XT style components would still be there, software would still see an XT.
If done with care, the original IBM PC BIOS can run mostly on modified on same hardware (V20 patch needed, maybe).
CGA cards could still be used, too.
Even more, a new 16-Bit CGA card could be designed, but dual-ported RAM.
The timings of such a PC would be above original XT timings, of course.
But maybe a "Turbo Button" functionality could be implemented. Waistates, second xtal oscillator etc, not sure.
"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel
//My video channel//