First post, by NautilusComputer
So I grew up with PCs of the late 80's as my youngest computer memories and really remember machines of the early 90's much better. Been catching up on PC history through that time period, and in everything I've read and watch (and searched for) I haven't found a good answer for this question.
I can follow how we got bus speeds from 8 MHz (8.33333 MHz) up - it doubled to 16.67 (16), and doubled to 33.333 (33), and then doubled to 66.67 (66). Doubling performance is very easy to describe and easy for the general public to understand.
People are very familiar with 'fourths' or 'quarters,' so even 25/50/75 Mhz makes enough sense for me to follow why they'd make those - they're numbers that are familiar to people and also easy to follow; and easy to derive from their "binary exponential" brethren - 25 is 16.67x1.5; 50 is 33.33x1.5; 75 is a little weird because it would be 66x1.125 which is a weird multiplier - but 50x1.5 is nice and easy (I guess 50 could be 25x2 OR 16x2.5 as well).
But 83 MHZ...I mean, there's no seemingly "good" way to get there. 33x2.5 does, or 66x1.25, or 41.5x2. Is that why it never caught on, because there wasn't an 'even way' to multiply something to get there and by the time we did have the capability to handle things on what I'll call 'in-between steps,' we had 100 MHz which could be any or ALL of the above, practically (100x1, 66x1.5, 50x2, 40x2.5, 33x3, 25x4, 20x5, 16.67x6)?
Was it something to do with oscillators just not working at frequencies that fit well with 83? Manufacturing technology just not having tolerances tight enough for it? Something else?
Who was the first to try 83 MHz - as CPU *or* bus speed? When in history was this?
I'd love to read/hear anyone's tidbits they can shine on this.