Scali wrote:I remember most of it myself... the 486/Pentium era was when I was maturing as a graphics programmer, and wrote various 3d rende […]
Show full quote
spiroyster wrote:Just out of curiosity... Do you remember all this, or have to remind yourself? If its the former, I applaud your synapses.
I remember most of it myself... the 486/Pentium era was when I was maturing as a graphics programmer, and wrote various 3d renderers in assembly myself.
Learning every new architecture was very important at the time, to learn how to best design a renderer around its strengths and weaknesses.
When the Pentium MMX arrived it was quite a big thing in my world.
Up until the PIII/Athlon era more or less, I was still very much involved with microarchitectural optimizations, so I knew these CPUs up close and personal.
After that, things shifted to GPUs and shader programming instead, so I became more GPU-centered, and CPU-specifics became less important. On the other hand, the Core2 was more or less a derivative of the PIII, and Intel has been doing evolutionary steps since, so there wasn't all that much to learn about new CPUs anyway.
Have you heard of FlipCode by chance? That was my mecca as a budding young games developer, then Karma probably kicked in and I found my professional-self in the realm of CAD and have been there ever since S:
I read your blog towards the end of last year and it is a project that went into the big box of projects to attempt at some point in my life. I was a ardent Amiga user (not developer) for number of years (right up until about '96, when I got the Pentium which cost me an arm and a leg at the time), and later on in life, it was something that always wanted to return to. I've had the AMOS books, and a big pile of 'Storm' disks for years, but never found the enthusiasm for it until I read your Amiga blog (nice write btw). Then I read your rundown of 8088 and the box got bigger. Please stop.
I've been stuck IRL for the past few years, and in a vague attempt to try and relive nostalgia, decided to venture here. It has quickly occurred to me that things I remember may have happened +/- 3 years to when I remembered them S: Which is something I thought I would never be saying. </rant>
In the interests of mitigating thread derailment, and in relation to my Amiga experiences. I too, for years in fact, have wondered why I was only 20-30% less productive on a 7.5 Mhz 68K, than a Pentium at 10 times the speed (clock for clock). In regards to PC only platform, I think OS bloat certainly applies. This could be achieved by a number of indirect consequences; The use of language and compiler used, heavy use of libraries which themselves are complex due to the rich feature sets a lot have these days, among a few of the reasons. While the execution area of the OS may well be super optimised for the platform, this does not necessarily apply to the extended and dynamic resources used on an application basis. As an example, has anyone written a GUI with Win32, and then the same in Win Forms or even XAML to see differences (or even C# pinvoking Win32/Gdi stuff and comparing that)? I would be interested in the results of this. All 3 programs essentially achieve the same thing, with slight visual differences but quite a lot of requirement change under the hood.
Phrases like "Premature Optimisation is the root of all evil" and "Don't re-invent the wheel" have added to a development culture in which using off-the-shelf components for application development is the norm, with little consideration given to optimisation, in fact platform support is much more preferred and assembly is not portable in most cases, so the requirement for specific optimisation is diminished. I speak from a higher-level application point of view only though, this most certainly does not apply to games in which compatibility is much more defined, and perquisites higher. Compare Office95 to Office365 (Application), and something like Doom to which ever the latest release has been (A Game). 20 years of evolutionary development is much more apparent in the gaming world than in the application world. We have just got lazy as developers (imo, sorry to say, older developers had a much broader range of skills that could be applied across many disciplines in development. I've seen this diminish over the years as we tend to get developers who are specialised in certain fields. Given the lack of resources in smaller dev teams and an increasing size of products to maintain, we need 'jack of all trades' rather than 'masters'), and with the added power of hardware these days combined with (as OP suggests) the perception of how fast a task should be carried out perhaps leads to notion of inefficient computing (which probably is present, but rather hard to quantify).
This is all just my 2 cents. Perhaps also it is us, The user, that expects more because of our previous experience years ago of something taking "just as long to do" with ~20 years technological differences?