You know, I have no financial interest in this, and no reason to be engaging in this discussion other than to try to make a gentle correction on something you've got slightly wrong.
For the last time, I don't have anything wrong.
Memory refreshing *IS* a side-effect of accessing memory. Do you know how the PC refreshes its memory with the DMA controller? Yes, by accessing it.
Clearly the primary reason for accessing memory is to read or write it, not to refresh it. Hence the fact that it refreshes is a side-effect rather than the primary effect of accessing memory.
The fact that you keep insisting that I'm wrong is an insult.
I am relaying to you that the people who designed the machine considered it a design feature. You are flat out rejecting that.
I'm not. I'm saying that I think it's a silly view to have, I'm not denying that they may actually think this.
And I said why it's silly: video chips have been designed in a way that their access patterns refresh all video memory, without the need for any additional refreshing for years before the PCjr arrived. And home computers have been designed to share video memory with system memory, therefore not requiring additional refresh for years before the PCjr arrived.
So if those IBM engineers are like "Wow, look at this cool feature we've just designed!", using this off-the-shelf Motorola chip which tons of people have used in the exact same manner before them, then yes, I think they're being silly.
Remember, these people conspicuously decided not to include video memory in with the system memory when they designed the CGA, MDA, and the superior EGA (which was in development at the same time the PCjr was in development.) They broke with their own design pattern to do shared memory.
Yes, and likewise consoles have been using regular PC-parts for years, yet they also share videomemory with main memory while regular PCs do not.
There's a reason why the original PC didn't share memory... Big part of the reason why PCjr was such a failure: it makes the whole system slower (at least, the way they did it... A C64 for example is a completely different story. But hey, they didn't use off-the-shelf parts, you're in the big league now). For the regular PC, graphics performance wasn't very important. Heck, they even had a text-only adapter. So it wasn't a good trade-off.
PCjr was aimed more at games and such, so the trade-off would be different.
By the time a team breaks with two established design patterns and then chooses to create a custom ASIC (the Video Gate Array), I think it's pretty clear that they considered it a design feature, not a side effect.
If that design was actually any good, perhaps...
But it was just a poor copy of what other computers had been doing for years already.
Those IBM 'designs' were some of the worst in the history of computing. Both the PC and the PCjr are laughable designs. But hey, if you suck that much, perhaps you think that this memory refresh solution is actually a 'cool feature'.
I mean, how many systems other than CGA even have the snow-problem at all? It's not like the PCjr is the only design to have worked around this, nor is their workaround particularly impressive.
Now please drop the arrogance, and just let it go. You're getting horribly annoying, to the point of being insulting, and I am not impressed in the least.