VOGONS


First post, by RetroGeorge

User metadata
Rank Newbie
Rank
Newbie

Hey!

I'm newbie here, so please be gentle 😀

So, I'm doing @root42 MS-DOS video series.
Going quite well but I encountered funny thing while coding the episode in which
we do fade_in effect.

It seems that setting the 0x13 VGA mode takes some time and in that time my CPU continues with the code,
so when it encounters fade_in effect I can only see the end of the effect, so practically when all the background colors are the final ones.

No problems with fade out effect, cause as I suspect it's long after VGA mode setting.
I solved the problem with delaying

delay(3000);

execution just after setting the 0x13 mode.
But I would like to programatically know exactly when I can continue not by some fixed time which can be different on other system.
Also I am not sure if it's the problem on the VGA card side or my monitor switching.
However it seems that in both text mode and 0x13 mode monitor is at 720x400@70Hz - yeah, another topic of how current LCD monitors work with old VGA cards.

So, to sum up how can I programatically know the system is ready for the fade in effect?

Reply 1 of 1, by vstrakh

User metadata
Rank Member
Rank
Member
RetroGeorge wrote on Yesterday, 17:41:

So, to sum up how can I programatically know the system is ready for the fade in effect?

There is no such thing.
The code setting up VGA registers when changing the video mode takes negligible amount of time. All the rest is wasted by your monitor on detecting the signal parameters, locking up to frequencies and pixel phases, etc. CRT were faster, but being analog and not knowing about pixels they had little to worry about. They still would blank the beam for the period of changing horizontal frequencies, VGA card would pause signal generation until the mode is fully set. But that time was way shorter.