First post, by RetroGeorge
- Rank
- Newbie
Hey!
I'm newbie here, so please be gentle 😀
So, I'm doing @root42 MS-DOS video series.
Going quite well but I encountered funny thing while coding the episode in which
we do fade_in effect.
It seems that setting the 0x13 VGA mode takes some time and in that time my CPU continues with the code,
so when it encounters fade_in effect I can only see the end of the effect, so practically when all the background colors are the final ones.
No problems with fade out effect, cause as I suspect it's long after VGA mode setting.
I solved the problem with delaying
delay(3000);
execution just after setting the 0x13 mode.
But I would like to programatically know exactly when I can continue not by some fixed time which can be different on other system.
Also I am not sure if it's the problem on the VGA card side or my monitor switching.
However it seems that in both text mode and 0x13 mode monitor is at 720x400@70Hz - yeah, another topic of how current LCD monitors work with old VGA cards.
So, to sum up how can I programatically know the system is ready for the fade in effect?