VOGONS

Common searches


whats the difference between...

Topic actions

  • This topic is locked. You cannot reply or edit posts.

First post, by Duffman

User metadata
Rank Member
Rank
Member

whats the difference between, a linear framebuffer and a banked framebuffer?

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 2 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
Harekiet wrote:

speed

Yes, speed!!

But the reason why there's such a fuss about LFB support in Win2K/XP is:
If you run NOLFB, it tells the game that your video does not support LFB, so it will use older, non-LFB modes...
But ONLY if the game has a fallback to non-LFB modes! Some games support LFB or nothing!

That is why it is important for Win2000/XP.

BUT - we are in DOS forum. So it's not important info.

Reply 3 of 57, by Duffman

User metadata
Rank Member
Rank
Member

with blood on an ATI computer i have,
Linear FrameBuffer on winXP exits without error but a banked framebuffer can work after i run NOLFB but its all distorted! I want to know if there is any solution to this distortion without a new graphics card.

BTW I have a win98SE bootdisk, when I run the VESA modes with this disk, they all work. so its XP not my graphics card

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 5 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
Duffman wrote:

with blood on an ATI computer i have,
Linear FrameBuffer on winXP exits without error but a banked framebuffer can work after i run NOLFB but its all distorted! I want to know if there is any solution to this distortion without a new graphics card.

Could be firmware, try different revisions, if ATI card has BIOS files online somewhere and update utility somewhere. Of course, do not try unless you know what you are doing.

Could be drivers, but I really doubt it. It might be fun to experiment, if you are into that sort of thing.

But yes, it is a combination of graphics card and XP. Even different graphics cards from same vendor can have different results with NOLFB.

Reply 6 of 57, by Duffman

User metadata
Rank Member
Rank
Member

I've alrady got catalyst 3.2 and all that stuff. I guess I'm stuck with this:( but anyway what is the purpose of different framebuffers anyway? enlighten me😕 😕

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 7 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member

Going back in time, not all video cards support all buffer formats! Reason why there's different versions of VESA...

As to what they are...
http://labs.google.com/glossary?q=Frame+Buffer
http://labs.google.com/glossary?q=Linear%20Frame%20Buffer

I hope someone can make a better posting...

If you experiment, maybe you'll have more luck. The most recent of everything (drivers, firmware, etc.) may not solve your problem - maybe you need an older version!

Also, you might want to try a different video card.

Question: Try DOS Boot disk and run the game. (LFB will work.) But try with NOLFB. NOLFB does the same thing in DOS as in Windows. Do you still get weird colors? If so, then you have the right to contact ATI tech support about this, as it cannot be blamed on XP! (I doubt they would do anything about it.)

Reply 8 of 57, by Duffman

User metadata
Rank Member
Rank
Member

Just tryed what you asked. It all works fine on the bootdisk so it must be XP working with my graphics card.

I'm just wondering does anyone get this distortion problem like I do? I still get framebuffer writes but they come out wrong on XP!!

I wish there was a patch:(

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 9 of 57, by MajorGrubert

User metadata
Rank Member
Rank
Member
Duffman wrote:

whats the difference between, a linear framebuffer and a banked framebuffer?

The original VGA specification was created quite a while ago (in computer terms), when the 8086/8088 CPUs could only access 1MB of memory and PCs used the ISA bus. Inside the 1MB address space, 128kB were reserved for the video memory: the A000-AFFF and the B000-BFFF memory ranges, and most cards used only 64kB or 32kB of the available space to map their RAM.
This setup led to a problem: a VGA card had 256kB of video memory and the 640x480 mode with 16 colors (the best you could get from a VGA card) required a 150kB frame buffer. How to fit all that memory in a memory window smaller than this? The answer was: banked memory access. When a program wanted to use a video mode such as 640x480x16, it had to ask the video card to map small portions of the frame buffer (called "banks") into the available address space. When the VESA 1.2 standard came out, the ISA bus was still used and the only way to access larger frame buffers was to use this banked approach, even if the cards had 512kB or 1MB of video RAM.

Later, when new I/O buses designed to work with 386 and newer CPUs appeared (such as the VL-bus, PCI and finally AGP), it became possible for the video card to map large chunks of memory into the CPU address space above 1MB. At this point the VESA 2.0 standard set the rules for the use of linear frame buffers, allowing the programmers to "see" the entire frame buffer at once using inside a single block of memory. Obviously, writing a program to use a linear frame buffer is simpler and faster than using a banked mode, since you don't have to worry about switching banks while you draw, constantly checking if you are about to cross the boundary between two memory areas.

Update: after the VESA 2.0 standard was published, it took a while before the programmers started to use the new linear modes. There was also a significant installed base of VESA 1.2 capable cards, so several game engines were written with detection routines and fallback options, so they would try to use a linear mode if available , but would revert to a 1.2-compatible banked mode otherwise.

Regards,

Last edited by MajorGrubert on 2003-04-22, 17:45. Edited 1 time in total.

Major Grubert

Athlon 64 3200+/Asus K8V-X/1GB DDR400/GeForce FX 5700/SB Live! 5.1

Reply 10 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
MajorGrubert wrote:

The original VGA specification was created quite a while ago (in computer terms), when the 8086/8088 CPUs could only access 1MB of memory and PCs used the ISA bus. Inside the 1MB address space, 128kB were reserved for the video memory: the A000-AFFF and the B000-BFFF memory ranges, and most cards used only 64kB or 32kB of the available space to map their RAM.
This setup led to a problem: a VGA card had 256kB of video memory and the 640x480 mode with 16 colors (the best you could get from a VGA card) required a 150kB frame buffer. How to fit all that memory in a memory window smaller than this? The answer was: banked memory access. When a program wanted to use a video mode such as 640x480x16, it had to ask the video card to map small portions of the frame buffer (called "banks") into the available address space. When the VESA 1.2 standard came out, the ISA bus was still used and the only way to access larger frame buffers was to use this banked approach, even if the cards had 512kB or 1MB of video RAM.

Later, when new I/O buses designed to work with 386 and newer CPUs appeared (such as the VL-bus, PCI and finally AGP), it became possible for the video card to map large chunks of memory into the CPU address space above 1MB. At this point the VESA 2.0 standard set the rules for the use of linear frame buffers, allowing the programmers to "see" the entire frame buffer at once using inside a single block of memory. Obviously, writing a program to use a linear frame buffer is simpler and faster than using a banked mode, since you don't have to worry about switching banks while you draw, constantly checking if you are about to cross the boundary between two memory areas.

Excellent explanation! Someone make this man a moderator! (If he wishes it.)


Update: after the VESA 2.0 standard was published, it took a while before the programmers started to use the new linear modes. There was also a significant installed base of VESA 1.2 capable cards, so several game engines were written with detection routines and fallback options, so they would try to use a linear mode if available , but would revert to a 1.2-compatible banked mode otherwise.



From a friend of a friend about the occasional LACK OF fallback switches in games that use UVBELib:


So many of the developers who licensing UVBELib back in the day simply didn't read the documentation. We had sample code and clear documentation telling them what switches we thought they should implement to allow disabling of the built in UVBELib, but then didn't do it. Too many of the developers came running to us like a week before the CD was to be sent to the duplicators for a license to ship UVBELib. Back in the day most team leaders and developers didn't believe how bad the situation was until they had the game finished and got the results back from their test teams. Then they came rushing to us and needed our stuff 'yesterday', and unfortunately never read the manual!



But for those games, we can always try Stiletto's suggested and gil22's sort-of-proven UVBELib solution:
showthread.php?s=&postid=11974#post11974

Reply 11 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
Duffman wrote:

Just tryed what you asked. It all works fine on the bootdisk so it must be XP working with my graphics card.

I'm just wondering does anyone get this distortion problem like I do? I still get framebuffer writes but they come out wrong on XP!!

I wish there was a patch:(

So - options:
1. It is Windows XP, or some part of it, interfering.
2. It is ATI drivers or BIOS in Windows (but not BIOS in DOS).
3. It is the game.
4. It is NOLFB.
5. It is something else.

First to tackle? #2.
A. Find someone on the forum or that you know who has the exact same video card down to the BIOS version number and the drivers version number. Make sure you are both running XP. Have the someone else install the game, making sure it is exactly the same version and CRC32s for game files. Run game with NOLFB and see if they have the same problem.
B. Find someone from which you can borrow another video card that will work in XP. Install the latest drivers for it and install the card in place of your old card. See if this makes the problem go away.
C. Take your video card and try many video driver versions. Test game, see if any one version is better at it than the rest.

Once you've done all these, if the results seem to point to the video card, I suggest you contact ATI. I doubt they'll help you, but it would be fun to see what they say.

Reply 12 of 57, by Duffman

User metadata
Rank Member
Rank
Member

who else has a radeon 7000 AGP?

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 13 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
Duffman wrote:

who else has a radeon 7000 AGP?

According to who's mentioned it in the forum, a guest named Hitler, (yes, I know - but hey, maybe his real name is Samuel Hitler? 😁) and a registered user named "say days ago"

It would require a moderator/admin to search the Registered users database for video card. Actually, I am not sure if they can do that.

Send private message to "say days ago", I guess.

Or maybe the moderators know someone.

Reply 14 of 57, by MajorGrubert

User metadata
Rank Member
Rank
Member

Originally posted by Duffman
with blood on an ATI computer i have,
Linear FrameBuffer on winXP exits without error but a banked framebuffer can work after i run NOLFB but its all distorted! I want to know if there is any solution to this distortion without a new graphics card.


Duffman, the last time I ran some tests with VBETEST on my GeForce4 under Win 2000 I got the same results as you: nothing happens when I select a linear frame buffer and a distorted screen shows up in the banked ones. It is impossible to get a linear frame buffer under any NT-based OS (NT/2000/XP), so eventually you may have success with a banked mode, but I belive this depends a lot on the built-in VGA driver from NT.

When a program makes a VESA BIOS call (supposing your card supports it, virtually all post-1998 cards do), the BIOS in your card will set the video mode sending I/O instructions to the card. Theoretically, if the BIOS can set the appropriate video mode using only the I/O ports that "pass through" the vga.sys driver, you can get a banked mode. However, if the card uses some of the ports that do not exist in the original VGA standard, the OS will interfere and you won't get the card to work in the desired mode.

Unfortunately, this behaviour is caused by the combination of Windows NT/2000/XP + your card, but I can't guarantee which card would work in each mode. Is anyone out there willing to run VBETEST to check which modes work (or don't) with your card?


I wish there was a patch


So do I. I guess the card manufacturer could rewrite part of the standard vga.sys driver from NT, making a card-specific version that would work at least for the banked modes (the linear modes need some help from the VDM and maybe from DOSX, the DPMI host). I am trying to get a copy of the Windows 2000 DDK to check some rumors I heard about the source code of vga.sys and how it really works, but it's been a while since I worked with a device driver so I can't promise anything.

Regards,

Major Grubert

Athlon 64 3200+/Asus K8V-X/1GB DDR400/GeForce FX 5700/SB Live! 5.1

Reply 15 of 57, by Schadenfreude

User metadata
Rank Member
Rank
Member
MajorGrubert wrote:

Unfortunately, this behaviour is caused by the combination of Windows NT/2000/XP + your card, but I can't guarantee which card would work in each mode. Is anyone out there willing to run VBETEST to check which modes work (or don't) with your card?

Check this old thread out, Grubert!
showthread.php?s=&threadid=517

Of course, it's been a while since anyone added to it. Maybe it is time again for more posts... We could make a database of entries, etc. Could be interesting.

Reply 16 of 57, by Duffman

User metadata
Rank Member
Rank
Member

Hey Grubert?

you know how safemode in XP uses "vgasave"
is it possible to get vgasave to work in regular windows? (like editing the registry)
or to get vga.sys to run every time the command prompt is opened?

or would this just be useless and cause a conflict?

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 17 of 57, by MajorGrubert

User metadata
Rank Member
Rank
Member
Duffman wrote:
Hey Grubert? […]
Show full quote

Hey Grubert?

you know how safemode in XP uses "vgasave"
is it possible to get vgasave to work in regular windows? (like editing the registry)
or to get vga.sys to run every time the command prompt is opened?

or would this just be useless and cause a conflict?

Duffman, the vga.sys driver is designed to serve two different purposes. First, if you boot into safe mode or there is some problem with your card that prevents its driver from loading (suppose you change your video card for a different model), then vga.sys is loaded as the VGASAVE driver and you get standard vga resolution (640x480) for the Windows desktop.

If everything is ok with your card (normal boot, driver working), Windows still needs a different driver for full screen command prompts. Usually, this driver is also vga.sys, called as a standard VGA driver only when a command prompt goes full screen. In other words, it is already running when you hit Alt-Enter. The problem is that vga.sys knows nothing about higher resolutions, extra VESA modes or specific details of your card, so in most cases it will interfere with any attempt to set a video mode not supported by the original VGA cards.

As I mentioned before, there is one possibility left to the card manufacturers. They could write their own replacement for vga.sys in full screen mode, a driver tied specifically to each card model that would allow applications to set different video modes, but I have never seen such a driver. It seems that most (maybe all) manufacturers take the easy path and simply rely on the original vga.sys for full screen modes, so you get stuck with the modes it supports.

Regards,

Major Grubert

Athlon 64 3200+/Asus K8V-X/1GB DDR400/GeForce FX 5700/SB Live! 5.1

Reply 18 of 57, by Duffman

User metadata
Rank Member
Rank
Member

what about the "\basevideo" command?
Whats that do?

MB: ASRock B550 Steel Legend
CPU: Ryzen 9 5950X
RAM: Corsair 64GB Kit (4x16GB) DDR4 Veng LPX C18 4000MHz
SSDs: 2x Crucial MX500 1TB SATA + 1x Samsung 980 (non-pro) 1TB NVMe SSD
OSs: Win 11 Pro (NVMe) + WinXP Pro SP3 (SATA)
GPU: RTX2070 (11) GT730 (XP)

Reply 19 of 57, by MajorGrubert

User metadata
Rank Member
Rank
Member
Duffman wrote:

what about the "\basevideo" command?
Whats that do?

Straight from Microsoft KB (http://support.microsoft.com/default.aspx?sci … kb;en-us;170756):

/BASEVIDEO

The /basevideo switch forces the system into standard 640x480 16-color VGA mode. This is used to enable the system to load if the wrong video resolution or refresh rate had been selected.

In other words, it forces Windows to load vga.sys as the video driver, ignoring any other video driver. It is supposed to help in case you have trouble after changing drivers or video cards. Selecting either "safe mode" or "vga mode" during Windows 2000/XP boot have the same effect on the video driver, as if you had inserted this option in your boot.ini file.

Regards,

Major Grubert

Athlon 64 3200+/Asus K8V-X/1GB DDR400/GeForce FX 5700/SB Live! 5.1