So, from what you're saying, I'm getting something like 320x200 times 5 equals 1600x1000 and then the extra line is made-up with edge interpolation till it's stretched to 1200 vertically?
Honestly, it looks good and I can't tell it from the same thing with the PPP.
No, the scaling amounts for each axis are independent. So upscaled 5x horizontally and 6x vertically using integer scaling (to 1600x1200), then the bilinear filtering stage does nothing because it's already at the target resolution.
I think I love you.
All the consoles and emulators like Analogue, MiSTer, Retroarch, have proper sharp-interpolation which is an absolute must for any upscaling of low resolution while retaining proper pixel aspect ratio and sharpness, and DOS games are no exception.
I am indefinitely thankful for the addition of this shader.
No, the scaling amounts for each axis are independent. So upscaled 5x horizontally and 6x vertically using integer scaling (to 1600x1200), then the bilinear filtering stage does nothing because it's already at the target resolution.
Ok, great. That's what I originally thought was going on. I misinterpreted your previous comment:
It will not scale to an arbitrary aspect ratio.
I took that to mean that it can't scale by an arbitrary aspect ration. In other words, my setup is actual scaling at 5:6 and I though you were saying that Sharp can only scale 1:1. It seems that you were saying that the target resolution must be 4:3, and it must have some integer path to get there which is what I had thought.
the target resolution must be 4:3, and it must have some integer path to get there.
Yep, looks to me that's the big difference with the PPP: the PPP does not require the aspect ratio to be exactly 4:3. It will try to get as close as possible to that aspect ratio, but also takes into account the max resolution of the monitor and will try to use as much of that as possible, using integer scaling that's independant for horizontal and vertical, but with a resulting aspect ratio that's as close to 4:3 as possible.
Maybe there's a scaler out there for DosBox ECE or SVN that does exactly this? I can't write it myself, unfortunately... Maybe it's best if I start a seperate thread about this (it's not necessarily ECE dependant).
Maybe there's a scaler out there for DosBox ECE or SVN that does exactly this?
You can always go back to the newest version of ECE with the PPP. You'll have to go back through this thread and it's mentioned here when it happens. As for SVN, I think you can just add the patch? Not sure if it's still compliant, check in the PPP thread.
No, the developer of the patch stopped with development. The latest version is not compatible with the machine=vgaonly setting, unfortunately... I guess a shader (these seem all to be compatible with vgaonly) that does the exact same thing would be my best option...
I guess a shader (these seem all to be compatible with vgaonly) that does the exact same thing would be my best option...
I would just use the sharp shader at 1440x1080. It only does interpolation on the edges of the pixels. With something like Lemmings with the large areas of contiguous color, it shouldn't even be noticeable and it will give you the right sized pixels.
No, the developer of the patch stopped with development. The latest version is not compatible with the machine=vgaonly setting, unfortunately... I guess a shader (these seem all to be compatible with vgaonly) that does the exact same thing would be my best option...
Another option (assuming your video card supports it) is to temporarily turn on integer scaling in your video card's control panel (Has a large impact on performance to keep it on all the time).
Then keep DOSBox set up with opengl, the fullscreen=original, and no scaler or glshader.
This worked for me, anyways.
Last edited by Crimson_Zero on 2021-01-18, 16:27. Edited 1 time in total.
Another option (assuming your video card supports it) is to temporarily turn on integer scaling in your video card's control panel (Has a large impact on performance to keep it on all the time).
Then keep DOSBox set up with opengl, the fullscreen=original, and no scaler or glshader.
I'm not aware that video cards support arbitrary scaling. AFAIK, doing that would result in a 1600x1000 squished image (320x5, 2000x5).
I'm not aware that video cards support arbitrary scaling. AFAIK, doing that would result in a 1600x1000 squished image (320x5, 2000x5).
Which brand card do use?
It's an Nvidia card. Here's the option in the control panel.
Without actually measuring, it looks correct to my eyes when running Lemmings in DOSBox with the black bars on both the top/bottom and sides in fullscreen mode.
It's an Nvidia card. Here's the option in the control panel.
Without actually measuring, it looks correct to my eyes when running Lemmings in DOSBox with the black bars on both the top/bottom and sides in fullscreen mode.
Again, fullscreen=original must be set.
Thanks.
I am familiar with the Nvidia integer scaling option. What I don't get is the output. It should be outputting either 320x200 or 320x400 from DOSBox, depending on what machine you choose.
The only way I can see what you are doing as "working" is if Lemmings is a 640x400 game, which I guess it could be because I don't know it. Then you would be getting a perfect 1280x800 with bard on the top and bottom as you say. If so, it wouldn't work for any game that outputs at 320x400 (or any other oddball resolution). It would output, but the shape of the screen would be off.
Also, just to be completely clear about it, even if everything works okay with the Nvidia card, the requirements for getting it to work are quite hefty. You have to have a 2000 series card with RTX or newer, it doesn't work on laptops unless there is no intel iGPU and you have to turn off a bunch of other features and manage the colorspace your monitor operates in just to get it to work. It's not exactly a universal solution.
Also, just to be completely clear about it, even if everything works okay with the Nvidia card, the requirements for getting it to work are quite hefty. You have to have a 2000 series card with RTX or newer, it doesn't work on laptops unless there is no intel iGPU and you have to turn off a bunch of other features and manage the colorspace your monitor operates in just to get it to work. It's not exactly a universal solution.
Oh I agree. That's why I originally said to temporarily turn it on if your video card supports it. It isn't an everyday solution. My usual DOSBox setup is to go with the pixel_perfect.glsl. Simple, easy and works without putting too much strain on the computer. I was surprised to see that the shader did not give me the expected results in vgaonly mode though, so thought I'd test it with the GPU's scaler and see what happened!
I acknowledge up-front that ECE is officially for Windows and Linux only. However, I often find that if something will compile on Linux, then it is usually possible to make a few tweaks to get it working on macOS as well, being a fellow Unix-based system. I spent a considerable amount of time teaching myself how to compile DOSBox SVN in macOS (with various patches), so I thought I would give it a go with ECE.
I downloaded the source zip file from Github. Right off the bat, running autogen spat out a permissions error, so I ran this command:
chmod +x autogen.sh
Running autogen again then resulted in another error: bad interpreter: /bin/sh^M:
After some research, I learned that I needed to install dos2unix via Homebrew and then run the following command on autogen, to convert the format to one that is compatible with Unix/macOS:
dos2unix autogen.sh
I was then able to run ./autogen, followed by ./configure , both of which successfully completed with no errors. I have attached the terminal output of each process.
Lastly, I ran the make command. Everything was working well despite some warnings being generated (which also happens during vanilla DOSBox and SVN builds). Ultimately, the make command failed due to 3 errors and 2 warnings, all of them related to only the one "drive_iso.cpp" file. The last part of the terminal output was:
It seems that this *almost* worked and there should not be anything fundamentally incompatible with getting ECE to compile on the Mac. The full terminal output for the make command is also attached for reference.
Can anyone familiar with the compilation process provide me with some advice or assistance in relation to the above error? I have searched around online but nothing useful has emerged.
It would be great to get this branch working on the Mac, as I often see comments requesting it.
Hi almeath,
I don't know much about dosbox (ECE or otherwise) but that warning is about c++11 extension...so maybe you just need add -std=c++11 or something like that for the g++ command line?
(...yeah probably this idea doesn't fly if your gcc is too old...check the manual)
The root problem is that there is some initialization done which is not supported without that extension.
Removing the initialization would probably allow compilation to go through, but it would be a good idea to do that init somewhere...
That file is several month old, I uploaded it when I created my GitHub account, mainly for testing purposes. Since then I didn't upgrade the GitHub site, so I reccomend getting the latest source code I provide on the homepage.
Thanks - I am using the source downloaded from the website and I get the same result.
I am currently trying to work out whether my system has all the required components to build with c++11. All the comments I can find suggest that Mojave 10.14.6 already supports that and even newer versions of c++, but others say there are sometimes problems with not all components being enabled by default.
The comments also suggest that -std=c++11 should probably work, but I cannot figure out how to use this in the command line with "make" and using the correct syntax.
It is probably something like:
make CXXFLAGS=
...but I do not know what to put after that.
I will have to ask for help in Stack Overflow, before reporting back here if I am successful.
almeathwrote on 2021-01-23, 09:09:Thanks - I am using the source downloaded from the website and I get the same result. […] Show full quote
Thanks - I am using the source downloaded from the website and I get the same result.
I am currently trying to work out whether my system has all the required components to build with c++11. All the comments I can find suggest that Mojave 10.14.6 already supports that and even newer versions of c++, but others say there are sometimes problems with not all components being enabled by default.
The comments also suggest that -std=c++11 should probably work, but I cannot figure out how to use this in the command line with "make" and using the correct syntax.
It is probably something like:
make CXXFLAGS=
...but I do not know what to put after that.
I will have to ask for help in Stack Overflow, before reporting back here if I am successful.
Yes there are multiple ways to test it:
- You can add it on the command line, I'm not 100% sure but I think it's something like you suggested (just add the actual switch). I think it should work the other way around too, so: CXXFLAGS="-std=c++11" make
- The Makefile has a line where these flags (CPPFLAGS, CXXFLAGS etc.) are defined. So one way to do it is edit the Makefile and add them there.
- You can also try adding the missing switch to only the failing g++ line first, and run that one command (the one in your previous post starting with g++ -DHAVE_CONFIG_H ...)
(Need to cd into src/dos so that it finds what it needs)
I see that Mac is a bit special, that configure log doesn't even tell which compiler it's actually using (clang is the default but gcc also used?). What does "g++ --version" report?