VOGONS

Common searches


First post, by Crank9000

User metadata
Rank Newbie
Rank
Newbie

I've got three options:

A: Get a 1280x1024 LCD monitor and enjoy sharp native resolution picture for 95% of the games I play. Problem: I'd have to make room for that stuff and that would require making changes I REALLY don't want to do.
B: Move to somewhere I have the room to do what I want. Problem: I don't want to move yet so this isn't actually an option at all.
C: Get a 32" 1440p monitor for my main PC, also connect Windows XP PC to it too and get a big ass 4:3 picture for my old games

The C option sounds best for me and I've been looking at Benq PD3200Q monitor, 32" 1440p monitor with all kinds of scaling options (1:1, aspect ratio, different screen size/resolution "emulation") and the thing even has inbuilt KVM. But since it's 1440p the good old "will my older games look like crap on it" question rises it's ugly mug, and for that I'd like get input from people here who use 1440p monitor for older games. I'm very aware that scaling 1280x1024 or under to 1440p will not look as sharp as native 1440p would but is the picture tolerable? All right? Horrendous? Most of my games max out at 1600x1200 or 1280x1024, only a couple max out at 800x600 or under.

For some games I could use 720p and have it integer scaled, but this isn't something I can do for most of my games as they are natively 4:3 and I really can't tolerate messed up aspect ratio even in UI elements. Some of them I might be able to play on 1600x1200 with 1:1 scaling. I'm also aware of dgVoodoo2 and things like that that allow you to increase the internal resoltion, but I'd like to avoid using them if at all possible. I don't think dgvoodoo2 even works on Win XP.

So how bout it, am I about to shoot myself in the foot or is this actually something worth concidering?

Reply 1 of 23, by Wolfus

User metadata
Rank Member
Rank
Member

I think 1280*1024 in 1:1 on 32" display will be still big enough to use. It should be also "native sharp".

Last edited by Wolfus on 2019-08-02, 12:26. Edited 1 time in total.

Reply 2 of 23, by Crank9000

User metadata
Rank Newbie
Rank
Newbie
Wolfus wrote:

I think 1280*1024 in 1:1 on 32" displey will be still big enough to use. It should be also "native sharp".

Big enough to use probably yes, but it's very likely I would find it too small to enjoy and it would bother me constantly. 1600x1200 at 1:1 should be big enough for me but my 7950GT won't be able to deal with that resolution with all games.

Reply 3 of 23, by Wolfus

User metadata
Rank Member
Rank
Member

It would be like smaller 18" display. 1600*1200 should be better (21" probably). And you will be still able to stretch the picture while preserving aspect ratio. It's not that bad as it used to be.
For me it's still better than additional display.

Reply 4 of 23, by cde

User metadata
Rank Member
Rank
Member

I'd be interested to know if the Benq monitor you mention does box 720x400 into 4:3. My previous 1080p iiyama screen 4:3 aspect ratio option worked as intended for 640x480, 800x600 etc. but 720x400 was stretched to 1920x1080 and this results into a distorded image. I've since switched to the HP EliteDisplay E190i (1280x1024) which does the right thing regardless of the input resolution and has no issues with 720x400@70 Hz. Granted the aspects ratio is 1.25 instead of 1.33 but that is good enough for me.

Reply 6 of 23, by Crank9000

User metadata
Rank Newbie
Rank
Newbie
an81 wrote:

I am currently using an LG 34UM88C-P 34" 21:9 and 1024x768 and 720p look really really good on it. I am using a dvi to hdmi converter and you only get 3 inputs with it and the converter I think causes major issues with some higher resolutions not being detected properly (1920x1200 for instance, the screen loses signal) and the input switch is not too convenient, you always have to go into the menu. But as far as the upscaler goes, playing old games on old gpus wasn't the slightest consideration when I bought it over a year ago now, and I was pleasantly surprised by its performance in this regard, so much so that I've almost given up the idea of getting myself an old crt. I also have a Benq EW2420 and the upscaler on that one is nowhere near as good, anything but the native resolution looks bad.

That's encouraging to hear! That Benq i've been looking at has this funky USB puck controller thingy you can use to swap presets with a press of a button so it should be easy to switch between scaling modes and KVM inputs. Hopefully larger Benq monitors have better scalers than that EW2420 though..

Any others with first hand experience?

Reply 7 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t

I use 1440p and 2160p monitors for older games, but unlike most people here I also use a modern computer. Which means I can use dgVoodoo2 (and other tools) to force rendering resolution to native for most games. I can probably count on one hand the 3D games that makes me wish I had a lower resolution monitor.

Even those games run at 1920x1080, which looks OK on a 1440p or 2160p monitor. Not great, but it's not like you'll be bothered with it while playing.

Crank9000 wrote:

For some games I could use 720p and have it integer scaled

Integer scaling isn't magic, 720p is still a blurry low resolution, even if you scale it perfectly. 1024p would be much, much preferable even if it's scaled worse.

Also, the blurry scaling has a purpose, it smooths out pixels. It's desirable in cases where there should be smooth lines (edges of 3D objects), but undesirable in cases where pixels should be clear and pronounced (pixel art). I don't use integer scaling for 3D games. It looks ugly.

Finally, integer scaling isn't really all the achievable for you, two important tools (Windows Vista+ magnification API, dgVoodoo2) don't work on Windows XP.

Crank9000 wrote:

but this isn't something I can do for most of my games as they are natively 4:3 and I really can't tolerate messed up aspect ratio even in UI elements.

Um, what are you talking about? No old games support 1280x720 natively, so if you're able to add one custom resolution (1280x720) you'll surely be able to add one that's the correct aspect ratio (960x720).

Crank9000 wrote:

C: Get a 32" 1440p monitor for my main PC, also connect Windows XP PC to it too and get a big ass 4:3 picture for my old games

Ok, here's an obvious question: "main PC", is that something that can actually run the tools needed to force rendering resolution for 3D games and integer scale 2D games?

Because I really don't see why you would insist on using the Windows XP PC when you could get better image quality otherwise.

Reply 8 of 23, by Crank9000

User metadata
Rank Newbie
Rank
Newbie
ZellSF wrote:

Um, what are you talking about? No old games support 1280x720 natively, so if you're able to add one custom resolution (1280x720) you'll surely be able to add one that's the correct aspect ratio (960x720).

Depends on what you call old. Probably only games I might play 720p would be Tiberian Sun and Red Alert 2. They don't support it natively but all resolutions can be unlocked and they should scale well for 720p. I also might play Dragon Age: Origins on 720p as its UI stops scaling well after 960p and that game supports 720p straight out the box.

ZellSF wrote:

Ok, here's an obvious question: "main PC", is that something that can actually run the tools needed to force rendering resolution for 3D games and integer scale 2D games?
Because I really don't see why you would insist on using the Windows XP PC when you could get better image quality otherwise.

Simply because I would enjoy using a XP machine. Some want to use period correct original hardware, others fully embrace emulation and wrappers, I'm personally somewhere in the middle. Also I don't really share your view for blurry scaling being a good thing, I personally like my pixels sharp. I can deal with some softness due to scaling to a point as long the image doesn't start looking like there's a layer of vaseline on the screen, I don't like things like FXAA at all. That's why I made this topic, I'm trying to figure out are there 1440p monitors out there that have scalers that I would consider good enough for my taste.

Reply 9 of 23, by an81

User metadata
Rank Newbie
Rank
Newbie

About the 34UM88C-P, if anyone's interested, the upscaler produces a soft image and most of the low non-native resolutions look about the same. I mostly use 1024x768 with AA for Win98 gaming - looks great in my opinion, I also have an old netbook connected to it used as 24x7 torrent machine running a Win7 1920x1080 and the text on it looks great too, unless you specifically know what resolution the desktop is set to, it's really hard to guess. Integer scaled resolution like 720p looks almost identical to 1024x768 in terms of sharpness. I very much prefer this uniform soft scaling to a wavy pattern like with the EW2420, where at 720p you get a mix of blurry and sharp pixels.

Reply 10 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
Crank9000 wrote:
ZellSF wrote:

Um, what are you talking about? No old games support 1280x720 natively, so if you're able to add one custom resolution (1280x720) you'll surely be able to add one that's the correct aspect ratio (960x720).

Depends on what you call old. Probably only games I might play 720p would be Tiberian Sun and Red Alert 2. They don't support it natively but all resolutions can be unlocked and they should scale well for 720p. I also might play Dragon Age: Origins on 720p as its UI stops scaling well after 960p and that game supports 720p straight out the box.

Those games all support 960x720 just fine. There's no reason they would have any distorted aspect ratio, as they support any aspect ratio.

Though depending on your setup, Tiberian Sun might crash more often at both 1280x720 and 960x720. So I wouldn't recommend those resolutions for that game.

Also if Dragon Age Origin's UI scaling stops scaling well after 960p, why would you not, you know, use 960p rather than significantly reducing resolution to 720p?

Crank9000 wrote:

I don't like things like FXAA at all.

Have you tried it on a 2160p monitor, or for that matter even a 1440p monitor like you're considering? The damage FXAA does to image quality is greatly reduced when you go to higher resolutions.

Not particularly relevant to the discussion, but you brought it up and I thought it important to mention that FXAA's performance is resolution-dependent.

Crank9000 wrote:

Also I don't really share your view for blurry scaling being a good thing

"View" makes it sound like I'm being subjective, I'm not. When you have clearly visible pixel stepping where there obviously should be a smooth curve, the image quality is objectively inferior.

Enabling integer scaling on games with lots of smooth curves (i.e: any 3D rendered game) is like turning the sharpness setting on your monitor to max: it'll look sharper. It will not look as intended.

Crank9000 wrote:

I'm trying to figure out are there 1440p monitors out there that have scalers that I would consider good enough for my taste.

I wouldn't make the quality of the built-in scaler in any way part of my purchasing decision for a monitor. There's too many different criteria for monitors as is, adding a really obscure one is going to give you all sorts of problems. I'd just get used to GPU scaling.

Reply 11 of 23, by Crank9000

User metadata
Rank Newbie
Rank
Newbie
ZellSF wrote:

Also if Dragon Age Origin's UI scaling stops scaling well after 960p, why would you not, you know, use 960p rather than significantly reducing resolution to 720p?

Widescreen, you know. It's something I would have to try out and see what I think looks best.

ZellSF wrote:

Have you tried it on a 2160p monitor, or for that matter even a 1440p monitor like you're considering? The damage FXAA does to image quality is greatly reduced when you go to higher resolutions.

No, I haven't. Just 1080p where it doesn't look good at all in my opinion. Good if it gets less blurry on higher resolutions.

ZellSF wrote:

"View" makes it sound like I'm being subjective, I'm not. When you have clearly visible pixel stepping where there obviously should be a smooth curve, the image quality is objectively inferior.

I'm sorry, but that sounds a bit arrogant. I don't know the technical "correctness" if there is one for this, but I do know what I like and don't like. And I like sharp picture more than blurry picture.

Reply 12 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
Crank9000 wrote:
ZellSF wrote:

Also if Dragon Age Origin's UI scaling stops scaling well after 960p, why would you not, you know, use 960p rather than significantly reducing resolution to 720p?

Widescreen, you know. It's something I would have to try out and see what I think looks best.

1706x960 is widescreen.

Crank9000 wrote:
ZellSF wrote:

"View" makes it sound like I'm being subjective, I'm not. When you have clearly visible pixel stepping where there obviously should be a smooth curve, the image quality is objectively inferior.

I'm sorry, but that sounds a bit arrogant. I don't know the technical "correctness" if there is one for this, but I do know what I like and don't like. And I like sharp picture more than blurry picture.

There's nothing arrogant about pointing out that I wasn't stating an opinion, but talking objectively about how to get the image to look the closest to how it was intended.

You can obviously say "but I like distorting the image by doing X" and that's perfectly fine.

Reply 13 of 23, by foil_fresh

User metadata
Rank Member
Rank
Member

i play a lot of dos games (at 320 x 200) on an AOC 24.5" 240hz screen, using the hardware scaling preset mode of 22" 16:10 is very neat as 320x200 = 16:10. cant fault it, all of the games i've played so far look great. there's no 5:4 mode sadly! 😢

its fairly sharp at 1024 x 768 or 1280 x 1024, plus most of my video cards can output at 100 or 120hz at these resolutions so there are some benefits. my little radeon 9600xt can do 120 hz @ 1080 resolution over DVI. i didnt expect that at all. 🤣

Reply 14 of 23, by VileR

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:

When you have clearly visible pixel stepping where there obviously should be a smooth curve, the image quality is objectively inferior.

As long as we're throwing qualifiers around, there are two objective points you're obviously missing:

- There never was a 'smooth curve' to begin with, only an aliased (stepped) rendering of a curve *at a lower resolution*. Integer scaling produces an identically stepped curve, so that the underlying division into physical pixels doesn't make a difference. You may want it to be smoother because you have more pixels, but that doesn't turn identical into inferior, let alone objectively.

- 'Blurry' (bilinear/trilinear/etc.) interpolation doesn't smooth out curves in any case - it softens the edges of every sampled point, which is a completely different thing. Actually smoothing curves is the domain of perceptual rule-oriented filters like hq#x, xBRZ and so on, which obviously have their own downsides. Luckily, reducing the question of a game's "intended" appearance to "smooth curves" doesn't mean much.

Enabling integer scaling on games with lots of smooth curves (i.e: any 3D rendered game) is like turning the sharpness setting on your monitor to max: it'll look sharper. It will not look as intended.

Integer scaling most certainly isn't like maximizing sharpness on the monitor, because the latter is basically 'blurry' filtering in reverse - it artificially increases the contrast around edges, whereas blurry scaling artificially decreases it. They're two equivalent (but opposite) ways of interpolating between sampled points to add image data that wasn't there.

Another problem these methods have is that they tend to introduce errors by blending their point samples as if the color space was linear (no gamma law). Results are full of hue/luminance errors which bring them even farther from 'intended', if anything.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 15 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
VileRancour wrote:

There never was a 'smooth curve' to begin with, only an aliased (stepped) rendering of a curve *at a lower resolution*.

Sounds like you're basing assumptions only on the source 1600x1200 image (for example), when for 3D art we can for the most part make solid assumptions about what it's supposed to look like at a higher resolution.

3D art is (usually) trying to portray realistic images.

3D art usually would have more smoothed out edges at higher resolution.

Smoothed edges are more accurate to both realism and how the image would look rendered at a higher resolution than pixelated ones.

Yes, you can make the claim that Quake III didn't support higher than 1600x1200 because the artists though it looked wrong at 2880x2160, or the 360 era consoles went for 1280x720 because of artistic intent. In 99% of cases you would be wrong.

VileRancour wrote:

'Blurry' (bilinear/trilinear/etc.) interpolation doesn't smooth out curves in any case - it softens the edges of every sampled point, which is a completely different thing.

Technically, yes. Perceptually, no, which is the important thing when discussing image quality.

VileRancour wrote:

Integer scaling most certainly isn't like maximizing sharpness on the monitor, because the latter is basically 'blurry' filtering in reverse - it artificially increases the contrast around edges, whereas blurry scaling artificially decreases it. They're two equivalent (but opposite) ways of interpolating between sampled points to add image data that wasn't there.

You're right, but that wasn't the point I was sarcastically making here. It was that if you prioritize sharpness above all else, then that's not going to get you good image quality. You have to consider for scaling algorithms what would produce a better image, not what would produce more sharpness.

OP asked how good scalers are now, and if his only criteria is how sharp the end result is, dialing up sharpness would work. It would not look good. Likewise, for integer scaling it will look sharp. It won't always look good. Some people think it's a magical solution to image scaling and I wanted to point clearly out that it isn't.

Not that even your technically correct way of scaling the image would do what the OP wants: replicate a native resolution picture. Because that sharpness also comes from the pixel structure of the display. CRTs, for all their praise about being native resolution all the time basically do the same thing as a blurry upscaling on a high DPI monitor: adds a ton of image information that wasn't actually there.

Reply 16 of 23, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

on windows you can run 1600x1200 centered with some black bars around the image but it still going to be using most of the screen and "without" scaling, probably the best solution if the game supports this res,

1440 SHOULD be a good res to scale, 2x 720, 3x 480, 1.5x 960
but who knows if the monitor is going to scale it properly or just blur everything (more likely)

Reply 17 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:

1440 SHOULD be a good res to scale, 2x 720, 3x 480, 1.5x 960

What? No. The others are fine, but a 1.5x scale gives you no advantages whatsoever. Also worth noting that 16x9 480p is slightly problematic (2560 / 3 doesn't work). Not a big deal if you can use custom resolutions though.

SPBHM wrote:

but who knows if the monitor is going to scale it properly or just blur everything (more likely)

What's the proper scaling method is dependent on the content (as explained "blur everything" is pretty nice for a lot of content) and I don't think there's a single monitor that allows you to change scaling modes to suit content.

Which is why I use GPU scaling + software based scaling to get some flexibility, but since OP's using XP that's very limiting to his options.

Reply 19 of 23, by ZellSF

User metadata
Rank l33t
Rank
l33t
lvader wrote:

1.5 scales very well, it basically line doubles every other pixel.

So half the pixels are too tall, and half are too short. The image is supposed to be scaled evenly, by line-doubling every other pixel you distorted the entire image pretty severely.

Ivader wrote:

You don’t get the softness associated with poor scaling.

Softness ≠ poor scaling.

But no, you don't get softness that way, but then again you don't get softness when you do nearest scaling from ANY resolution.