VOGONS

Common searches


Reply 20 of 29, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
ZellSF wrote on 2022-06-17, 12:39:
1) Linedoubling itself doesn't need support on displays. As far as the display is concerned, it's just getting a higher resoluti […]
Show full quote
BEEN_Nath_58 wrote on 2022-06-17, 04:15:

Can it mean linedoubling isn't supported on displays anymore or are they too afraid they'll lose users after they see an "Input out of range"

1) Linedoubling itself doesn't need support on displays. As far as the display is concerned, it's just getting a higher resolution image from the GPU. More compatibility problems ("Input of range") would be caused by the GPU outputting low resolutions without linedoubling.

2) Linedoubling lower resolutions have always been the standard. There's literally no reason to drop it.

There of course a possibility that a GPU sends the signal straight to the monitor (who knows what the internal logics are for which resolutions to linedouble), but the crux of the issue here is: you don't know and neither will most people submitting data to answer you. If you really want to know the lowest resolutions many modern displays support, you need a thorough testing methodology.

1) What are the alternatives to linedoubling? Problems such as "Input out of range" happens on my 17 inch monitor HD till date when trying to output 8K (yes I am aware of this) but even 1152x576(?). This is not even a low resolution. Does it mean GPU stops linedoubling at will?

2) Another question comes here, why is linedoubling so less frequent then. If I consider the "AMD upscaling" (they take any res < 480p and render at screen res) and "linedoubling" as 2 separate things, what modern DirectDraw wrappers follow is the former. Majority of them take a game at 320x200 (take Wing Commander 1 for example) and render at the screen resolution, but not do it as soon as a game uses 640x480.

"linedoubling" on the other hand as to what I understood still present that pixelated look. My 65 inch TV, which has scaling off, renders the same game in the center in a tiny box not visible 1m from the TV. AMD doesn't do it, neither does Intel Iris.

ZellSF wrote on 2022-06-17, 12:39:
BEEN_Nath_58 wrote on 2022-06-17, 04:15:

Supposing linedoubling still exists, still AMD render screens at native resolution for application resolution < 640x480. They could've used linedoubling, but they didn't!

Again, you do not know that. Without any measuring equipment, you have no idea what the GPU does internally with the signal before sending it to your screen.

It does a GPU upscale. Even NVIDA does it, but I can't ascertain in which certain cases it does.

ZellSF wrote on 2022-06-17, 12:39:
BEEN_Nath_58 wrote on 2022-06-17, 04:15:

(they don't support less than 640x480 as I said) ?

Aren't you claiming you're getting 100x100 output? And you're also claiming they don't support less than 640x480? That seems rather contradictory.

No contradictions here, the talk was of 2 different GPU vendors

previously known as Discrete_BOB_058

Reply 21 of 29, by ZellSF

User metadata
Rank l33t
Rank
l33t
BEEN_Nath_58 wrote:

1) What are the alternatives to linedoubling? Problems such as "Input out of range" happens on my 17 inch monitor HD till date when trying to output 8K (yes I am aware of this) but even 1152x576(?). This is not even a low resolution. Does it mean GPU stops linedoubling at will?

Linedoubling is to make the signal meet the minimum frequency required by 31khz CRTs, so anything below that frequency will likely be linedoubled. Though as I mentioned earlier, I'm not an expert on how CRTs work, at all. There's no chance 1152x576 is linedoubled.

That said 576 is a magic number: it's the vertical resolution of the PAL TV standard. Your GPU, monitor or both might make some assumptions based on that that causes incompatibility.

GPUs don't have any way of disallowing too high resolutions (monitor drivers, and later EDID information and the likes prevented the average user from damaging their monitor this way).

BEEN_Nath_58 wrote:

"linedoubling" on the other hand as to what I understood still present that pixelated look.

Yes, line doubling is just duplicating lines, so it gives a pixelated look, as opposed to game consoles which used lower resolutions directly on 15khz displays, where there just would be a longer distance between lines.

Reply 22 of 29, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
ZellSF wrote on 2022-06-17, 21:42:
Linedoubling is to make the signal meet the minimum frequency required by 31khz CRTs, so anything below that frequency will like […]
Show full quote
BEEN_Nath_58 wrote:

1) What are the alternatives to linedoubling? Problems such as "Input out of range" happens on my 17 inch monitor HD till date when trying to output 8K (yes I am aware of this) but even 1152x576(?). This is not even a low resolution. Does it mean GPU stops linedoubling at will?

Linedoubling is to make the signal meet the minimum frequency required by 31khz CRTs, so anything below that frequency will likely be linedoubled. Though as I mentioned earlier, I'm not an expert on how CRTs work, at all. There's no chance 1152x576 is linedoubled.

That said 576 is a magic number: it's the vertical resolution of the PAL TV standard. Your GPU, monitor or both might make some assumptions based on that that causes incompatibility.

GPUs don't have any way of disallowing too high resolutions (monitor drivers, and later EDID information and the likes prevented the average user from damaging their monitor this way).

BEEN_Nath_58 wrote:

"linedoubling" on the other hand as to what I understood still present that pixelated look.

Yes, line doubling is just duplicating lines, so it gives a pixelated look, as opposed to game consoles which used lower resolutions directly on 15khz displays, where there just would be a longer distance between lines.

Conflicting information regarding monitors overall. Maybe we have to stop at what the GPU shows it is delivering in driver.

previously known as Discrete_BOB_058

Reply 23 of 29, by brostenen

User metadata
Rank l33t++
Rank
l33t++

Why not just go this route if you can not find any CGA, or EGA monitor?
I am using one of the Amiga-specific internal versions in two of my Amiga500's. And then there are RetroTink and OSSC.
Get all three solutions, and you are pretty much ready to go with almost everything on modern HDMI capable monitors.
I use both a Samsung SyncMaster T22A300 television and a Dell 2001 FP monitor as the only two monitors/display's as of now.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 24 of 29, by ZellSF

User metadata
Rank l33t
Rank
l33t
brostenen wrote on 2022-06-18, 21:51:
Why not just go this route if you can not find any CGA, or EGA monitor? I am using one of the Amiga-specific internal versions i […]
Show full quote

Why not just go this route if you can not find any CGA, or EGA monitor?
I am using one of the Amiga-specific internal versions in two of my Amiga500's. And then there are RetroTink and OSSC.
Get all three solutions, and you are pretty much ready to go with almost everything on modern HDMI capable monitors.
I use both a Samsung SyncMaster T22A300 television and a Dell 2001 FP monitor as the only two monitors/display's as of now.

I don't think this topic was made with solving any practical problem in mind.

Reply 25 of 29, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

I do a ton of NES to PS3/Xb360-era console gaming on a 64" Samsung F8500 plasma panel. Its native resolution is 1080p, but fortunately it has a great scaler and gladly accepts 240p. And man is it a colorful display! Stare at a game like Super Mario 64 for too long and your eyes may suffer from burn-in before the actual panel does! The AR filter and super low black floor just increases that intense "bursty" color effect...I love it!

And yeah, it's a TV, but it's the only flat-panel display that I still have a bunch of low-res sources hooked up to.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 26 of 29, by darry

User metadata
Rank l33t++
Rank
l33t++

The lowest pixel clock supported by DVI is 25.175 MHz .

That translates to 800x449 total pixels (including horizontal and vertical blanking intervals) at 70Hz (more accurately about 70.09Hz) , which is what the 256-color VGA 320x200@70Hz mode (line doubled to 640x400@70Hz by the VGA card) and the 720x400@70Hz VGA text mode use . These two modes differ in the size of their blanking intervals .

One could presumably try to coax a video card to output non-standard modes with unusually large blanking intervals to lower the number of active pixels (effective resolution) of a given mode without lowering the pixel clock . I have no idea how tolerant monitors are to such manipulations, either over analogue RGBHV ( VGA ) or over a digital link (DVI/HDMI) .

Online modeline calculator :
https://www.epanorama.net/faq/vga2rgb/calc.html

Reply 28 of 29, by darry

User metadata
Rank l33t++
Rank
l33t++
Plasma wrote on 2022-06-19, 04:56:

On the raspberry pi (at least) it is possible to output lower resolutions over HDMI using CVT. For example.

Interesting. I wonder what the pixel clock and blanking interval are for that mode in the example. Does the Pi transparently line double as needed or does it set huge blanking intervals to stay above 25.175MHz or does it set a lower pixel clock (maybe HDMI allows that)?

I have a couple a Pi boards laying around, so I could test it .

According to this https://learn.adafruit.com/using-weird-displa … everything-else , setting blanking precisely can be done too .

EDIT : See also https://forums.raspberrypi.com/viewtopic.php?t=304378

Reply 29 of 29, by brostenen

User metadata
Rank l33t++
Rank
l33t++
ZellSF wrote on 2022-06-18, 22:06:
brostenen wrote on 2022-06-18, 21:51:
Why not just go this route if you can not find any CGA, or EGA monitor? I am using one of the Amiga-specific internal versions i […]
Show full quote

Why not just go this route if you can not find any CGA, or EGA monitor?
I am using one of the Amiga-specific internal versions in two of my Amiga500's. And then there are RetroTink and OSSC.
Get all three solutions, and you are pretty much ready to go with almost everything on modern HDMI capable monitors.
I use both a Samsung SyncMaster T22A300 television and a Dell 2001 FP monitor as the only two monitors/display's as of now.

I don't think this topic was made with solving any practical problem in mind.

But the talk, seems to head towards what replacements that are out there. You know. When reading between lines. I thought that it would be better to suggest modern replacements at this point, as old CRT monitors cost more than the best solutions. And because CRT's seem to stop working before the machines.

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011