VOGONS


Reply 20 of 20, by darry

User metadata
Rank l33t++
Rank
l33t++
Scythifuge wrote on 2021-06-04, 02:05:
darry wrote on 2021-06-03, 23:07:
It looks like a nice device whose existence I somehow managed to miss until seeing this thread. Full range chroma sampling, arbi […]
Show full quote
cyclone3d wrote on 2021-06-03, 19:22:

Yeah, it is a really sweet device. you can have it letter-box the input resolution on the output so you can get the correct aspect-ratio no matter the monitor you are using. Pretty sure you can have it stretch the image to a higher resolution so you don't end up with a tiny box when you are inputting some low resolution and outputting to a high resolution monitor.

Input and output refresh rate can bet different as well.

It looks like a nice device whose existence I somehow managed to miss until seeing this thread. Full range chroma sampling, arbitrary refresh rate support (in and out) is nice.

Caveats/questions :
a) Conversion back to analogue on output may be a non-issue from a quality point of view if well implemented on the converter, but may be problematic because
a1) analogue RGBHV (VGA) input on new/upcoming monitors will likely no longer be a thing soon-ish
a2) even on those monitors that will still have such an input support for odd/old/uncommon refresh rates/resolutions will not necessarily be very good (or as wide as over HDMI/DP) nor might implementation quality of analogue input . This is admittedly conjecture on my part, but as analogue RGBHV becomes an afterthought, why would manufacturers bother much with it?)
b) I wonder what its scaling look like ? Fans of integer scaling may be disappointed here, but the wealth of video-centric features (reverse-pulldown and deinterlacing options) will certainly be appreciated by some people who need/want those features
c) How good or bad is latency ? This info is not disclosed in the manual (unless I missed it). Other similar devices (described in the manual) that also support audio input have audio delay options, so if this specific device is meant for the broadcast industry, maybe the expectation is that the user will adjust audio delay manually, as needed, downstream of the device . If latency, in at least some scenarios, is big enough to be perceptible with audio-sync issues, it would be deal-breaker for me . To be clear, I am not saying that it is, I am just asking the question.

There is definitely potential for greatness here but, as they say, the devil is in the details . Ideally, if somebody with access to one of these, an Extron RGB-DVI 300 (or similar), an OSSC and high quality DVI and analogue RGBHV capture device(s) also had the time and motivation to run some comparisons and post results, it would be very useful/interesting to see. Anybody know of qualified YT content producer who would want to take up such a project ? 😉

I was going to buy an OSSC. However, after you and I exchanged posts on a similar thread a little while back, I looked into it and saw that a new version is coming out. I will probably want the updated version, or perhaps the original OSSC will come down in price. Ironically, I planned on creating a YT channel to discuss my retro hobbies and show off my set up and some games, until my KDS wigged (and set all of my projects back until I can resolve the issue.)

Another issue I am having is that my HP2065 that I am using for now has "noise;" like a wavering or shimmering on the characters during the post and boot process. The issue goes away once windows is loaded. This has me concerned about using MS-DOS. I don't know if it is my monitor, setup, or environment that is causing it. I am going to unplug it from the surge protector and plug it directly into the wall. If it goes away, I am going to wonder if the surge protector had anything to do with my NOS CRT KDS wigging out...

I too have my eye on the eventual OSSC Pro, but I believe there a few points to consider .

Cons :
a) OSSC Pro will not come cheap, just the BOM (bill of materials) will be significantly higher than the original OSSC . Prices might go down a bit with higher volume production, but it will still be expensive
b) the original OSSC can only go down so much as it still costs money to make (of which BOM is only a part) and the people making/selling it need to make a living, so I don't feel that pricing has much room left to go down (I could be wrong)
c) With parts shortages and supply chain issues still being a thing in our covidic world, I fear prices may actually go up, at least in the short term (due to component costs rising)
d) Original OSSC works so well for what it can do that, while the Pro version will have a lot more to offer in terms of features/flexibility, outside of specific use cases, the original OSSC may be good enough for many of us

Pros:
a) MiSTer compatibility
b) a frame buffer and all the advantages that can potentially provide (scaling to arbitrary resolution, refresh rate conversion
c) decoupling sampling rate and output rate https://github.com/marqs85/ossc/issues/56
d) a multitude of other features made possible by the more powerful hardware .

No idea about the noise issue, especially if it completely goes away when Windows is loaded and is not present even in a full-screen dos window running under Windows (presumably the same screen mode as the one showing issues during boot).