Reply 80 of 81, by Kaztrofy
At the workplace they probably had broadcast quality video monitors next to PC monitor,
but video game testers probably had used various domestic TVs that were connected to stock consoles.
I recently learned that they increased the color warmth on old games on purpose because the pvm's they were using had a cold default setting, so to balance it out. Explains why they look too yellow/red when used on modern displays that often are calibrated to 6500k.
I remember, the Super NES I had did merely ship with an RF cable, an AV cable and an Cinch to SCART adapter. It was connected to […]
I remember, the Super NES I had did merely ship with an RF cable, an AV cable and an Cinch to SCART adapter.
It was connected to a video monitor with AV ports (Commodore 1702 monitor), so I was avoiding SCART for quite a while.Though I also I remember that friends at same time had their consoles hooked up to a big SCART TV..
Via the supplied Composite AV adapter, probably. Big meant 27" or so (70cm diagonal).Amiga platform was different, of course: The RGB output was the default video output method that users used (rather than composite video).
My A500 had an ordinary video port, too, but it was monochrome only.
Probably for use with a green monitor or amber monitor.
I live in Europe so at some point I remember RGB scart being available for both my Amiga and SNES, it was such a huge upgrade but I also remember it was the first time I noticed thingsthat I think I wasn't supposed to see, like the artifacts on FMV's on the PlayStation 1, made me wonder that having the best signal and picture quality sometimes could be a grey area.
[/quote]