VOGONS


First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hi,

since the K6-2 times I always like all the different solutions where both graphic cards and/or dedicated hardware decoders tried to offload the cpu from mpeg2 decoding process. Since Pentium II/III obviously old PCI decoders weren't necessary while the final image quality has always been variable. I've seen better results with some ATi Rage later chips with the original rare DVD software player than with much faster and more modern video cards sometimes having only motion compensation to accelerate part of the process not to mention sometimes awful vga outputs.
The Real Magic H+ and X-Card PCI were probably the best consumer specific solutions while most of the gpu's were beginning to use both motion compensation and later (beside ATi) idct acceleration requiring anyway a third party software that did benefit of the hardware acceleration but later using sw post processing filters and cpu related stuff.
Nowdays I am testing this subject on an old LCD with a good analog input and the old Hollywood+ cards requiring (really a lot) the best possible native VGA output out there (let's say were the G450 and G550) to compensate the rare not good vga pass cable and somehow comparing it with a modern GT610 PCI-EX card (in linux) with DVI using VDPAU acceleration.
The feelings were somehow variable... the VDPAU of these "modern" gpus obviously totally offload the cpu to 10% even with a Celeron 420, nothing new here MPEG2 is not nearly an heavy task compared to the times of the Pentium 1xx, but I was expecting awesome auto smart gpu processing for sharper image, antialiasing, etc.. but even if indeed the image is impressively stable with no artifacts and a generic noise removing process, the result seems a bit "soft" and variable considering tested it on a 720p (1280x768) 26" monitor. I don't have modern software media players that may have post processing but I was more interested in the internal hardware alone part not the added cpu (like those sort of 60fps simulation, "smart" sw upscaling etc..).
I can't see (while still great compared to old early 2000 gpu's) incredible progresses in this sector while the H+ PCI cards were really the state of the art back then and maybe with a nostalgic view somehow even nowdays depending on the analog vga oriented configuration used with the only downside of the vga cables loose of details and general image stability not always easy to config. I know that in home theatre tech world the "upscalers" subject is a big expensive world itself but considering the modern GPU capabilities I was expecting better things even if maybe the GT610 is too old itself to take as comparison.
What's your experience in DVD decoding quality accelerations in the last two decades?
Thanks