VOGONS


First post, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Just noticed this today as I was playing a 34 Mb/s 1080p MPEG-2 encoded MKV file ripped from a Blu-ray disc. MPC-HC indicated that DXVA was being used, and sure enough, CPU usage was at around 8%. I was using a dual core S939 Opteron overclocked to 3.0GHz. The video card tested was a 640MB 8800GTS running 341.44 on Win7 x64.
I always thought that the old G80 was incapable of full bitstream decoding even with MPEG-2, so this was kind of a pleasant surprise!
A 32 Mb/s h.264 stream ripped from a another BD was a different story. With this format, MPC-HC did not use DXVA and CPU usage was around 35-62%.

Interestingly, ATI cards of that era don't appear to support full bitstream decoding. Playing the same 34 Mb/s MPEG-2 stream with a Radeon X1950 XTX resulted in significantly higher CPU usage: around 15-30%. MPC-HC did not enable DXVA with the X1950 XTX.

I wonder if GeForce 7 can perform full MPEG-2 bitstream decoding as well? Apparently they use the same video engine as the G80-based cards. I'm gonna have to dig up my old 7950GT and find out!

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 1 of 5, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Hmm, GeForce 7 does not seem to perform MPEG-2 bitstream decoding. With the GeForce 7950GT, CPU utilization was slightly lower than what I saw with the X1950 XTX: around 12-28%. Still, a far cry from the 6-8% I saw with the G80 8800GTS. So it appears that G80's video processor is more advanced than G70's. Unless MPC-HC is just being wonky and doesn't allow DXVA on anything below GF8.

Anyway, I know it's largely useless info, but I thought it was kinda interesting. 🙄

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 4 of 5, by elianda

User metadata
Rank l33t
Rank
l33t

I think wikipedia has a quite comprehensive list: https://en.wikipedia.org/wiki/Nvidia_PureVideo
Also check this table: https://www.nvidia.com/docs/CP/11036/PureVide … _Comparison.pdf

Be aware that there are different capabilities even within the same generation. Also what you test are the features that are exposed through DXVA only.

Retronn.de - Vintage Hardware Gallery, Drivers, Guides, Videos. Now with file search
Youtube Channel
FTP Server - Driver Archive and more
DVI2PCIe alignment and 2D image quality measurement tool

Reply 5 of 5, by mirh

User metadata
Rank Member
Rank
Member
havli wrote:

G80 is 6 months older than R600, comparing it to R580 is perfectly valid.

Except that was launched like 9 months earlier.
And honestly, it seemed obvious to compare with R600, considering technologically they are the first unified shaders architectures.

elianda wrote:

I think wikipedia has a quite comprehensive list: https://en.wikipedia.org/wiki/Nvidia_PureVideo
Also check this table: https://www.nvidia.com/docs/CP/11036/PureVide … _Comparison.pdf

Cool! And it seems there was even something down to GeForce4.
Albeit it predates ATi earlier efforts I know, Wikipedia mentions some kind of.. already existing competing solution though.

It would be nice to have a similar in depth comparison for UVD anyway.

pcgamingwiki.com