VOGONS


Reply 40 of 65, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Standard Def Steve wrote:

I remember SSE giving software MPEG-2 a huge boost back in the day

I had a Pentium 3 Katmai @ 300 MHz (66 MHz FSB x 4.5) decoding a DVD in software on a Voodoo3 with perfect fluidity the other day. I think you're right. I think SSE has some instructions specifically for MPEG2 assistance.

Reply 41 of 65, by obobskivich

User metadata
Rank l33t
Rank
l33t
Standard Def Steve wrote:
Anyway, I *believe* VP9 uses SSE4 at the very least because the Core i series CPUs are far more efficient at decoding it than pr […]
Show full quote

Anyway, I *believe* VP9 uses SSE4 at the very least because the Core i series CPUs are far more efficient at decoding it than previous CPUs. I "benchmarked" a few of my CPUs by playing back VP9 encoded YouTube video using Chrome's HTML5 player, and the results were quite interesting.

Opteron 185 (2C/2T, 3GHz OC, SSE3, GTX 560)
720p: 40% average CPU utilization
1080p: 85%
720p/60: 95% and still completely smooth
1080p/60: 100%, playing back at roughly half the frame rate.

Core 2 Quad Q6700 (4C/4T, 3.33GHz OC, S-SSE3, GTX 560)
720p: 20%
1080p: 45%
720p/60: 35% (Not sure why 720p/60 uses less CPU time than 1080/30 in this case, but it is what it is)
1080p/60: 92%

Core i7 4930K (6C/12T, 4.5GHz OC, SSE4 & AVX, GTX 970)
720p: 0-1%
1080p: 1-2%
720p/60: 1-2%
1080p/60: 2-3%

The very low CPU usage on the i7 is what made me initially believe that VP9 was hardware accelerated. However, GPU-Z shows zero load on the GPU's Video Engine during playback, and according to everything I've read, no GPU currently offloads VP9. I remember SSE giving software MPEG-2 a huge boost back in the day, perhaps SSE4 is doing the same thing with modern codecs.

Interesting. Just to note, 45nm Core 2 ('Penryn') also has SSE4 support. It'd be interesting to compare that as well, if possible. I never recall my Q9550 having any trouble with various codecs though (but that CPU has never been paired with a GPU that doesn't help). I think what may also be a factor here is the multi-threading component - even comparing my dual 604 to single 478 will change things significantly for handling HD content, even though the CPU in the 478 is faster than either of the CPUs in the 604. In your comparison, the i7 is also a 6C with HT, which may be helping things out too. 😀

Reply 43 of 65, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:
Standard Def Steve wrote:
Anyway, I *believe* VP9 uses SSE4 at the very least because the Core i series CPUs are far more efficient at decoding it than pr […]
Show full quote

Anyway, I *believe* VP9 uses SSE4 at the very least because the Core i series CPUs are far more efficient at decoding it than previous CPUs. I "benchmarked" a few of my CPUs by playing back VP9 encoded YouTube video using Chrome's HTML5 player, and the results were quite interesting.

Opteron 185 (2C/2T, 3GHz OC, SSE3, GTX 560)
720p: 40% average CPU utilization
1080p: 85%
720p/60: 95% and still completely smooth
1080p/60: 100%, playing back at roughly half the frame rate.

Core 2 Quad Q6700 (4C/4T, 3.33GHz OC, S-SSE3, GTX 560)
720p: 20%
1080p: 45%
720p/60: 35% (Not sure why 720p/60 uses less CPU time than 1080/30 in this case, but it is what it is)
1080p/60: 92%

Core i7 4930K (6C/12T, 4.5GHz OC, SSE4 & AVX, GTX 970)
720p: 0-1%
1080p: 1-2%
720p/60: 1-2%
1080p/60: 2-3%

The very low CPU usage on the i7 is what made me initially believe that VP9 was hardware accelerated. However, GPU-Z shows zero load on the GPU's Video Engine during playback, and according to everything I've read, no GPU currently offloads VP9. I remember SSE giving software MPEG-2 a huge boost back in the day, perhaps SSE4 is doing the same thing with modern codecs.

Interesting. Just to note, 45nm Core 2 ('Penryn') also has SSE4 support. It'd be interesting to compare that as well, if possible. I never recall my Q9550 having any trouble with various codecs though (but that CPU has never been paired with a GPU that doesn't help). I think what may also be a factor here is the multi-threading component - even comparing my dual 604 to single 478 will change things significantly for handling HD content, even though the CPU in the 478 is faster than either of the CPUs in the 604. In your comparison, the i7 is also a 6C with HT, which may be helping things out too. 😀

I forgot about those! It just so happens that I have a Core 2 Duo e7500 nestled snugly in an awesome little Optiplex 780 USFF I rescued not too long ago. However, CPU-Z reports that this processor only supports "half" of SSE4; SSE4.1 support is listed, but SSE4.2 is absent. Still, I hooked it up, made sure Chrome was up to date, and ran my unscientific little YouTube benchmark.

Core 2 Duo e7500 (2C/2T, 2.93GHz, SSE4.1, GMA X4500HD)
720p: 32%
1080p: 64%
720p/60: 52%
1080p/60: 100% and dropping frames.

So it appears that I was wrong. 😊 SSE4 isn't helping VP9 decode, unless the optimizations are specific to SSE4.2 (highly unlikely). It must be the combination of HT, clock speed, and higher performance per core helping the Core i CPUs. That, or AVX instructions are lending a hand.

candle_86 wrote:

well i can test my E6420 with a PCIe 6200TC and see if i can still play netflix and youtube

If my Opteron can handle it, so can your C2D. 😀

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 45 of 65, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

so got an EVGA 750 FTW SLI and Q6600 for 40 bucks

Nice! If it overclocks decently that rig should run any game but perhaps not with all the video settings maxed out.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 47 of 65, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

its at 3.6ghz @ 4V right now easy peasey, its a G0

@ 4V it should run pretty hot 😉

3.6 @ 1.4V is really good, what is the voltage during load? Perhaps the EVGA board dosnt drop/droop as much as the Intel chipset boards?

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 48 of 65, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

I don't really consider Q6600 and such to be proper old hardware (but I guess they are), because they are mostly fine for current software and faster than some new stuff;

but if you have a 775 board with 45nm support, it's interesting to get a cheap 771 Xeon from ebay, like the e5440, with the right board you can go over 4GHz

anyway, my oldest PC being used daily as a modern PC is my athlon 64 X2 2.4GHz with nforce 6100 IGP, it struggles to stream HD videos in the web browser, but other than that it seems OK, unfortunately using a 4670 for video acceleration didn't work well, it's to dependent on software (web browser, flash version, driver version) and it simply wont work most of the time (thanks in part for the HD 4000 series not getting new drivers) while using Kodi to run youtube/twitch streams works beautifully, the same was the case using my PCI 8400GS and single core 1.8GHz k8, web browser was not usable for videos (even 480P was bad), Kodi worked great for streaming 720P, from what I notice here, if you play web browser 60FPS streams you need at least a core 2 quad if you don't have GPU acceleration working.

Reply 51 of 65, by Chaniyth

User metadata
Rank Member
Rank
Member

If you're looking to go with Ubuntu there's an Ubuntu Mate Edition, as Mate has a simular foot print in resources to XP it would be ideal for that Opteron. Xubuntu would be a decent edition to use too. If you upgrade the RAM and want a pretty GUI interface then go with a KDE/Plasma based distro.

All the world will be your enemy, Prince with a Thousand Enemies, and when they catch you, they will kill you... but first they must catch you. 😁

Reply 53 of 65, by ODwilly

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

nah running 7 Pro 🤣

Even as a single core that opteron should make a fine web browser. Now that the q6600 system machine is your primary setup id say you are good for a good long time!

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 54 of 65, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Firefox seems to use H.264 for its HTML5 videos. Does anyone know if this decodes on a modern GPU then? I wish it was easier to determine exactly what is going on with web video and GPU decode...

Something else I've noticed is Firefox tends to get better network utilization than Chrome on Windows XP. I have a 50mpbs internet connection and Firefox will get that (assuming I've tweaked XP with TCPOptimizer). Chrome does not for some reason.

Another additional tidbit is I've noticed with my Atom Z3770 tablet that IE11 is the fastest browser with the smoothest video playback as well. It's impressive how much MS has improved IE.

Reply 56 of 65, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

in-browser video is extremely complicated, but yes, IE11 seems to have the better use of GPU acceleration here for HTML5 and flash, this should be helping the Atom.

I would say Firefox is very inefficient for video compared to Chrome and IE11

the problem is that different browsers, OSes and drivers breaks GPU video acceleration very easily, and even just going into full screen and out, changing resolutions can make things difficult to understand...

Reply 58 of 65, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

It is pretty easy to determine if Flash is using the GPU. HTML5 is mysterious.

CPU usage is a good indication, and DXVA checker shouldn't show anything on trace log if hw acceleration is not being used I think (disabling manually the flash hw acceleration on firefox does that)

but even when using GPU acceleration the CPU load and performance can differ from browser to browser, I think the better integration from IE might help here...

Reply 59 of 65, by King_Corduroy

User metadata
Rank Oldbie
Rank
Oldbie

DON'T USE UBUNTU OR MINT. Ubuntu steals your information and infringes on your rights in other ways, use a real distro like Linux Fedora 21. I've been using Fedora for over a year now and I haven't touched my Windows 7 partition in all that time. With the MATE desktop environment Fedora is my number one choice. Not only is it stable, it's cutting edge and the package manager is very modern and intuitive.

Mint is based on the Ubuntu Kernel and in general is not all that great for that reason. Their desktop environment Cinnamon is kinda power hungry also.

Check me out at Transcendental Airwaves on Youtube! Fast-food sucks!