VOGONS


Reply 60 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

Standard Def Steve: Would you be willing to try it with Chromium 54.20.6530.0 ? This seems to be the latest Chromium for XP.

Plan your life wisely, you'll be dead before you know it.

Reply 61 of 133, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Hmm, looks like Advanced Chrome 54 on XP is using the CPU. But official Chrome on the Win7 partition is definitely using hardware. 1080p streaming only uses 5-10% of the CPU once YouTube slows down the rate of background caching. That’s at 3 GHz.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 62 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

Great, thanks a lot for testing this. Were you using XP 32-bit or XP 64-bit? It is really starting to sound like browser video acceleration is a no-go in XP. I may just switch to Win7 and let the graphics card take the load off. Unfortunately, I have the only other PCIe slot next to my PCIe x16 slot, so I cannot use a double-wide PCIe card. Seems that ELSA made a single wide GTX 750, but it is impossible to find. eVGA has a 128-bit 740 GDDR5 card.

3 GHz? Opteron 185? How did you pull that off?

Plan your life wisely, you'll be dead before you know it.

Reply 63 of 133, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Lucky CPUs can do 3ghz easy. I think I've achieved stable 2.9ghz and something with my FX60 on the BIg Typhoon cooler.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 64 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

I tried overclocking to 2.9 GHz with my FX60, but it just wasn't stable. Perhaps my cooler and IHS paste issue... I'll find out soon enough.

Plan your life wisely, you'll be dead before you know it.

Reply 65 of 133, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
feipoa wrote:

Great, thanks a lot for testing this. Were you using XP 32-bit or XP 64-bit? It is really starting to sound like browser video acceleration is a no-go in XP. I may just switch to Win7 and let the graphics card take the load off. Unfortunately, I have the only other PCIe slot next to my PCIe x16 slot, so I cannot use a double-wide PCIe card. Seems that ELSA made a single wide GTX 750, but it is impossible to find. eVGA has a 128-bit 740 GDDR5 card.

XP 32-bit.

The GTX-750 is a fine card, but keep in mind that its browser video acceleration is limited to H.264. It won't accelerate VP9 or HEVC, which may become more important in the future. Since browser video acceleration seems to be your main goal, you may want to take a look at the GT 1030 instead. All of them are single wide, support VP9 and HEVC even at 4k/60fps, and use less power than the GTX 750. Just please don't get the gimped DDR4 version.

3 GHz? Opteron 185? How did you pull that off?

I got lucky with mine. All I had to do was increase the bus speed to 231MHz. Didn't even need any extra voltage. This CPU will also do a fully stable 3.13GHz, but that extra 133MHz requires a hefty increase in voltage:

Opt185-GTX560-3D01-XP.PNG
Filename
Opt185-GTX560-3D01-XP.PNG
File size
870.87 KiB
Views
846 views
File license
Fair use/fair dealing exception

But since I'm a little paranoid about electromigration/degradation, I usually just run it at 3GHz/1.35v.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 66 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.

Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.

Plan your life wisely, you'll be dead before you know it.

Reply 67 of 133, by Koltoroc

User metadata
Rank Member
Rank
Member
feipoa wrote:

Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.

Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.

correct, the GT1030 has no XP drivers or vista drivers for that matter. Windows 7, 8 and 10 only.

Reply 68 of 133, by matze79

User metadata
Rank l33t
Rank
l33t

I have a Athlon 6400+ and also can't get it properly cooled. 130 TDP.

i put a Arctic Cooling Freezer Pro 7 Heatpipe Cooler on Top, it's not enough..

The Extreme looks intressenting but it seems i can't find one for cheap in Germany.
And spending 40€ for a old Rig ? 😳

https://www.retrokits.de - blog, retro projects, hdd clicker, diy soundcards etc
https://www.retroianer.de - german retro computer board

Reply 69 of 133, by ph4nt0m

User metadata
Rank Member
Rank
Member

I have an Opteron 180 running at 3.0GHz almost at the default voltage. The mainboard is an Abit AT8-32X with 8Gb DDR PC3200R, that's ECC registered. The board has had a fair share of MOSFET/capacitor mods, but they are irrelevant to the CPU temperature. The cooler is a Zalman CNPS7500-Cu. No heat pipes, but there's a 120mm dual ball bearing fan operating up to 2500rpm and 800g of copper. Yet I had to delid the CPU and put some liquid metal there. I recall the temps were about 60C in LINPACK after the mod without much noise from the fan.

My Active Sales on CPU-World

Reply 70 of 133, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

FX-60 and unlocked multiplier isn't that easy 😀
CPU multiplier is used to clock memory as well.
So, an even multi will clock memory lower than odd multiplier (integer only memory frequency dividers).
Things get even more complicated with half multipliers 😁

157143230295.png

Reply 71 of 133, by shiva2004

User metadata
Rank Member
Rank
Member
agent_x007 wrote:
FX-60 and unlocked multiplier isn't that easy :) CPU multiplier is used to clock memory as well. So, an even multi will clock me […]
Show full quote

FX-60 and unlocked multiplier isn't that easy 😀
CPU multiplier is used to clock memory as well.
So, an even multi will clock memory lower than odd multiplier (integer only memory frequency dividers).
Things get even more complicated with half multipliers 😁

Nop, what's used to clock memory (and PCI, and more) is the bus speed, CPU speed/multiplier is irrelevant.

Reply 72 of 133, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
shiva2004 wrote:

Nop, what's used to clock memory (and PCI, and more) is the bus speed, CPU speed/multiplier is irrelevant.

Sure...
I guess, my CPU-z is simply mad by showing "FSB:DRAM" as "CPU/number" then 😁
8sRrcQR.png

157143230295.png

Reply 73 of 133, by ph4nt0m

User metadata
Rank Member
Rank
Member
shiva2004 wrote:
agent_x007 wrote:
FX-60 and unlocked multiplier isn't that easy :) CPU multiplier is used to clock memory as well. So, an even multi will clock me […]
Show full quote

FX-60 and unlocked multiplier isn't that easy 😀
CPU multiplier is used to clock memory as well.
So, an even multi will clock memory lower than odd multiplier (integer only memory frequency dividers).
Things get even more complicated with half multipliers 😁

Nop, what's used to clock memory (and PCI, and more) is the bus speed, CPU speed/multiplier is irrelevant.

Agent is right. There is no FSB or base clock in K8. Clock speed for memory is derived from core clock using an integer divisor. When C'n'Q changes core clock, memory clock also gets changed.

My Active Sales on CPU-World

Reply 75 of 133, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
feipoa wrote:

Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.

Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.

Looks like the only cards with hardware VP9/HEVC support and official XP drivers are the GTX 950 and GTX 960.
Google definitely wants you to use VP9, though they still provide H.264 as a fallback. Hard to say for how much longer, as all major browsers support VP9 now. Premium content providers like Netflix and Amazon use HEVC for 4k stuff, but I believe they still use H.264 for 1080p and below.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 76 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

Sounds like I won't be upgrading the graphics card until I update the OS then. It would be nice if the firmware and/or drivers for the graphic cards could be adjusted to support new technologies, sort of like how a desktop CPU uses software mode to decode new video technologies. Whether that is possible or not, I don't know, but it certainly wouldn't be good for hardware manufacturers.

Plan your life wisely, you'll be dead before you know it.

Reply 77 of 133, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Hmm, looks like Safari (on both iOS and macOS) doesn't natively support VP9. The iOS userbase is huge, so Google/YouTube is definitely not going to be ditching H.264 anytime soon. Apple just might be your saviour here. 😁

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 78 of 133, by ph4nt0m

User metadata
Rank Member
Rank
Member
Standard Def Steve wrote:
feipoa wrote:

Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.

Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.

Looks like the only cards with hardware VP9/HEVC support and official XP drivers are the GTX 950 and GTX 960.
Google definitely wants you to use VP9, though they still provide H.264 as a fallback. Hard to say for how much longer, as all major browsers support VP9 now. Premium content providers like Netflix and Amazon use HEVC for 4k stuff, but I believe they still use H.264 for 1080p and below.

You can use even a 980 Ti or Titan X with the same drivers if you hack the INF file.

My Active Sales on CPU-World

Reply 79 of 133, by feipoa

User metadata
Rank l33t++
Rank
l33t++

Standard Def Steve, you mentioned that you are dual booting XP and Win7. Are you using a boot manager, or is the Win7 installation smart enough to use its own boot manager? I haven't played with Win7 much, so I'm a bit of a novice here. I might dual boot XP and Win10 to help ease out XP. I'd prefer Win7, but extended support ends Jan 2020, so might not be worth the effort to setup. Unless there is an XP/POS2009 like fix to keep updates going on Win7?

Looks like my Opteron cooler has just shipped. That took awhile!

Plan your life wisely, you'll be dead before you know it.