Reply 60 of 133, by feipoa
- Rank
- l33t++
Standard Def Steve: Would you be willing to try it with Chromium 54.20.6530.0 ? This seems to be the latest Chromium for XP.
Plan your life wisely, you'll be dead before you know it.
Standard Def Steve: Would you be willing to try it with Chromium 54.20.6530.0 ? This seems to be the latest Chromium for XP.
Plan your life wisely, you'll be dead before you know it.
Hmm, looks like Advanced Chrome 54 on XP is using the CPU. But official Chrome on the Win7 partition is definitely using hardware. 1080p streaming only uses 5-10% of the CPU once YouTube slows down the rate of background caching. That’s at 3 GHz.
"A little sign-in here, a touch of WiFi there..."
Great, thanks a lot for testing this. Were you using XP 32-bit or XP 64-bit? It is really starting to sound like browser video acceleration is a no-go in XP. I may just switch to Win7 and let the graphics card take the load off. Unfortunately, I have the only other PCIe slot next to my PCIe x16 slot, so I cannot use a double-wide PCIe card. Seems that ELSA made a single wide GTX 750, but it is impossible to find. eVGA has a 128-bit 740 GDDR5 card.
3 GHz? Opteron 185? How did you pull that off?
Plan your life wisely, you'll be dead before you know it.
Lucky CPUs can do 3ghz easy. I think I've achieved stable 2.9ghz and something with my FX60 on the BIg Typhoon cooler.
I must be some kind of standard: the anonymous gangbanger of the 21st century.
I tried overclocking to 2.9 GHz with my FX60, but it just wasn't stable. Perhaps my cooler and IHS paste issue... I'll find out soon enough.
Plan your life wisely, you'll be dead before you know it.
wrote:Great, thanks a lot for testing this. Were you using XP 32-bit or XP 64-bit? It is really starting to sound like browser video acceleration is a no-go in XP. I may just switch to Win7 and let the graphics card take the load off. Unfortunately, I have the only other PCIe slot next to my PCIe x16 slot, so I cannot use a double-wide PCIe card. Seems that ELSA made a single wide GTX 750, but it is impossible to find. eVGA has a 128-bit 740 GDDR5 card.
XP 32-bit.
The GTX-750 is a fine card, but keep in mind that its browser video acceleration is limited to H.264. It won't accelerate VP9 or HEVC, which may become more important in the future. Since browser video acceleration seems to be your main goal, you may want to take a look at the GT 1030 instead. All of them are single wide, support VP9 and HEVC even at 4k/60fps, and use less power than the GTX 750. Just please don't get the gimped DDR4 version.
3 GHz? Opteron 185? How did you pull that off?
I got lucky with mine. All I had to do was increase the bus speed to 231MHz. Didn't even need any extra voltage. This CPU will also do a fully stable 3.13GHz, but that extra 133MHz requires a hefty increase in voltage:
But since I'm a little paranoid about electromigration/degradation, I usually just run it at 3GHz/1.35v.
"A little sign-in here, a touch of WiFi there..."
Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.
Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.
Plan your life wisely, you'll be dead before you know it.
wrote:Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.
Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.
correct, the GT1030 has no XP drivers or vista drivers for that matter. Windows 7, 8 and 10 only.
I have a Athlon 6400+ and also can't get it properly cooled. 130 TDP.
i put a Arctic Cooling Freezer Pro 7 Heatpipe Cooler on Top, it's not enough..
The Extreme looks intressenting but it seems i can't find one for cheap in Germany.
And spending 40€ for a old Rig ? 😳
https://www.retrokits.de - blog, retro projects, hdd clicker, diy soundcards etc
https://www.retroianer.de - german retro computer board
I have an Opteron 180 running at 3.0GHz almost at the default voltage. The mainboard is an Abit AT8-32X with 8Gb DDR PC3200R, that's ECC registered. The board has had a fair share of MOSFET/capacitor mods, but they are irrelevant to the CPU temperature. The cooler is a Zalman CNPS7500-Cu. No heat pipes, but there's a 120mm dual ball bearing fan operating up to 2500rpm and 800g of copper. Yet I had to delid the CPU and put some liquid metal there. I recall the temps were about 60C in LINPACK after the mod without much noise from the fan.
wrote:FX-60 and unlocked multiplier isn't that easy :) CPU multiplier is used to clock memory as well. So, an even multi will clock me […]
FX-60 and unlocked multiplier isn't that easy 😀
CPU multiplier is used to clock memory as well.
So, an even multi will clock memory lower than odd multiplier (integer only memory frequency dividers).
Things get even more complicated with half multipliers 😁
Nop, what's used to clock memory (and PCI, and more) is the bus speed, CPU speed/multiplier is irrelevant.
wrote:wrote:FX-60 and unlocked multiplier isn't that easy :) CPU multiplier is used to clock memory as well. So, an even multi will clock me […]
FX-60 and unlocked multiplier isn't that easy 😀
CPU multiplier is used to clock memory as well.
So, an even multi will clock memory lower than odd multiplier (integer only memory frequency dividers).
Things get even more complicated with half multipliers 😁Nop, what's used to clock memory (and PCI, and more) is the bus speed, CPU speed/multiplier is irrelevant.
Agent is right. There is no FSB or base clock in K8. Clock speed for memory is derived from core clock using an integer divisor. When C'n'Q changes core clock, memory clock also gets changed.
OK, so is it best to run the core clock in 200 MHz steps, e.g. 2600, 2800, 3000, etc MHz?
Plan your life wisely, you'll be dead before you know it.
wrote:Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.
Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.
Looks like the only cards with hardware VP9/HEVC support and official XP drivers are the GTX 950 and GTX 960.
Google definitely wants you to use VP9, though they still provide H.264 as a fallback. Hard to say for how much longer, as all major browsers support VP9 now. Premium content providers like Netflix and Amazon use HEVC for 4k stuff, but I believe they still use H.264 for 1080p and below.
"A little sign-in here, a touch of WiFi there..."
Sounds like I won't be upgrading the graphics card until I update the OS then. It would be nice if the firmware and/or drivers for the graphic cards could be adjusted to support new technologies, sort of like how a desktop CPU uses software mode to decode new video technologies. Whether that is possible or not, I don't know, but it certainly wouldn't be good for hardware manufacturers.
Plan your life wisely, you'll be dead before you know it.
Hmm, looks like Safari (on both iOS and macOS) doesn't natively support VP9. The iOS userbase is huge, so Google/YouTube is definitely not going to be ditching H.264 anytime soon. Apple just might be your saviour here. 😁
"A little sign-in here, a touch of WiFi there..."
wrote:wrote:Thanks for the information. The GT 1030 has no XP drivers, right? So I couldn't buy it now, use it with XP for another year, then upgrade to some other Windows OS? I'd have to be fully committed to ditching XP, and I'm just not there yet. From your response, it sounds like the industry is migrating away from H.264.
Maybe once I fix the Opteron heating issue, I can get 3 GHz out of it like you. Have you tried using an FX60? From what I understand, you should be able to just increase the multiplier.
Looks like the only cards with hardware VP9/HEVC support and official XP drivers are the GTX 950 and GTX 960.
Google definitely wants you to use VP9, though they still provide H.264 as a fallback. Hard to say for how much longer, as all major browsers support VP9 now. Premium content providers like Netflix and Amazon use HEVC for 4k stuff, but I believe they still use H.264 for 1080p and below.
You can use even a 980 Ti or Titan X with the same drivers if you hack the INF file.
Standard Def Steve, you mentioned that you are dual booting XP and Win7. Are you using a boot manager, or is the Win7 installation smart enough to use its own boot manager? I haven't played with Win7 much, so I'm a bit of a novice here. I might dual boot XP and Win10 to help ease out XP. I'd prefer Win7, but extended support ends Jan 2020, so might not be worth the effort to setup. Unless there is an XP/POS2009 like fix to keep updates going on Win7?
Looks like my Opteron cooler has just shipped. That took awhile!
Plan your life wisely, you'll be dead before you know it.