VOGONS


Reply 20 of 87, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-08-05, 10:32:

🤣 the P4 EE isnt really that power heavy, I have modern CPUs that use more at stock clocks .. now if it was heat we were discussing then you have a point 😁

Operating temperatures went down, but 105W is 105W, it's still the same amount of heat, only dissipated faster and more efficiently.

sreq.png retrogamer-s.png

Reply 21 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
RandomStranger wrote on 2022-08-05, 11:07:
TrashPanda wrote on 2022-08-05, 10:32:

🤣 the P4 EE isnt really that power heavy, I have modern CPUs that use more at stock clocks .. now if it was heat we were discussing then you have a point 😁

Operating temperatures went down, but 105W is 105W, it's still the same amount of heat, only dissipated faster and more efficiently.

Its not a CPU I would personally care about for power use, its not as bad as it could have been ...I mean you want bad just go fire up a FX 9590 for a power hungry heat generating waste of silicon that even a tower cooler has issues keeping cool.

Reply 22 of 87, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie

Well, that's 220W if my memory serves me right. Definitely difficult to cool by air. Though its main problem is that it's still only marginally faster over all than a 65W i5 from its time.

sreq.png retrogamer-s.png

Reply 23 of 87, by 386SX

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2022-08-05, 10:32:

not that I'm aware of, it would for sure be interesting to see the results of. Im not the one to do this however as I dont have a big Athlon XP collection and the board I have loves its +5v rail .. the bigger the better.

What I would love to see is an adaptor that can take a sata power connector and boost the +5v line of ATX PSUs with weak +5v rails. Think of a breakout board the Sata and 20 pin ATX power connectors plug into that has a ATX pigtail on it for output to the motherboard. I think a simple convertor should be able to step down the 12v sata connector to 5v and take both that voltage along with the +5v rail voltage step up the AMPs and provide a stronger +5v rail for output.

Not sure how possible such a device would be but it would solve the issue of weak +5v rails, could even a provide -5v rail too. (Perhaps even with a AT pigtail it could let a ATX PSU power an AT setup)

I also don't have any of those late Athlon XP mainboards, only the usual 20pin ATX PSU +5v stressing boards on Nforce2 and similar. Considering the classic capacitors problems of those times and the amount of them in those mainboards, I don't have much patience to collect and repair them all. I prefer older less power demanding boards equal to less problems anyway (capacitors, MOSFETs, heat, PSU of course etc..) or more modern ones so I don't have to repair much. 😉

Reply 24 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
RandomStranger wrote on 2022-08-05, 13:03:

Well, that's 220W if my memory serves me right. Definitely difficult to cool by air. Though its main problem is that it's still only marginally faster over all than a 65W i5 from its time.

Yeah .. I want one for shits and giggles 😁

I wonder if its really possible to cook a steak on one.

Reply 25 of 87, by 386SX

User metadata
Rank l33t
Rank
l33t

Sometimes I think it's not all about the cpu or SoC power TDP but also how internally was designed for the manufacturing size itself. For example lately I tried an old APU like the E350 which integrate its 80 shaders @ 500Mhz Radeon HD 6310 all in one SoC with a dual 1,6Ghz x64 cores. Beside theorically not much power demanding the default heatsink and fan clearly has a difficult time to keep it cool with the temperature going up even before the boot process finish and with the full fan speed activating keeping it around 50 to 60 °C and no way to keep it lower. And with a new thermal paste too and all the on-demand clocks/voltages activated logics. It integrated too much for the 40nm size imho that's why the Atom SoC went for the slow and problematic GMA series that came from portable devices concept and required 10-15w less at the wall and run at 50°C in summer with a passive heatsink.

Reply 26 of 87, by timw4mail

User metadata
Rank Newbie
Rank
Newbie
TrashPanda wrote on 2022-08-05, 13:26:
RandomStranger wrote on 2022-08-05, 13:03:

Well, that's 220W if my memory serves me right. Definitely difficult to cool by air. Though its main problem is that it's still only marginally faster over all than a 65W i5 from its time.

Yeah .. I want one for shits and giggles 😁

I wonder if its really possible to cook a steak on one.

I doubt it. The CPU will thermal throttle, and I don't think 100C is hot enough to sear meat.

Reply 27 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
timw4mail wrote on 2022-08-05, 13:46:
TrashPanda wrote on 2022-08-05, 13:26:
RandomStranger wrote on 2022-08-05, 13:03:

Well, that's 220W if my memory serves me right. Definitely difficult to cool by air. Though its main problem is that it's still only marginally faster over all than a 65W i5 from its time.

Yeah .. I want one for shits and giggles 😁

I wonder if its really possible to cook a steak on one.

I doubt it. The CPU will thermal throttle, and I don't think 100C is hot enough to sear meat.

As in put a small iron skillet on top of it to act as a sink (A little thermal compound between it), let it get hot which it should do then add a little butter and throw a small bit of steak in there. In theory all of it should act quiet well as a heat sink allowing the CPU to keep from throttling, given enough time it should cook the steak.

It wont be great but itll certainly be cooked enough to eat 🤣, Ive seen it done with bacon, you have to remember that food doesnt require a super high temp to cook just a constant source of heat. (We are not talking about Porterhouse here but rather sizzle steak which is pretty thin)

Reply 28 of 87, by alvaro84

User metadata
Rank Member
Rank
Member

Well, on the opening post, it's more than easy for me. My most loved era is the DOS-capable (ISA and all) hardware and most of these are actually less power hungry than my (already almost vintage) main Ivy Xeon/gtx650 rig. I've just measured the 486 I have on the test bench now, it's an MS-4138 with an IBM made i486DX-33, an Octek Ark1000 VLB VGA and the actually tested sound card and HDDs (one 2.5", one 3.5") and it drew in the low 30s, in watts of course.

So using it or my FM2 config (in the 20s-30s when idle/oolite/youtube with no DVGA instead of the Ivy Xeon (60-ish in idle) actually saves power.

On the other hand, my Phenom II X2@X4 with a Radeon 5870 or 7870XT (it depends) - well, I keep it in stock for the heating season when it'll run basically "for free". Well the AC is way more efficient but we only have one unit for the entire house so the attic needs some extra anyway...

Shame on us, doomed from the start
May God have mercy on our dirty little hearts

Reply 29 of 87, by debs3759

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-08-05, 09:15:

1Kw GPUs honestly worry me ...modern PCs already have heat issues with 450watt GPUs no idea how 1Kw GPUs will be cooled.

If I score one, it'll be cooled with a water chiller, with the heat vented outside in the summer and heating the room in winter (instead of gas central heating). I can live with that 😀

See my graphics card database at www.gpuzoo.com
Constantly being worked on. Feel free to message me with any corrections or details of cards you would like me to research and add.

Reply 30 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
debs3759 wrote on 2022-08-05, 23:49:
TrashPanda wrote on 2022-08-05, 09:15:

1Kw GPUs honestly worry me ...modern PCs already have heat issues with 450watt GPUs no idea how 1Kw GPUs will be cooled.

If I score one, it'll be cooled with a water chiller, with the heat vented outside in the summer and heating the room in winter (instead of gas central heating). I can live with that 😀

Eventually we are going to need wall sockets that plumb the PC cooling system/rad into the HVAC system, that way the PC cooling is part of a cooling system you already have and use. (Technically you wouldn't need a rad in the PC with such a setup but it could be used to dump some heat from the cooling gas before returning it to the HVAC loop)

Reply 31 of 87, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-08-06, 04:25:
debs3759 wrote on 2022-08-05, 23:49:
TrashPanda wrote on 2022-08-05, 09:15:

1Kw GPUs honestly worry me ...modern PCs already have heat issues with 450watt GPUs no idea how 1Kw GPUs will be cooled.

If I score one, it'll be cooled with a water chiller, with the heat vented outside in the summer and heating the room in winter (instead of gas central heating). I can live with that 😀

Eventually we are going to need wall sockets that plumb the PC cooling system/rad into the HVAC system, that way the PC cooling is part of a cooling system you already have and use. (Technically you wouldn't need a rad in the PC with such a setup but it could be used to dump some heat from the cooling gas before returning it to the HVAC loop)

I don't know. I'm less and less interested in current games and at the same time I'm not planning to buy a GPU with a higher than 200W TDP. My sweet spot is currently around RX6600XT and RTX3060. It's a level of gaming performance I'm more perfectly satisfied with and the TDP is still manageable. I'm actually thinking downsizing my monitor. I don't need a 40" 4K monitor. I bought it to be my TV in the first place and have it as a stand in monitor in the meantime. A sub-30" 1440p monitor perfectly satisfies my needs.

I think high resolution on small screens are almost like placebo. You need more computational power to use then in their native res without benefiting all that much. Cellphones are even worse in this regard whit resolution which would be suitable for a 24" monitor on a sub-6" screen.

sreq.png retrogamer-s.png

Reply 32 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
RandomStranger wrote on 2022-08-06, 04:49:
TrashPanda wrote on 2022-08-06, 04:25:
debs3759 wrote on 2022-08-05, 23:49:

If I score one, it'll be cooled with a water chiller, with the heat vented outside in the summer and heating the room in winter (instead of gas central heating). I can live with that 😀

Eventually we are going to need wall sockets that plumb the PC cooling system/rad into the HVAC system, that way the PC cooling is part of a cooling system you already have and use. (Technically you wouldn't need a rad in the PC with such a setup but it could be used to dump some heat from the cooling gas before returning it to the HVAC loop)

I don't know. I'm less and less interested in current games and at the same time I'm not planning to buy a GPU with a higher than 200W TDP. My sweet spot is currently around RX6600XT and RTX3060. It's a level of gaming performance I'm more perfectly satisfied with and the TDP is still manageable. I'm actually thinking downsizing my monitor. I don't need a 40" 4K monitor. I bought it to be my TV in the first place and have it as a stand in monitor in the meantime. A sub-30" 1440p monitor perfectly satisfies my needs.

I think high resolution on small screens are almost like placebo. You need more computational power to use then in their native res without benefiting all that much. Cellphones are even worse in this regard whit resolution which would be suitable for a 24" monitor on a sub-6" screen.

I dont think I could live with out my 32" 4k panel, I dont game on it but for everything else non gaming its a damn life saver, I've tried to go back to 1080p for normal day to day use and it drives me insane with how little desktop space there is. Even 1440p is hard to adapt to after being at 4k for so long.

You really dont need a super powerful GPU to run at 4k if you are not gaming, the crazy amount of desktop space simply makes 4k to useful to not have.

Though 40" is far to big for a desktop 🤣.

Reply 33 of 87, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
RandomStranger wrote:

but 105W is 105W, it's still the same amount of heat, only dissipated faster and more efficiently.

Well, no. You can easily cool down Pentium 4 EE with Big Typhoon even in Prime95, all while you'll kick into thermal throttling on modern CPU with "identical" rating and aforementioned cooler, especially with AVX instructions involved.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 34 of 87, by Jo22

User metadata
Rank l33t++
Rank
l33t++

Uhm, power isn't 'consumed', it just changes form. ;)
(It's more about current draw/current drain..)

Edit: Never mind. In winter, a retro PC can be a nice alternative to a heater! :D

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 35 of 87, by 386SX

User metadata
Rank l33t
Rank
l33t

Reading about modern monitor resolution requirements, I suppose there're not many that use old 4:3 LCD monitors like my main one. 😁 Seriously I can see only positive sides using those old early 2000-2005 monitors with low resolutions. Low GPU requirements and power demands, faster speed mostly everywhere even in the GUI management and 2D apps, low resolution panels means pixel accuracy with low resolution media and less bandwidth everywhere. Monitor power demand depends on the model and often can be lower than a more modern LED 1080p monitor I've got but mostly never use cause already too much and increasing hw requirements in any apps with no real positive sides.

About devices heating a whole room, much time ago I remember the very first Xbox360 version which indeed required a lot of power and created a lot of heat, I remember could seriously warm up a whole room after few hours. I've never found something similar but I never tried those FX cpu or similar things so there might be even worse situations I suppose. 😁

Reply 36 of 87, by TrashPanda

User metadata
Rank l33t
Rank
l33t
386SX wrote on 2022-08-06, 08:50:

Reading about modern monitor resolution requirements, I suppose there're not many that use old 4:3 LCD monitors like my main one. 😁 Seriously I can see only positive sides using those old early 2000-2005 monitors with low resolutions. Low GPU requirements and power demands, faster speed mostly everywhere even in the GUI management and 2D apps, low resolution panels means pixel accuracy with low resolution media and less bandwidth everywhere. Monitor power demand depends on the model and often can be lower than a more modern LED 1080p monitor I've got but mostly never use cause already too much and increasing hw requirements in any apps with no real positive sides.

About devices heating a whole room, much time ago I remember the very first Xbox360 version which indeed required a lot of power and created a lot of heat, I remember could seriously warm up a whole room after few hours. I've never found something similar but I never tried those FX cpu or similar things so there might be even worse situations I suppose. 😁

I actually wonder about older LCD panels being lower power requirements, modern backlit LED panels dont use much power at all whereas old LCDs usually used CCFL or the older LED back lights that were not energy efficient.

I doubt the old 20" LCD/CCFL panels were more energy efficient than a modern LED 27/32" panel, I fully expect the modern panel to beat it in every area for efficiency.

And if CRT ..well I got some bad news for you, not even close to being efficient 🤣.

Edit - The FX 9590 was a 220watt TDP CPU ... it was Volcanic, ate power like a fat guy at a all you can eat buffet and required a super beefy cooler to handle it or an AIO/CLC setup

Reply 37 of 87, by 386SX

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2022-08-06, 09:25:
I actually wonder about older LCD panels being lower power requirements, modern backlit LED panels dont use much power at all wh […]
Show full quote
386SX wrote on 2022-08-06, 08:50:

Reading about modern monitor resolution requirements, I suppose there're not many that use old 4:3 LCD monitors like my main one. 😁 Seriously I can see only positive sides using those old early 2000-2005 monitors with low resolutions. Low GPU requirements and power demands, faster speed mostly everywhere even in the GUI management and 2D apps, low resolution panels means pixel accuracy with low resolution media and less bandwidth everywhere. Monitor power demand depends on the model and often can be lower than a more modern LED 1080p monitor I've got but mostly never use cause already too much and increasing hw requirements in any apps with no real positive sides.

About devices heating a whole room, much time ago I remember the very first Xbox360 version which indeed required a lot of power and created a lot of heat, I remember could seriously warm up a whole room after few hours. I've never found something similar but I never tried those FX cpu or similar things so there might be even worse situations I suppose. 😁

I actually wonder about older LCD panels being lower power requirements, modern backlit LED panels dont use much power at all whereas old LCDs usually used CCFL or the older LED back lights that were not energy efficient.

I doubt the old 20" LCD/CCFL panels were more energy efficient than a modern LED 27/32" panel, I fully expect the modern panel to beat it in every area for efficiency.

And if CRT ..well I got some bad news for you, not even close to being efficient 🤣.

Edit - The FX 9590 was a 220watt TDP CPU ... it was Volcanic, ate power like a fat guy at a all you can eat buffet and required a super beefy cooler to handle it or an AIO/CLC setup

I've got a classic Acer 2003 LCD monitor that should have the usual CCFL backlight but I've tested myself at the plug wall the watts decreasing almost 50% at lowest brightness value (which is not 0% realistically, more like 30-40%). While the monitor specification talk about 12v @ 3A more or less, the power demand decrease to 15 watts seriously.
I wonder if not all the monitors back then worked in this way cause I've got a 2004-2005 (still Acer) TV 26" HD Ready which decreasing the brightness didn't change the power demand not a single watt (around 120 watts). I suppose the "backlight" design wasn't already often thought around "power saving" in most products while they could. The 2013 Samsung 1080p low end vga cheap PC monitor I got has a 14v @ 2,1 A specification but at the wall without activating the Eco mode (I didn't test those) the power demands are close to those max values. Clearly the higher pixel count and the 22" backlight LEDs at the end more or less make power demands similar (while unfortunately this specific model was very cheap with bad colors very difficult to find an acceptable balance, low contrast ratio, almost impossible viewing angles, etc...).

About CRT monitors I wish I could still use them but beside the fixed high power demand their lifetime is the part I don't like much. All the one I had even good ones, began to have unexpected random problems. If it wasn't for those problems I'd find this type of monitor to be better than most common LCDs I tried. Native resolutions, colors, grey shades levels, powerful brightness, etc..

Reply 38 of 87, by Jo22

User metadata
Rank l33t++
Rank
l33t++
386SX wrote on 2022-08-06, 09:49:

About CRT monitors I wish I could still use them but beside the fixed high power demand their lifetime is the part I don't like much. All the one I had even good ones, began to have unexpected random problems. If it wasn't for those problems I'd find this type of monitor to be better than most common LCDs I tried. Native resolutions, colors, grey shades levels, powerful brightness, etc..

It's not the picture tube that ages quickly, it's the power supply. And the flyback transformer, maybe.
TFT/LCD monitors aren't any better, by the way.
They have dying power supplies, too. And aging inverters for the CCFLs.
That's the #1 issue of flat screens with a black screen - the monitor aa such is still intact, but not the lighting.

Edit: At home, we lost at least 5 LCD monitors in the last ~20 years that way. But not a single CRT monitor.

Edit: Picture tubes can be regenerated, by the way.
The process is known as "rejuvenation".
It's been applied if the tube becomes weak and looses brightness, for example.
There are rejuvenator devices that can be used. In the past, these devices weren't unheard of.
TV and radio repair men used then occasionally, I believe.

Here are two sample videos about rejuvenation.

https://www.youtube.com/watch?v=nWqqB_OnYzE

https://www.youtube.com/watch?v=YikOY8WTnLU

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 39 of 87, by 386SX

User metadata
Rank l33t
Rank
l33t
Jo22 wrote on 2022-08-06, 11:03:
It's not the picture tube that ages quickly, it's the power supply. And the flyback transformer, maybe. TFT/LCD monitors aren't […]
Show full quote
386SX wrote on 2022-08-06, 09:49:

About CRT monitors I wish I could still use them but beside the fixed high power demand their lifetime is the part I don't like much. All the one I had even good ones, began to have unexpected random problems. If it wasn't for those problems I'd find this type of monitor to be better than most common LCDs I tried. Native resolutions, colors, grey shades levels, powerful brightness, etc..

It's not the picture tube that ages quickly, it's the power supply. And the flyback transformer, maybe.
TFT/LCD monitors aren't any better, by the way.
They have dying power supplies, too. And aging inverters for the CCFLs.
That's the #1 issue of flat screens with a black screen - the monitor aa such is still intact, but not the lighting.

Edit: At home, we lost at least 5 LCD monitors in the last ~20 years that way. But not a single CRT monitor.

Edit: Picture tubes can be regenerated, by the way.
The process is known as "rejuvenation".
It's been applied if the tube becomes weak and looses brightness, for example.
There are rejuvenator devices that can be used. In the past, these devices weren't unheard of.
TV and radio repair men used then occasionally, I believe.

Interesting. I remember those times they felt more like electricity random noise that changed the image screen geometry for a second. I've read about possibility of dust interference inside and I opened them but I didn't see anything wrong but still that problem happened sometimes and at the end I didn't use it anymore. I suppose in the past such problems could have been found and solved easily considering how much time similar CRT tech lasted until 2000.