VOGONS

Common searches


Reply 40 of 84, by Peter.Mengel

User metadata
Rank Member
Rank
Member
TrashPanda wrote on 2022-08-16, 02:30:
Peter.Mengel wrote on 2022-08-16, 02:26:
TrashPanda wrote on 2022-08-16, 01:53:

When you grow up with higher mains voltages you learn to be pretty dang respectful of it so we dont have any more electrical fires here than you would normally expect, our codes here are also pretty stringent and most if not all newer homes have RCD based breakers.

And our Ovens are 400v up to 8000 Watt. Its a Special Electric Cable System in as far i know all kitchens in germany.
I guess this is the only System which could harm you if youre doing dumb things.

Doesn't Germany use 3 phase for high amperage devices like Ovens and HVAC ?

HVAC here is usually wired directly to the mains with its own dedicated circuit but the little window mounted AC is run from 15 AMP sockets, Ovens are run from their own dedicated 20 AMP circuit. AFAIK very few houses in Australia have access to 3 phase as its mostly used for industrial reasons here.

To be honest i rly have no clue Sir! I know some iceberg upper facts...but the deeper knowledge is not existing xD
There is a own Fuse just for the High Voltage Stuff (The On Off one). Otherwise then by Rooms for the other fuses. But thats far as i know about it.

Reply 41 of 84, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-08-16, 03:02:
Standard Def Steve wrote on 2022-08-16, 02:45:
TrashPanda wrote on 2022-08-15, 17:58:

Australia is lucky, I can pull ~3000 watts from the wall without killing the breaker, 240-250v AC @ 13amps is a wonderful thing, though if I want to run a 1600watt PSU ill need to use the 15amp wall plug cause they dont put 10amp power leads on 1600watt PSUs.

Hey, I can run a kettle and microwave off of my wimpy 20a/120v Canadian outlet at the same time. And yeah, that's well over 2400w and probably close to 3000w, so it should be tripping the breaker. But it doesn't, and it hasn't started a single fire in the 19 years I've owned this house, so I keep doing it. Huzzah, baby.

being a 20 AMP line it likely has some nice thick cabling which would help it pull far more than it should.

So the microwave is a 1200w unit, and the kettle pulls in 1500w, making the combined load 300w over limit. But like most things in life, the breakers are probably designed to give us dumb users a fair bit of leeway. 😀
And you're probably right about the cabling!

P6 chip. Triple the speed of the Pentium.
Tualatin: PIII-S @ 1628 MHz | QDI Advance 12T | 2GB DDR-310 | 6800GT | X-Fi | 500GB HDD | 3DMark01: 14,059
Dothan: PM @ 2720 MHz | MSI Speedster FA4 | 2GB DDR2-544 | GTX-280 | X-Fi | 500GB SSD | 3DMark01: 42,148

Reply 42 of 84, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Standard Def Steve wrote on 2022-08-16, 06:23:
TrashPanda wrote on 2022-08-16, 03:02:
Standard Def Steve wrote on 2022-08-16, 02:45:

Hey, I can run a kettle and microwave off of my wimpy 20a/120v Canadian outlet at the same time. And yeah, that's well over 2400w and probably close to 3000w, so it should be tripping the breaker. But it doesn't, and it hasn't started a single fire in the 19 years I've owned this house, so I keep doing it. Huzzah, baby.

being a 20 AMP line it likely has some nice thick cabling which would help it pull far more than it should.

So the microwave is a 1200w unit, and the kettle pulls in 1500w, making the combined load 300w over limit. But like most things in life, the breakers are probably designed to give us dumb users a fair bit of leeway. 😀
And you're probably right about the cabling!

Well in the setup we have here the 10amp lines all have 16 amp breakers, the 15 has a 20 amp and the stove 20amp is on a 32amp breaker IIRC, now I believe these are the hard limits of what the breaker will allow but by rights it should trip before it hits them. (Yeah this house is a weird setup, it was recently upgraded electrically so all the wiring was updated along with both breaker panels)

This gives the wall sockets roughly 4000 watts to play with, not that I would ever draw that much with one device.

Oh noes, the cap let the shmooo out 😁

Reply 43 of 84, by Hezus

User metadata
Rank Member
Rank
Member
antrad wrote on 2022-08-15, 18:39:

I would like to upgrade to a 27" 1440p monitor and buy a GPU for it. Since I only play older games I was thinking of buying something like a used RX 580 or GTX 1070 which should be enough for playing older games on high resolutions.

I recently upgraded to a 27 1440p screen with 120 hz and so far my RTX 1070 has been sufficient playing games from a few years ago (like Battlefield 5) or newer lighter titles. Playing Goblins of Elderstone now. Not sure how the 1070 will hold up with newer titles, since I don't really play that much.

I do keep an eye on the price fluctuations for the RTX 30 series but honestly, I wouldn't know why I would need a new GPU. It's not going to give me any noticable performance gain for the stuff I use my pc for nowadays. I've also been more mindful of my power usage lately, so slapping a watt-hungry beast in there just for bragging rights... meh.

Visit my YT Channel!

Reply 44 of 84, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Hezus wrote on 2022-08-16, 08:05:
antrad wrote on 2022-08-15, 18:39:

I would like to upgrade to a 27" 1440p monitor and buy a GPU for it. Since I only play older games I was thinking of buying something like a used RX 580 or GTX 1070 which should be enough for playing older games on high resolutions.

I recently upgraded to a 27 1440p screen with 120 hz and so far my RTX 1070 has been sufficient playing games from a few years ago (like Battlefield 5) or newer lighter titles. Playing Goblins of Elderstone now. Not sure how the 1070 will hold up with newer titles, since I don't really play that much.

I do keep an eye on the price fluctuations for the RTX 30 series but honestly, I wouldn't know why I would need a new GPU. It's not going to give me any noticable performance gain for the stuff I use my pc for nowadays. I've also been more mindful of my power usage lately, so slapping a watt-hungry beast in there just for bragging rights... meh.

GTX 1070, and yeah, if you don't play modern games why would you bother upgrading. Also its only watt-hungry if you utilize its power. It's likely at least as efficient for the things you do with your 1070, probably more so if you keep a frame rate limit on it.

I'm not the only one who limits their FPS to their monitors refresh rate right?

Reply 45 of 84, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
Shagittarius wrote on 2022-08-16, 14:22:
Hezus wrote on 2022-08-16, 08:05:
antrad wrote on 2022-08-15, 18:39:

I would like to upgrade to a 27" 1440p monitor and buy a GPU for it. Since I only play older games I was thinking of buying something like a used RX 580 or GTX 1070 which should be enough for playing older games on high resolutions.

I recently upgraded to a 27 1440p screen with 120 hz and so far my RTX 1070 has been sufficient playing games from a few years ago (like Battlefield 5) or newer lighter titles. Playing Goblins of Elderstone now. Not sure how the 1070 will hold up with newer titles, since I don't really play that much.

I do keep an eye on the price fluctuations for the RTX 30 series but honestly, I wouldn't know why I would need a new GPU. It's not going to give me any noticable performance gain for the stuff I use my pc for nowadays. I've also been more mindful of my power usage lately, so slapping a watt-hungry beast in there just for bragging rights... meh.

GTX 1070, and yeah, if you don't play modern games why would you bother upgrading. Also its only watt-hungry if you utilize its power. It's likely at least as efficient for the things you do with your 1070, probably more so if you keep a frame rate limit on it.

I'm not the only one who limits their FPS to their monitors refresh rate right?

I limit to monitor refresh rate also; currently 144Hz 16:9 on the bottom one and 120Hz on the 21:9 one.

***->WINNER, 1ST PLACE<-***
2022 #QUAKE3totheMAX -560.5fps-
Brain Drain Retro LAN https://discord.com/channels/799008837918261328
Windows ME
NForce2 A7N8X-E DLX
Athlon 848/154MHz
DDR@411MHz (2-3-3-3)
GeForce 256 DDR@144/344MHz
ESS Maestr0-1

Reply 46 of 84, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Meatball wrote on 2022-08-16, 14:45:
Shagittarius wrote on 2022-08-16, 14:22:
Hezus wrote on 2022-08-16, 08:05:

I recently upgraded to a 27 1440p screen with 120 hz and so far my RTX 1070 has been sufficient playing games from a few years ago (like Battlefield 5) or newer lighter titles. Playing Goblins of Elderstone now. Not sure how the 1070 will hold up with newer titles, since I don't really play that much.

I do keep an eye on the price fluctuations for the RTX 30 series but honestly, I wouldn't know why I would need a new GPU. It's not going to give me any noticable performance gain for the stuff I use my pc for nowadays. I've also been more mindful of my power usage lately, so slapping a watt-hungry beast in there just for bragging rights... meh.

GTX 1070, and yeah, if you don't play modern games why would you bother upgrading. Also its only watt-hungry if you utilize its power. It's likely at least as efficient for the things you do with your 1070, probably more so if you keep a frame rate limit on it.

I'm not the only one who limits their FPS to their monitors refresh rate right?

I limit to monitor refresh rate also; currently 144Hz 16:9 on the bottom one and 120Hz on the 21:9 one.

I've got a true 10bit 4k monitor but only if I run it at 98Hz. I limit to 90FPS and I find it smooth enough in combination with GSync.

Reply 47 of 84, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
Shagittarius wrote on 2022-08-16, 15:02:
Meatball wrote on 2022-08-16, 14:45:
Shagittarius wrote on 2022-08-16, 14:22:

GTX 1070, and yeah, if you don't play modern games why would you bother upgrading. Also its only watt-hungry if you utilize its power. It's likely at least as efficient for the things you do with your 1070, probably more so if you keep a frame rate limit on it.

I'm not the only one who limits their FPS to their monitors refresh rate right?

I limit to monitor refresh rate also; currently 144Hz 16:9 on the bottom one and 120Hz on the 21:9 one.

I've got a true 10bit 4k monitor but only if I run it at 98Hz. I limit to 90FPS and I find it smooth enough in combination with GSync.

Going from 60Hz to 120Hz (for me) was one of those magic technology moments; like when I first saw full motion video on the PSOne, or Super Mario 64 demo'ing inside Toys 'R Us. All going to 144Hz did was increase my self-inflicted fps snobbery, heh...

***->WINNER, 1ST PLACE<-***
2022 #QUAKE3totheMAX -560.5fps-
Brain Drain Retro LAN https://discord.com/channels/799008837918261328
Windows ME
NForce2 A7N8X-E DLX
Athlon 848/154MHz
DDR@411MHz (2-3-3-3)
GeForce 256 DDR@144/344MHz
ESS Maestr0-1

Reply 48 of 84, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Meatball wrote on 2022-08-16, 15:11:
Shagittarius wrote on 2022-08-16, 15:02:
Meatball wrote on 2022-08-16, 14:45:

I limit to monitor refresh rate also; currently 144Hz 16:9 on the bottom one and 120Hz on the 21:9 one.

I've got a true 10bit 4k monitor but only if I run it at 98Hz. I limit to 90FPS and I find it smooth enough in combination with GSync.

Going from 60Hz to 120Hz (for me) was one of those magic technology moments; like when I first saw full motion video on the PSOne, or Super Mario 64 demo'ing inside Toys 'R Us. All going to 144Hz did was increase my self-inflicted fps snobbery, heh...

Time for my snobbery, HDR at 1100 nits is something you have to see in person to understand. Sunlight looks like actual sunlight. To me HDR 1000+ is far more important than any other video consideration.

Reply 49 of 84, by chris2021

User metadata
Rank Oldbie
Rank
Oldbie

How many dookie gpus can be plugged into your average motherboard, and would there be a benefit? SLI/Crossfire used to be a thing, not sure if it still was.

Reply 50 of 84, by appiah4

User metadata
Rank l33t++
Rank
l33t++

Radeon 6600 is down as low as $270 where I live.. Still patiently waiting for it to hit the $250 mark.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 51 of 84, by TrashPanda

User metadata
Rank l33t
Rank
l33t
chris2021 wrote on 2022-08-17, 11:40:

How many dookie gpus can be plugged into your average motherboard, and would there be a benefit? SLI/Crossfire used to be a thing, not sure if it still was.

Its not, the only nVidia cards that still support it are the 3090s and it got renamed to NVLink since its not positioned towards professional applications instead of gaming.

Even Xfire is dead and neither AMD nor nVidia make game profiles for dual GPUs anymore.

Oh noes, the cap let the shmooo out 😁

Reply 52 of 84, by Kallestofeles

User metadata
Rank Newbie
Rank
Newbie

Going for the 48" LG C1 was probably one of the best and consequently also the worst things I could have done since ditching my nostalgic CRTs. Now I am stuck with always going for the high-end parts and the wallet has been burnt already to ashes thanks to the GPU-boom.
Currently, I have settled on the 3080Ti (TIEEEEEEEE!) and it feeds the panel well. A bit of undervolting was necessary to bring the noise down, but Strix can cool the chip quite reasonably without much coil whine (though that is always a lottery).
When it comes to next gen though - I shall wait out the reviews and whatever provides the best performance at a "reasonable" price, I shall probably upgrade to in order to keep feeding the panel with newer games which I am hoping to eventually get to... yeah, maybe eventually... I mostly play late 90's/early 00's stuff and the backlog is HUGE on the PC - probably most of you can relate here.

That might raise a question - "Why upgrade at all if >75% of the gaming you do is retro?"
Answer: I am a sucker for both new and retro tech 😢

Reply 53 of 84, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

I will never buy a 600+W GPU (that surely has 1kW+ transient spikes). I have a Seasonic Focus GX 850W PSU, which would maybe handle the load, I also have a 230-240V / 32Amp electrical system at home, but... still, no thanks. Also, I live in Europe, and we are kind of going through a major energy crisis here (which will most likely get even worse).

Either way, I've never purchased a high end GPU in my life, I always go for the midrange cards, so I'm hoping that RTX 4060 Ti / 4070 will stay below 250W (unlikely, I know). If not, I will jump on AMD's ship (based on the rumours so far, it seems that RDNA 3 cards will have very good performance per watt).

2 x PGA132 / 5 x Socket 3 / 9 x Socket 7 / 12 x SS7 / 1 x Socket 8 / 14 x Slot 1 / 5 x Slot A
5 x Socket 370 / 8 x Socket A / 2 x Socket 478 / 2 x Socket 754 / 3 x Socket 939 / 7 x LGA775 / 1 x LGA1155
Current rig: Ryzen 5 3600X
Backup rig: Core i7 7700k

Reply 54 of 84, by TheMobRules

User metadata
Rank Oldbie
Rank
Oldbie

I have a 5700 XT since 2019 and I'm not planning to upgrade soon. I only play a few select modern games which all run fine at 1440p upscaled to 4K with Radeon Super Resolution, looks great and plays fine, I have a 55in 60Hz TV so no reason for me to get insane framerates, and I'm not really interested in getting a display with higher refresh rates either.

What annoys me is the noise when the fans on the card ramp up during gaming (I have a Gigabyte card), if I don't set an aggresive fan curve this card gets really hot, even after replacing the stock thermal paste. I know AMD says they can run up to 110C without throttling but I don't really like running something at like 90C for extended periods of time. I just think modern GPUs in general run way too hot...

Reply 55 of 84, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
TheMobRules wrote on 2022-08-17, 16:49:

What annoys me is the noise when the fans on the card ramp up during gaming (I have a Gigabyte card), if I don't set an aggresive fan curve this card gets really hot, even after replacing the stock thermal paste. I know AMD says they can run up to 110C without throttling but I don't really like running something at like 90C for extended periods of time. I just think modern GPUs in general run way too hot...

My Gigabyte RTX 2060 also ran quite hot (84 - 85 degrees in very demanding games), but I've undervolted it immediately after buying it in 2019 (using the voltage/frequency curve editor in MSI Afterburner). Thankfully, it handled the undervolting beautifully (and I also managed to keep the original overclocked frequency while doing so), and the card has stayed below 75C ever since.

2 x PGA132 / 5 x Socket 3 / 9 x Socket 7 / 12 x SS7 / 1 x Socket 8 / 14 x Slot 1 / 5 x Slot A
5 x Socket 370 / 8 x Socket A / 2 x Socket 478 / 2 x Socket 754 / 3 x Socket 939 / 7 x LGA775 / 1 x LGA1155
Current rig: Ryzen 5 3600X
Backup rig: Core i7 7700k

Reply 56 of 84, by Kallestofeles

User metadata
Rank Newbie
Rank
Newbie
bloodem wrote on 2022-08-17, 16:35:

I will never buy a 600+W GPU (that surely has 1kW+ transient spikes). I have a Seasonic Focus GX 850W PSU, which would maybe handle the load, I also have a 230-240V / 32Amp electrical system at home, but... still, no thanks. Also, I live in Europe, and we are kind of going through a major energy crisis here (which will most likely get even worse).
...

I hear you loud and clear. I tested the waters with 6900XT (Sapphire Nitro+) and whenever a transient spike happened (whenever a game engine decided to go full mental on GPU resources), it tripped my old 850W BitFenix Whisper M 80 PLUS gold PSU. Upgraded to 1KW beQuiet! but in all honesty, the power draw of these cards (especially the high-end) is getting out of hand. During hot summer days, deciding to run anything on the computer easily puts +1C to the already 27C room in my household.😅
On a similar note - greetings from Estonia... we just had the legendary electricity spike of 4EUR per KWH.🙇‍♂️

Reply 57 of 84, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

Nvidia, AMD and card manufacturers are pretty much overclocking video cards to the max from the box. So you have to underclock and undervolt them now, unlike in good old days, when manufacturers tended to put quite a large headroom on hardware capabilities.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 58 of 84, by gaffa2002

User metadata
Rank Member
Rank
Member

Always when I think about upgrading, I tend to create a list of games I want that can only run in the new machine, or at least have some good improvement. Then five minutes later I give up the idea because the list hardly has any game on it.
Modern games really don't interest me anymore, nor the technology advancements towards gaming (exponential processing power required for logarithmic gains, its just not worth it anymore). IMHO technology has many other areas to progress, and I wouldn't mind at all if the gaming industry took a break from hardware advancements and focused on making better games for the existing hardware instead.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 59 of 84, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Kallestofeles wrote on 2022-08-17, 19:21:

On a similar note - greetings from Estonia... we just had the legendary electricity spike of 4EUR per KWH.🙇‍♂️

Greetings from Romania! But... damn, man! And to think that I was complaining about paying now 0.62 lei (~ 0.13EUR)/kWh, when a year ago I was paying 0.24 lei (~0.04EUR)/kWh.
I promise I won't complain anymore... 😁

2 x PGA132 / 5 x Socket 3 / 9 x Socket 7 / 12 x SS7 / 1 x Socket 8 / 14 x Slot 1 / 5 x Slot A
5 x Socket 370 / 8 x Socket A / 2 x Socket 478 / 2 x Socket 754 / 3 x Socket 939 / 7 x LGA775 / 1 x LGA1155
Current rig: Ryzen 5 3600X
Backup rig: Core i7 7700k