VOGONS


RAM prices have gone insane

Topic actions

Reply 280 of 298, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

I was lucky enough to buy three 64GB DDR4 3200 kits last year in February, for €120 each, as well as a 128GB DDR4 2600 kit for another 120€, and another 64GB 2400 kit for my two X99 motherboards, 32GB DDR4 3200 for my laptop, for 140€, and 16GB DDR3L for my mini PC router. As well as five 512GB NVMe SSDs and one 1TB NVMe SSD. It was a year of upgrading RAM and storage, and I can’t believe how lucky I was. Now, I’m worried that something might go wrong, because here, they also refund the original price. With current prices, it would be impossible for me.
It was normal for me to buy something at one price, only to see it cheaper the next day. This is the first time I see what's happening now, even in 2011 with the floods, the HDD prices didn't climb like they did nowadays.

Reply 281 of 298, by megatron-uk

User metadata
Rank l33t
Rank
l33t

We've all seen spikes before due to natural disasters, temporary shortages or similar, but this is the first time I've seen an effect of this magnitude across so many different sectors all because of the insane claims from one strand of technology.... the incredible cost of all the equipment and power driving this hype machine isn't covering its costs according to any report that I've seen. Most services are running on venture capital and investor funding... but no-one appears to be actually covering their costs and or indeed making any money from this if those sources of income are excluded.

We're seeing costs of some workstations and high end equipment double or triple in the space of 3-6 months... not component prices, but entire systems.... and at the bottom end, if you're not willing to swallow the huge cost rises, then your entry level laptops and desktops are now like reading manufacturer specs from 10 or 15 years ago. That's going to be great for productivity.

It is all certifiably insane.

My collection database and technical wiki:
https://www.target-earth.net

Reply 282 of 298, by Shponglefan

User metadata
Rank l33t
Rank
l33t
megatron-uk wrote on 2026-03-27, 18:00:

Most services are running on venture capital and investor funding... but no-one appears to be actually covering their costs and or indeed making any money from this if those sources of income are excluded.

It feels like the dot-com boom all over again.

Pentium 4 Multi-OS Build
486 DX4-100 with 6 sound cards
486 DX-33 with 5 sound cards

Reply 283 of 298, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all "AI" focused hardware was immediately rendered obsolete and functionally worthless? Imagine the crazy things people would come up with to do with that hardware as it ended up in various places around the world (E-waste, harvesting chips to make homebrew hardware, etc.).

I have seen the kind of ridiculous stuff that lands on scrap or second hand markets when businesses fold and have to liquidate immediately.

It kind of makes me ill to think about it, but last year some time I saw a scrap seller on a popular selling platform selling multiple piles of very large server CPUs that had the heat spreaders popped off. I almost had a heart attack because I knew what they were based on their distinctive pad layout on the bottom... and I knew because earlier that year I had found a single one of them (with heat spreader) by chance and sold it for $500 as-is... they regularly sold for $900+ in working condition and sold new for thousands of dollars each just a couple years before that. This seller had - HUNDREDS - of them being sold for scrap prices... delidded of course. It was easily half a million dollars worth of high end ARM-based server processors. I contacted him to see if he still had the heat spreaders somewhere but he was completely unresponsive. Around that time the seller ended a bunch of his listings and then later replied to me saying he no longer had them. I was... disappointed. But, hey, they weren't mine. I hope he got something out of them.

Anyway... I think if this bubble bursts in the worst way possible, there is a slight chance we could see crazy things like this ending up on the junk markets. Imagine having several TB of high end VRAM and dozens of 5090-level GPU cores just laying in heaps attached to now-useless proprietary boards waiting to be repurposed into gaming hardware by people with the tools and skills needed to cobble them into something useful.

... again, this very likely won't happen, but it is fun to think about.

Now for some blitting from the back buffer.

Reply 284 of 298, by rmay635703

User metadata
Rank l33t
Rank
l33t
Ozzuneoj wrote on 2026-03-27, 18:40:
This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all […]
Show full quote

This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all "AI" focused hardware was immediately rendered obsolete and functionally worthless? Imagine the crazy things people would come up with to do with that hardware as it ended up in various places around the world (E-waste, harvesting chips to make homebrew hardware, etc.).

I have seen the kind of ridiculous stuff that lands on scrap or second hand markets when businesses fold and have to liquidate immediately.

It kind of makes me ill to think about it, but last year some time I saw a scrap seller on a popular selling platform selling multiple piles of very large server CPUs that had the heat spreaders popped off. I almost had a heart attack because I knew what they were based on their distinctive pad layout on the bottom... and I knew because earlier that year I had found a single one of them (with heat spreader) by chance and sold it for $500 as-is... they regularly sold for $900+ in working condition and sold new for thousands of dollars each just a couple years before that. This seller had - HUNDREDS - of them being sold for scrap prices... delidded of course. It was easily half a million dollars worth of high end ARM-based server processors. I contacted him to see if he still had the heat spreaders somewhere but he was completely unresponsive. Around that time the seller ended a bunch of his listings and then later replied to me saying he no longer had them. I was... disappointed. But, hey, they weren't mine. I hope he got something out of them.

Anyway... I think if this bubble bursts in the worst way possible, there is a slight chance we could see crazy things like this ending up on the junk markets. Imagine having several TB of high end VRAM and dozens of 5090-level GPU cores just laying in heaps attached to now-useless proprietary boards waiting to be repurposed into gaming hardware by people with the tools and skills needed to cobble them into something useful.

... again, this very likely won't happen, but it is fun to think about.

Using your gpus memory to supplement system ram over x16 would be interesting to say the least.

Reply 285 of 298, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
rmay635703 wrote on 2026-03-27, 19:03:
Ozzuneoj wrote on 2026-03-27, 18:40:
This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all […]
Show full quote

This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all "AI" focused hardware was immediately rendered obsolete and functionally worthless? Imagine the crazy things people would come up with to do with that hardware as it ended up in various places around the world (E-waste, harvesting chips to make homebrew hardware, etc.).

I have seen the kind of ridiculous stuff that lands on scrap or second hand markets when businesses fold and have to liquidate immediately.

It kind of makes me ill to think about it, but last year some time I saw a scrap seller on a popular selling platform selling multiple piles of very large server CPUs that had the heat spreaders popped off. I almost had a heart attack because I knew what they were based on their distinctive pad layout on the bottom... and I knew because earlier that year I had found a single one of them (with heat spreader) by chance and sold it for $500 as-is... they regularly sold for $900+ in working condition and sold new for thousands of dollars each just a couple years before that. This seller had - HUNDREDS - of them being sold for scrap prices... delidded of course. It was easily half a million dollars worth of high end ARM-based server processors. I contacted him to see if he still had the heat spreaders somewhere but he was completely unresponsive. Around that time the seller ended a bunch of his listings and then later replied to me saying he no longer had them. I was... disappointed. But, hey, they weren't mine. I hope he got something out of them.

Anyway... I think if this bubble bursts in the worst way possible, there is a slight chance we could see crazy things like this ending up on the junk markets. Imagine having several TB of high end VRAM and dozens of 5090-level GPU cores just laying in heaps attached to now-useless proprietary boards waiting to be repurposed into gaming hardware by people with the tools and skills needed to cobble them into something useful.

... again, this very likely won't happen, but it is fun to think about.

Using your gpus memory to supplement system ram over x16 would be interesting to say the least.

I keep picturing all the goofy things that have turned up on Chinese markets in recent years... like Mobile RX 6600 GPUs soldered onto discrete PCI-E cards for bargain prices when GPUs were super expensive. Or, ATX\ITX motherboards with integrated high end laptop\mini-PC CPUs. Or even something as simple as mass produced adapters to make standard hardware work with proprietary stuff.

How about motherboards with a bunch of onboard high speed GPU focused memory to act as an L4 cache of sorts? Or even put it on a removable module to make it expandable\replaceable. That would be slick.

It all sounds far fetched, but... the industry has never seen anything like this before with the sheer volume of high end silicon being produced. It is hard to even comprehend the amount of absurdly fast memory and the millions of ludicrously powerful GPUs being pumped out for "AI" datacenters. I have read that some of these places are being set up with 100,000+ GB200 units. As far as I understand it, a GB200 incorporates dual G200 GPU dies, 1 Grace CPU and as much as 864GB of HBM3e total (not unified) with 8TB/sec of bandwidth available to each GPU. That means a datacenter like this could have over 86 Petabytes of HBM3e. Sadly, I think most or all of it is actually part of the GPU\CPU die\package itself so it cannot be separated from it.

Yet, consumers can't even get nvidia to sell a current generation GPU with more than 8GB of mid range peasant GDDR7 for under $429 MSRP (actually $550+ right now).

If the bubble burst and no one wanted this stuff, there's probably enough HBM3e out in the wild right now (attached to GPUs... 🤣) to replace all the system RAM and VRAM that all of the home computing\gaming users in the world could conceivably use for at least the next decade. (Again, I know this isn't how it works, it's just funny to think about.)

Last edited by Ozzuneoj on 2026-03-27, 20:53. Edited 2 times in total.

Now for some blitting from the back buffer.

Reply 286 of 298, by Law212

User metadata
Rank Member
Rank
Member

I was lucky that I built my new PC right before the prices went up. Though I only got 32 gigs of ddr 5 , i now wish i got 64
I hope this PC lasts a good long time its an i9 14900KF , Asus z790 , RTX 580 , 32 gigs and 4 TB M.2
I hate hearing about 60 series cards because it reminds me how fast cards become obsolete

Reply 287 of 298, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
Law212 wrote on 2026-03-27, 20:29:

I was lucky that I built my new PC right before the prices went up. Though I only got 32 gigs of ddr 5 , i now wish i got 64
I hope this PC lasts a good long time its an i9 14900KF , Asus z790 , RTX 580 , 32 gigs and 4 TB M.2
I hate hearing about 60 series cards because it reminds me how fast cards become obsolete

If the rumors from Mooreslawisdead are true the 6000 series will be full AI rendering GPUs, they wont have Raster or RT cores,
so if you want all your graphics to be fake AI graphics then the 6000 series will be exactly what you want.

If however you value raw raster power over AI fakery then AMD or Intel will be the only option.
DLSS 5 was a preview for what these AI hallucinations for your graphics will be .. lovely Instagram filtered content.

I have a 4090 and a 9070XT and I think they will both last a good long while before I need to think about looking at another GPU.

Reply 288 of 298, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
rmay635703 wrote on 2026-03-27, 19:03:
Ozzuneoj wrote on 2026-03-27, 18:40:
This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all […]
Show full quote

This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all "AI" focused hardware was immediately rendered obsolete and functionally worthless? Imagine the crazy things people would come up with to do with that hardware as it ended up in various places around the world (E-waste, harvesting chips to make homebrew hardware, etc.).

I have seen the kind of ridiculous stuff that lands on scrap or second hand markets when businesses fold and have to liquidate immediately.

It kind of makes me ill to think about it, but last year some time I saw a scrap seller on a popular selling platform selling multiple piles of very large server CPUs that had the heat spreaders popped off. I almost had a heart attack because I knew what they were based on their distinctive pad layout on the bottom... and I knew because earlier that year I had found a single one of them (with heat spreader) by chance and sold it for $500 as-is... they regularly sold for $900+ in working condition and sold new for thousands of dollars each just a couple years before that. This seller had - HUNDREDS - of them being sold for scrap prices... delidded of course. It was easily half a million dollars worth of high end ARM-based server processors. I contacted him to see if he still had the heat spreaders somewhere but he was completely unresponsive. Around that time the seller ended a bunch of his listings and then later replied to me saying he no longer had them. I was... disappointed. But, hey, they weren't mine. I hope he got something out of them.

Anyway... I think if this bubble bursts in the worst way possible, there is a slight chance we could see crazy things like this ending up on the junk markets. Imagine having several TB of high end VRAM and dozens of 5090-level GPU cores just laying in heaps attached to now-useless proprietary boards waiting to be repurposed into gaming hardware by people with the tools and skills needed to cobble them into something useful.

... again, this very likely won't happen, but it is fun to think about.

Using your gpus memory to supplement system ram over x16 would be interesting to say the least.

PCIe is still not fast enough for that to work as well as you would imagine and GDDR isn't built for normal memory tasks either, it has far higher latency than normal DDR, but has a higher bandwidth as compensation.

I would hate to see its already high latency over PCIe 5 ....its going to be slower for normal tasks than using your NVME drive as system ram.

Now if you put your system NVME drive on the GPU and cut out the PCIe bus ....well then things get interesting.

Reply 289 of 298, by rmay635703

User metadata
Rank l33t
Rank
l33t
Trashbytes wrote on 2026-03-27, 21:04:
rmay635703 wrote on 2026-03-27, 19:03:
Ozzuneoj wrote on 2026-03-27, 18:40:
This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all […]
Show full quote

This is hypothetical... I don't actually think this is going to happen... but what if this bubble burst SO BADLY that nearly all "AI" focused hardware was immediately rendered obsolete and functionally worthless? Imagine the crazy things people would come up with to do with that hardware as it ended up in various places around the world (E-waste, harvesting chips to make homebrew hardware, etc.).

I have seen the kind of ridiculous stuff that lands on scrap or second hand markets when businesses fold and have to liquidate immediately.

It kind of makes me ill to think about it, but last year some time I saw a scrap seller on a popular selling platform selling multiple piles of very large server CPUs that had the heat spreaders popped off. I almost had a heart attack because I knew what they were based on their distinctive pad layout on the bottom... and I knew because earlier that year I had found a single one of them (with heat spreader) by chance and sold it for $500 as-is... they regularly sold for $900+ in working condition and sold new for thousands of dollars each just a couple years before that. This seller had - HUNDREDS - of them being sold for scrap prices... delidded of course. It was easily half a million dollars worth of high end ARM-based server processors. I contacted him to see if he still had the heat spreaders somewhere but he was completely unresponsive. Around that time the seller ended a bunch of his listings and then later replied to me saying he no longer had them. I was... disappointed. But, hey, they weren't mine. I hope he got something out of them.

Anyway... I think if this bubble bursts in the worst way possible, there is a slight chance we could see crazy things like this ending up on the junk markets. Imagine having several TB of high end VRAM and dozens of 5090-level GPU cores just laying in heaps attached to now-useless proprietary boards waiting to be repurposed into gaming hardware by people with the tools and skills needed to cobble them into something useful.

... again, this very likely won't happen, but it is fun to think about.

Using your gpus memory to supplement system ram over x16 would be interesting to say the least.

PCIe is still not fast enough for that to work as well as you would imagine and GDDR isn't built for normal memory tasks either, it has far higher latency than normal DDR, but has a higher bandwidth as compensation.

I would expect x16 + a hd in/out port on the card.

Not as fast as ddr5 but most definitely gives my ddr3 memory a run for it’s money

Reply 290 of 298, by Shponglefan

User metadata
Rank l33t
Rank
l33t

I was checking SSD prices and amazed at how much they have gone up as well.

About a year ago I bought these cheap 128GB Lexar SSD drives. They were $20 CAD each. Now they are $70 CAD each, a 3.5x price increase.

Similarly I bought a SAMSUNG 990 Pro 1TB M.2 NVMe drive for $130 CAD last fall. Now they are listed at $440 CAD. About 3.4x more expensive.

Edited to add:

I just the math on my computer upgrade I did last Fall. If I had to pay today's prices it would cost almost double, an extra $2,400 CAD. That is insane.

Pentium 4 Multi-OS Build
486 DX4-100 with 6 sound cards
486 DX-33 with 5 sound cards

Reply 291 of 298, by Jo22

User metadata
Rank l33t++
Rank
l33t++

TF cards are next, it seems. Those made by SanDisk, at least.

https://www.techradar.com/computing/memory/bu … e-memory-crisis

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 292 of 298, by lti

User metadata
Rank Member
Rank
Member

I noticed storage price increases a while back. The low-end SSDs don't seem to have increased in price as much as the nicer drives. HDD prices haven't gone up as much, but they're still climbing ($200 for an 8TB 5400RPM SMR drive).

Now I'm wondering what's going to happen if a drive fails. I already saw someone's RAM warranty claim end in a refund for the original purchase price.

Trashbytes wrote on 2026-03-27, 20:57:
If the rumors from Mooreslawisdead are true the 6000 series will be full AI rendering GPUs, they wont have Raster or RT cores, […]
Show full quote

If the rumors from Mooreslawisdead are true the 6000 series will be full AI rendering GPUs, they wont have Raster or RT cores,
so if you want all your graphics to be fake AI graphics then the 6000 series will be exactly what you want.

If however you value raw raster power over AI fakery then AMD or Intel will be the only option.
DLSS 5 was a preview for what these AI hallucinations for your graphics will be .. lovely Instagram filtered content.

I think GPUs would need backward compatibility, but games will inevitably be hardcoded to require those "AI" rendering features (so AMD and Intel will have to play catch-up). Of course, it's also going to drive price and power consumption up even higher, and performance gain will be insignificant in applications that don't use the new AI feature.

I agree with the comment earlier about everyone designing products like they want their business to fail. I can only hope that the RTX 6090 will use two 12VHPWR connectors instead of changing the spec to say that one connector is good for 1000W (we're in the Idiocracy world, so we won't get a new connector that doesn't melt).

Reply 293 of 298, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
lti wrote on Yesterday, 05:02:
I noticed storage price increases a while back. The low-end SSDs don't seem to have increased in price as much as the nicer driv […]
Show full quote

I noticed storage price increases a while back. The low-end SSDs don't seem to have increased in price as much as the nicer drives. HDD prices haven't gone up as much, but they're still climbing ($200 for an 8TB 5400RPM SMR drive).

Now I'm wondering what's going to happen if a drive fails. I already saw someone's RAM warranty claim end in a refund for the original purchase price.

Trashbytes wrote on 2026-03-27, 20:57:
If the rumors from Mooreslawisdead are true the 6000 series will be full AI rendering GPUs, they wont have Raster or RT cores, […]
Show full quote

If the rumors from Mooreslawisdead are true the 6000 series will be full AI rendering GPUs, they wont have Raster or RT cores,
so if you want all your graphics to be fake AI graphics then the 6000 series will be exactly what you want.

If however you value raw raster power over AI fakery then AMD or Intel will be the only option.
DLSS 5 was a preview for what these AI hallucinations for your graphics will be .. lovely Instagram filtered content.

I think GPUs would need backward compatibility, but games will inevitably be hardcoded to require those "AI" rendering features (so AMD and Intel will have to play catch-up). Of course, it's also going to drive price and power consumption up even higher, and performance gain will be insignificant in applications that don't use the new AI feature.

I agree with the comment earlier about everyone designing products like they want their business to fail. I can only hope that the RTX 6090 will use two 12VHPWR connectors instead of changing the spec to say that one connector is good for 1000W (we're in the Idiocracy world, so we won't get a new connector that doesn't melt).

Its not a feature, its the entire GPU, the AI cores will be handling everything including pretending to be raster units for games that expect raster hardware, you really should go check out what Jensen is pushing here, guy is so deep into AI he cannot pull out without making a huge mess. Also I really don't believe for one second nVidia even cares at this point about backwards compatibility beyond a certain point my guess would be that anything older than DX 12 simply will not run on the new AI only GPUs.

The funniest thing is Nvidia needed two 5090s to even run DLSS 5, one for the real rendering and one to run the LLM doing the DLSS...they claim they will get it running on a 5060 before release but ...I remain unconvinced they can pull that feat off.

Personally I hate the AI hallucination rendering nVidia wants to push, Ive seen what DLSS 5 looks like and it looks like typical Instagram AI fakery slop. I for one will boycott any and all hardware/games/programs that require AI hardware to even function. I deal with enough fake BS in my life as is, I don't need more of it.

No I dont care if that means I never buy another GPU or PC, I have plenty of older hardware to work with and a backlog of Steam/Retro games that I doubt Ill ever get through before I walk off this mortal coil.

Reply 294 of 298, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie

Worth pointing out in this topic, why can't there be a single OS where memory usage doesn't creep up over time? Be it the OS itself, the drivers or bundled system utilities. I noticed both Win10 20H2 and Win11 23H2 start at about 1,6 GB on my PC, then couple of days later, that turns into 2 GB.

I noticed even bigger insanity on supposedly noob friendly Linux Mint where System Monitor, if left running, its memory usage just creeps into a whooping gigabyte!

I remember reading anectdotes where certain drivers on Windows increased non-paged kernel memory into a multi-gigabyte range.

Why isn't there a single entity that would slow down and actually make existing shit work properly instead of keep piling new shit on top of existing broken shit?

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.

Reply 295 of 298, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

All this was to be expected, and it’s only going to get worse. It’s as simple as that: Nvidia sells at those prices because there are people willing to pay them, and with the RAM memory a storage, the same happens. YouTube has done a lot of damage in this regard.
If we stop to consider how the world works today, it is much more evident that money is worth more than even life itself.
It is the ‘money first’ model promoted in certain parts of the world.
As long as that mindset persists, things will continue to get worse.
This isn’t the fault of LLMs; it’s the fault of those with the most money and power, who want to use them to gain even more money and power at any cost.

Reply 296 of 298, by lti

User metadata
Rank Member
Rank
Member
Trashbytes wrote on Yesterday, 06:26:

Its not a feature, its the entire GPU, the AI cores will be handling everything including pretending to be raster units for games that expect raster hardware, you really should go check out what Jensen is pushing here, guy is so deep into AI he cannot pull out without making a huge mess. Also I really don't believe for one second nVidia even cares at this point about backwards compatibility beyond a certain point my guess would be that anything older than DX 12 simply will not run on the new AI only GPUs.

Having some kind of raster emulation is the minimum level of backward compatibility I was thinking of. On the other hand, game devs don't want you playing their older games, so having no backward compatibility would make sense in a twisted way.

Trashbytes wrote on Yesterday, 06:26:

The funniest thing is Nvidia needed two 5090s to even run DLSS 5, one for the real rendering and one to run the LLM doing the DLSS...they claim they will get it running on a 5060 before release but ...I remain unconvinced they can pull that feat off.

Personally I hate the AI hallucination rendering nVidia wants to push, Ive seen what DLSS 5 looks like and it looks like typical Instagram AI fakery slop. I for one will boycott any and all hardware/games/programs that require AI hardware to even function. I deal with enough fake BS in my life as is, I don't need more of it.

So the 6090 is going to draw more power than two 5090s... and the masses will pretend it's okay and "the real problem destroying the environment" is that high-refresh-rate monitor or leaving your cell phone charger plugged in...

I'm also tired of seeing that kind of fake stuff and the social media influencers making Instagram filters something that people try to imitate in real life. There's also the problem with people blindly accepting the first answer they get.

Trashbytes wrote on Yesterday, 06:26:

No I dont care if that means I never buy another GPU or PC, I have plenty of older hardware to work with and a backlog of Steam/Retro games that I doubt Ill ever get through before I walk off this mortal coil.

My fastest GPU is an Nvidia T1000, and that's in my laptop. My modern desktops have integrated graphics, and that would be good enough if the Intel graphics driver for Linux didn't suck.

Reply 297 of 298, by LSS10999

User metadata
Rank Oldbie
Rank
Oldbie
lti wrote on Yesterday, 05:02:

I agree with the comment earlier about everyone designing products like they want their business to fail. I can only hope that the RTX 6090 will use two 12VHPWR connectors instead of changing the spec to say that one connector is good for 1000W (we're in the Idiocracy world, so we won't get a new connector that doesn't melt).

Personally I don't have any video card of the generations that introduced the 12VHPWR and 12V-2x6 connectors, and I don't think I'll need one. My latest nVidia card is RTX A4000 which still works well with the last few Windows versions (Server 2012 for example), but apparently the latest driver for those OSes, 474.44, is broken, as it would crash when using Vulkan (OpenGL is okay, though). I'm currently using driver 463.15 with which Vulkan is working okay.

From what I saw online those new connectors are a bit smaller than the older 8-pin connector yet rated for much more power... something very unusual considering Joule's law where one would want a larger connector surface (lower resistance) in the face of higher amperes of current to keep heat generation within limits.

I'm afraid that a few years later these generations of video cards will become total e-wastes with little reusable/resellable value as they keep frying and rendered nonfunctional. I wonder if any nVidia card vendor has experimented with designing current video cards around older 8-pin connectors?

Reply 298 of 298, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
LSS10999 wrote on Today, 05:15:
Personally I don't have any video card of the generations that introduced the 12VHPWR and 12V-2x6 connectors, and I don't think […]
Show full quote
lti wrote on Yesterday, 05:02:

I agree with the comment earlier about everyone designing products like they want their business to fail. I can only hope that the RTX 6090 will use two 12VHPWR connectors instead of changing the spec to say that one connector is good for 1000W (we're in the Idiocracy world, so we won't get a new connector that doesn't melt).

Personally I don't have any video card of the generations that introduced the 12VHPWR and 12V-2x6 connectors, and I don't think I'll need one. My latest nVidia card is RTX A4000 which still works well with the last few Windows versions (Server 2012 for example), but apparently the latest driver for those OSes, 474.44, is broken, as it would crash when using Vulkan (OpenGL is okay, though). I'm currently using driver 463.15 with which Vulkan is working okay.

From what I saw online those new connectors are a bit smaller than the older 8-pin connector yet rated for much more power... something very unusual considering Joule's law where one would want a larger connector surface (lower resistance) in the face of higher amperes of current to keep heat generation within limits.

I'm afraid that a few years later these generations of video cards will become total e-wastes with little reusable/resellable value as they keep frying and rendered nonfunctional. I wonder if any nVidia card vendor has experimented with designing current video cards around older 8-pin connectors?

They all have and the cards work just fine with 3x8 or even 4x8 the issue with the new connectors isn't the size of the connector its load spreading across the pins, the 5090 typically draws between 500 - 600 watts across one connector but can pull 1000watts if allowed with two HPWR connectors They have found that if even one of the pins has a slightly bad connection the card will draw an unbalanced load and put extra load on the pins with a better connection which as you can imagine causes extra heat on an already stressed pin.

They are now putting active load monitoring and balancing in PSUs for the connector, if the PSU detects even an unbalanced load where one pin is beyond its rating it will force the GPU into a 70watt limp mode and alert the user.

That said ... the 8pin connector was .. perfect and the only reason nVidia went with the new one was forced obsolescence, they knew full well that a connector like that wont last long and that they cant change it at will. This naturally makes their GPUs much harder to use once the connector gets changed for a new format that wont work with the older cards.\

But its not hard to adapt the 8pin connectors to fit the new ones so ...whats the point of having new connectors really.