VOGONS

Common searches


Nvidia adaptergate

Topic actions

Reply 20 of 150, by TrashPanda

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2022-11-01, 12:22:
TrashPanda wrote:

doesn't have physical power limits built into the VRM like RTX2000 and RTX3000 have

Voltage controllers still have OCP, but they are disabled or set to ludicrous level since Fermi refresh days (GTX 5xx), which lead to the infamous driver killing GTX 590. So nothing new here.

I never had a 590, I had a pair of 580s instead which was likely the better option in hindsight ..but still I do have a thing for them dual GPU video cards and have a number of them from both AMD and nVidia and while the AMD ones look great they are power hogs so I can imaging how bad the 590 would have been as the 580s were pretty terrible for heat and power.

Edit - Oh no .. it let the magic smoke out, cant put the angry pixies back once they escape 🙁

Reply 21 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

With the information we have thus far, it seems that native 12VHPWR cables (from quality brands, at least) are not affected by this, so it does appear to be a screw up on nVIDIAs part (or whichever company they partnered with to manufacture those adapters). However, it's also very likely that an overwhelming percentage of 4090 buyers ended up using the included adapter, so we still can't eliminate the native cables entirely.

Having said that, on a personal note, this adapter issue is still a minor problem compared to the power usage and pricing of these cards (I mean, the power usage is actually the undeniable cause for the adapter issue in the first place).
Unless one of the cards in the 4060/4060Ti/4070 series turns out to be a 'killer' card that runs at no more than 200W + it has both a decent performance per watt and... oh, yeah, A NORMAL PRICE, then I am definitely skipping yet another generation of GPUs and CPUs. I know, I'm clearly on drugs to even think that, so... no upgrade for me. 😁
Unfortunately, AMD GPUs are not an option in my case, since I absolutely need NVENC.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 22 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Meanwhile Galax will make 4090 with two 12VHPWR connectors...

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 24 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2022-11-01, 13:53:

Just wait for Nov 3rd.

RDNA3 will be unto GPUs what Zen was tp CPUs.

I hope so. I mean, there has to be a reason why Jensen pushed these cards to their absolute limit and that reason is most likely competition (+ I'm convinced that both nVIDIA and AMD are able to get pretty accurate inside information regarding what the other is up to).

However, IMO, nVIDIA does still have a clear advantage when it comes to DLSS game support (and its performance and image quality compared to FSR, particularly at lower resolutions with more budget cards). Of course, in my case, I could definitely ignore DLSS (even though it's the number one reason, together with nVIDIA Reflex, why I'm currently able to play A Plague Tale Requiem @ 30 - 50 FPS on my puny RTX 2060, and that's at 1440p with High/Ultra details), however I absolutely need NVENC... and this is unlikely to change anytime soon. Damn you, nVIDIA, damn you to hell! 😁

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 25 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-11-01, 12:08:
darry wrote on 2022-11-01, 11:57:
Thanks for that perspective. […]
Show full quote
TrashPanda wrote on 2022-11-01, 11:07:

The main issue is that under full load for extended periods the connector is simply unable to dump the huge amounts of heat generated by the absurd power draw, not only that but where nVidia has the connector placed on the card also attributes to stupid amount of heat build up. Instead of placing the connector at the rear of the card mounted to the heat sink where it could get better cooling ...they located it on the side of the card near the VRM, it also has zero cooling as very little airflow passes over that area.

Simply put they went with looks and beauty over practicality and function and it bit them fair in the arse when their crazy power draw essentially caused the plastic insulation to melt. (The connector itself is poorly designed for handling the 500+ watts the card can draw with transient spikes)

Nvidia has done bone headed shit in the past in regards to its failure to correctly account for heat dissapation ..any one remember Fermi ? how about over heating 8000 Series GPUs.

Thanks for that perspective.

Potential firestarting ability aside, IMHO, this is why I now always stay clear of the highest end GPU of a given series. They are always made to run at the limits of what card and chip manufacturers think they can get away it.

- Huge power draw
- Intense cooling needs
- Usually relatively hot when running "as designed"
- Noisy if air cooled
Result: often compromised lifespan due to the above

Then, even on well designed and made cards, when something goes wrong ( cooling solution degrades due to fan wear/failure, thermal paste degradation, dust, etc), the effect is compounded.

When the Geforce FX 5800 Ultra was born, it was laughable but, over time, the concept has basically become acceptable. IMHO, flagship GPU these days are doing a great job of keeping the "hot leafblower" spirit alive and well.

If you look at what AMD has done for their 7000 Radeons they increased power draw by ~50watts but doubled performance in raster and will be roughly a little faster than the RTX 3000 series for RT, they realize that GPU power draw is becoming a huge problem for GPUs so they employed MCM and chiplets to get around it, much like they did with Ryzen. Sure their RT isnt super amazing but nVidia does rely on DLSS a bit too much and the vast majority of RTX4000s power comes from DLSS 3 and the fake frames it adds to the output. (Fake frames are AI generated frames and look fucking terrible)

Nvidia wont have MCM till their RTX5000 cards as they are a bit behind AMD with that tech but even then I don't see them backing off the power draw, its how they brute force their tech being at the top.

Latest leaks indicate raster performance increased by 50% best case scenario. Not that it matters, the only thing that really matters now is RT performance increases, raster is already under control for most use cases unless you have a legitimate need for 8k performance. 3000 series was lacking in RT, I'd say the 4090 is really the first real set it and forget it RT card, even at 4k.

BTW I believe the claims of adapters melting are being overblown but regardless I have a new ATX 3.0 PSU on the way.

Reply 26 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Shagittarius wrote on 2022-11-01, 15:10:
Latest leaks indicate raster performance increased by 50% best case scenario. Not that it matters, the only thing that really […]
Show full quote
TrashPanda wrote on 2022-11-01, 12:08:
darry wrote on 2022-11-01, 11:57:
Thanks for that perspective. […]
Show full quote

Thanks for that perspective.

Potential firestarting ability aside, IMHO, this is why I now always stay clear of the highest end GPU of a given series. They are always made to run at the limits of what card and chip manufacturers think they can get away it.

- Huge power draw
- Intense cooling needs
- Usually relatively hot when running "as designed"
- Noisy if air cooled
Result: often compromised lifespan due to the above

Then, even on well designed and made cards, when something goes wrong ( cooling solution degrades due to fan wear/failure, thermal paste degradation, dust, etc), the effect is compounded.

When the Geforce FX 5800 Ultra was born, it was laughable but, over time, the concept has basically become acceptable. IMHO, flagship GPU these days are doing a great job of keeping the "hot leafblower" spirit alive and well.

If you look at what AMD has done for their 7000 Radeons they increased power draw by ~50watts but doubled performance in raster and will be roughly a little faster than the RTX 3000 series for RT, they realize that GPU power draw is becoming a huge problem for GPUs so they employed MCM and chiplets to get around it, much like they did with Ryzen. Sure their RT isnt super amazing but nVidia does rely on DLSS a bit too much and the vast majority of RTX4000s power comes from DLSS 3 and the fake frames it adds to the output. (Fake frames are AI generated frames and look fucking terrible)

Nvidia wont have MCM till their RTX5000 cards as they are a bit behind AMD with that tech but even then I don't see them backing off the power draw, its how they brute force their tech being at the top.


Latest leaks indicate raster performance increased by 50% best case scenario.
Not that it matters, the only thing that really matters now is RT performance increases, raster is already under control for most use cases unless you have a legitimate need for 8k performance. 3000 series was lacking in RT, I'd say the 4090 is really the first real set it and forget it RT card, even at 4k.

BTW I believe the claims of adapters melting are being overblown but regardless I have a new ATX 3.0 PSU on the way.

No, mark my words it will be a >50% raster increase, probably 60%+

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 27 of 150, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
TrashPanda wrote on 2022-11-01, 12:08:

7000 Radeons

Oh, we're back there again.

When they make a 9000 series in a couple of years, I wonder if we'll get another legendary 9700 Pro card. I can already see the confusion on eBay and such. That might be a good time to sell off my Radeon 9000 Pro. 😁

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 28 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++

Posting because it is very relevant: https://www.youtube.com/watch?v=yAfwhR6Bfr4

Radeon RX8000 series will be the new Radeon 9000 series.

Fuck Intel, Fuck nVidia. Crash and burn motherfuckers.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 29 of 150, by ZellSF

User metadata
Rank l33t
Rank
l33t

I'm a bit confused about people blaming the cable problems on the 450W power draw. There were 450W power limit 3090s, they were fine.

Shagittarius wrote on 2022-11-01, 15:10:

Latest leaks indicate raster performance increased by 50% best case scenario. Not that it matters, the only thing that really matters now is RT performance increases, raster is already under control for most use cases unless you have a legitimate need for 8k performance.

I mean, I do as much as I have a legitimate need for 4K, 1440p or 1080p performance? For playing games, "legitimate need" is a weird phrasing.

8K is one of the reasons I would like to have a RTX 4090. Ideally I would want 8K@120hz native, but Nvidia decided to not support that, and hold back monitors, by not implementing DisplayPort 2.0, so I'm not terribly tempted.

Reply 30 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2022-11-02, 07:45:

Posting because it is very relevant: https://www.youtube.com/watch?v=yAfwhR6Bfr4

Radeon RX8000 series will be the new Radeon 9000 series.

Fuck Intel, Fuck nVidia. Crash and burn motherfuckers.

This is why I won't take you seriously. 🤣.

Reply 31 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie

nVidia knew exactly the problems they were going to encounter.
They knew and considered the risk as "acceptable"
They HAD to know. Think they didn't test these disasters before releasing them?

They knew that there would be issues, just like they knew people would buy them anyway.
They will continue to make problematic decisions until they can't afford to.
Just like Intel kept releasing the same CPU year after year until Zen forced them to actually advance their product again.

TIme will tell whether or not the next gen Radeon cards will give them the push, or if Intel can get their head in the game and write some decent drivers to help convince people to buy their stuff.
Hopefully it is both. Competition is good for everyone.

I bought my first nVidia card in YEARS when I bought my 1080. I am a self admitted AMD fanboy, but I also am no fool. Good performance and good value aren't always mutually exlcusive.
I don't care how awesome a 4090 performs, they are a TERRIBLE value. I have a hard time believing that even the 60 series cards from this generation won't be absurdly overpriced.
But time will tell.

Reply 32 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Jasin Natael wrote on 2022-11-02, 14:26:
nVidia knew exactly the problems they were going to encounter. They knew and considered the risk as "acceptable" They HAD to k […]
Show full quote

nVidia knew exactly the problems they were going to encounter.
They knew and considered the risk as "acceptable"
They HAD to know. Think they didn't test these disasters before releasing them?

They knew that there would be issues, just like they knew people would buy them anyway.
They will continue to make problematic decisions until they can't afford to.
Just like Intel kept releasing the same CPU year after year until Zen forced them to actually advance their product again.

TIme will tell whether or not the next gen Radeon cards will give them the push, or if Intel can get their head in the game and write some decent drivers to help convince people to buy their stuff.
Hopefully it is both. Competition is good for everyone.

I bought my first nVidia card in YEARS when I bought my 1080. I am a self admitted AMD fanboy, but I also am no fool. Good performance and good value aren't always mutually exlcusive.
I don't care how awesome a 4090 performs, they are a TERRIBLE value. I have a hard time believing that even the 60 series cards from this generation won't be absurdly overpriced.
But time will tell.

Actually performance to price ratio for the 4090 they are the best value in years. Actual cost might be too much for some, well, they can buy other cards, and that's traditionally where AMD has positioned themselves, the "value" alternative. I only ever owned AMD cards twice the 9800, and the x800. After that, with some arguable instances, there's never been an obvious reason to go AMD over NVidia unless you are pursuing "value", which I don't think is as obvious when you factor in driver issues.

Also, apparently the companies assembling the adapters used different qualities of the materials and some are not rated for an appropriate amount of Voltage. The adapter itself is sound, 3rd party cable builds are the culprit.

Just like you I will decide what is priced appropriately and what is too much for me to buy. Thankfully there are multiple choices in a free market society.

Reply 34 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Shagittarius wrote on 2022-11-02, 14:32:
Actually performance to price ratio for the 4090 they are the best value in years. Actual cost might be too much for some, well […]
Show full quote
Jasin Natael wrote on 2022-11-02, 14:26:
nVidia knew exactly the problems they were going to encounter. They knew and considered the risk as "acceptable" They HAD to k […]
Show full quote

nVidia knew exactly the problems they were going to encounter.
They knew and considered the risk as "acceptable"
They HAD to know. Think they didn't test these disasters before releasing them?

They knew that there would be issues, just like they knew people would buy them anyway.
They will continue to make problematic decisions until they can't afford to.
Just like Intel kept releasing the same CPU year after year until Zen forced them to actually advance their product again.

TIme will tell whether or not the next gen Radeon cards will give them the push, or if Intel can get their head in the game and write some decent drivers to help convince people to buy their stuff.
Hopefully it is both. Competition is good for everyone.

I bought my first nVidia card in YEARS when I bought my 1080. I am a self admitted AMD fanboy, but I also am no fool. Good performance and good value aren't always mutually exlcusive.
I don't care how awesome a 4090 performs, they are a TERRIBLE value. I have a hard time believing that even the 60 series cards from this generation won't be absurdly overpriced.
But time will tell.

Actually performance to price ratio for the 4090 they are the best value in years. Actual cost might be too much for some, well, they can buy other cards, and that's traditionally where AMD has positioned themselves, the "value" alternative. I only ever owned AMD cards twice the 9800, and the x800. After that, with some arguable instances, there's never been an obvious reason to go AMD over NVidia unless you are pursuing "value", which I don't think is as obvious when you factor in driver issues.

Also, apparently the companies assembling the adapters used different qualities of the materials and some are not rated for an appropriate amount of Voltage. The adapter itself is sound, 3rd party cable builds are the culprit.

Just like you I will decide what is priced appropriately and what is too much for me to buy. Thankfully there are multiple choices in a free market society.

I honestly feel that the "Bad AMD driver" thing is taken vastly out of proportion.
I would argue that their Adrenalin drivers are surperior in pretty much every regard to nVidia's rather archaic if mostly stable driver set.
I also would say that both the Radeon 5xxxx and 6xxxx cards were better bang for buck than anything from nVidia, or at least they would have been if not for scalpers.
Unless you care about RTX that is.....and I just don't care, likely never will until it is exclusively used.

I get what you are saying about price to performance, and while yes the 4090 is a pretty massive generational leap in raw compute and FPS, I don't think that it justifies the cray price jump. I feel that Nvidia saw what were people were willing to pay for a GPU during the minig/pandemic/chip shortage phase.....and knew they could get away with about anything.

And to be honest it is free enterprise, you can't hardly blame them for that. A product is worth whatever consumers will pay for it.
But they have been steadily increasing prices for flagship cards since the intoroduction of the OG Titan.
And honestly what is the 4090 supposed to be?
It's a replacement Titan, and it's MSRP reflects that.

Still here's to hoping that AMD has a very competitive product.
Jury is out on Intel. I refuse to buy their unfinished garbage product to prop them up in the hopes that the second generation cards don't suck.
They have billions to spend on R&D, the general public shouldn't be left beta testing their half baked silicon.
They want us to buy their cards, then they need to make better cards, and drivers.

Reply 35 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Jasin Natael wrote on 2022-11-02, 16:24:
I honestly feel that the "Bad AMD driver" thing is taken vastly out of proportion. I would argue that their Adrenalin drivers a […]
Show full quote
Shagittarius wrote on 2022-11-02, 14:32:
Actually performance to price ratio for the 4090 they are the best value in years. Actual cost might be too much for some, well […]
Show full quote
Jasin Natael wrote on 2022-11-02, 14:26:
nVidia knew exactly the problems they were going to encounter. They knew and considered the risk as "acceptable" They HAD to k […]
Show full quote

nVidia knew exactly the problems they were going to encounter.
They knew and considered the risk as "acceptable"
They HAD to know. Think they didn't test these disasters before releasing them?

They knew that there would be issues, just like they knew people would buy them anyway.
They will continue to make problematic decisions until they can't afford to.
Just like Intel kept releasing the same CPU year after year until Zen forced them to actually advance their product again.

TIme will tell whether or not the next gen Radeon cards will give them the push, or if Intel can get their head in the game and write some decent drivers to help convince people to buy their stuff.
Hopefully it is both. Competition is good for everyone.

I bought my first nVidia card in YEARS when I bought my 1080. I am a self admitted AMD fanboy, but I also am no fool. Good performance and good value aren't always mutually exlcusive.
I don't care how awesome a 4090 performs, they are a TERRIBLE value. I have a hard time believing that even the 60 series cards from this generation won't be absurdly overpriced.
But time will tell.

Actually performance to price ratio for the 4090 they are the best value in years. Actual cost might be too much for some, well, they can buy other cards, and that's traditionally where AMD has positioned themselves, the "value" alternative. I only ever owned AMD cards twice the 9800, and the x800. After that, with some arguable instances, there's never been an obvious reason to go AMD over NVidia unless you are pursuing "value", which I don't think is as obvious when you factor in driver issues.

Also, apparently the companies assembling the adapters used different qualities of the materials and some are not rated for an appropriate amount of Voltage. The adapter itself is sound, 3rd party cable builds are the culprit.

Just like you I will decide what is priced appropriately and what is too much for me to buy. Thankfully there are multiple choices in a free market society.

I honestly feel that the "Bad AMD driver" thing is taken vastly out of proportion.
I would argue that their Adrenalin drivers are surperior in pretty much every regard to nVidia's rather archaic if mostly stable driver set.
I also would say that both the Radeon 5xxxx and 6xxxx cards were better bang for buck than anything from nVidia, or at least they would have been if not for scalpers.
Unless you care about RTX that is.....and I just don't care, likely never will until it is exclusively used.

I get what you are saying about price to performance, and while yes the 4090 is a pretty massive generational leap in raw compute and FPS, I don't think that it justifies the cray price jump. I feel that Nvidia saw what were people were willing to pay for a GPU during the minig/pandemic/chip shortage phase.....and knew they could get away with about anything.

And to be honest it is free enterprise, you can't hardly blame them for that. A product is worth whatever consumers will pay for it.
But they have been steadily increasing prices for flagship cards since the intoroduction of the OG Titan.
And honestly what is the 4090 supposed to be?
It's a replacement Titan, and it's MSRP reflects that.

Still here's to hoping that AMD has a very competitive product.
Jury is out on Intel. I refuse to buy their unfinished garbage product to prop them up in the hopes that the second generation cards don't suck.
They have billions to spend on R&D, the general public shouldn't be left beta testing their half baked silicon.
They want us to buy their cards, then they need to make better cards, and drivers.

I love to see all companies competing, as we know that produces the most competitive market.

Reply 36 of 150, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, AMD's drivers are fine these days. Which is why I'm hoping & praying that the RX 7000 delivers at least RTX 3090 level ray-tracing performance. If it does, I will totally be picking one up! Ah hell, even if it's a little below the 3090 in RT, I'll still pick one up, assuming stellar rasterization performance. I mean, my current 1080 Ti is getting pretty old and doesn't support hardware RT at all. Plus, all of the recent Nvidia shenanigans (and their prickish attitude) has me feeling rather turned off.

I need something this generation though. I've been using my new 4k/165 monitor as a 1440p/165 display because the old GTX 1080 Ti just ain't fast enough. Fortunately, the monitor has a decent scaler!

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 37 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
Standard Def Steve wrote on 2022-11-02, 17:30:

Yeah, AMD's drivers are fine these days.

Yeah. It seems that, lately, nVIDIA has had more driver related issues than AMD - the latest one being the Call of Duty MW2 instability/crashes with their latest drivers. Performance is also pretty bad compared to the AMD cards (unclear if it's also driver related): https://www.youtube.com/watch?v=hZ4JDDCnOi0

I've also had my fair share of issues with my RTX 2060, but for the most part things have been pretty good (aside from nVIDIA intentionally nerfing the performance of their older generation cards with each new driver update - like they have been doing since... forever, something that us in the retro community know all too well).

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 38 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Apparently there is a case of melted connector from native ATX 3.0 cable now.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 39 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, saw it on Tom's Hardware. This is getting better and better... 😁

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k