VOGONS

Common searches


Nvidia adaptergate

Topic actions

Reply 40 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie

Looks like sub-standard cables, as pointed out in the very same comments thread on Toms.

"MSI was using 18-24 AWG on the 600W 12VHPWR 12+4 PCIe .

nVIDIA's 12VHPWR Adapters that they contracted out to "Astron" to Design/Manufacture was using 4x 14 AWG copper wire pairs to the 4x PCIe 6/8pin Receptacles.

CableMod Cables were using 6x 16 AWG copper wire pairs in their adapters."

Reply 41 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

18 AWG is not a big issue, because 6+2 pin connectors with 18 AWG wiring are rated to hold peak 300W (100W for one pin). Only plastic parts at the end are melting, which could indicate that less surface area for each pin is leading to increased resistance, due to rising temperature and positive feedback loop effect (more heat = more resistance).

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 42 of 150, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-11-01, 00:23:

Adaptor that was not designed for prolonged use at high power draw leading to high temperatures in a confined space.

critical failure is the only outcome and NVidia is at fault here.

Rather stupid to use 14 gage wire in a high power situation

Plastic and contacts have always been a bad idea, now gold plated contacts

Reply 44 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Here's more scientific approach on the problem with recreated issue of resistance going through the roof - https://www.youtube.com/watch?v=hkN81jRaupA

So it seems the new connector is not as foolproof as old 8-pin ones and some amount paranoia is required during installation. There might be some manufacturing defects too, which will make things even worse.

Last edited by The Serpent Rider on 2022-11-06, 13:32. Edited 4 times in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 45 of 150, by TrashPanda

User metadata
Rank l33t
Rank
l33t

I think im going to sit back and see where this all goes, its pretty clear at this point that a redesign of the connector is required, its too late for the already released cards but later revisions I expect to have a redesigned connector. Good thing to note is that some AIBs were aware of this and already have a beefed up connector.

Personally I think AMDs approach in improving efficiency and power use has huge benefits over nVidias brute force to the top approach, sure nVidia are the fastest cards out there but they are also fire hazards and work well as space heaters, itll also cost you a couple of hundred bucks extra a year to run one.

The fact the 7900XTX can get within spitting distance of the 4090 for pure raster at 350-400 watts is proof nVidia didn't need to go the path they did. Some will still claim AMD RT isn't good . .but its slightly better than a 3080ti which honestly for how often you would use RT is perfectly fine. (Assuming RT/AI matters at all to many out there)

Reply 46 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-11-06, 13:18:

I think im going to sit back and see where this all goes, its pretty clear at this point that a redesign of the connector is required, its too late for the already released cards but later revisions I expect to have a redesigned connector. Good thing to note is that some AIBs were aware of this and already have a beefed up connector.

Personally I think AMDs approach in improving efficiency and power use has huge benefits over nVidias brute force to the top approach, sure nVidia are the fastest cards out there but they are also fire hazards and work well as space heaters, itll also cost you a couple of hundred bucks extra a year to run one.

The fact the 7900XTX can get within spitting distance of the 4090 for pure raster at 350-400 watts is proof nVidia didn't need to go the path they did. Some will still claim AMD RT isn't good . .but its slightly better than a 3080ti which honestly for how often you would use RT is perfectly fine. (Assuming RT/AI matters at all to many out there)

It's not even close to a 3080ti, unless you cherry pick the very limited intergrations. AMD tanks on any full RT implementation. You cannot say AMD has passable RT it doesnt, it might as well not have any. Also I dont consider 25-30% slower than a 4090 on average spitting distance. I think its going to be about 10% slower than a 4080 on average, and that's the card AMD said its competing with.

Reply 47 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

AMD lags exactly one generation behind, when it comes to RT. 6900XT/6950XT are slightly faster than 2080 Ti. But that's in pure/heavy RT games - games with more balanced workloads are much faster.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 48 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Shagittarius wrote on 2022-11-06, 18:20:
TrashPanda wrote on 2022-11-06, 13:18:

I think im going to sit back and see where this all goes, its pretty clear at this point that a redesign of the connector is required, its too late for the already released cards but later revisions I expect to have a redesigned connector. Good thing to note is that some AIBs were aware of this and already have a beefed up connector.

Personally I think AMDs approach in improving efficiency and power use has huge benefits over nVidias brute force to the top approach, sure nVidia are the fastest cards out there but they are also fire hazards and work well as space heaters, itll also cost you a couple of hundred bucks extra a year to run one.

The fact the 7900XTX can get within spitting distance of the 4090 for pure raster at 350-400 watts is proof nVidia didn't need to go the path they did. Some will still claim AMD RT isn't good . .but its slightly better than a 3080ti which honestly for how often you would use RT is perfectly fine. (Assuming RT/AI matters at all to many out there)

It's not even close to a 3080ti, unless you cherry pick the very limited intergrations. AMD tanks on any full RT implementation. You cannot say AMD has passable RT it doesnt, it might as well not have any. Also I dont consider 25-30% slower than a 4090 on average spitting distance. I think its going to be about 10% slower than a 4080 on average, and that's the card AMD said its competing with.

7900XTX will be 10-20% slower in Raster than 4090 depending on game. It will do that wile using 33% less power and costing 37% less. The 7900XTX will also be 20% fater than 4080 while still costing less. So AMD not only kneecapped the 4090 it completely shot 4080 dead.

In raytracing it will be comparable to a 3080Ti. Which is not great, but it is also not bad at all. This is the only part of your post that is true - you can't call it passable, it is WAY BETTER than that.

You either don't know what you are talking about here or you are just wishlisting with nVidia fanboy goggles. AMD may as well not have any raytracing? Well, I have a 6700XT in my system and Raytracing + FSR 2.0 works quite well in many games for me.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 49 of 150, by Tree Wyrm

User metadata
Rank Newbie
Rank
Newbie

I'll keep my 3070 for quite some time and watch how it all unfolds. I was somewhat skeptical of the new connector prior already, particularly after they announced power draw of 4090.
Seeing both FE and AIB cooler designs it's all just getting very silly at this point. Ridiculously big and stupidly expensive.

Cautiously optimistic about upcoming AMD cards, would like to see what independent benchmarks and comparisons will show. Most certainly not gonna be anything record breaking so it all will come down to price to performance ratio.

Which title today is the "crysis" of this generation anyway?

Reply 50 of 150, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

I don't know if any really qualifies. Crysis is badly optimized and demanded both CPU and GPU, whereas for the longest time now CPU hasn't been much of a factor unless you're talking low resolutions at crazy high fps.
No one is bothering to create games to take advantage of PC hardware (increase graphics incrementally ad infinitum yay) and if you look at the reviews for the 4090 it is pathetic how old the games they are using in their benchmarks.
Pick any game, bump to 4k, require a bare minimum of 60fps and then look at 1% benchmarks. If you don't need (want?) 144+hz at 4k from games released in the last 3+ years or so then you don't need a 4090.
Above doesn't factor in RT or DLSS and if you're running 4K and not using DLSS or similar with those video cards that support it then you're losing out on performance with barely any quality loss.

It's nice seeing these companies battle it out (too bad Intel dropped the ball suprise suprise), we can only benefit assuming video card prices are ever realistic. Glad I stuck with 3840x1600, great resolution for desk gaming and non-gaming and one of the reasons my 1080ti was able to last so long before bumping up to a 3080 (which is now myliving room pc....that's currently at my desk). My main desktop sits without a graphics card and it might be an AMD video card depending on how this goes, been a looooooooooooooooooong time since I last went AMD for GPU and gave up on them.

Last edited by DosFreak on 2022-11-06, 21:17. Edited 12 times in total.

How To Ask Questions The Smart Way
Make your games work offline

Reply 52 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie

Yes Nvidia must recall over a handful of cases some of which no doubt was people trying to recreate it.

The Rosy AMD outlook will continue to wilt as launch day approaches. AMD lost this round.

Reply 53 of 150, by awgamer

User metadata
Rank Oldbie
Rank
Oldbie

Weak, nothing to do with AMD, it's a fire hazard capable of burning down houses and kill people and far more than a handful, with nvidia having all of their board partners send all the cards to them. Also, the cost of connectors is way cheaper than lawsuits , let alone being held liable for burning places down and killing people, but maybe jensen is a gambler.

Reply 54 of 150, by TrashPanda

User metadata
Rank l33t
Rank
l33t
appiah4 wrote on 2022-11-06, 19:31:
7900XTX will be 10-20% slower in Raster than 4090 depending on game. It will do that wile using 33% less power and costing 37% […]
Show full quote
Shagittarius wrote on 2022-11-06, 18:20:
TrashPanda wrote on 2022-11-06, 13:18:

I think im going to sit back and see where this all goes, its pretty clear at this point that a redesign of the connector is required, its too late for the already released cards but later revisions I expect to have a redesigned connector. Good thing to note is that some AIBs were aware of this and already have a beefed up connector.

Personally I think AMDs approach in improving efficiency and power use has huge benefits over nVidias brute force to the top approach, sure nVidia are the fastest cards out there but they are also fire hazards and work well as space heaters, itll also cost you a couple of hundred bucks extra a year to run one.

The fact the 7900XTX can get within spitting distance of the 4090 for pure raster at 350-400 watts is proof nVidia didn't need to go the path they did. Some will still claim AMD RT isn't good . .but its slightly better than a 3080ti which honestly for how often you would use RT is perfectly fine. (Assuming RT/AI matters at all to many out there)

It's not even close to a 3080ti, unless you cherry pick the very limited intergrations. AMD tanks on any full RT implementation. You cannot say AMD has passable RT it doesnt, it might as well not have any. Also I dont consider 25-30% slower than a 4090 on average spitting distance. I think its going to be about 10% slower than a 4080 on average, and that's the card AMD said its competing with.

7900XTX will be 10-20% slower in Raster than 4090 depending on game. It will do that wile using 33% less power and costing 37% less. The 7900XTX will also be 20% fater than 4080 while still costing less. So AMD not only kneecapped the 4090 it completely shot 4080 dead.

In raytracing it will be comparable to a 3080Ti. Which is not great, but it is also not bad at all. This is the only part of your post that is true - you can't call it passable, it is WAY BETTER than that.

You either don't know what you are talking about here or you are just wishlisting with nVidia fanboy goggles. AMD may as well not have any raytracing? Well, I have a 6700XT in my system and Raytracing + FSR 2.0 works quite well in many games for me.

I have a 3080ti ...it does Psycho RT in CP2077 perfectly fine using DLSS I would imagine the 7900XTX would be just as capable using FSR2 or 3, I still don't get why some people think 3080ti levels of RT power is bad or somehow not passable.

Shagittarius wrote on 2022-11-06, 21:45:

Yes Nvidia must recall over a handful of cases some of which no doubt was people trying to recreate it.

The Rosy AMD outlook will continue to wilt as launch day approaches. AMD lost this round.

Lost this round to nVidia whose only way to win is set fire to their GPUs and require their own power plant to run, but hey if you feel the need to disparage then by all means go ahead if it helps you sleep at night.

I like nVidia GPUs but the 4000 series is a mess and for the first time in a long time I will be skipping a generation.

Reply 55 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Shagittarius wrote on 2022-11-06, 21:45:

Yes Nvidia must recall over a handful of cases some of which no doubt was people trying to recreate it.

The Rosy AMD outlook will continue to wilt as launch day approaches. AMD lost this round.

Yes, nVidia won its phyrric victory. They went onto the newest processing node, used almost the largest die size they could, stuffed it with as much cache as they can and then pushed the power limit to the max ending up burning cards/connectors.

Now nVidia has almost no room left to go for next generations aside from possible process node improvements and they will probably end up losing badly for the next few generations with their monolithic designs, just as Intel did.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 56 of 150, by TrashPanda

User metadata
Rank l33t
Rank
l33t
appiah4 wrote on 2022-11-07, 06:08:
Shagittarius wrote on 2022-11-06, 21:45:

Yes Nvidia must recall over a handful of cases some of which no doubt was people trying to recreate it.

The Rosy AMD outlook will continue to wilt as launch day approaches. AMD lost this round.

Yes, nVidia won its phyrric victory. Now they have almost no room left to go and they will probably end up losing badly for the next few generations with their monolithic designs, just as Intel did.

nVidias next GPU will be MCM it simply wasn't ready this year, however AMD has a huge advantage with MCM so even if nVidia does get it out next year they will already be several years behind.

Reply 57 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
TrashPanda wrote on 2022-11-07, 06:11:
appiah4 wrote on 2022-11-07, 06:08:
Shagittarius wrote on 2022-11-06, 21:45:

Yes Nvidia must recall over a handful of cases some of which no doubt was people trying to recreate it.

The Rosy AMD outlook will continue to wilt as launch day approaches. AMD lost this round.

Yes, nVidia won its phyrric victory. Now they have almost no room left to go and they will probably end up losing badly for the next few generations with their monolithic designs, just as Intel did.

nVidias next GPU will be MCM it simply wasn't ready this year, however AMD has a huge advantage with MCM so even if nVidia does get it out next year they will already be several years behind.

No, there is no gaming MCM for nVidia next gen. Blackwell is experimental and for datacenters. Even if they bring it to desktop as a Titan class card or something, they are WAY behind in terms of technology on that front.

Last edited by appiah4 on 2022-11-07, 06:15. Edited 1 time in total.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 58 of 150, by TrashPanda

User metadata
Rank l33t
Rank
l33t
appiah4 wrote on 2022-11-07, 06:11:
TrashPanda wrote on 2022-11-07, 06:11:
appiah4 wrote on 2022-11-07, 06:08:

Yes, nVidia won its phyrric victory. Now they have almost no room left to go and they will probably end up losing badly for the next few generations with their monolithic designs, just as Intel did.

nVidias next GPU will be MCM it simply wasn't ready this year, however AMD has a huge advantage with MCM so even if nVidia does get it out next year they will already be several years behind.

No, there is no MCM for nVidia next gen.

I haven't heard of them ditching their Hopper MCM designs unless something changed, everything im reading still has them on track for next year.

Reply 59 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
TrashPanda wrote on 2022-11-07, 06:14:
appiah4 wrote on 2022-11-07, 06:11:
TrashPanda wrote on 2022-11-07, 06:11:

nVidias next GPU will be MCM it simply wasn't ready this year, however AMD has a huge advantage with MCM so even if nVidia does get it out next year they will already be several years behind.

No, there is no MCM for nVidia next gen.

I haven't heard of them ditching their Hopper MCM designs unless something changed.

Hopper is monolithic, Blackwell will be MCM. See above.

Hmm, apparently Hopper may indeed be MCM, I'll check. Regardless, I don't think the gaming parts will use Hopper?

Retronautics: A digital gallery of my retro computers, hardware and projects.