VOGONS

Common searches


Nvidia adaptergate

Topic actions

Reply 100 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2022-11-15, 12:10:
Jasin Natael wrote on 2022-11-14, 21:59:
I agree. It stinks of trying to make the architecture more than what it is capable of. And charging a premium and a half "just […]
Show full quote
bloodem wrote on 2022-11-14, 18:31:

No, no! Don't blame the engineers!
This has "Jensen" written all over it. 😀 He just wanted to ensure that Ada Lovelace comes out on top no matter what. It doesn't take a rocket scientist to realize that pushing these cards to their absolute limit will inevitably have unpredictable consequences outside of your test lab environment.

To be clear, I think the root cause of this issue will eventually be found and fixed. But, these are still sad "PC Master Race" times we're living. At this rate, very soon, even mid-range systems will need a 1 kW PSU... Dennard scaling has truly gone down the drain.

I agree. It stinks of trying to make the architecture more than what it is capable of.
And charging a premium and a half "just because"

Intel pulled the same garbage with the 9th gen i9 crap.
And for that matter AMD did the same with the horrible FX-9590 chips...ask me how I know.

The difference is that they did quite quickly discount those parts.
Jensen won't do that. I guess those silly leather jackets are really expensive.

Were there cases of i9 9900K CPUs causing something to melt/burn that I missed ? Or are you referring to the the actual power usage/thermals being rather intense for a desktop part (as opposed to quoted TDP)?

Yes I was just referring to the i9 line being pushed too far, past their usable limit really. Intel really, really needed that die shrink and they just didn't have it yet.
AMD with their factory overclocked 8350 was much in the same boat, all of these chips were trying to compete outside of their range.
4090 is no better.

Reply 101 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

But hey, 4080 is a solid 3090 Ti killer... for a small fee of 1200$ (actually 200-300$ higher due to demand).

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 102 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-11-15, 20:55:

But hey, 4080 is a solid 3090 Ti killer... for a small fee of 1200$ (actually 150-250$ higher due to demand).

Yeah 4080 appears to be way more reasonable. Also doesn't appear to be nearly as power hungry or hot.
I am guessing that they are leaving themselves plenty of room to slot in a 4080ti variant later in the game.

I still think that the cards are ridiculously overpriced. But what isn't these days?
Still yet I can't seem to get past the mentality of a "flagship" costing $499.99.

Reply 103 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++

In what world does 4080 and reasonable go in the same sentence? the 3090Ti is basically a travesty of a card that had no appeal whatsoever at its absurd price point for any buyer with an ounce of common sense. Being slightly better than pure shit is hardly anything to write home about.

7900XTX will come on Dec 13th, and destrou the 4080. It will also be within 10%-15% of the 4090 at 60% the cost.

nVidia is fucked through and through.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 104 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2022-11-16, 06:01:

In what world does 4080 and reasonable go in the same sentence? the 3090Ti is basically a travesty of a card that had no appeal whatsoever at its absurd price point for any buyer with an ounce of common sense. Being slightly better than pure shit is hardly anything to write home about.

7900XTX will come on Dec 13th, and destrou the 4080. It will also be within 10%-15% of the 4090 at 60% the cost.

nVidia is fucked through and through.

The 4080 only makes a semblance of sense when compared to the 4090, I guess I should have specified.
And by that I mean that the product doesn't seem to have been pushed to the point of thermal instability and ridiculous power draw.
It is still overpriced for what it is.

I hope that AMD can really steal some thunder this time around. Someone needs to upset the cart.

Reply 105 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

I think that, if AMD truly wants to leave a mark in nVIDIA's market share (and pride), it actually needs to pull an "nVIDIA in the late 90s/early 00s" move.
I.e.: nVIDIA had a 6-month product cycle back then, which is probably impossible to do now with modern GPUs, but I would imagine that having a one year product cycle should be doable with enough money and resources. If they catch nVIDIA by surprise and manage to release two next gen products instead of just one within a two year timeframe... it could allow them to recover lost ground in areas like ray tracing.
Of course, to be fair, something like this is probably VERY difficult to do - since, as I mentioned before, these guys surely have inside information about what the other is doing. So it's just wishful thinking on my part. 😀

As for nVIDIA, they are a business. If people are willing to pay A LOT of money for their cards (and it's obvious that MANY are... 2000+ € cards are still flying off the shelves), then they will increase the prices without any remorse.
I personally blame the people before blaming nVIDIA...

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 106 of 150, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie

I dont blame anyone who is willing to pay the asking price. I spent most of my early life unable to afford ANY computer parts and everything I had were hand me downs from friends. I got my first sound blaster in 94 for christmas which was a huge deal for me and ran that 386DX-25 all the way until 1996 when I got a DX4-100, which I had for years. Computer prices have come down a lot overall and since I've been a working adult with no car/home payments and I make good money, and computers is my only hobby that really costs any money, the current cost of a 4090 is plenty worth it to me.

We are also lucky to live in a time when you can play any game out with a low tier video board from the current gen, that's something that was unthinkable when I had no money to buy computers. I missed out on so many games, anything 3d (Non Hardware Accelerated) or relatively advanced up to the 1998 time frame, I would look at those games and wish but I couldn't play them.

The top tier might not be for you and it doesn't have to be because the ability to play PC games at all levels is so much better than it use to be. Be thankful companies compete for performance, we dont have to stagnate performance wise and overall this creates a more powerful bottom end.

Don't let me stop the illogical fanboy orgy though.

Reply 107 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
bloodem wrote:

I.e.: nVIDIA had a 6-month product cycle back then, which is probably impossible to do now with modern GPUs

It is impossible due to very simple thing - back then Nvidia were as fast as die-shrinking of their successful chips could be physically possible. TNT2 was not a new chip. GeForce 2 was not a new chip. Heck, 9800GTX and GTX 280 hardly were new chips, just optimization of the same old 8800GTX, they squeezed everything they could from Tesla. Lithography improvement now just can't keep up, because it's not behind one or two generations behind CPU lithography, like it was back then. Nvidia can't play die-shrinking game now.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 108 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-11-16, 16:19:
bloodem wrote:

I.e.: nVIDIA had a 6-month product cycle back then, which is probably impossible to do now with modern GPUs

It is impossible due to very simple thing - back then Nvidia were as fast as die-shrinking of their successful chips could be physically possible. TNT2 was not a new chip. GeForce 2 was not a new chip. Heck, 9800GTX and GTX 280 hardly were new chips, just optimization of the same old 8800GTX, they squeezed everything they could from Tesla. Lithography improvement now just can't keep up, because it's not behind one or two generations behind CPU lithography, like it was back then. Nvidia can't play die-shrinking game now.

Yeah, I guess so... As I said, wishful thinking. 😀

Shagittarius wrote on 2022-11-16, 15:37:
I dont blame anyone who is willing to pay the asking price. I spent most of my early life unable to afford ANY computer parts a […]
Show full quote

I dont blame anyone who is willing to pay the asking price. I spent most of my early life unable to afford ANY computer parts and everything I had were hand me downs from friends. I got my first sound blaster in 94 for christmas which was a huge deal for me and ran that 386DX-25 all the way until 1996 when I got a DX4-100, which I had for years. Computer prices have come down a lot overall and since I've been a working adult with no car/home payments and I make good money, and computers is my only hobby that really costs any money, the current cost of a 4090 is plenty worth it to me.

We are also lucky to live in a time when you can play any game out with a low tier video board from the current gen, that's something that was unthinkable when I had no money to buy computers. I missed out on so many games, anything 3d (Non Hardware Accelerated) or relatively advanced up to the 1998 time frame, I would look at those games and wish but I couldn't play them.

The top tier might not be for you and it doesn't have to be because the ability to play PC games at all levels is so much better than it use to be. Be thankful companies compete for performance, we dont have to stagnate performance wise and overall this creates a more powerful bottom end.

Don't let me stop the illogical fanboy orgy though.

See, I was not referring to you or other 'de facto' high-end buyers. You guys are fine - as I already mentioned, I can totally understand that some people not only can easily afford the top tier card, but they are also willing to pay for it.
I was referring to (what appears to be) a worrying number of more and more 'traditional' gamers that, until yesterday, would only spend ~ $500 for a graphics card, and now, all of a sudden... sky's the limit. Many of them probably can't even truly afford these cards, and yet they keep buying them anyway.

Our 'gaming life stories' are quite similar, actually... Like you, I also couldn't afford any computer parts (well, more specifically, my single mother couldn't), which is why I received my very first PC at the end of 1998 (as a first year of high school present), and this PC was, at the time of purchase, an already outdated Pentium 1 MMX 166 MHz with 16 MB RAM and a 2 MB S3 Virge (which I only managed to "upgrade" in the Summer of 2000... to an ATI Rage IIC 8 MB 😀 ). So, as one can imagine, I too had to endure (for many years) single digit FPS in software rendering (and this was the best case scenario, since many games wouldn't even run).
Fast forward to today, I'm also a working adult, also don't have car/home payments (anymore) and I also have a good software engineering job that pays very well (all thanks to my mother and that very first Pentium 1, which got me started in this field). This is where the similarities between you and me end, though. Because, even though I could easily afford to buy a 4090, I stubbornly choose not to. It doesn't mean that I'm 'better' or 'worse' than you in this regard. I just view things differently, that's all.

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 110 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-11-16, 18:16:

If anyone could've done it... it had to be Steve.
So, bottom line, it mostly boils down to people who don't know how to plug a connector (and, as a contributing factor, a less than ideal "not clicky" connector).
But, the real cause, is the insane current that the cable (and connector) needs to sustain... I really hope that, someday, modern GPUs will return to the 20 - 25W power consumption of their (now) 22 year old ancestors. 😀

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 111 of 150, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

List by priority:
1) User error
2) Manufacturing imperfections (foreign objects inside the connector)
3) Design oversight which misleads user

bloodem wrote:

it mostly boils down

Oh, it certainly is! 😉

P.S.
Nvidia and PCI-SIG are going to make another revision of the connector which minimizes user error.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 112 of 150, by awgamer

User metadata
Rank Oldbie
Rank
Oldbie

Induced user error caused by bad design and questionable even when inserted properly with a hint of degradation that will lead to failure. Something that didn't used to be a thing that's now a thing.

Reply 113 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-11-16, 16:19:
bloodem wrote:

I.e.: nVIDIA had a 6-month product cycle back then, which is probably impossible to do now with modern GPUs

It is impossible due to very simple thing - back then Nvidia were as fast as die-shrinking of their successful chips could be physically possible. TNT2 was not a new chip. GeForce 2 was not a new chip. Heck, 9800GTX and GTX 280 hardly were new chips, just optimization of the same old 8800GTX, they squeezed everything they could from Tesla. Lithography improvement now just can't keep up, because it's not behind one or two generations behind CPU lithography, like it was back then. Nvidia can't play die-shrinking game now.

If their non monolithic chiplet design proves as fruitful as it has with their Zen chips.....it is more than possible. Maybe even probable.

Reply 114 of 150, by Jasin Natael

User metadata
Rank Oldbie
Rank
Oldbie
Shagittarius wrote on 2022-11-16, 15:37:
I dont blame anyone who is willing to pay the asking price. I spent most of my early life unable to afford ANY computer parts a […]
Show full quote

I dont blame anyone who is willing to pay the asking price. I spent most of my early life unable to afford ANY computer parts and everything I had were hand me downs from friends. I got my first sound blaster in 94 for christmas which was a huge deal for me and ran that 386DX-25 all the way until 1996 when I got a DX4-100, which I had for years. Computer prices have come down a lot overall and since I've been a working adult with no car/home payments and I make good money, and computers is my only hobby that really costs any money, the current cost of a 4090 is plenty worth it to me.

We are also lucky to live in a time when you can play any game out with a low tier video board from the current gen, that's something that was unthinkable when I had no money to buy computers. I missed out on so many games, anything 3d (Non Hardware Accelerated) or relatively advanced up to the 1998 time frame, I would look at those games and wish but I couldn't play them.

The top tier might not be for you and it doesn't have to be because the ability to play PC games at all levels is so much better than it use to be. Be thankful companies compete for performance, we dont have to stagnate performance wise and overall this creates a more powerful bottom end.

Don't let me stop the illogical fanboy orgy though.

Nothing wrong with spending your own hard earned money on what you feel you want. Hell, if I didn't have other expensive hobbies (cars and guitars mainly) then I would likely be sorely tempted myself. Just because one guy says it isn't worth it to him, there might be 10 guys behind him that think it is worth it. Market segmentation is critical for that reason. Like you said, we can all spend what we can afford, or are willing to afford and still get a piece of the pie. Play some games at some nice frames.

Reply 115 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Jasin Natael wrote on 2022-11-17, 02:13:
The Serpent Rider wrote on 2022-11-16, 16:19:
bloodem wrote:

I.e.: nVIDIA had a 6-month product cycle back then, which is probably impossible to do now with modern GPUs

It is impossible due to very simple thing - back then Nvidia were as fast as die-shrinking of their successful chips could be physically possible. TNT2 was not a new chip. GeForce 2 was not a new chip. Heck, 9800GTX and GTX 280 hardly were new chips, just optimization of the same old 8800GTX, they squeezed everything they could from Tesla. Lithography improvement now just can't keep up, because it's not behind one or two generations behind CPU lithography, like it was back then. Nvidia can't play die-shrinking game now.

If their non monolithic chiplet design proves as fruitful as it has with their Zen chips.....it is more than possible. Maybe even probable.

Monolithic chiplet design?..

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 116 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote on 2022-11-17, 07:00:
Jasin Natael wrote on 2022-11-17, 02:13:
The Serpent Rider wrote on 2022-11-16, 16:19:

It is impossible due to very simple thing - back then Nvidia were as fast as die-shrinking of their successful chips could be physically possible. TNT2 was not a new chip. GeForce 2 was not a new chip. Heck, 9800GTX and GTX 280 hardly were new chips, just optimization of the same old 8800GTX, they squeezed everything they could from Tesla. Lithography improvement now just can't keep up, because it's not behind one or two generations behind CPU lithography, like it was back then. Nvidia can't play die-shrinking game now.

If their non monolithic chiplet design proves as fruitful as it has with their Zen chips.....it is more than possible. Maybe even probable.

Monolithic chiplet design?..

"non-monolithic" chiplet design 😀

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 117 of 150, by ZellSF

User metadata
Rank l33t
Rank
l33t

Good products are made with considerations for potential user errors. It does not absolve Nvidia at all. IMO it's even worse for them. It's one of the most basic things to test for and Nvidia with their profit margins can't manage it? Pathetic.

bloodem wrote on 2022-11-16, 18:49:

But, the real cause, is the insane current that the cable (and connector) needs to sustain...

3090 Ti is the same TDP and has practically no reports of issues like this. That isn't the cause, even if you want it to be.

appiah4 wrote on 2022-11-16, 06:01:

7900XTX will come on Dec 13th, and destrou the 4080. It will also be within 10%-15% of the 4090 at 60% the cost.

The only data we have on that so far is from AMD marketing. I wouldn't say anything is for certain yet.

Don't get me wrong, I both hope and think that's going to be the case, but regurgitating corporate marketing without clearly stating that it is so is a very bad thing to do.

Reply 118 of 150, by appiah4

User metadata
Rank l33t++
Rank
l33t++
ZellSF wrote on 2022-11-17, 10:18:
appiah4 wrote on 2022-11-16, 06:01:

7900XTX will come on Dec 13th, and destrou the 4080. It will also be within 10%-15% of the 4090 at 60% the cost.

The only data we have on that so far is from AMD marketing. I wouldn't say anything is for certain yet.

Don't get me wrong, I both hope and think that's going to be the case, but regurgitating corporate marketing without clearly stating that it is so is a very bad thing to do.

There has never been a single case of AMD marketing numbers being off since the Polaris launch, and that includes the lacklustre results from Vega and VII. They are 100% transparent and never surprise.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 119 of 150, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote on 2022-11-17, 10:18:

3090 Ti is the same TDP and has practically no reports of issues like this. That isn't the cause, even if you want it to be.

You missed my point entirely. What I meant was that, something like this would be very unlikely (if not actually impossible) to happen with a GPU that has a TGP of 200 - 250W or less. The more power you feed these GPUs, the more likely that each tiny engineering mistake (and mistakes DO happen, make no 'mistake'), will result in unforeseen issues (some with potentially dramatic consequences).

As an analogy, if you drive a car at 30 miles per hour and you make a mistake... chances are you will be fine. If you drive it at 100 miles per hour and you make a mistake... the outcome will be quite different.

The 3090Ti not having any issues is not statistically relevant in any way.
When playing the Russian roulette, you can't say with 100% certainty that the gun is not loaded just because, in your case, the gun didn't fire.

So, I reiterate: the power consumption IS the real cause, and things will get worse as we crawl our way to manufacturing 1kW GPUs (and the CPU situation doesn't look very good either).

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k