When your chips are running at 100C and failing left and right, I don't think it's a question of them deciding to run them at th […]
Show full quote
Scali wrote:You realize that the claim that nVidia doesn't know at what voltages they can run the chips that they themselves designed is rather far-fetched, right?
When your chips are running at 100C and failing left and right, I don't think it's a question of them deciding to run them at those voltages to "keep healthy margins". There was no margin at all at those voltages. Like I said, considering the poor engineering of the chips, they were being factory over-volted, and for no apparent reason.
Hey, the truth is sometimes stranger than fiction. I didn't make this up.
Again, this doesn't make sense.
You dismiss the technical merits of the architecture based on the fact that the reliability wasn't that great.
I think my point was very salient. Bitboys cards also had a lot of technical merit. In simulations, they outperformed everything else, and I'm sure that had they had some millionaire backers and not to mention some luck, they might have put out some pretty impressive silicon.
And we're not talking about nVidia cards of that era failing after 3 years. We're talking about cards dropping like flies after several months of usage. Just look at consumer-submitted Newegg follow-up reviews of GeForce cards from that era to get a pretty good idea of just how long these cards lasted on average.
And again, this wasn't limited to one series of cards. This took place over a span of many years, perhaps even up until the very last G9x silicon. And keep in mind that G9x silicon was still being sold even after Fermi was released. so while high-end Geforce 2xx cards were Fermi-based, lower-end 2xx models were simply re-badged G9x models, and were still being sold well into 2010.