VOGONS


First post, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

http://www.tomshardware.com/news/gtx-970-laws … idia,32358.html
http://www.guru3d.com/news-story/nvidia-settl … in-the-usa.html

Will the claimants actually receive the full usable $30? Stay tuned! 😁

P.S. Just for laugh! Don't take it to heart. No fanboy accusations allowed!

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 1 of 15, by Munx

User metadata
Rank Oldbie
Rank
Oldbie

Good. I didn't buy one, but its nice to know that people get at least minimal compensation for being mislead on what they were spending money on.

My builds!
The FireStarter 2.0 - The wooden K5
The Underdog - The budget K6
The Voodoo powerhouse - The power-hungry K7
The troll PC - The Socket 423 Pentium 4

Reply 3 of 15, by laxdragon

User metadata
Rank Member
Rank
Member

The claim links are not available yet. They will make another announcement when they are ready.

Despite all the controversy over the RAM, I still rather like my 970. I've had mine for almost 2 years now, and it is still holding up, even with VR. I might get another year out of it before I think about upgrading.

laxDRAGON.com | My Game Collection | My Computers | YouTube

Reply 4 of 15, by Snayperskaya

User metadata
Rank Member
Rank
Member

Why is thus limited to USA only? 😐

Reply 5 of 15, by ODwilly

User metadata
Rank l33t
Rank
l33t
Snayperskaya wrote:

Why is thus limited to USA only? 😐

Lawsuit was filed in the u.s. i bet. You could get together and start one in your country i bet. Just takes time money and lawyers!

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 7 of 15, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

The funny thing is that they used to be fine with "odd" numbers (i.e., anything not an integer in GB e.g. 768MB 8800GTX, 896MB 260/275 and 1792MB 295, 1280MB 470/570 and 2560MB version, 1536MB 480/580). So showing/advertising just "3584MB" would have been perfectly alright.

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 8 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
archsan wrote:

The funny thing is that they used to be fine with "odd" numbers (i.e., anything not an integer in GB e.g. 768MB 8800GTX, 896MB 260/275 and 1792MB 295, 1280MB 470/570 and 2560MB version, 1536MB 480/580). So showing/advertising just "3584MB" would have been perfectly alright.

But there actually is 4 GB on the card. It's just that it's segmented in a 3.5 GB and a 0.5 GB part. But both can be used by applications. You can also use them at the same time, so technically you have both the 4 GB and the bandwidth they advertised.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 9 of 15, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

Yes, I mean, if the slower half gig were not advertised but came as a "bonus" instead, it would've gone in their favor.

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 10 of 15, by Snayperskaya

User metadata
Rank Member
Rank
Member
archsan wrote:

Yes, I mean, if the slower half gig were not advertised but came as a "bonus" instead, it would've gone in their favor.

They lied about the available ROPs, too.

Reply 11 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
Snayperskaya wrote:

They lied about the available ROPs, too.

From what I understood, the number of ROPs came from the reviewer's guide, where the specs were accidentally copy-pasted from the 980, without being adjusted in the right places.
NVidia does not specify this kind of information in the specs on its websites, so they never communicated this figure directly to their customers.
The reviewer's guide was not for publication.
They responded quite quickly when it became known: http://www.anandtech.com/show/8935/geforce-gt … ory-allocation/

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 12 of 15, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I'm still waiting for the 3.5GB issue to really surface in a game. Yeah you can test for it with CUDA or whatever, but in games the card has always been a great value for its performance level. Even Tech Report's fancy frame time charts always look good.

I think AFR-based multi-GPU is more of a scam. Because of frame latency canceling out the benefit. But apparently the super coolness of dual cards overrules any logic there.

Reply 13 of 15, by SquallStrife

User metadata
Rank l33t
Rank
l33t
archsan wrote:

The funny thing is that they used to be fine with "odd" numbers (i.e., anything not an integer in GB e.g. 768MB 8800GTX, 896MB 260/275 and 1792MB 295, 1280MB 470/570 and 2560MB version, 1536MB 480/580). So showing/advertising just "3584MB" would have been perfectly alright.

Ummmm

I mean I get what you're saying but... Yeah.

When I first read that, I processed it and was like "How do you have half a byte?"

I think I might be retarded.

VogonsDrivers.com | Link | News Thread

Reply 14 of 15, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

@SquallStrife
That's why I said "in GB". (Though I should've said positive integer for the sake of accuracy & completeness!)

Also, half a byte = 4 bits, right? 😁

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)

Reply 15 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I'm still waiting for the 3.5GB issue to really surface in a game. Yeah you can test for it with CUDA or whatever, but in games the card has always been a great value for its performance level. Even Tech Report's fancy frame time charts always look good.

Indeed, I bought my GTX970 after the 3.5 GB issue was already known. I just figured that none of the reviews showed anything weird, so the card apparently performs fine, regardless of how it does it.
I bet a lot of people did. I don't think they should be entitled to the $30 really. But it's 'Murica. I guess the point isn't so much that the customers are entitled to the money, but rather that vendors should be careful not to spread misleading information on their products.

swaaye wrote:

I think AFR-based multi-GPU is more of a scam. Because of frame latency canceling out the benefit. But apparently the super coolness of dual cards overrules any logic there.

Yea, I've always wondered about that. It shouldn't be THAT difficult to get rid of most of the micro-stuttering, and get a smooth gaming experience.
I mean, you can measure the average framerate, and you can measure how long it takes to emit draw calls for GPU #1 before moving to GPU #2. Since there generally is not that much of a difference in subsequent frames, the values measured from GPU #1 should be very close to what GPU #2 will need to do for its frame.
From that data, you can easily extrapolate a delay-value so that you offset GPU #2 to fall exactly 'halfway' the frames of GPU #1.

The only reason I can think of why they aren't doing this is because although it will give a smoother user experience, the delays would give a minimal drop in average framerate when benchmarked.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/