VOGONS


Are GeForce 256 DDR cards that rare?

Topic actions

Reply 120 of 311, by amadeus777999

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
dirkmirk wrote:

For what the cards do their is nothing special about them, their were gaziillions of GeForce sdrs/ddrs/2/gts/ultra/ti/2mx/4mx440/MX 4000 basically any dx7 NVIDIA card you can buy dirt cheap.

No, but they were the first GeForce, the first 'GPU', the first with hardware T&L, the first with per-pixel lighting etc.
It's the dawn of a new era, so it's historical significance mostly. Building a system with a GeForce 256 gives you a nice hands-on feel of how this new era started.

Well put - the GeForce256, especially DDR, is quite a worthy collector's item.

Reply 121 of 311, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Where were all you guys ~7 years ago when this junk was still super cheap because nobody cared because "vintage" wasn't really happening for the era yet? 😕 😀 It's the wrong time to join the party now.

We should speculate on what will be vintage and sought after in 10 years...

Reply 122 of 311, by silikone

User metadata
Rank Member
Rank
Member
swaaye wrote:

We should speculate on what will be vintage and sought after in 10 years...

Geforce 8800
Unified shading, D3D10, large leap from last generation, etc.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 123 of 311, by Scali

User metadata
Rank l33t
Rank
l33t
silikone wrote:

Geforce 8800
Unified shading, D3D10, large leap from last generation, etc.

Got that one covered 😀 I lost an 8800GTS320 to bumpgate, but I got an 8800GTX which still works.
I think you are skipping a generation though.
I'd say the Radeon 9700 was also a generational leap: SM2.0, floating point shaders, first practical implementation for MSAA and AF etc.
Also interesting because it was the first success-story for ATi in the era of 3d acceleration.

And then we could also say that the GeForce3 is another 'generation', although not quite the leap that other cards in this list were. But it was the first card advertised as having 'programmable shaders', which was pretty much true for vertex shading at least. Pixelshading was still very primitive and not that different from the 'texture combiners' you had on the GeForce256.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 124 of 311, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I've been halfheartedly thinking of getting an 8800 GTX or Ultra again.

The 9700/9800 cards are certainly historically interesting, but they also tend to die for various reasons similar to bumpgate.

I agree about GeForce 3, not only because of programmable shaders, but also because of the gigantic efficiency boost it has over GeForce 2, the addition of MSAA and useful anisotropic filtering, and the first GeForce capable of EMBM. Loads of improvements. It still has the DXT1 banding caused by lack of dithering however!

Reply 125 of 311, by silikone

User metadata
Rank Member
Rank
Member
Scali wrote:

I think you are skipping a generation though.
I'd say the Radeon 9700 was also a generational leap: SM2.0, floating point shaders, first practical implementation for MSAA and AF etc.

Well, he did say ten years, which would very roughly make it as old as the original Geforce. Of course, this is ignoring the likelihood that hardware ageing is logarithmic.

Last edited by silikone on 2017-06-16, 23:19. Edited 1 time in total.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 126 of 311, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
devius wrote:

I just wanted to mention that you guys are insane

Those who bid that much for a "cooked" video card (back of the card near GPU area) are probably insane.

swaaye wrote:

Yeah I've been halfheartedly thinking of getting an 8800 GTX or Ultra again.

Probably should invest in dual GPU cards. 9800GX2 and first GTX295 revision (sammich) especially.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 127 of 311, by dirkmirk

User metadata
Rank Oldbie
Rank
Oldbie

How about the Geforce 4ti4200s? Stil good prices IMO they were a hero card.

The geforce 3s werent significantly faster than top of the line Geforce 2s but offered DX8 capabilities, The TI200 models are common but the orginal and ti500 are rarer and people are asking big money for them

A quick look at reviews suggested they weren't highly regarded when released because of their high prices but I remember people touting they were "Future Proof" because of DX8 🤣 .

The ti4200s on the other hand were screaming buys and can still be had on ebay for like $20 or $30, its like the 300a celeron of the 2002 for graphics cards.

Reply 128 of 311, by appiah4

User metadata
Rank l33t++
Rank
l33t++

Who cares about the Ti4200. It was all about the Radeon 8500.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 129 of 311, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
dirkmirk wrote:

I remember people touting they were "Future Proof" because of DX8

They were in fact. You can't play Silent Hill 3 and Prince of Persia with DX7 hardware.

dirkmirk wrote:

The geforce 3s werent significantly faster than top of the line Geforce 2s but offered DX8 capabilities

1.5x-2x increase in performance is a significant boost.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 130 of 311, by Scali

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:

1.5x-2x increase in performance is a significant boost.

I think his point is that it's not 1.5x-2x: http://www.anandtech.com/show/742/5
GeForce2 Ultra wasn't all that much slower worst case, and in some games, such as UT, it was actually faster than a GF3.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 131 of 311, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
Scali wrote:

I think his point is that it's not 1.5x-2x

It is.
q3max-gf3-32.png
g2-h-32.png
http://ixbtlabs.com/articles/gf3ti500/index.html

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 132 of 311, by silikone

User metadata
Rank Member
Rank
Member

I think his point is that a 2x performance upgrade was the norm. It's at most the difference from GF 1 to GF 2 in fixed function applications.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 133 of 311, by Scali

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:
Scali wrote:

I think his point is that it's not 1.5x-2x

It is.

In cherry-picked (synthetic) benchmarks, yes.
In real life, absolutely not.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 134 of 311, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
Scali wrote:

In cherry-picked (synthetic) benchmarks, yes.
In real life, absolutely not.

You're wrong again.
http://www.ixbt.com/video2/over2k4-ss-1024.shtml
http://www.ixbt.com/video2/over2k4-ut-1024.shtml
http://www.ixbt.com/video2/over2k4-cd-1024.shtml

By the way it's really nice article (very huge amount of old cards tested):
http://ixbtlabs.com/articles2/over2k4

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 136 of 311, by silikone

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

Goodness, Geforce 256 DDR bested that FX5200.
I can't believe an acquaintance of mine played Half-Life 2 on such a thing.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 137 of 311, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
Scali wrote:

This is pointless.

Of course it is, since you have no proof. Anandtech article with default settings and old Athlon 1ghz/Sdram test bench is a joke.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 138 of 311, by Scali

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:
Scali wrote:

This is pointless.

Of course it is, since you have no proof. Anandtech article with default settings and old Athlon 1ghz/Sdram test bench is a joke.

See, this is why it's pointless. Reviews have been posted, people can google for other reviews if they like. Let everyone make up their own mind. This isn't worth my time.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 139 of 311, by Reputator

User metadata
Rank Member
Rank
Member

To be fair, most of those gains from the GeForce 3 were largely from the GeForce 2 suffering from overdraw and poor bandwidth usage. In older games you only really saw big gains at higher resolutions.

I actually compared the two when I looked at 4x2 NVIDIA architectures, and in most tests the NV20 architecture did nearly double the NV15.

https://www.youtube.com/c/PixelPipes
Graphics Card Database