VOGONS


3dfx Rampage alive

Topic actions

First post, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Gary Donovan (of here) posted some screens and videos of his efforts to bring the card to life. Interesting stuff.

http://forum.beyond3d.com/showthread.php?t=2289&page=9 (near the bottom of the page)

Reply 2 of 14, by F2bnp

User metadata
Rank l33t
Rank
l33t

Ah so he finally managed to get it working eh?
Interesting stuff indeed!

Reply 3 of 14, by cdoublejj

User metadata
Rank Oldbie
Rank
Oldbie

Not FULLY it lacks proper drivers that unlock it's full performance.

Reply 4 of 14, by leileilol

User metadata
Rank l33t++
Rank
l33t++

That Quake3 Demo video looks strange. Vertex lighting? Unblended sky? How early is the OpenGL ICD for the Rampage?

apsosig.png
long live PCem

Reply 5 of 14, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote:

That Quake3 Demo video looks strange. Vertex lighting? Unblended sky? How early is the OpenGL ICD for the Rampage?

Gary says that he had to disable several engine features via config.cfg in order to reduce the stress made to the card

7fbns0.png

tbh9k2-6.png

Reply 6 of 14, by sliderider

User metadata
Rank l33t++
Rank
l33t++

That's also the lowest configuration of Rampage. The full blown version was to have had two GPU's and a separate T&L unit. That's why Rampage would have been doomed to failure even if it came out on time. It would have been far too expensive to build and would have been power hungry. The low end units without the T&L unit would have been pretty useless because T&L proved itself as a desirable feature during the DX7 era and these would have been DX8 cards so most games would have assumed a T&L unit on the card by then. They would have needed several more years of development time to fix bugs and consolidate everything onto one chip and the industry would have moved on again just like they did when V4 and V5 were released. DX9 cards would have been out by the time Rampage was ready.

Reply 7 of 14, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

While Rampage would not be my favourite as well, I would not write it off by paper specs only. Sage was not just another TnL unit. I am interested in finding out how 3dfx improved 3d upon Voodoos, as this was the only complete overhaul- after all the company accumulated more talent in the end then in the 'rampaging' beginning. However I do not believe there will ever be people able/willing to write proper driver for any complex test, even if documentation was available.

Reply 8 of 14, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Gary's engineer contact said he figured the product was a year away from retail. This hardware is probably really bug ridden yet. I'm surprised they didn't have more driver ready considering they had real hardware done. They obviously knew all about what they were building.

Rampage probably would have gone down like VSA100 - too late and beaten by NV (and ATI by that point.) It probably would have come after Radeon 8500.

Reply 9 of 14, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

One year? Only bug I read about was the color inversion, that would be one respin with possible availability in half a year. The chip should have much less transistors then NV20-R200, so it should not be disastrous if Rampage would be slower then these.

Reply 11 of 14, by nforce4max

User metadata
Rank l33t
Rank
l33t
sliderider wrote:

That's also the lowest configuration of Rampage. The full blown version was to have had two GPU's and a separate T&L unit. That's why Rampage would have been doomed to failure even if it came out on time. It would have been far too expensive to build and would have been power hungry. The low end units without the T&L unit would have been pretty useless because T&L proved itself as a desirable feature during the DX7 era and these would have been DX8 cards so most games would have assumed a T&L unit on the card by then. They would have needed several more years of development time to fix bugs and consolidate everything onto one chip and the industry would have moved on again just like they did when V4 and V5 were released. DX9 cards would have been out by the time Rampage was ready.

Not really power hungry but very complex and expensive to produce. Very interesting though and am wondering if Nvidia used their share of samples for R&D in later projects. Did any of you see his latest video showing Rampage running DX tests? Was surprised that it ran DX9 and passed 😮
All 3dfx should have done was massively scale up their designs on the high end instead of depending on sli alone.

On a far away planet reading your posts in the year 10,191.

Reply 12 of 14, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
nforce4max wrote:

All 3dfx should have done was massively scale up their designs on the high end instead of depending on sli alone.

You make it sound easy. Even if such design was made, both Napalm and Rampage were pushing transistor limits of their manufacturing process, I wonder why was 3dfx transitioning so slow.

Reply 13 of 14, by nforce4max

User metadata
Rank l33t
Rank
l33t
Putas wrote:
nforce4max wrote:

All 3dfx should have done was massively scale up their designs on the high end instead of depending on sli alone.

You make it sound easy. Even if such design was made, both Napalm and Rampage were pushing transistor limits of their manufacturing process, I wonder why was 3dfx transitioning so slow.

Everything that they done was in software long before there was any tap out and I think that they didn't invest well enough into their R&D department. The more raw compute power that there is to run the simulations the quicker the bugs can be worked out and the asic validated. Also they all should have migrated to using flip chip designs instead of the older plastic capped bga s. If they would have done that and allowed for slightly higher power consumption scaling up would have been as of a problem for any of them except for one thing. At the fab level they were still using older waver tech (most likely 200mm or less) so the larger dies would have been a big problem but they could have harvested enough of the defective dies to cover some of the costs. Nvidia's, ATI's, Matrox's, and 3dfx's designs almost all required a fully functional chip if it were to even go to market.

On a far away planet reading your posts in the year 10,191.

Reply 14 of 14, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I did not imply problem was in their internal tools. Also I doubt they would underinvest, since their factual position was never that good and lacked cash. More likely they just canned projects and failed to capitalize on acquisitions.