VOGONS


Reply 20 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

Back in that time I imagine everyone was thinking that most Voodoo cards since the Voodoo II felt like spending for something already seen too many times beside fps. At the end with the low res CRT monitors around there were no reasons for upgrading but the Geforce 256 was indeed a fast card anyway that promised a lot more than it could maybe but still impressive tech. And with the DDR version followed by the Geforce 2 GTS, that was an impossible situation for any competitor.

Reply 21 of 103, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
rasz_pl wrote on 2024-04-11, 19:15:
like Amazon? :--) https://www.forbes.com/sites/jonmarkman/2017/ … its-no-problem/ https://247wallst.com/special-report/2022/06/ […]
Show full quote

like Amazon? :--) https://www.forbes.com/sites/jonmarkman/2017/ … its-no-problem/ https://247wallst.com/special-report/2022/06/ … -turn-a-profit/
When you are on stock exchange profit is not important, market share and growth are.

3Dfx failed to grow in 2000. They didnt survive Nvidia relentless R&D assault and lost market share to Nvidias low end offerings. It wasnt GF256 that killed 3Dfx, nor was it GF2 or GF3. It was TNT2M64 and Vantas flooding market in huge volume at ridiculously good prices.
I dont recall a year 2000 game that would struggle on V3 3000? but people saw Geforce benchmarks and picked winning brand.

What if Amazon or Tesla exploded like other things did in 2008? I don't think infinite growth exists, IBM was huge and, well, it had to change a lot, and other companies that have maintained themselves over the years as well; 3dfx did not see the change in the appearance of T&L, but to their credit it can be said that they were not the only ones. Other companies besides 3dfx were also left behind for not being able to keep up with Nvidia and Ati. The PowerVR Kyro cards could have been good contenders if they had true HW T&L.
At Nvidia they are very, very good at marketing, the slogan "it's meant to be played" attracted many people without real knowledge of the hardware they were buying, the FX line for example; and they managed to overcome several fiascos throughout their history thanks to their fans. 3dfx also did good marketing in its time, but they did not bring their products to the market on time. The story may have been different if they had not had so many delays, even though their products were not the best, they had many loyal fans.
3dfx still has many fans, the example is the prices of the 3dfx cards that only have the GLide API as the unique feature, there are other cheaper and more powerful alternatives for D3D and OpenGL than the 3dfx cards, even the Nvidia FX ones, but still, for some illogical reason that I don't understand, many believe that a 3DFX is the essential card. And the only special 3dfx is the Voodoo 1, maybe the Voodoo 2, but when almost all the games already used D3D or OpenGL, the 3dfx cards lose a lot of relevance outside small details that, for me, do not justify their current price, everything On the contrary, they should be cheaper than their more powerful contemporaries, but it is what it is.
I only have Voodoo 1/2/3, the later ones don't seem interesting to me because they are very far from the Ati and Nvidia cards in performance in D3D and OpenGL, and I think that all the games that use GLide can be played without problems with the ones I have.

Reply 22 of 103, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
rasz_pl wrote:

I have no proof one way or the other, but I have a suspicion both Banshee and Voodoo3 do the same - that both are just miniaturized V2 architecture running with faster clocks, modified memory controllers and no other optimizations slightly frowning face

Well yeah, Banshee/Voodoo 3 essentially are just a Voodoo 2 on steroids with nicer image output (no blurry third-party GENDAC and new dithering filter).

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 23 of 103, by DrAnthony

User metadata
Rank Newbie
Rank
Newbie
havli wrote on 2024-04-11, 17:43:
Perhaps you have different information, but from what I know the Rampage ES boards did barely run and drivers were horrible too. […]
Show full quote
DrAnthony wrote on 2024-04-10, 14:03:

Mostly true there. The Spectre line was setup with a separate geometry chip (I vaguely remember Sage as the codename) with everything else on a separate chip (Rampage). The entry level line was 1 Sage and 1 Rampage and the high end was 1 Sage and 2 Rampage chips. From the benchmarks that have come out using ES hardware and pre-alpha drivers it was absolutely going to be a monster if it made it to market.

Perhaps you have different information, but from what I know the Rampage ES boards did barely run and drivers were horrible too.

Also I think the slowest Rampage was meant to be just the single Rampage chip - so no TnL and no Vertex Shader.

Anyway, considering the basic hardware specs, these cards wouldn't be that impressive in my opinion. 200-250 MHz 4 ROPS / 4 TMU chip made at 180nm process. I think we estimated many years ago the middle variant of 1x Rampage + 1x Sage could be around Radeon 9000 performance at best. That is just barely above GeForce2 Ultra and much slower than GF3. The 2+1 variant could be compatable to GF3 or Ti500 maybe, but at cost of much higher power consumption and huge PCB.

Also considering the first Rampage samples were barely working in november 2000, the cards could be ready for market in mid-2001 perhaps. So too late for any significant success.

I'll have to see if I can drag it back up but it sounds like the information you're referencing was from the work around 2012 or so when a few people figured out the dongle needed to get proper output (the levels were inverted) on the A0 boards and the very very basic drivers that didn't really expose the 3D engine. These were the ones like in that infamous picture with the c clamp holding the chip on the board at the bring up station. There was another wave a few years later (I want to say 2019 or so) with the A1 boards packing 2 Rampage chips and a Sage with the "late night" drivers talked about in the lead up to everyone getting laid off. The performance was well beyond any of the other first gen DX8 hardware and some tests were closer late DX8 early DX9 hardware (I want to say fill rate, but I know that's not right, it was something related to this one operation the hardware essentially did for free that could be exploited to tremendous success). The bummer is that there's only a handful of any Rampage chips, let alone assembled boards in existence and the collectors holding them aren't really the type to mess around with them.

Reply 24 of 103, by hornet1990

User metadata
Rank Newbie
Rank
Newbie
Hoping wrote on 2024-04-11, 20:35:

3dfx did not see the change in the appearance of T&L, but to their credit it can be said that they were not the only ones. Other companies besides 3dfx were also left behind for not being able to keep up with Nvidia and Ati. The PowerVR Kyro cards could have been good contenders if they had true HW T&L.

The Kyro line was going to get T&L - the STG4800 had software T&L in the driver (previewed to reviewers but never released). The follow-on STG5500 was to get hardware T&L, the STG6500 full DX8 vertex and pixel shaders (as I recall). However STMicro pulled the plug on the Graphics Products Division in 2002, and you can probably thank Nvidia's shady business practices for a good chunk of that decision... I also wonder if 3dfx hadn't bought STB and pushed most of their partners towards Nvidia whether Kyro would have had more of a chance.

Reply 25 of 103, by havli

User metadata
Rank Oldbie
Rank
Oldbie
DrAnthony wrote on 2024-04-12, 00:45:

I'll have to see if I can drag it back up but it sounds like the information you're referencing was from the work around 2012 or so when a few people figured out the dongle needed to get proper output (the levels were inverted) on the A0 boards and the very very basic drivers that didn't really expose the 3D engine. These were the ones like in that infamous picture with the c clamp holding the chip on the board at the bring up station. There was another wave a few years later (I want to say 2019 or so) with the A1 boards packing 2 Rampage chips and a Sage with the "late night" drivers talked about in the lead up to everyone getting laid off. The performance was well beyond any of the other first gen DX8 hardware and some tests were closer late DX8 early DX9 hardware (I want to say fill rate, but I know that's not right, it was something related to this one operation the hardware essentially did for free that could be exploited to tremendous success). The bummer is that there's only a handful of any Rampage chips, let alone assembled boards in existence and the collectors holding them aren't really the type to mess around with them.

Interesting, I would love to see some pictures or more info on the A1 board. Because as far as I know there is just one PCB design - with single Rampage chip and 32/64 MB of DDR memory. There are more pieces, let's say 5-10. Some are socketed, others have the chip soldered, but still the same board (most likely the A0 you refer). The VGA dongle just invert the colors, it shouldn't have any impact on the performance or function in general. https://web.archive.org/web/20210805060535fw_ … ampage_2012.htm

Then there is this picture, which is known to be fake http://www.3dfx.cz/rampage/spectre3000.jpg

The SAGE chips were manufactured, supposedly. There was even a dieshot, but if it is legit, who knows. However I seriously doubt it was ever mounted to a PCB and saw some action.

HW museum.cz - my collection of PC hardware

Reply 26 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

It'd be interesting to know the real prototypes which date has printed on the PCB and components to make some ideas about that. Even the quad VSA-100 boards ended up in an infinite testing cycle waiting to be obsolete at the end even more than they were at the beginning.

Reply 27 of 103, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Realistically, quad VSA-100 were never meant to be used in regular desktop environment.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 28 of 103, by Bruno128

User metadata
Rank Member
Rank
Member

Scalability thing applied to older rendering technique had little practical use back then with TnL around and DX8 programmable shaders looming later in the year.
It is now more than 20 years later that we can enjoy antialiased Glide because we can target (cherry-pick) very specific titles benefitting from it and ignore the others from the day.

Now playing: Red Faction on 2003 Acrylic build


SBEMU compatibility reports

Reply 29 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2024-04-12, 19:14:

Realistically, quad VSA-100 were never meant to be used in regular desktop environment.

I agree and I suppose at that point it didn't have or never had a real market position for the many limitations that design would meet in any real world consumer config. But at the end maybe not really that expensive to keep alive as a project more for the look and idea of a quad video chip card than something else.

Reply 30 of 103, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
hornet1990 wrote on 2024-04-12, 10:20:
Hoping wrote on 2024-04-11, 20:35:

3dfx did not see the change in the appearance of T&L, but to their credit it can be said that they were not the only ones. Other companies besides 3dfx were also left behind for not being able to keep up with Nvidia and Ati. The PowerVR Kyro cards could have been good contenders if they had true HW T&L.

The Kyro line was going to get T&L - the STG4800 had software T&L in the driver (previewed to reviewers but never released). The follow-on STG5500 was to get hardware T&L, the STG6500 full DX8 vertex and pixel shaders (as I recall). However STMicro pulled the plug on the Graphics Products Division in 2002, and you can probably thank Nvidia's shady business practices for a good chunk of that decision... I also wonder if 3dfx hadn't bought STB and pushed most of their partners towards Nvidia whether Kyro would have had more of a chance.

Yeah, this is interesting to think about. When 3dfx fell to Nvidia, it had to have an impact on all of the other players that left the 3D market shortly after. Both from a motivation standpoint ("If they bulldozed 3dfx within two years, we don't stand a chance in this market.") and from the standpoint of Nvidia having such a large portion of the graphics industry's revenue to reinvest.

Obviously it's impossible to know, but if 3dfx had been able to compete with the OEM-market staturation of the TNT2 M64, then the matched the features and performance of the Geforce 256, and then put out a competitive product during the Geforce 3-4 days when ATi was also having a decent amount of success with the Radeon and Radeon 8500, and then Nvidia still took the misstep of the FX series while getting trounced by ATi's 9500\9700 and then 9600\9800 series... that could have had a pretty huge impact on where things ended up. We'd have probably seen other players (PowerVR, SIS\XGI, S3, Matrox, maybe even Intel) stick around at least a little longer, unless having 3 successful players completely saturated the market and left the others with no market share anyway.

Judging from Nvidia's 20+ years of success, like it or not, the company just seems to be lead by people who are making good business decisions and good engineering decisions. I don't like the current pricing or the stuff that happened during the "dark days" a few years back, but it's hard to imagine any other companies holding back that amount of momentum forever.

Last edited by Ozzuneoj on 2024-04-13, 04:41. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 31 of 103, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Never released? Could've sworn the last Kyro2 drivers had the SW T&L in there. Giants recognized a HW T&L D3D rendering device after and bumpmapping worked. However it wasn't a stable feature and some games that required it crashed instead and the drivers did have registry keys to disable it on a per game basis.

HWT&L wasn't the only big feature being picked up by games at the time, there was also cubemapping gaining the same steam as the Geforce256 also did that. UT2003, NOLF2 and Doom3 used cubemapping plenty. Voodoo5 and Kyro also could not do those. DXTC was also the other thing among the rush 3dfx couldn't do.

apsosig.png
long live PCem

Reply 32 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

Also the "EnT&L" was later available even for the regular Kyro II with drivers and probably just like the usual Dx8 sw feature. Didn't even later 3dfx driver had some geometry assist option for what I remember?

Reply 34 of 103, by hornet1990

User metadata
Rank Newbie
Rank
Newbie
leileilol wrote on 2024-04-13, 04:40:

Never released? Could've sworn the last Kyro2 drivers had the SW T&L in there. Giants recognized a HW T&L D3D rendering device after and bumpmapping worked. However it wasn't a stable feature and some games that required it crashed instead and the drivers did have registry keys to disable it on a per game basis.

HWT&L wasn't the only big feature being picked up by games at the time, there was also cubemapping gaining the same steam as the Geforce256 also did that. UT2003, NOLF2 and Doom3 used cubemapping plenty. Voodoo5 and Kyro also could not do those. DXTC was also the other thing among the rush 3dfx couldn't do.

I was talking about the unreleased hardware STG4800, a higher clocked refresh of the Kyro2 that was sampled but never put into production in 2002, as part of the progression towards hardware T&L. I know the drivers had it for the released hardware, as it was a purely software solution implemented within the driver.

WRT cube mapping again that was due in the STG5500, in fact one of my last tasks at STM was adding support for dynamically generating cube maps to the demo framework as the functionality was to be used in the 5500 tech demo.

So if everything had stuck to schedule by the end of 2002 there would have been a Kyro with hardware T&L and cube mapping. Drivers were Imaginations area though so we could only hope they'd have upped their game (no pun intended!) to resolve a lot of the issues.

Reply 35 of 103, by Kruton 9000

User metadata
Rank Newbie
Rank
Newbie
Ozzuneoj wrote on 2024-04-13, 04:37:

... and then Nvidia still took the misstep of the FX series while getting trounced by ATi's 9500\9700 and then 9600\9800 series...

But FX series was developed by ex-3dfx engineers...

Reply 36 of 103, by havli

User metadata
Rank Oldbie
Rank
Oldbie
hornet1990 wrote on 2024-04-13, 07:38:

I was talking about the unreleased hardware STG4800, a higher clocked refresh of the Kyro2 that was sampled but never put into production in 2002, as part of the progression towards hardware T&L. I know the drivers had it for the released hardware, as it was a purely software solution implemented within the driver.

There are few STG4800 around the world in the hands of the collectors. And as far as I know it is just a 200 MHZ Kyro 2, nothing more - so it runs exactly the same as regular overclocked Kyro 2.

HW museum.cz - my collection of PC hardware

Reply 37 of 103, by hornet1990

User metadata
Rank Newbie
Rank
Newbie
havli wrote on 2024-04-13, 08:19:

There are few STG4800 around the world in the hands of the collectors. And as far as I know it is just a 200 MHZ Kyro 2, nothing more - so it runs exactly the same as regular overclocked Kyro 2.

IIRC samples were sent to reviewers so I’m not surprised some of them survived - these would likely have been from a pre-production run of the chips and boards (this occurred after I left so can’t be 100%). But yes, you’ve just exactly described a higher clocked refresh of the STG4500 that I said it was 😁

Reply 38 of 103, by 386SX

User metadata
Rank l33t
Rank
l33t

I remember previews and announcements were talking about a faster 15% "mixed intelligent" whatever sw CPU and 3D resources "T&L" solution... I suppose no silicon was involved into that so maybe just an optimized sw T&L running into both Kyro II and II SE versions.

Reply 39 of 103, by Dolenc

User metadata
Rank Member
Rank
Member

Just want to add.

Nowdays, you can pair a voodoo5 with a "from the future!" cpu, like a core2duo at 3ghz+, which is probably faster that what a hardware tnl unit would provide. And theres not much benefit to it. Not really my field, but my guess is, the main "bottleneck" for those cards is the connection between chips, which runs at pci66 speeds.