VOGONS


Curious about Voodoo 3 and on

Topic actions

Reply 20 of 31, by nforce4max

User metadata
Rank l33t
Rank
l33t
Putas wrote:
idspispopd wrote:

Even Voodoo 2 SLI effectively only has 4 MB of texture memory (2 MB if you use 8 MB cards.)

Where do these claims come from?

Texture memory and frame buffer memory are two different things that is where the 4mb comes from as the frame buffer. The texture memory scales while the frame buffer mirrors, 4mb is the max frame buffer but the texture memory can scale up to 16mb per TMU according to the documentation from 3dfx. In practice a V2 can be modded for 8mb per TMU and it scale in sli. All the good that it will do is make for better 1024x800 performance and better settings but the frame buffer holds things back a little. The V3 typically uses between 4 and 5mb for frame buffer while the rest is texture. The V5 5500 will do 11mb for frame buffer and the rest is texture.

On a far away planet reading your posts in the year 10,191.

Reply 21 of 31, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Forevermore wrote:

I suppose, as I am only going off benchmarks. And youre right about the quality of DACs in GF2 cards. Most of them are hopeless. TV out is even worse.

The GF256 was highly overrated, a TNT2 Ultra had parity with it in most games of the era.

The GF256 was actually pretty revolutionary for the time. The problem with being a revolutionary, though, is that it sometimes takes a while for others accept your ideas as the new normal. It wouldn't be until the DX8 era was in full swing that hardware T&L finally caught on. Those early GeForce cards were also hamstrung by a bad memory architecture that didn't allow them to use the full potential fill rates. So, yeah, some older cards with lower fill rates could still keep up in many instances but that doesn't diminish the significance of the original GeForce on the video card market because the new ideas they introduced were eventually accepted as mainstream even though they weren't very useful in their own time.

Reply 22 of 31, by Forevermore

User metadata
Rank Member
Rank
Member
sliderider wrote:

The GF256 was actually pretty revolutionary for the time. The problem with being a revolutionary, though, is that it sometimes takes a while for others accept your ideas as the new normal. It wouldn't be until the DX8 era was in full swing that hardware T&L finally caught on. Those early GeForce cards were also hamstrung by a bad memory architecture that didn't allow them to use the full potential fill rates. So, yeah, some older cards with lower fill rates could still keep up in many instances but that doesn't diminish the significance of the original GeForce on the video card market because the new ideas they introduced were eventually accepted as mainstream even though they weren't very useful in their own time.

Oh of course, Im certainly not denying the GF256 significance in that aspect (even though HW T&L wasn't a new idea at the time). I was just referring how glorified it was at the time simply based on the fact it had HW T&L.

I personally think the G400 was more revolutionary for its day. Matrox had a lot of good ideas, just never really exploited them.

So many combinations to make, so few cases to put them in.

Reply 23 of 31, by Davros

User metadata
Rank l33t
Rank
l33t

Its 4 mb of texture memory on 8mb cards

@putas
from wiki
"RAM configured into 4 Megabytes for frame buffer(s) and Z-buffer and 4 or 8 Megabytes texture memory."
remember if you have 2 12mb cards you have 2x12mb not 24mb

Guardian of the Sacred Five Terabyte's of Gaming Goodness

Reply 24 of 31, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
Putas wrote:
idspispopd wrote:

Even Voodoo 2 SLI effectively only has 4 MB of texture memory (2 MB if you use 8 MB cards.)

Where do these claims come from?

You have a site about old GPUs and don't know about this? 😀

Carmack explained it well:

John Carmack's .plan, Feb 16 1998 wrote:
An 8mb v2 has 2 mb of texture memory on each TMU. That is not as general as the current 6mb v1 cards that have 4 mb of texture m […]
Show full quote

An 8mb v2 has 2 mb of texture memory on each TMU. That is not as general as the current 6mb v1 cards that have 4 mb of texture memory on a single TMU. To use the multitexture capability, textures are restricted to being on one or the other TMU (simplifying a bit here). There is some benefit over only having 2 mb of memory, but it isn’t double. You will see more texture swapping in quake on an 8mb voodoo 2 than you would on a 6mb voodoo 1. However, the texture swapping is several times faster, so it isn’t necessarily all that bad.

If you use the 8 bit palettized textures, there will probably not be any noticable speed improvement with a 12 mb voodoo 2 vs an 8 mb one. The situation that would most stress it would be an active deathmatch that had players using every skin. You might see a difference there.

A game that uses multitexture and 16 bit textures for everything will stress a 4/2/2 voodoo layout. Several of the Quake engine licensees are using full 16 bit textures, and should perform better on a 4/4/4 card.

The differences probably won’t show as significant on timedemo numbers, but they will be felt as little one frame hitches here and there.

Reply 25 of 31, by idspispopd

User metadata
Rank Oldbie
Rank
Oldbie

Based on d1stortion's post I did some more searching. I now think that it's not as simple as I stated but that the 4MB texture memory on a Voodoo2 8MB are not as valuable as 4MB texture memory on a card with a unified memory architecture (ie. most other cards).
Putas, you even agreed here about the texture duplication issue on a single Voodoo2:
3DFX Voodoo2 vs Voodoo1 in Lower-End Pentiums - A Measure of Smoothness?

Putas wrote:

You are right Swaaye, I forgot they have it divided per tmu. If it has to duplicate everything then Voodoo2 is not very forward looking.

Some more interesting sources (referring to Voodoo1 which can also use two TMUs):
http://www.gamers.org/dEngine/xf3D/howto/3Dfx … TO-6.html#ss6.7

As each Texelfx can address 4MB texture memory, a dual Texelfx setup has an effective texture cache of up to 8MB. This can be true even if only one Texelfx is actually needed by a particular application, as textures can be distributed to both Texelfx, which are used depending on the requested texture. Both Texelfx are used together to perform certain operations as trilinear filtering and illumination texture/lightmap passes (e.g. in glQuake) in a single pass instead of the two passes that are required with only one Texelfx. To actually exploit the theoretically available speedup and cache size increase, a Glide application has to use both Texelfx properly.

The two Texelfx can not be used separately to each draw a textured triangle at the same time. A triangle is always drawn using whatever the current setup is, which can be to use both Texelfx for a single pass operation combining two textures, or one Texelfx for only a single texture. Each Texelfx can only access its own memory.

http://www.gamers.org/dEngine/xf3D/howto/3Dfx … -10.html#ss10.9

According to John Carmack: "If you are using multitexture, some textures have to be on opposite TMUs - in quake's case, all the environment textures must be on the bottom TMU and all the lightmaps must be on the upper TMU. Models skins could be in either, but it isn't optimally sorted out, so there is a definite packing loss."

(Couldn't find another source for this statement, though. Still sounds plausible.)

With SLI it's pretty obvious that textures have to be duplicated on both boards. One board renders the even scanlines, the other board renders the odd scanlines, the image is composed on one board which receives the missing scanlines from the other board over the SLI cable as analog data. The TMUs on one board don't have access to the texture memory on the other board.
If one board would render the upper half of the screen and the other the lower half of the screen it might be possible that both wouldn't need the same textures, but with scan line interleave that won't work.

@nforce4max: Frame buffer doesn't mirror. That's the reason why two Voodoo2 in SLI can display a 1024x768 resolution which a single Voodoo2 can't.

See also the Wikipedia article on Voodoo2 for these statements:

Voodoo2 SLI not only doubled rendering throughput, it also increased the total framebuffer memory, and thus the maximum supported screen resolution increased to a then-impressive 1024×768. However, texture memory was not doubled because each card needed to duplicate the scene data.

Reply 26 of 31, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

The duplication of textures in SLI setup is no-brainer. However as seen from the Quantum3D FAQ, single Voodoo card with multiple TMUs does not have to duplicate. I think the statement from JC was poorly worded or misinterpreted. Total usable capacity with casual textures will be just hair below unified texture memory. Regarding my post from 2012-1-29, I changed my mind since then.

Reply 27 of 31, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I wonder if there is a texturing test that could demonstrate Voodoo2's effective capacity... The problem is it isn't 8MB so you'd need some granularity to try lesser values like 7MB perhaps. Most benchmarks don't allow that AFAIK.

John Carmack's post seems to imply that getting the most effective storage space requires optimization for how the two TMUs interact though

Reply 28 of 31, by IInuyasha74

User metadata
Rank Newbie
Rank
Newbie

Hello Everyone,

I have found my way here a few times in the past but didn't ever make out exactly what this forum was before. Reading through this post I decided to go ahead and make an account.

I might be one of the youngest people to have a retro gaming system of this age. I was only 10 at oldest when the Voodoo3 came out, so I don't remember it, but I just wanted to support the idea of getting one if you are putting together a retro system for gaming. I have a Voodoo3 at current, and a big part of that is because of the price of other Voodoo cards. I never owned one until this year, but with everything always being Nvidia, ATI, AMD, and Intel I love finding parts from other companies. 3dfx Voodoo I think might be the most highly held graphics cards of all time as far as nostalgia goes, because they were so revolutionary at first, and so expensive it was like really huge if you had one. I always wanted a Voodoo2 SLI pair, but they are really hard to find and expensive if you do. Still, from playing Monkey Island I couldn't pass on having a graphics card that was actually named "Voodoo", so I bought a Voodoo3 3000.

It is stuck in 16-bit color, with a kind of fake 20-bit like color in all truth. The card is pretty much just the two TMU's off of a Voodoo2 but clocked higher and with a 2D graphics accelerator onboard. From my understanding the Voodoo3 2000 is closest in 3D performance to a single Voodoo2 but still surpasses it as a result of having 1.5 times faster clock speed, faster RAM, and better architecture. Considering that, it makes sense a Voodoo3 would be roughly the same as two Voodoo2 cards in SLI.

Has excellent 3rd party drivers out for it on a dedicated webpage all about 3dfx cards. Mine also manages to overclock really well, without touching the voltage I raised it a minor 175Mhz, and after installing RAM heat syncs I plan to one day when I start playing with it to try to push it up to Voodoo3 3500 level. The Voodoo3 3500 has better quality RAM, but with good cooling the Voodoo3 3000 can possible match the stock 3500 settings.

I actually built this for playing games like from 1990, and I don't know what I was thinking because its so overpowered for what I use it for. I didn't realize the original Diablo was so old, but I was amazed I could play it and with ease. If you are deciding on buying one, it is a really cheap way to get an excellent system for older games, but I would just say think what you want to play. It plays everything I have tried really well, but with the exception of Monkey Island 4, which functions terrible. I suspect because of the lack of true 32-bit Color support.

Reply 29 of 31, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

The post filter makes V3 my preferred card of the generation. You get quality between high and true colors while running at 16 bit speeds. For me such 1024x768x16 always beats 800x600x32.

Reply 30 of 31, by NitroX infinity

User metadata
Rank Member
Rank
Member
IInuyasha74 wrote:
Hello Everyone, ,,, […]
Show full quote

Hello Everyone,
,,,

It is stuck in 16-bit color, with a kind of fake 20-bit like color in all truth.

...

It plays everything I have tried really well, but with the exception of Monkey Island 4, which functions terrible. I suspect because of the lack of true 32-bit Color support.

The graphics chip renders the image in 24bit but the output is converted to 16bit. Should look better than 16bit on any other card. If you want 32bit, you'd have to go with a Voodoo4 4500. Which has other benefits too (FSAA, support for larger textures, more memory, AGP4x {but only 3dfx' own V4 has AGP4x])

NitroX infinity's 3D Accelerators Arena | Yamaha RPA YGV611 & RPA2 YGV612 Info

Reply 31 of 31, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The Voodoo converts from 24-bit to 16-bit, and then does a DAC filter to try and estimate a 24-bit picture out of the dither (leading to blurriness and unintentional edge smoothing in places).... Voodoo3 has extensive control on what filter to use in this case. Shame it doesn't support 24/32-bit textures...

apsosig.png
long live PCem