VOGONS


First post, by candle_86

User metadata
Rank l33t
Rank
l33t

So I ran my voodoo 3 through 3dmark2000 at which point it finished all the tests, so whats the actual hardware reason DX6 and Dx7 cards are diffrent its is just that DX7 class hardware has a TnL and if so why do some DX7 cards like the Radeon LE have no TnL

Reply 1 of 8, by dionb

User metadata
Rank l33t++
Rank
l33t++

DirectX is an API that defines drawing functions. Different versions do different things. A given card offers hardware support to some or all of those functions. Because it's not mandatory to support all functionality in hardware, it's possible that an old card can get support for a newer version of an API even if it doesn't natively support all the new functions. They then have to be handled in software, i.e. by the CPU.

The Voodoo 3 natively supports DirectX 6.0 in hardware, but there are (community-made) drivers available up to DirectX 9.0. The card itself isn't magically being upgraded, it just does what it can and the CPU does the rest. Consequence is that you end up being CPU-limited very quickly when running applications that actually use the newer API versions.

The same applies to a Radeon VE or 7000 (the LE did have T&L): it provides hardware support for most of DX7, but not the T&L functions, which the CPU needs to handle. That will make it the same speed as a similarly clocked 'regular' Radeon if T&L isn't used, but much slower and CPU-dependent if T&L is used.

Reply 2 of 8, by Jo22

User metadata
Rank l33t++
Rank
l33t++

I couldn't have said it any better. 😀
The only minor exception to this is/was DX 10 to 11 I believe.
There was a time when MS threw out the "cap bits" and required cards to support all DX (D3D?) features in hardware.
Anyway, I believe that changed again. That confusing "feature level" story is all water under the bridge now (or how it is said in English).

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 3 of 8, by candle_86

User metadata
Rank l33t
Rank
l33t

I understand that, buy why can the voodoo 3 complete 3dmark2000, I'm well aware about cap bits and new shaders ect in later DX versions, but was their no new rendering technique required by DX7 which is why DX6 class cards can run the same tests?

Reply 5 of 8, by candle_86

User metadata
Rank l33t
Rank
l33t
spiroyster wrote:

DX7 requires TnL ... this can be done in hw or sw. Since Voodoo3 doesn't support hw TnL, I'm gonna guess the driver does it in sw.

Thank you that's what I was trying to figure out, so bascially my Pentium III 1.26 is my TnL 😁

Reply 6 of 8, by spiroyster

User metadata
Rank Oldbie
Rank
Oldbie

Yep, in OpenGL land you can remove the gfx card and have your CPU be your entire OpenGL implementation, courtesy of MESA. Even supports Vulkan o.0 ... of course the phrase AZDO doesn't really apply in this case 😵

Reply 7 of 8, by silikone

User metadata
Rank Member
Rank
Member

As far as I am aware of, all DX7 class hardware also supports per-pixel lighting and dot product bump mapping.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 8 of 8, by Scali

User metadata
Rank l33t
Rank
l33t
Jo22 wrote:
I couldn't have said it any better. :) The only minor exception to this is/was DX 10 to 11 I believe. There was a time when MS t […]
Show full quote

I couldn't have said it any better. 😀
The only minor exception to this is/was DX 10 to 11 I believe.
There was a time when MS threw out the "cap bits" and required cards to support all DX (D3D?) features in hardware.
Anyway, I believe that changed again. That confusing "feature level" story is all water under the bridge now (or how it is said in English).

Not entirely...
I made an overview here: https://scalibq.wordpress.com/2012/12/07/dire … -compatibility/

In short, all DX versions have a certain minimum requirement featureset.
DX1 through 7 were loose enough that basically any kind of 3D accelerator could support the minimum featureset. As such, they all ran on the same basic driver, and the DX runtime just presented capability bits to the application, so it could decide what and how to render.

DX10 is the only API where the minimum featureset is equal to the entire API featureset.
DX11 actually has a lower minimum featureset than DX10, which is somewhat strange, since DX11 is a superset of the DX10 API. It looks like DX10 just wasn't fully finished (in the SDK you could find some 'downlevel' flags for DX10, but they did not work yet).
DX11 has 'downlevel' support down to basic DX9 SM2.0 hardware (even with no hardware T&L or vertex shaders, with a CPU fallback).

Of course in practice most applications did not support such a wide range of hardware, so eg for most DX9 games, you need more than the absolute minimum hardware. Commonly you need SM2.0 or SM3.0 support, even though DX9 can run even on pre-shader hardware.
Half-Life 2 is a fine example of DX9 being used to its full potential, supporting hardware from the original GeForce and Radeon up to full SM3.0 DX9.0c hardware.

Another interesting tidbit of detail about the D3D API is that D3D8 was the first version to be completely 'standalone'.
D3D7 and earlier were extensions of DirectDraw: You would use DirectDraw to create the basic graphics buffers that you render to, and if you had a 3d accelerator, you could query the Direct3D acceleration API, and use it to render on the DirectDraw buffers.
DX8 dropped support for DirectDraw altogether, and Direct3D was now a 'complete' API, where it managed its own buffers and everything.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/