VOGONS

Common searches


First post, by Presbytier

User metadata
Rank Member
Rank
Member

I was looking at picking up the Kyro II based prophet 4500 for a win98 gaming PC. One thing i am having a hard time finding is what its exact Directx support level is. Now i could care less about DX9 I would like to get a card with at least 8.1 support. If anyone knows or can point me to documentation for this it would be helpful?

"Never pay more than 20 dollars for a computer game" - Guybrush Threepwood

Reply 1 of 15, by kixs

User metadata
Rank l33t
Rank
l33t

From Wikipedia:

"Basically it is a 'false DirectX 7' card that supports just DirectX 6 features."

It has software implementation of T&L.

Requests are also possible... /msg kixs

Reply 2 of 15, by leileilol

User metadata
Rank l33t++
Rank
l33t++

If you want 8.1 FEATURE support (pixel shading et al) head for a Radeon 8500 or a Geforce4 Ti at the least.

If you want 8.1 API support, Kyro 2 will be fine.

Note that the software T&L wasn't implemented until a 2003+ driver. It was popular to have 3danalyze running with some games to start with the card at all (like bf1942 / nolf2). Much of the T&L compromise was marketed with the then unique tiled rendering, hidden surface removal, overdraw mitigation and "true high colour" which is a 32-bit buffer put through a 16-bit post-dither+filter making it as handy as a Voodoo3-5 for the eternally 16-bit 3d games.

apsosig.png
long live PCem

Reply 3 of 15, by Presbytier

User metadata
Rank Member
Rank
Member

The majority of the games I'd be playing definitely appear to be DX6 or OpenGL so it does not seem DX8 support will be a big deal. I was mostly looking at this card because 1: it's different (I had a voodoo 3 back then) 2: the 32 bit dithering effect intrigued me 3: it's pretty cheap unlike the voodoo cards which cost a ridiculous amount IMO. I'm trying not to spend more than 150 on this project.

"Never pay more than 20 dollars for a computer game" - Guybrush Threepwood

Reply 4 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
kixs wrote:

From Wikipedia:

"Basically it is a 'false DirectX 7' card that supports just DirectX 6 features."

It has software implementation of T&L.

I think that is somewhat of a misnomer.
Hardware T&L is only one of the features introduced in DX7.
The Kyro II supports various things only supported in DX7 (or later).
Also, hardware T&L is not a requirement for DX7 or higher (Intel even produced a number of chips with PS2.0, but no hardware T&L whatsoever), so 'false DirectX 7' is a weird way to describe the card.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 5 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

Note that the software T&L wasn't implemented until a 2003+ driver. It was popular to have 3danalyze running with some games to start with the card at all (like bf1942 / nolf2). Much of the T&L compromise was marketed with the then unique tiled rendering, hidden surface removal, overdraw mitigation and "true high colour" which is a 32-bit buffer put through a 16-bit post-dither+filter making it as handy as a Voodoo3-5 for the eternally 16-bit 3d games.

The thing is basically that you got GeForce2-like performance for a fraction of the price. At least, in games of that era, which didn't benefit much from T&L yet (initially many reviewers questioned the usefulness of T&L on the GeForce series). In terms of fillrate, texturing and shading, the Kyro II was a good match for GeForce2.
As games started to push more geometry (eg Far Cry), Kyro II fell behind.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 6 of 15, by bakcom

User metadata
Rank Newbie
Rank
Newbie

What do you want it for? If for general-purpose gaming, Kyros are non-standard cards that will likely have frequent compatibility problems. GeForce 4 Ti is a much better idea for Win98 gaming. Maybe a Radeon 9500 Pro, but I don't know what's the Win98 driver situation there.

Check out this older thread on Win98 cards.

Reply 7 of 15, by Presbytier

User metadata
Rank Member
Rank
Member
bakcom wrote:

What do you want it for? If for general-purpose gaming, Kyros are non-standard cards that will likely have frequent compatibility problems. GeForce 4 Ti is a much better idea for Win98 gaming. Maybe a Radeon 9500 Pro, but I don't know what's the Win98 driver situation there.

Check out this older thread on Win98 cards.

From what I have gleaned from them they had solid drivers without much compatibility issues. Uses standard OpenGL and DirectX no weird proprietary API or Wrappers. The main thing here is that 16bit games will look better with this card and it has solid 32bit performance based on benchmarks from that period. I am not looking at the most powerful card, but these are pretty plentiful on eBay and don't cost nearly as much as the equivalent 3Dfx cards which are now way over priced.

"Never pay more than 20 dollars for a computer game" - Guybrush Threepwood

Reply 8 of 15, by leileilol

User metadata
Rank l33t++
Rank
l33t++

It does have a texture filter precision issue though. Some extremely stretched out textures may jitter and get blocky, which can happen in many 96-98 games, like on big blurry terrain for example

apsosig.png
long live PCem

Reply 9 of 15, by Arctic

User metadata
Rank Oldbie
Rank
Oldbie

I bought a Kyro 2 (3D Prophet 4500) when it came out.
I had to return the card 2 times because of a faulty fan but I never had to complain about the performance or the drivers.

The last drivers were really cool. Battlefield 1942 ran fine on a Duron 1200 😀

I had worse problems with ATi cards.
No matter which ATi generation (8500LE, 9600XT, X800) I threw at "Die Hard: Nakatomi Plaza", I had glitches everytime.

Reply 10 of 15, by bakcom

User metadata
Rank Newbie
Rank
Newbie
Presbytier wrote:

Uses standard OpenGL and DirectX no weird proprietary API or Wrappers.

Supporting D3D/OGL is not the same as supporting it well, particularly when game developers didn't check on this card.

The main thing here is that 16bit games will look better with this card and it has solid 32bit performance based on benchmarks from that period. I am not looking at the most powerful card, but these are pretty plentiful on eBay and don't cost nearly as much as the equivalent 3Dfx cards which are now way over priced.

This won't help with Glide games, if that's what you mean. For that you'd either need a Voodoo, or maybe modern Glide wrappers can do the job.

Reply 12 of 15, by Presbytier

User metadata
Rank Member
Rank
Member

Also very few glide only games worth playing, so not too worried about glide support, at least not worried enough to pay the prices people are asking for a voodoo card

"Never pay more than 20 dollars for a computer game" - Guybrush Threepwood

Reply 13 of 15, by bakcom

User metadata
Rank Newbie
Rank
Newbie

What are some examples of 16-bit resource games?

But how's dithering helpful if the game uses 16-bit only resources? If they are 32-bit but the game sets the render output to 16-bit mode, isn't it better to just use some wrapper to force 32-bit output?

Reply 14 of 15, by Presbytier

User metadata
Rank Member
Rank
Member
bakcom wrote:

What are some examples of 16-bit resource games?

But how's dithering helpful if the game uses 16-bit only resources? If they are 32-bit but the game sets the render output to 16-bit mode, isn't it better to just use some wrapper to force 32-bit output?

That is what the card did (like the later Voodoo cards just better), All coloring is internally done in 32bits and then down-sampled so even a 16 bit color pattern can be enhanced or a 32bit game can have a bit of a speed bump by running in 16bit with minimal visual impact.

EDIT: FROM THE HARDOCP Review "Also because of the nature of Imagination's Tile Based Rendering, everything is rendered at 32bit true color on the chip, and then only if you have 16bit color selected in the game does it dither it down to 16bit. Because of the internal 32bit color even it's 16bit colors are very very sharp and precise, producing excellent image quality and not much of a performance hit going from 16bit to 32bit colors."

"Never pay more than 20 dollars for a computer game" - Guybrush Threepwood

Reply 15 of 15, by Scali

User metadata
Rank l33t
Rank
l33t
bakcom wrote:

But how's dithering helpful if the game uses 16-bit only resources? If they are 32-bit but the game sets the render output to 16-bit mode, isn't it better to just use some wrapper to force 32-bit output?

16-bit doesn't have enough precision for doing actual alphablending. Therefore 16-bit renderers generally used dithering patterns to 'fake' alphablending.
The Kyro II doesn't need to do this, as it always renders with 32-bit precision internally. 16-bit resources are converted to 32-bit on-the-fly when textures are sampled. If you use a 16-bit videomode, then the resulting 32-bit rendered tiles are converted back to 16-bit on-the-fly when stored in VRAM.
This process gives you better image quality than traditional 16-bit rendering.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/