VOGONS


First post, by villeneuve

User metadata
Rank Newbie
Rank
Newbie

I just read about the internal true color rendering the Kyro2 had. Can anybody provide 16 bit screenshots of Incoming running on the Kyro2? And does anyone know if Nvidia, Ati or another company like Matrox did implement that feature in their products as well or is Kyro2's 16 bit rendering the best there ever was?
Since christmas were doing a 2 player Windows 9x LAN session every few days mostly playing StarLancer in Co-Op both using Quadro 4 980 XGL cards and when you look towards a bright sky these dithering dots that can also be seen on the 16-bit screenshots of the TNT in this thread look pretty bad. I only used Voodoo Banshee and Voodoo5 back in the days and don't remember ever having that issue. So it looks like at least in the GeForce 4/Quadro 4 days Nvidia's 16 bit rendering still looked as bad. Considering 16 bit rendering games already were outdated by then I wonder if despite that Nvidia and others ever bothered to improve their 16 bit rendering quality at that point.

Reply 1 of 6, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

Some drivers can force 32-bit color depth or 24-bit Z-buffer. For example, Voodoo 4/5 can force 32-bit color in Glide/OpenGL games. Enabling anti-aliasing also can drastically reduce dithering effect.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 2 of 6, by villeneuve

User metadata
Rank Newbie
Rank
Newbie

In terms of AA we have it cranked up to the max in the Nvidia drivers, though I'm not sure which options the drivers offer in regards to the other things you mentioned. I'll have a look during our next session.
I did force 32-bit on my Voodoo5 back then. While I don't remember any negative effects in terms of picture quality are there any known ones?
Apart from all that I'm still interested if any brand after the Kyro2's release improved their 16 bit rendering natively without having to rely on FSAA/SSAA or so.

Reply 3 of 6, by 386SX

User metadata
Rank l33t
Rank
l33t

Even if imho the whole 16bit vs 32bit back in those days was quite a marketing self convinced need (but still necessary sooner or later), the Kyro II definetely did a great job with 16bit native games and I do remember quite clearly the differences. A game where I immediately could see were the ones where high contrast lights showed the palette limits, like games with night environment. For example Thief/Thief II. The Kyro II was awesome and where other cards showed the usual light 16bit "problems" with that card the difference were clearly visible.

Reply 4 of 6, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

That depends. You can crank up resolution in 16-bit color with GeForce card. Dithering is barely noticeable in 1600x1200 mode, especially on CRT display, and performance should be still good in most games.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 5 of 6, by appiah4

User metadata
Rank l33t++
Rank
l33t++

Could a mod please split the kyro discussion into a new thread, the thread necromancy here is.. just wow.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 6 of 6, by Stiletto

User metadata
Rank l33t++
Rank
l33t++
appiah4 wrote on 2021-06-22, 19:03:

Could a mod please split the kyro discussion into a new thread, the thread necromancy here is.. just wow.

Done. 😀

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto