VOGONS


Retro 3D Accelerator Screenshot Collection

Topic actions

Reply 120 of 353, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Is it even slow in Motorhead? That game ran particularily smooth on Rage.

apsosig.png
long live PCem

Reply 121 of 353, by TheLazy1

User metadata
Rank Member
Rank
Member

No idea, if there is a demo I'll try it but my D3D games library is pretty low.

Reply 122 of 353, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The demo is software only and has no fun in it, so nevermind

apsosig.png
long live PCem

Reply 123 of 353, by F2bnp

User metadata
Rank l33t
Rank
l33t

I tried the SiS 6326 and I was quite surprised by how compatible it is. All Direct3D games that I tried where displayed correctly with almost no graphical bugs! Quake III also ran and at 320x240 and everything on the lowest it would get around 13 fps?
I've taken some screenshots which I will upload later. I think as regards the speed, it is a little bit faster than the Millenium 2.

Reply 124 of 353, by sprcorreia

User metadata
Rank Oldbie
Rank
Oldbie

It would be nice to see a review for the ATI Rage Fury (128). It seems that the Pro version was much better, but i would like to know how Fury holds against the TNT.

Reply 125 of 353, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I ran a Rage 128 Pro, which is the same chip as Rage Fury Pro. The 3D Rage Pro is different and I don't have one (not interested in them).

Actually there are two distinct Rage 128 chips. The Rage 128 and the Rage 128 Pro. The original chip has some problems and missing features compared to the Pro edition.

http://www.xbitlabs.com/articles/video/displa … ti-furypro.html

Reply 126 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

The TNT also had horrible 16-bit dithering, but it could run in true 32-bit before 3dfx cards could (and at a playable framerate).

Reply 127 of 353, by swaaye

User metadata
Rank l33t++
Rank
l33t++

TNT's 16-bit is not even remotely as bad as a Rage 128 put outs. The ATI chip causes a sort of noisy banding/posterization. Riva 128 isn't even as bad and Riva TNT is much better than Riva 128.

Reply 128 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

I remember that some games in 16-bit color had a gridlike pattern over the whole screen on my Viper V550, but it wasn't there when running in 32-bit color. Much more distracting than your banding example. I can't find any examples, though, so maybe it was fixed at the driver level.

Reply 129 of 353, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Edited

Reply 130 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

Yep that's why my first hardware-accelerated 3D video card was a TNT. People were talking about it as a serious competitor to the 3dfx empire.

I got it right around when Epic was first adding support for Direct3D rendering to Unreal, so Unreal had a lot of issues on my TNT for a while.

Reply 131 of 353, by sliderider

User metadata
Rank l33t++
Rank
l33t++
DonutKing wrote:
Some cool info in this thread. […]
Show full quote

Some cool info in this thread.

Kind of a bit newer than most of what is being discussed but does anybody remember the Xabre series of cards? I owned the Xabre 400 for a while... they were terrible, they could barely outperform a Geforce 2, and had terrible texture filtering so games were a soupy mess. As soon as you turned any sort of texture filtering on everything started chugging.
They claimed to support DX8 vertex and pixel shaders but this was done in software, the only game I saw this work in was Morrowind and it was like rowing a boat through a sea of molasses when it was enabled.
They also didn't work with VIA chipsets which is how I ended up with it- a mate with a KT333 board bought it, and when it didn't work I bought it off him cheap, to upgrade my TNT2 Vanta (another dog of a card) Didn't hang on to it for long though.

Prior to the Vanta I had an S3 Virge GX which never worked properly in hardware mode. A driver update might have helped it but whenever I enabled hardware mode all the wall textures went white with a very thin strip of rainbow colours at the bottom edge- very strange. Character models seemed ok though. All the DXdiag tests were fine so I don't think it was a faulty card.

So basically I had 3 crappy cards in a row 😜

I've had a Xabre 600 on my "to get" list for a while now. They don't come up very often. SiS is better known for it's integrated graphics than for it's dedicated cards. I don't think there's a lot of them out there because they don't come up for sale that frequently.

On a side note, does anyone know if the Xabre II DX9 SiS was supposed to be working on was ever released in any form, integrated or otherwise?

Reply 132 of 353, by DonutKing

User metadata
Rank Oldbie
Rank
Oldbie

I remember that some games in 16-bit color had a gridlike pattern over the whole screen

I remember seeing this on my TNT2 Vanta, in Unreal Tournament, that game let you pick different renderers such as DirectX or OpenGL and I got that grid pattern in one mode or the other. It was very faint, not like black or white lines but rather it was like a semitransparent mesh had been placed over the screen. Swapping to the other renderer got rid of it. Didn't ever notice it on any other games.

On a side note, does anyone know if the Xabre II DX9 SiS was supposed to be working on was ever released in any form, integrated or otherwise?

Was that the same thing as the Volari?

If you are squeamish, don't prod the beach rubble.

Reply 133 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++
DonutKing wrote:

I remember that some games in 16-bit color had a gridlike pattern over the whole screen

I remember seeing this on my TNT2 Vanta, in Unreal Tournament, that game let you pick different renderers such as DirectX or OpenGL and I got that grid pattern in one mode or the other. It was very faint, not like black or white lines but rather it was like a semitransparent mesh had been placed over the screen. Swapping to the other renderer got rid of it. Didn't ever notice it on any other games.

Yes, that sounds like what I remember!

Reply 134 of 353, by sprcorreia

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

I ran a Rage 128 Pro, which is the same chip as Rage Fury Pro. The 3D Rage Pro is different and I don't have one (not interested in them).

Actually there are two distinct Rage 128 chips. The Rage 128 and the Rage 128 Pro. The original chip has some problems and missing features compared to the Pro edition.

http://www.xbitlabs.com/articles/video/displa … ti-furypro.html

My point exactly. I was hoping for a non Pro review, but after reading it seems the regular 128 is below TNT.

Reply 135 of 353, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

I ran a Rage 128 Pro, which is the same chip as Rage Fury Pro. The 3D Rage Pro is different and I don't have one (not interested in them).

Actually there are two distinct Rage 128 chips. The Rage 128 and the Rage 128 Pro. The original chip has some problems and missing features compared to the Pro edition.

http://www.xbitlabs.com/articles/video/displa … ti-furypro.html

I got a bulk deal on a bunch of Rage 128 Pro Mac PCI cards for all my beige machines a while back, only $5 each, so I got like 7 or 8 of them. They are the ones that used to be available as an extra cost option on the Blue and White Powermac G3 with the DVD decoder card that only works under OS X. That was the last ATi chipset prior to the Radeon, which I can't afford even today. Mac cards you have to buy fast when you see them cheap because they normally sell for way too much money and there's a lot of competition for them. Even a first generation Radeon goes for at least $20 when you can find one, which is a lot when you have as many machines as I have to upgrade. I'll have to be satisfied with the Rage 128 Pro performance until I can afford something better.

My G4 Powermac came with a AGP Rage 128 Pro but I didn't use it long because it couldn't handle the games I was playing. 16mb wasn't enough texture memory for Everquest, which I was heavily into at the time. The objects would display, but they would be all white because the textures wouldn't load, other than that the game was still playable. That's when I found a flashed Radeon 9700.

Reply 136 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

"flashed"?

Reply 137 of 353, by sliderider

User metadata
Rank l33t++
Rank
l33t++
HunterZ wrote:

"flashed"?

Yes. Some PC video cards can have the BIOS flashed with a Mac rom and will work in a Mac after.

Reply 138 of 353, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

Didn't know Mac video cards needed different stuff in their BIOS, but I guess it makes sense as a way to force people to pay the Apple Tax.

Reply 139 of 353, by h-a-l-9000

User metadata
Rank DOSBox Author
Rank
DOSBox Author

A video BIOS has processor-specific code 😉

1+1=10