VOGONS


Reply 100 of 120, by leileilol

User metadata
Rank l33t++
Rank
l33t++

That was for their then-shadered-up 1.09 menu, in the final release they didn't actually have a difference for the rage pro (all cards get the static backgrounds). They only force white smoke for the rocket/grenade trails instead (to work around ugly/non-functional alpha modulation in the rage pro drivers, and it's white because I believe they forgot to update the texture)

apsosig.png
long live PCem

Reply 101 of 120, by subhuman@xgtx

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote:

especially this bullshit one!!! Stinks of 3dfx fan revisionist history.

Refusing to set Trilinear on a Voodoo.

😉

Edit: Aren't VSA-100 cards (Besides specific revisions of Quantum3D AAlchemy cards) incapable of applying Trilinear filtering to a scene whenever multitextuing is used?

7fbns0.png

tbh9k2-6.png

Reply 102 of 120, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
kithylin wrote:
leileilol wrote:
Yep. Let's look real close into a zoomed 32x32 32-bit texture (swizzled to rgba4444 don't tell anyone) for The Truth(tm) […]
Show full quote

Yep. Let's look real close into a zoomed 32x32 32-bit texture (swizzled to rgba4444 don't tell anyone) for The Truth(tm)

(also full disclosure this engine is id tech3 and there's no overbrights enabled here therefore it's as accurate as accurate can be)

v3filter.png

Therefore by fact i'm right and everyone using these cards are idoits for using low quality filter product!!!!!!!!!!

Except of course no one can see the actual sky texture zoomed in that far. So your point is academic at best. In normal game play you view it from a distance. Which is all that matters. From normal view distance the sky looks perfectly fine, over on the voodoo side at least, in the above screenshot. Sorry but no one cares how your sky textures look when you zoom in under a microscope. All anyone actually cares about is how it looks when actually playing the game.

This is actually not true. The technical details are kinda interesting. Having a very simple way where one solves a "complex" problem is pretty much genius. I think it's actually good to dig deeper, figuring out how stuff really works and yes, there are people who care about this stuff without being academic people, so to way 😜

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 103 of 120, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Regarding those UT shots - remember that Epic's D3D renderer has bugs that vary across GPUs. That was an endless work in progress with regressions at every step. Might want to try with UTGLR which can run even on GeForce 256 in Windows 98, though with Win98 you need UTGLR version 3.4.

And with respect to DXT1 uglies on GeForce 256 through GeForce 3, there is some info here:
https://web.archive.org/web/20040225122914/ht … blaster-3.shtml
GeForce 4 finally addressed this problem. I'm not sure if GF4 MX is included in that.

Oh and Radeon 9000/9200/9100/8500's AF is only functional with bilinear filtering and has extreme angle dependence as mentioned. Which of course isn't exactly ideal and the card's filtering looks pretty terrible sometimes compared to the R300 generation. And also OTOH the Radeon 9500/9700 isn't capable of speed-optimized trilinear filtering (aka brilinear), unlike Radeon 9600/9800, so mip map transitions might be smoother with the 9500/9700.

Reply 104 of 120, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

remember that Epic's D3D renderer

Epic's D3D has nothing to do with it, I've used more modern D3D8 and D3D10 renderers. Unreal Tournament lightmaps are very lowres and any kind of filtering "optimisation" will be clearly visible.

It probably would be interesting to test TNT2 card with and without linear filtering in Quake 3, to measure performance loss and compare it to GeForce 256 loss after. Perhaps it will show why Nvidia dropped quality so drastically.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 106 of 120, by Reputator

User metadata
Rank Member
Rank
Member
Tetrium wrote:
kithylin wrote:
leileilol wrote:
Yep. Let's look real close into a zoomed 32x32 32-bit texture (swizzled to rgba4444 don't tell anyone) for The Truth(tm) […]
Show full quote

Yep. Let's look real close into a zoomed 32x32 32-bit texture (swizzled to rgba4444 don't tell anyone) for The Truth(tm)

(also full disclosure this engine is id tech3 and there's no overbrights enabled here therefore it's as accurate as accurate can be)

v3filter.png

Therefore by fact i'm right and everyone using these cards are idoits for using low quality filter product!!!!!!!!!!

Except of course no one can see the actual sky texture zoomed in that far. So your point is academic at best. In normal game play you view it from a distance. Which is all that matters. From normal view distance the sky looks perfectly fine, over on the voodoo side at least, in the above screenshot. Sorry but no one cares how your sky textures look when you zoom in under a microscope. All anyone actually cares about is how it looks when actually playing the game.

This is actually not true. The technical details are kinda interesting. Having a very simple way where one solves a "complex" problem is pretty much genius. I think it's actually good to dig deeper, figuring out how stuff really works and yes, there are people who care about this stuff without being academic people, so to way 😜

I agree that digging deeper is interesting. It's something I enjoy doing as well. But there are some in this thread that are characterizing one pattern of banding as terrible compared to another pattern of equally apparent banding. I just don't buy it. It's a rather sensationalized way of looking at things, needlessly politicizing the issue.

https://www.youtube.com/c/PixelPipes
Graphics Card Database

Reply 107 of 120, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I didn't realize this is the thread from awhile back with Phil's VGA and DVI captures that were quite contradictory to what is otherwise being presented. I somewhat want to put together some hardware to look at it myself. Time and effort though.

Reply 108 of 120, by firage

User metadata
Rank Oldbie
Rank
Oldbie

Phil's captures were of a different level with a different texture, though. And people should be careful to differentiate between the comparisons made here and the shots someone posted from an old texture compression comparison. There's a difference, but really it's pretty subtle - by the time you usually see the difference on the texture, it looks like crap anyway from being stretched out too much.

My big-red-switch 486

Reply 109 of 120, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Have you tried a GeForce 4 and/or FX?

Yes, quality never improves until G80. Most likely because Microsoft set foot in the house with strict DirectX 10 rules. Games on PS3 also have the same pattern.

Reputator wrote:

But there are some in this thread that are characterizing one pattern of banding as terrible compared to another pattern of equally apparent banding.

Banding itself is completely different issue and always will be present. Purpose of linear filtering is to remove squares and make textures look more natural. But with mentioned GeForce cards you get the same squares. IQ of modern GPUs has also returned to VSA100 (Voodoo 4/5) level.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 110 of 120, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

Found some related comparison in my archives, which I did 7 years ago. Radeon X1950XTX vs GeForce 8800 Ultra in Bioshock, both under Windows XP. Quality of shading was much improved with DX10 hardware.

Bioshock Radeon X1950XTX 1.png
Filename
Bioshock Radeon X1950XTX 1.png
File size
417.09 KiB
Views
756 views
File license
Fair use/fair dealing exception
Bioshock Radeon X1950XTX 2.png
Filename
Bioshock Radeon X1950XTX 2.png
File size
518.04 KiB
Views
756 views
File license
Fair use/fair dealing exception
Bioshock GeForce 8800 Ultra 1.png
Filename
Bioshock GeForce 8800 Ultra 1.png
File size
849.81 KiB
Views
756 views
File license
Fair use/fair dealing exception
Bioshock GeForce 8800 Ultra 2.png
Filename
Bioshock GeForce 8800 Ultra 2.png
File size
789.82 KiB
Views
756 views
File license
Fair use/fair dealing exception

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 111 of 120, by swaaye

User metadata
Rank l33t++
Rank
l33t++

One thing I noticed with X1950 is in FEAR there are light shaft-like beams near some windows in the beginning area and they are very blocky on that card. Like an unfiltered texture. Unfortunately Imageshack lost my old screenshots. Fellix on B3D told me that is because they don't support percentage closer filtering.

Yeah it is interesting how so much improved with D3D 10. No wonder the transistor budgets went crazy implementing all of that. Though it was unfortunate that they threw out some legacy support too.

Reply 112 of 120, by auron

User metadata
Rank Member
Rank
Member

about DXT1 again, to really put this one to rest: i keep reading here it was "fixed" on GF4, but what really seems to be the case is that on GF4 they still use 16-bit processing but just added dithering to it, which in some instances is supposed to look actually worse than what GF1-3 was doing. one thread on it: https://forum.beyond3d.com/threads/dxt-issue- … gf4-series.611/

i stumbled upon this issue while testing a GF2GTS with ut2004, where i immediately noticed the ugly green streaks on floors and many other textures, even after making sure that 16-bit textures are disabled. these shots illustrate it really well. GF2/3 look just the same while GFFX is an improvement, yet kyro ii seems to have even less of these green streaks? R200 looks very similar to GFFX color-wise, just the anisotropic filtering is rougher. so kyro ii seems to be the winner here to me, but when was this exactly fixed on newer cards? GF6 or 8 maybe?

another interesting (and possibly related) fact is that while i could disable texture compression in this game on the GF2GTS (fixing the ugly color streaks, but killing performance), doing this on my GTX 1060 results in just white textures on many things, so only compressed textures would run on modern hardware it seems. however the driver has the caps for DXTC disabled, so what exactly is it doing here? i found this paper from around the GF8 time, maybe this is related?

Reply 113 of 120, by silikone

User metadata
Rank Member
Rank
Member

So it took three whole generations of graphics to address this? I wonder what made it so difficult to properly implement DXT1 compared to DXT3.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 114 of 120, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I'm sure they knew all about the issue while designing GF256, but it apparently wasn't a trivial thing to fix and they had very aggressive product cycles to meet as they continued to annihilate the competition. DXTC was perhaps not a good fit for that iteration of their architecture or whatever, and so it had compromises. Or perhaps making it better would add too much complexity and reduce clock speeds, an unacceptable compromise. See ATI's feature-rific Radeon that couldn't keep up. These GPUs would have been built by several design teams working in parallel, so the compromised design was around for awhile.

Reply 115 of 120, by silikone

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

Found some related comparison in my archives, which I did 7 years ago. Radeon X1950XTX vs GeForce 8800 Ultra in Bioshock, both under Windows XP. Quality of shading was much improved with DX10 hardware.

I have a 7900 GTX lying around somewhere, a perfect match for the X1950XTX. I'll try to get it running and dump a pile of screenshots (and framerates) if there is any demand.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 116 of 120, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

if there is any demand.

There is little to no demand for such thing, because nobody cares. Which is quite ironic, when compared to something like OPL3 quality arguing.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 117 of 120, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I find it quite relevant to be honest; mipmap levels, texture filtering and color precision were really hot topics at the time whether anyone remembers or not, and I sure as hell think it's much more important than how accurate FM synthesis is to original OPL3 (which I think is ass compared to other implementations like Crystal and ESS, but I digress).

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 118 of 120, by Socket3

User metadata
Rank Member
Rank
Member
kithylin wrote on 2017-07-23, 04:27:
The Serpent Rider wrote:
Meanwhile I've tested Radeon 9800 Pro at last. Filtering pattern is identical to Radeon 7500, so it's safe to say that filtering […]
Show full quote

Meanwhile I've tested Radeon 9800 Pro at last. Filtering pattern is identical to Radeon 7500, so it's safe to say that filtering is identical from Radeon 256 and all the way to Radeon X1950XTX. DX10+ cards have better quality of filtering overall as mentioned before.
Here's macroshot of Radeon pattern:
Q3 wall macroshot Radeon 9800.png
Q3 wall macroshot Radeon 9800 edge.png
And here's GeForce pattern (now with edge detection to compare)
Q3 wall macroshot GeForce 3.png
Q3 wall macroshot GeForce 3 edge.png
Radeon pattern is not perfect, but at least trying to resemble smooth pattern of older cards. While GeForce pattern is just a horrible mess of squares.
In conclusion? Probably should stick to Radeon cards for Windows 98 retrogaming and avoid any old GeForce like a plague. Or go full retro and stick to Riva TNT/3Dfx/Matrox G400.

But nobody cares anyway 😈

Except the performance.. for Win98se gaming there's nothing AMD side that matches the performance of the 6800 ultra that's 100% native Win98se compatible. At least not that I'm aware of.

uhhhmm... The X850XT says Hi. And look! He brought his win9x driver with him!

Sorry, I've been watching too many kid's shows with my son. Catalyst 5.6 to 6.2 for windows 9x supports the X8xx series. The x1xxx series cards are the ones lacking win9x support. I use an X850XT/ Athlon64 4000+ in my "ultimate win9x build" and a x800xt + p4 631 in another 9x pc.

Reply 119 of 120, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I have a PCIe X800XT PE in my LGA775 Win98 PC. It more than matches 6800 Ultra.

Retronautics: A digital gallery of my retro computers, hardware and projects.