VOGONS


First post, by sliderider

User metadata
Rank l33t++
Rank
l33t++

Has anyone done this comparison yet? According to the specs here:

http://www.gpureview.com/show_cards.php?card1=23&card2=141

They should be identical in performance. The Radeon 8500 should also be equal to a GeForce 4Ti 4400.

http://www.gpureview.com/show_cards.php?card1=22&card2=136

Reply 2 of 11, by elfuego

User metadata
Rank Oldbie
Rank
Oldbie

On the paper the radeon 8500 also supports pixel shader 1.4, but in practice the card is much slower then any GF4 ti. Its more in the range of GF3 ti200 (not even ti500). I know, because I had the card and it was pretty disappointing.

Edit: after following the link you posted, I can say its BS. The overclockability of R8500 is maxxed at 290/290, any further than that and you would have to reduce the memory latency. The absolute top models could have maybe achieved 295/295, but thats already pushing it too far.

GF 4 Ti on the other hand... Now thats a video card generation built for overclock. Especially the GF ti 4200 and later 4800 (AGP x8 4200). Just check the 3dmax scores of that time. Difference is huge.

Last edited by elfuego on 2011-11-09, 01:11. Edited 1 time in total.

Reply 3 of 11, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
Has anyone done this comparison yet? According to the specs here: […]
Show full quote

Has anyone done this comparison yet? According to the specs here:

http://www.gpureview.com/show_cards.php?card1=23&card2=141

They should be identical in performance. The Radeon 8500 should also be equal to a GeForce 4Ti 4400.

http://www.gpureview.com/show_cards.php?card1=22&card2=136

yeah they are identical in theoretical specs, but actual performance is another thing because:
1. the r200 structure is less efficient than geforce in actual games.
2. ati's drivers sux.
as a result, the actual performance is in this order:
using drivers at release, ti200<8500le<gf3<8500<ti500<<ti4200
using newest drivers as late as 2006, gf3<8500le<ti500<8500<<ti4200

Reply 4 of 11, by Pippy P. Poopypants

User metadata
Rank Member
Rank
Member

GeForce4's multisampling also allows for much better anti-aliasing performance over the 8500's supersampling. The 8500's only real saving grace is that it doesn't take much of a performance hit when anisotropic filtering is enabled, but its technique does introduce some weird artifacts.

GUIs and reviews of other random stuff

Вфхуи ZoPиЕ m
СФИР Et. SEPOHЖ
Chebzon фt Ymeztoix © 1959 zem

Reply 5 of 11, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie
noshutdown wrote:

as a result, the actual performance is in this order:
using drivers at release, ti200<8500le<gf3<8500<ti500<<ti4200
using newest drivers as late as 2006, gf3<8500le<ti500<8500<<ti4200

I have the Ti200, 4200, 4600 and Radeon 8500 and that rating looks about right.

Reply 6 of 11, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I bought 8500 LE when it's drivers were getting decent and 4200 cost twice as much. Was very happy with the card and stick with it for long time. Too bad that pixel shader was rather slow, 1.4 was a big step up from older profiles. Texturing performance should have been awesome but something was holding back all that singlepass power in newer games. 6 texels in pass were ideal for Doom 3 for example but R200 sucked in it. Not enough cache? Who knows.
Aniso helped even in such easy implementation and being free finally made people use it. AA performance was low but super sampling is best sampling. I had best screenshots and with colorfill that actually worked everyone like my visuals on lanparties. Core reached 300 MHz easily.

Reply 7 of 11, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah there was some sort of internal cache issue (undersized maybe) that affected how much R200 and friends could do per pass regardless of what the specs said. Doom3 also didn't work well with their hierarchical Z function which meant wasted fillrate.

7000-8500's aniso was almost free but that's because it only affected a small amount of visible textures (angle dependent) and it didn't work with trilinear meaning you had to accept visible mipmap transitions. NV's GF3/4 aniso was way better quality but slow.

Reply 8 of 11, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

All textures are affected but number of samples varies with angle. Bilinear aniso still has much better mip transitions then plain trilinear. I don't recall anyone using aniso on GeForce before FX because of the performance impact.

Reply 9 of 11, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

I don't recall anyone using aniso on GeForce before FX because of the performance impact.

It might be more because not many people knew what AF was back then. Some people also sacrifice image quality to get more speed.

I think by far the biggest problem with 8500 is the drivers. NV has a big edge here. The first year of 8500's life was a total mess. Even the final drivers are troublesome though and the card is a bad choice if you want to play older games because there's no fog table support.

Reply 10 of 11, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:
Putas wrote:

I don't recall anyone using aniso on GeForce before FX because of the performance impact.

It might be more because not many people knew what AF was back then. Some people also sacrifice image quality to get more speed.

I think by far the biggest problem with 8500 is the drivers. NV has a big edge here. The first year of 8500's life was a total mess. Even the final drivers are troublesome though and the card is a bad choice if you want to play older games because there's no fog table support.

Would this work with an 8500?

http://www.ehow.com/how_7524901_enable-fog-ta … -emulation.html

Update Your Drivers

1 Create a Windows restore point. Right-click on the "My Computer" or "Computer" icon on your desktop and select "Properties." Select "System protection" from the menu on the left. Select the disk you wish to back-up -- typically C: -- and click the "Create" button at the bottom of the window. If something goes wrong when you're updating your drivers, you now have a way to restore your computer to its former, working condition.

2 Update your video card drivers automatically. Open the Control Panel and select the "Device Manager" utility. Expand "Display adapters" and find your video card on the list. Right-click it and select "Update Driver Software," then select "Search automatically for updated driver software." If Windows finds a driver update on the Internet, it will install it automatically, then ask you to restart your computer.

3 Update your video card drivers manually if the automatic update doesn't work. Open the Start menu and run the program "dxdiag" to open the DirectX Diagnostic Tool. Click on the "Display" tab to see your video card's manufacturer and the current version of its driver. Visit the manufacturer's website and check that your driver's current version matches the latest version available for download. If it does not, download and install the update from the website. Restart your computer.

Enable Fog Table Emulation

1 Download and install RivaTuner (see Resources). Launch the program; it will create a database of your current registry settings.

2 Click on the arrow next to "Customize..." in the "Driver settings" area of the "Main" tab. Choose "DirectDraw and Direct3D Settings." The "Direct3D tweaks" window will open.

3 Click on the "Compatibility" tab. Check the box for "Enable table fog emulation" under "Compatibility Settings." Direct3D's fog table emulation is now enabled.

Reply 11 of 11, by swaaye

User metadata
Rank l33t++
Rank
l33t++

From what I gather, table fog was mainly something 3dfx wanted in D3D. NVIDIA emulated it correctly but ATI couldn't be bothered to. They did have this unofficial partial support. You can enable it in older drivers with registry tweaks. I used Radeon Tweaker on a Radeon DDR. I don't remember if it works properly though (doubt it).

I wouldn't bother with this anymore. Just stick in a Voodoo1-5 or any NV card and fog will just work among other niceties. I'm not sure when NV dropped support for it but I know FX cards support it.