VOGONS


Geforce3 Ti 500 vs Geforce 6200

Topic actions

Reply 20 of 96, by AzzKickr

User metadata
Rank Member
Rank
Member
candle_86 wrote:
AzzKickr wrote:

I already found the problem: card was dying. After that benchmark, I turned off the system. Card is now dead sadly 🙁

Well I'd send you mine but I need a replacement first 🤣

Hahah, thx buddy. Appreciate the gesture but I already have my eye on an alternative 😉 (if the oven-bake-trick turns out bad)

Heresy grows from idleness ...

Reply 21 of 96, by matze79

User metadata
Rank l33t
Rank
l33t

bake oven is no solution, if you do it proper you need desolder the chip, reball it and resolder.

It works short time periods if you just reheat with the bake oven.

Another Method is to apply fluid flux to the BGA so it flows under the Chip, and heat the BGA Chip with a heatgun.
Shield all other components from Heat! Alufoil helps greatly.

https://www.retrokits.de - blog, retro projects, hdd clicker, diy soundcards etc
https://www.retroianer.de - german retro computer board

Reply 22 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

Well my question is now irrelavant my system will be getting a BFG FX 5900 on Monday. Which I know is faster than either card 🤣

Reply 24 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Actually a 6200 would probably win with any pixel shader 2.0 usage. I know a 6600GT usually stomps 5900 Ultra across the board. 😁

Nah i remember back in 2004 my FX 5900XT died and i could only afford a 6200 NV44 AGP card, i lost preformance in FarCry, Call of Duty and Counter Strike Source

Reply 25 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:

Nah i remember back in 2004 my FX 5900XT died and i could only afford a 6200 NV44 AGP card, i lost preformance in FarCry, Call of Duty and Counter Strike Source

You need to try Oblivion, FEAR and Half Life 2 (in forced D3D9 mode). I think a 6200 will win. 5900 really chokes on those.

Yeah in simpler games the 5900 Ultra can pull ahead with its fillrate and bandwidth. But it is still somewhere in between a 6200 and 6600.

Reply 27 of 96, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:
candle_86 wrote:

Nah i remember back in 2004 my FX 5900XT died and i could only afford a 6200 NV44 AGP card, i lost preformance in FarCry, Call of Duty and Counter Strike Source

You need to try Oblivion, FEAR and Half Life 2 (in forced D3D9 mode). I think a 6200 will win. 5900 really chokes on those.

Yeah in simpler games the 5900 Ultra can pull ahead with its fillrate and bandwidth. But it is still somewhere in between a 6200 and 6600.

Actually, Oblivion is so heavy and unoptimized that there are moments when it almost chokes my HD4870X2 on high. Playing on 9600XT was impossible, and there were moments when x1800GTO sucked on medium. There's no way 6200 or any FX can handle Oblivion 😀 HL2 though would work well, at least Portal runs good on 9800Pro.

[offtopic] Btw, I think that Skyrim runs better than Oblivion on older systems and looks better at the same time [/offtopic]

Reply 28 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
candle_86 wrote:

Nah i remember back in 2004 my FX 5900XT died and i could only afford a 6200 NV44 AGP card, i lost preformance in FarCry, Call of Duty and Counter Strike Source

You need to try Oblivion, FEAR and Half Life 2 (in forced D3D9 mode). I think a 6200 will win. 5900 really chokes on those.

Yeah in simpler games the 5900 Ultra can pull ahead with its fillrate and bandwidth. But it is still somewhere in between a 6200 and 6600.

I've seen Oblivion choke on an x1900XTX my buddy had back in 2006 running at 1440x900 medium, so I wouldn't even try to run it on 2003 hardware if 2006 hardware can't handle it. It look what the 8800GTX/GTX 280 to make Oblivion actually playable in its glory.

And since most of my games are DX5/DX6/DX7/OpenGL based I will be playing an FX 5900 will make a 6200 cry like a little girl.

The newest game I intend to run on this Machine is Star Trek Elite Force II.

Reply 29 of 96, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I'd still love to see some benchmark results if possible 😀

Nothing beats a good old head to head...

YouTube, Facebook, Website

Reply 30 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

sure ill run some next week when i have the Ti 500, FX 5900 and Geforce 6200.

I'll run

3dmark 2000
3dmark 2001 SE
3dmark 2003
Quake 3 Time demo
Doom 3 Time Demo

any other requests?

Reply 33 of 96, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:

ill check out X2, I can also run FarCry Demo on this machine as well as Halo

Perfect!

YouTube, Facebook, Website

Reply 34 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Eh a 5900U underperforms a 9600XT in Oblivion by quite a bit. I've run both and I promise a 9600XT can play Oblivion acceptably at 640x480. I have video of the fun I recorded with my 5950 years ago.
https://www.youtube.com/watch?v=qzDZhb808ts

FEAR is the same story. Though I should have run 640x480.
https://www.youtube.com/watch?v=0rlZb6YsJOQ

Here's Half Life 2 D3D8 vs 9.
https://www.youtube.com/watch?v=KwUxxLzu-uk

STALKER in its static lighting mode (D3D8). This interested me because this game started development during NV3x days and was initially advertised with NV3x in mind.
https://www.youtube.com/watch?v=OOneA51UdEo

The problem is any game that uses pixel shader 2 is hopeless even on NV35. FEAR has a D3D8 shader option, HL2 defaults to D3D8 on NV3x, and Oblivion is entirely PS2.0 with some extra 3.0 in some cases.

FarCry runs a mix of D3D8 and 9. The console tells all, I believe. Halo might default to D3D8 with FX but I don't recall anymore.

Reply 35 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
Eh a 5900U underperforms a 9600XT in Oblivion by quite a bit. I've run both and I promise a 9600XT can play Oblivion acceptably […]
Show full quote

Eh a 5900U underperforms a 9600XT in Oblivion by quite a bit. I've run both and I promise a 9600XT can play Oblivion acceptably at 640x480. I have video of the fun I recorded with my 5950 years ago.
https://www.youtube.com/watch?v=qzDZhb808ts

FEAR is the same story. Though I should have run 640x480.
https://www.youtube.com/watch?v=0rlZb6YsJOQ

Here's Half Life 2 D3D8 vs 9.
https://www.youtube.com/watch?v=KwUxxLzu-uk

The problem is any game that uses pixel shader 2 is hopeless even on NV35. FEAR has a D3D8 shader option, HL2 defaults to D3D8 on NV3x, and Oblivion is entirely PS2.0 with some extra 3.0 in some cases.

FarCry runs a mix of D3D8 and 9. Halo might default to D3D8 with FX but I don't recall anymore.

I think the point is that noone wants to play games such as these using anything slower than a Geforce 7800 or Radeon X1900 so it dosnt matter if the FX5900 is slower than a donut running pixel shader 2 stuff. 640*480 or 800*600 does't cut it, at least not for me.

The FX5900 kicks ass running DX7 and DX8! 😀

Last edited by Skyscraper on 2016-03-17, 19:02. Edited 1 time in total.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 36 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Skyscraper wrote:

I think the point is that noone wants to play these games using anything slower than a Geforce 7800 or Radeon X1900 so it dosnt matter if the FX5900 is slower than a donut in these games. 640*480 does't cut it, at least not for me.

The FX5900 kicks ass running DX7 and DX8! 😀

Well duh. The implication was 5900 U being faster than 6200. Not always the case. I'm also not sure it's a kick ass D3D8 card but hey whatever. It's a bit faster than a Ti 4600 and that's about it.

Reply 37 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Well duh. The implication was 5900 U being faster than 6200. Not always the case.

Well it's faster when it comes to everything that matters.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 39 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
Skyscraper wrote:

Well it's faster when it comes to everything that matters.

Oh ok. Thanks for your assistance with the obvious then.

You are welcome.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.