VOGONS


Pentuim 4 (478) need ideas?

Topic actions

Reply 20 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t
FaSMaN wrote:

Funny thing you should say that, I just started installing Windows 2000 as I want to get the system to dual boot WinME and Win2k , thing is according to the ATI/AMD webstite the latest driver for it is 6.10 for windows 2000 , If I try and install the latest version 10.3 I get a incompatibility warning stating that I do not have a 64bit version of windows, and if I look at the executable its clearly for both 32bit and 65bit?

Any ideas? Or should I just go with the Omeaga drivers?

10.3 is too new. Legacy 10.2 is the final build for R300 even in Vista - make sure you're downloading the proper variant though (go through their little "driver finder" thing and it'll steer you right). 9x is limited to 6.something which means fog is likely broken (it was officially fixed in 7.10).

Reply 21 of 42, by Gamecollector

User metadata
Rank Oldbie
Rank
Oldbie

The last Win9x Catalyst version is 6.2.
And there are 2 standard ATi troubles - table fog and subtract blending.

Asus P4P800 SE/Pentium4 3.2E/2 Gb DDR400B,
Radeon HD3850 Agp (Sapphire), Catalyst 14.4 (XpProSp3).
Voodoo2 12 MB SLI, Win2k drivers 1.02.00 (XpProSp3).

Reply 22 of 42, by FaSMaN

User metadata
Rank Member
Rank
Member

Got 10.2 working, it's worth mentioning AMD has the driver links wrong if you select XP - 32Bit as a OS it gives you the 64Bit driver that doesn't work 😒

None the less I am more than happy with the result its been benching 3DMark2001se since Friday, in both the Windows ME and Windows 2000 installs :3

Really stable and super fast!! (who needs a SSD!!)

Reply 23 of 42, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

The only downside of 9600Pro might be signal quality. My 9800Pro shows noticeably sharper picture than Sapphire 9600Pro. Is yours OK?

Reply 24 of 42, by FaSMaN

User metadata
Rank Member
Rank
Member

Picture quility wise mine seems to be pretty good, I have it connected to a 15"crt at the moment over VGA.

Are you using DVI or VGA?

Reply 25 of 42, by Scali

User metadata
Rank l33t
Rank
l33t
FaSMaN wrote:

Are you using DVI or VGA?

With DVI there should be no image quality issues, as there is no RAMDAC involved at all.
I thought the RAMDAC in the 9x00-series was integrated anyway, so there would not be a whole lot that an OEM can mess up.

Mind you, I have one ATi card, I don't recall if it's my 9600XT or my 8500, but it has one VGA output and one DVI-I output.
The VGA output is poor (nosiy/flickery), but using a DVI-to-VGA adapter gives me an excellent image. So that would be worth a try perhaps?

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 26 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t
Scali wrote:

With DVI there should be no image quality issues, as there is no RAMDAC involved at all.

Unfortunately this isn't universally true of cards from this era - TMDS transmitters from this era, especially integrated ones, were notorious for failing to meet spec. The results may not be noticeable in certain configurations, but can include signal dropouts and other connectivity issues. As far as I'm aware it wasn't until the GeForce 8 era that DVI/HDMI output from graphics cards was consistently "flawless" - this isn't to say that all models before that had problems. Generally it was the entry and mid-range products that had issues, and high-end and professional-grade adapters relied on external TMDS transmitters that generally passed or exceeded spec without issues. Generally if you need to ensure flawless DVI support from an older machine, go with a professional card like a Quadro or Wildcat, as they'll have quality external transmitters and generally are validated for every resolution on their modelist.

I thought the RAMDAC in the 9x00-series was integrated anyway, so there would not be a whole lot that an OEM can mess up.

RAMDACs themselves have been integrated for a while (well before R300), but that doesn't mean signal quality is guaranteed. RAMDAC is only one part of the output chain (very near the beginning); the output filter can still be done poorly, which was a common issue on GeForce 2-4 cards, and those also have integrated RAMDAC. ATi cards tended to not have as many issues reported because ATi maintained much tighter control over board designs and manufacture during this era, but it's still entirely possible for an OEM to totally cook the VGA output by going cheap on the output filter. Whether or not the RAMDAC itself is good becomes irrelevant at that point - it could be the best signal in the world, but if it's being mangled by a bad filter the output is still garbage.

Generally the same rules from above apply here too - higher end cards will have better output sections.

Reply 27 of 42, by Scali

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

Unfortunately this isn't universally true of cards from this era - TMDS transmitters from this era, especially integrated ones, were notorious for failing to meet spec. The results may not be noticeable in certain configurations, but can include signal dropouts and other connectivity issues.

I don't consider those image quality issues though (as in washed-out details, bad colours etc).
That's more a case of "it works" or "it doesn't work" (which may be intermittent, if you happen to be exactly at the dropoff point).
You may be able to solve such issues with shorter and/or better quality cables, to reduce signal degradation.

obobskivich wrote:

RAMDACs themselves have been integrated for a while (well before R300), but that doesn't mean signal quality is guaranteed. RAMDAC is only one part of the output chain (very near the beginning); the output filter can still be done poorly, which was a common issue on GeForce 2-4 cards, and those also have integrated RAMDAC.

Ah, well I meant the whole output stuff obviously, including the filters 😀
Which I believe is all included on R300 (because ATi indeed wanted better control over quality, given the mess that OEMs made with GeForce cards... I actually modded my GeForce2 by removing one stage of the filter, to avoid it from filtering out too low frequencies, causing visible blur at higher resolution/refresh rate).

On that one ATi card that has issues, I think the noise/flicker problem may be either because of interference issues, or because of a poor power supply to that section of the card.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 28 of 42, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
FaSMaN wrote:

Picture quility wise mine seems to be pretty good, I have it connected to a 15"crt at the moment over VGA.

Are you using DVI or VGA?

VGA, 19" 1280x1024 screen, blurring is noticeable, as opposed to 9800Pro or HD4870X2. DVI to VGA adapter uses the same analogue RGB signals a VGA. I doubt that could help.

Reply 29 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t
Scali wrote:

I don't consider those image quality issues though (as in washed-out details, bad colours etc).
That's more a case of "it works" or "it doesn't work" (which may be intermittent, if you happen to be exactly at the dropoff point).
You may be able to solve such issues with shorter and/or better quality cables, to reduce signal degradation.

It's digital v analog signal quality in my mind. You're right in that "bad digital" usually just means signal drops though - still a "signal quality" problem imho. My whole point was that making the assumption that DVI = never problems is unfortunately not always accurate; especially with early DVI cards. 😊

I know there's an ExtremeTech article out there where they tested a bunch of early DVI cards for compliance, and found a number of cheaper models to fail.

Ah, well I meant the whole output stuff obviously, including the filters 😀
Which I believe is all included on R300 (because ATi indeed wanted better control over quality, given the mess that OEMs made with GeForce cards... I actually modded my GeForce2 by removing one stage of the filter, to avoid it from filtering out too low frequencies, causing visible blur at higher resolution/refresh rate).

The output filtering is not part of the GPU itself - it's still on the board. ATi never had the widespread problems with IQ that nVidia had to deal with from OEMs though. That said, I wouldn't expect very cheaply built cards to be paragons of quality; especially more recent models that may not even be officially licenced.

RacoonRider wrote:

VGA, 19" 1280x1024 screen, blurring is noticeable, as opposed to 9800Pro or HD4870X2. DVI to VGA adapter uses the same analogue RGB signals a VGA. I doubt that could help.

Does lowering the resolution, refresh rate, etc have any effect?

Reply 30 of 42, by Scali

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

It's digital v analog signal quality in my mind. You're right in that "bad digital" usually just means signal drops though

Well, that's not quite what I mean...
Bad digital signal usually means you just don't get anything at all. There's only a very narrow range where the signal is good enough that a connection is established for long enough to transport some visual information and then drop out again, pick back up etc.

obobskivich wrote:

- still a "signal quality" problem imho.

Signal quality, yes. Image quality, no. Namely, anything that is transmitted correctly, will appear just fine. So image quality is perfect for any part of any image you manage to transport. Anything that isn't, is just missing altogether.

obobskivich wrote:

My whole point was that making the assumption that DVI = never problems is unfortunately not always accurate; especially with early DVI cards. 😊

Well, I agree that it doesn't mean there aren't any problems.
I just meant that *if* you get an image with DVI, it's 'perfect', none of the usual analog problems.

obobskivich wrote:

The output filtering is not part of the GPU itself - it's still on the board.

I'm quite sure I read around that time that they started integrating the whole output filter in the GPU, to avoid OEMs from messing it up with sub-par components. But I can't find a reference to any of that now. Oh well...

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 31 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Well, that's not quite what I mean... Bad digital signal usually means you just don't get anything at all. There's only a very n […]
Show full quote

Well, that's not quite what I mean...
Bad digital signal usually means you just don't get anything at all. There's only a very narrow range where the signal is good enough that a connection is established for long enough to transport some visual information and then drop out again, pick back up etc.

Signal quality, yes. Image quality, no. Namely, anything that is transmitted correctly, will appear just fine. So image quality is perfect for any part of any image you manage to transport. Anything that isn't, is just missing altogether.

Well, I agree that it doesn't mean there aren't any problems.
I just meant that *if* you get an image with DVI, it's 'perfect', none of the usual analog problems.

We are in agreement and/or now talking past each other. 😀 🤣

I'm quite sure I read around that time that they started integrating the whole output filter in the GPU, to avoid OEMs from messing it up with sub-par components. But I can't find a reference to any of that now. Oh well...

I did some looking as well and couldn't find anything conclusive in print either. But, you can see filter components on the boards though. 😊

Reply 32 of 42, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

My "Built By ATI" All-in-Wonder 9600XT has blurry analog output, very noticable at 1280x1024 at 72Hz. The "Built By ATI" All-in-Wonder X800XL in the other machine is slightly better, but it still can't come close to the sharpness of my Matrox G400. ATI's integrated DACs were always kinda crappy.

Reply 33 of 42, by Scali

User metadata
Rank l33t
Rank
l33t
NJRoadfan wrote:

My "Built By ATI" All-in-Wonder 9600XT has blurry analog output, very noticable at 1280x1024 at 72Hz. The "Built By ATI" All-in-Wonder X800XL in the other machine is slightly better, but it still can't come close to the sharpness of my Matrox G400. ATI's integrated DACs were always kinda crappy.

Have you tried using the DVI-I with a DVI-to-VGA adapter? This way gives me better image quality on some of my ATi cards from that era.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 34 of 42, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
obobskivich wrote:
RacoonRider wrote:

VGA, 19" 1280x1024 screen, blurring is noticeable, as opposed to 9800Pro or HD4870X2. DVI to VGA adapter uses the same analogue RGB signals a VGA. I doubt that could help.

Does lowering the resolution, refresh rate, etc have any effect?

No idea, you know how LCD screens are, lowering the resolution gives you so much blur that the output quality does not really matter 😀

Reply 35 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t
RacoonRider wrote:

No idea, you know how LCD screens are, lowering the resolution gives you so much blur that the output quality does not really matter 😀

Oh, right. 🤣

Reply 36 of 42, by Stermy57

User metadata
Rank Newbie
Rank
Newbie
Skyscraper wrote:
Use the socket 478 system for Windows 98 SE. Its great for games from 1997 - 2002. The Geforce Ti4200 is perfect, add the Vood […]
Show full quote

Use the socket 478 system for Windows 98 SE. Its great for games from 1997 - 2002.
The Geforce Ti4200 is perfect, add the Voodoo 2 for games that works better with Glide.

Use one of the socket 775 systems as a Windows XP box for games from 2003 - 2006.
För the XP box you will need a better video card. Buy a Geforce 7800/7900 GTX or a Radeon X1900/1950 XT(X).

Yes i agree with you!
I love retrogaming and retrobenchmarks! I have a lot of GPU AGP ( like 80-100 types)
If I were you, I would use geforce 4 ti 4200!
Because there are few games that have problems to start if you use a GPU with shader 2.0.
One example is Legacy of Kain Soul Reaver! It will run only at software mod without any 3D options!
I have tested this situations some months ago!
Another point is driver efficient and detonator driver are better than catalyst 3.x or 4.x version!
Use geforce 4 to 4200 with 44.03v detonator!

Reply 37 of 42, by FaSMaN

User metadata
Rank Member
Rank
Member

Just thought that I will quickly pop in and update everyone on the project:

While I was testing system stability (running computer for an entire week non stop) I neglected to test the onboard audio and when I finally got time to hook up some speakers it would seem the Realtek ALC650E is just terrible at handling audio, it had tuns of pops and clicks in Black and White 2 and 3D Mark 2001 Demo, it would seem that the buffer just isnt working properly and the audio would degrade after 2 minutes or so to complete garbage.

This might be hardware related or it could very well just be the drivers + Win 2k problem, but I feel that the Realtek ALC650E just isnt up for the job and decided to look for another Sound Card, in the meantime I stuck in my prized Yamaha XG YMF724F while the card works perfectly well, its amazing Midi Capabilities and sound really yearns out for a Dedicated P1 build, so I will be looking for something like a Sound Blaster Live! for this build...

I decided that while the 3rd party Zalmen HSF I added to the 9600pro years ago was adequate being Passively cooled , as even Saphire released some 9800 cards with this exact cooler, I would rather add a fan, so normal 80mm Cooler Master (its really silent for a 80mm) was bolted on with two screws on the HSF similar to what Zalmen released as a add-on for this exact HSF... Just much beefier...

Also added a nice 3Com Ethernet card, I had a realtek generec card while it worked wonders in windows ME mode, it sucked in Win2k.

20150225_190309.jpg
20150225_190321.jpg
20150225_191839.jpg
20150225_192719.jpg

Reply 38 of 42, by obobskivich

User metadata
Rank l33t
Rank
l33t

For a P4 with an R300-era card, an Audigy seems a much more appropriate companion.

Reply 39 of 42, by FaSMaN

User metadata
Rank Member
Rank
Member

While I do have a spare Audigy 2 zx I just feel that its overkill for the project, most of the time I will be using headphones on the computer , and won't need to take advantage of its advanced processing as the CPU is plenty fast already 😀