Putas wrote on 2023-11-20, 09:14:
W.x. wrote on 2023-11-19, 16:20:Out of 6 radeon 9600/9600 pro and 9550 cards, I cannot get reference clocks (300 mhz memory) on any radeon. Even tried 3.6ns Samsung one, but 280 mhz is maximum.
You do know out of those only the 9600 Pro has reference clock 300 MHz?
Yes, I know. I was speaking also about rated speed (of memory chips). Took Gigabyte Radeon 9550 with 4ns Hynix memory, thought I get at least 250 Mhz overclock. To my suprise, even with 4ns memory chips, the maximum overclock was 230 Mhz!!! (so 20 Mhz under rated speed). My experience with Ati card is much worse, and stuff like this happened always on side of Ati, not Nvidia (I admit it exist, but in lesser quantity). That's my personal experience after hundred of graphic cards. The reason for this, I think is, that on Ati part, they've "cheaped" manufacturing process much more. Same stuff happened to me with Radeon 7500, 9100, 8500/8500LE. So problematic to get it even at reference speeds, or in case, it is underclocked (very often), at least get them with overclock. As I said, I am not only one with this problem on radeon cards (of era 2000-2005 , then I think situation started to get gradually better). Of course, problem is , that cheap manufacturers and their versions (like PC Partner/ Sapphire, Powercolor) was much more promiment. Asus cards are in general better in this matter, but their price was always extra. Sometimes, they were even overpriced. Many people bought Sapphire cards because of good price. But they've paid with this problem.
For example, Radeon 9600 series, have very powerful core, in comparation by NV31/34 (FX 5200/5600). But if you block it by memory (5ns, and 200Mhz), fast core will be completly blocked by memory bandwidth, and you won't get much more than slower core (FX5200) with same memory (200Mhz, 128-bit ... so same bandwidth).
Sorry, this is my experience with Ati cards, it is repeating again and again. For example Sapphire Advantage Radeon 9600 Pro, which is most spread Radeon 9600 pro, has stock clocks of memory 222 mhz (instead 300Mhz). And memory on it, are often terrible (you buy 4ns, or 3.6ns, hoping, you FINALLY get reference 300 mhz clocks, but my 4 attempts with 4ns, 3.6ns memories were unsucessfull (always maximum like 270-285Mhz). I still don't have Radeon 9600 pro able to reach reference 300mhz. Only 9600XT (black one 3.3ns memory, but it was much more expensive to get). It had to be same in that time. Black Radeon 9600 pro/xt with 3.3ns memories, had to be more expensive. The problem here was unexperienced users. Not only, they didnt know to tweak anything, or very much through drivers/ and utilities, to bypass overclock protection on Radeon 9200/9600. But also, to know, which card to buy. Because Sapphire Radeon 9600 Pro advantage looked good... box looked good, card looked good, so normal casual user, just bought it, in hope, he get so much boost from Nvidia competition card (for example FX5200/5600). The result in reality for majority users was, he got 222Mhz (instead 300Mhz), so maybe 25-30% drop in performance, only for memory. And then, 6 to 12 months of random iriitating problems, before Ati finally tweaked that drivers.
This is, why I don't see FX5200 FX5600 experience during year 2003 (so real historical experience) not so much worse.
I agree though, that FX 5200 "64-bit" irritating problem, was same severe, and I suspect "FX 5200 worst graphic card" reputation origins from this (when you read historical discussion, you always find most people bashing it, or posting suspicious low results, while minority users cannot understand , why they are bashing it soo much, because they had better experience (and often post almost double FPS benchmark results on their computer). Only minority users alert them in those discussion, that they have probably 64-bit, and underclocked version. The overclocking awareness for casual users, were very low in 2003 (I know it on myself, in that time, I didnt overclock graphic card, and same stuff with 64-bit problem happened to me on MX440-8x from Gainward (terrible manufacturer, cause they marked 64-bit and lowend version by cool names, like Powerpack Pro, but only Golden sample out of all of them was full 128-bit)).
After some years, awareness of "I've got 64-bit version, and underclocked" was more prominent, and more and more users started to understand it, and avoiding it.
But experience of 2003s "FX 5200" remained the same after that time. All that hate remained.
All I am saying, that in good configuration, how Nvidia planned it (128bit memory) plus trying to get at least 4ns Hynix or Samsung or Infenion memory, it was good budget card, that is not behind from other Nvidia lowest solution, but oposite. It is second best (after Geforce 6200).
What is worse, even experts like Pixelpipes on youtube, when first tested FX5200 for historical review, he got 64-bit version. (not marking it as "cut off",he thought it is real FX5200, how it always was) The results were terrible of course, and many users in comment section pulled that "FX5200 worst graphic card in history"again. After he was alterted in comments to get 128-bit version, he made repaired video . But that had of course much less views. There he admited, with about 30% boost in performance, it is starting to be kinda ok lowest graphic card of the generation. And FX5200 have quite good overclock potential, as it is origins from NV31 core, which is meant for FX5600 with 325 mhz core overclock.
So you usually can overclock it to 325Mhz , and even more, so only thing which is important to gain really good boost on FX5200, is to get good memory.
And here is main stuff... from my experience, when you take 10 random Nvidia cards, memories are simply much more often better, than when you take blindly Ati Radeon cards. You also dont have overclock protection problem on FX5200. Overall, good budget card, for early 2003, when you know the stuff, and overclock it.
325/240-250 scenario is almost always realistic on FX5200, but often 270-300Mhz. With these speeds, try to benchmark it with age correct drivers, from early to mid 2003, and you'll see, that doing worse graphic card of all time from FX5200 is not justified. Who screwed it, were cheap manufacturers. Well, all in all, I don't see FX5200 as so bad lowend card, that you have to add notes like "is shouldn't even exists, how it's bad". I'm asking why?
When MX440 was not considered the worst card of all cards, and that it shouldn't had even exist... so FX5200 cannot be.