VOGONS


Athlon XP performance ratings

Topic actions

Reply 20 of 24, by appiah4

User metadata
Rank l33t++
Rank
l33t++
bakemono wrote on 2024-05-06, 14:28:

The performance ratings became more inflated over time. The 1900+ (which I had back in the day) ran at 1600MHz. The 2500+ (which I had later) ran at 1833MHz. How does a mere 233MHz increase translate into 600 marketing points? Increasing cache and bus speeds mitigates memory latency but it doesn't make a whole new CPU.

It's very similar to what happened with Cyrix 6x86 processors. Early on, a 133MHz part was given a PR166 label, then later a 285MHz part was given a far more generous PR400 label.

The 2500+ Barton had almost twice the cache (128+512KB) that the Palomino XP 1900+ (128+256KB) did. It was also almost twice the transistor count. It also ran at 166MHz FSB as opposed to 133MHz.

I really don't think your argument holds merit. I have no idea what happened to distort people's views about the Pentium 4 era AMD/Intel competition; maybe the Core generation completely brainwashed people but Intel was resorting to stupid shit like Pentium 4 EE to be able to compete with top of the line Bartons well before Athlon64 arrived. And then Athlon64 completely destroyed Intel's Pentium 4 lineup.

For reference:

312979_biz-winstone.gif
Filename
312979_biz-winstone.gif
File size
9.43 KiB
Views
168 views
File license
CC-BY-4.0
312977_ut-botmatch.gif
Filename
312977_ut-botmatch.gif
File size
10.3 KiB
Views
168 views
File license
CC-BY-4.0
312976_3dmark.gif
Filename
312976_3dmark.gif
File size
9.61 KiB
Views
168 views
File license
CC-BY-4.0

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 21 of 24, by Fish3r

User metadata
Rank Newbie
Rank
Newbie
appiah4 wrote on 2024-05-06, 14:41:
The 2500+ Barton had almost twice the cache (128+512KB) that the Palomino XP 1900+ (128+256KB) did. It was also almost twice t […]
Show full quote
bakemono wrote on 2024-05-06, 14:28:

The performance ratings became more inflated over time. The 1900+ (which I had back in the day) ran at 1600MHz. The 2500+ (which I had later) ran at 1833MHz. How does a mere 233MHz increase translate into 600 marketing points? Increasing cache and bus speeds mitigates memory latency but it doesn't make a whole new CPU.

It's very similar to what happened with Cyrix 6x86 processors. Early on, a 133MHz part was given a PR166 label, then later a 285MHz part was given a far more generous PR400 label.

The 2500+ Barton had almost twice the cache (128+512KB) that the Palomino XP 1900+ (128+256KB) did. It was also almost twice the transistor count. It also ran at 166MHz FSB as opposed to 133MHz.

I really don't think your argument holds merit. I have no idea what happened to distort people's views about the Pentium 4 era AMD/Intel competition; maybe the Core generation completely brainwashed people but Intel was resorting to stupid shit like Pentium 4 EE to be able to compete with top of the line Bartons well before Athlon64 arrived. And then Athlon64 completely destroyed Intel's Pentium 4 lineup.

For reference:

312979_biz-winstone.gif

312977_ut-botmatch.gif

312976_3dmark.gif

I know this is hardly scientific but COD2 works as a pretty extreme example and is what prompted me to create the thread in the first place. I don't recall it running perfectly back when I was using these specs so was surprised to see such a huge jump in performance.

On the lowest settings possible with dx7, 1024x768, vsync on this is the kind of fps difference I'm seeing:

Attachments

  • p4.jpg
    Filename
    p4.jpg
    File size
    1.11 MiB
    Views
    159 views
    File license
    Public domain
  • athlonxp.jpg
    Filename
    athlonxp.jpg
    File size
    737.89 KiB
    Views
    159 views
    File license
    Public domain

Reply 22 of 24, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

With that kind of difference, I'm thinking if GART driver is working right on Athlon XP platform (or if chipset driver is "OK")...
What platform (CPU + chipset + RAM settings + GPU) this was run on ?

Reply 23 of 24, by momaka

User metadata
Rank Member
Rank
Member

It also matters in what software / applications the chips were tested. Some stuff simply did better with the P4 CPUs, while others did better with the AXP.

Generally speaking, the "shorter" pipeline of the AXP (and Pentium III / Tualatin) was better suited for gaming, because games tend to have "simpler" instructions that often don't require many complex manipulations/processing. In contrast, the P4 / Netbust architecture/pipeline is much longer, meaning the same instruction that takes only a few cycles to complete on the AXP would often take a few more (to twice more) on the P4 / Netburst to complete. Thus, despite P4 chips running at a higher clock speed, it also usually took more cycles to complete a (simple) instruction... and this is what often held back the P4. But it wasn't all a loss for the P4, though. One area where the P4 / Netburst architecture always did better was video and audio encoding/decoding, since the instructions from this type of load usually require more complex manipulations.

The above is actually one of the reasons why I was still able to use all of my P4 PCs for online browsing until much later compared to the AXP platform. Hyper-threaded P4 did even better, especially on Youtube. As such, I was able to comfortably use a P4 Prescott for watching YT at 720p as late as 2016-2017 without any stutter. In contrast, I could not achieve the same with my AXP 2500+ OC'ed to 3200+... not even close. The CPU usage on the XP 3200+ was pretty much constantly pegged at 100% with any HD video. Meanwhile, my P4 Prescott HT was hardly going over 50% on either thread or over %60-70 load for the entire CPU - and that's with a weaker IGP too.

So really, the two architectures were entirely different animals.

Reply 24 of 24, by Fish3r

User metadata
Rank Newbie
Rank
Newbie
agent_x007 wrote on 2024-05-06, 17:24:

With that kind of difference, I'm thinking if GART driver is working right on Athlon XP platform (or if chipset driver is "OK")...
What platform (CPU + chipset + RAM settings + GPU) this was run on ?

Setup is: XP 3000+, MSI k7n2, 1gb dual channel ram, geforce fx 5900xt

After reinstalling the nforce chipset drivers, resetting the bios settings and disabling EAX (this might have been the culprit actually, I didn't have it enabled on the pentium machine and toggling it on noticeably degrades performance 🤦) they're behaving pretty similarly. One of or a combination of those fixed it and the Athlon system is pulling close to 60 in that same area I posted earlier. Now I'm wondering if the bad FPS I remember was due to me trying to run it in dx9 mode on an FX.