VOGONS


Ageia PhysX dedicated card- worth getting?

Topic actions

Reply 60 of 63, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
RandomStranger wrote on 2026-01-21, 20:39:

By the time Ageia released their product, we were a year or two after Half-Life 2. A game that built it's game on interacting with its physics engine more than any game that implemented PhysX aside of techdemos, and it ran fine with a single core CPU and a budget graphics card.

HL2 physics were pretty basic. A few years later, you had games like Ghost Recon: Advanced Warfighter, Mirror's Edge and the Batman Arkham series, all of which used PhysX to create some effects that are impressive even today.

Last year, there was a big uproar because Nvidia removed 32-bit PhysX support from their RTX 5000 series, which made some of those games crawl even on a Ryzen 9800X3D, when using the CPU for rendering the effects. They eventually released a fix or wrapper which allowed this to work on their latest GPUs. Before that fix, the difference was huge, as shown in this video.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 61 of 63, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2026-01-21, 20:51:

A few years later, you had games like Ghost Recon: Advanced Warfighter, Mirror's Edge and the Batman Arkham series, all of which used PhysX to create some effects that are impressive even today.

Visual effects that had nothing to do with game play. It couldn't have anything to do with game play. PhysX being locked to Nvidia meant they otherwise would have been completely unavailable to a large chunk of the gaming market. In Half-Life, physics was a huge part of the game play. And there was Crysis too with some nice, though not to complex physics effects without PhysX.

Joseph_Joestar wrote on 2026-01-21, 20:51:

Last year, there was a big uproar because Nvidia removed 32-bit PhysX support from their RTX 5000 series, which made some of those games crawl even on a Ryzen 9800X3D, when using the CPU for rendering the effects. They eventually released a fix or wrapper which allowed this to work on their latest GPUs. Before that fix, the difference was huge, as shown in this video.

And that's why it failed. That's why none of Nvidia's tech ever caught on long term. They lock everyone else out of it even if the hardware would be completely capable of running the feature so everyone who doesn't run an Nvidia card, the feature is either unavailable or there is a huge and unreasonable performance hit.

sreq.png retrogamer-s.png

Reply 62 of 63, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
RandomStranger wrote on 2026-01-22, 05:03:

Visual effects that had nothing to do with game play. It couldn't have anything to do with game play. PhysX being locked to Nvidia meant they otherwise would have been completely unavailable to a large chunk of the gaming market.

Yup, it was just eye candy. Very pretty eye candy, but still nothing that had any real impact on the gameplay. Aside from not working on AMD GPUs, PhysX also didn't work on consoles, and most games from that time were multi platform. So it was mostly relegated to providing extra visuals on high end PCs.

RandomStranger wrote on 2026-01-22, 05:03:

And that's why it failed. That's why none of Nvidia's tech ever caught on long term. They lock everyone else out of it even if the hardware would be completely capable of running the feature so everyone who doesn't run an Nvidia card, the feature is either unavailable or there is a huge and unreasonable performance hit.

No argument from me there. If Nvidia put some effort into making PhysX work outside of their ecosystem, we could have had some truly amazing games, with a hardware accelerated physics system that players could interact with in a meaningful way. Instead, they half assed the CPU only implementation, and made it pretty much unusable for anything but pretty visuals.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 63 of 63, by eddman

User metadata
Rank Oldbie
Rank
Oldbie

You should distinguish between the GPU physx modules and the main CPU physx engine. What failed was the former, because it was vendor locked.

The latter is used in thousands of games as the main physics engine, and is very capable. It's built into unity and UE4.

UE5 has moved away from physx now, but that's mainly because nvidia refocused physx from games to other applications. IINM physx 5 mostly(?) runs on the GPU, so it's not really suited for games as a general engine anyway.