VOGONS


Reply 40 of 50, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Gaming cards are basically just "charity" for Nvidia now. AMD made a huge deal with OpenAI and moving into the same direction.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 41 of 50, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

Looks like AMD walked this back, per their latest blog post.

This is not the end of support for RDNA 1 and RDNA 2. Your Radeon RX 5000 and RX 6000 series GPUs will continue to receive: […]
Show full quote

This is not the end of support for RDNA 1 and RDNA 2. Your Radeon RX 5000 and RX 6000 series GPUs will continue to receive:

  • Game support for new releases
  • Stability and game optimizations
  • Security and bug fixes

Also, Gamers Nexus summarized the entire situation in a hilarious video.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 42 of 50, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
sunkindly wrote on 2025-11-02, 08:15:

I can't disagree with you on those points but I do think it's a bit of a circle. Lazy devs are going to lean more into AI and Nvidia seems like they'll be very happy to give them the tools to do it.

Problem is the tools simply cannot overcome the total lack of optimization . .just look at Monster Hunter Wilds, Borderlands 4 or Dragons Dogma 2. DLSS/FG does nothing for these games because the underlying Engine has not had any work done on it to run well without them. If the engine cannot hit 60FPS at 1440p or 2160p then DLSS cannot fix that and combined with FG will actually make it worse because the AI then has to start making shit up to build the extra frames, if you have less than 60 frames to work with them trying to get to 100FPS is going to be a huge struggle along with the huge amount of input latency involved.

Frame gen simply put is horrible beyond 1 or 2 extra frames due to the extra input latency and rubbish extra fake frames and DLSS if has to drop to 720p just to get to 1440p or 2160p @ 60 FPS is going to make the game look like rubbish too.

Its only going to get worse if Epic doesn't get off their asses and start teaching devs how to use their engine correctly, they realise there is a problem so at least they can work on fixing it.

Reply 43 of 50, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
bitzu101 wrote on 2025-11-02, 18:55:

I think making changes to the silicon it probably the most expensive. Every new design needs a process of testing , trialing and you do loose quite a few bucks doing so , hence the companies like tsmc will charge a lot for new designs. The litography is very expensive , the yields for new designs are quite poor as well.

I was reading somewhere that tsmc charges nvidia around 30k per wafer. That is a lot. With new designs and lower nm processes , the cost has skyrocketed. It s becoming very dificult to make new chips at that level.

Needless to say , that in this case , the hardware has nothing to do with it. It s a pure commercial decision from AMD. nothing more.

AMD is better off with RDNA 4 and UDNA yields due to using chiplets over a mono die like nVidia who has terrible yields with their massive Blackwell dies.

Reply 44 of 50, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

And the issue of having to optimize drivers for each game, of which there are millions, instead of developers optimizing games for each API, of which there are fewer; isn't that one of the reasons why drivers are taking up more and more space?
For me, it's the world turned upside down; two or three GPU manufacturers have to optimize drivers for each game, and thousands of developers don't bother to optimize anything or offer finished games, so that everyone thinks it's normal for a game to have lots of updates.
Before the Steam era, this didn't happen. If a game came out with a bug that broke it, it got slammed in the reviews. Now, nothing. Now they expect it to be fixed with an update.
I haven't bought a newly released game in many years because of this, and even if the game has been out for a while, if it's poorly optimized, I'll only buy it if it's heavily discounted, sometimes just to see how badly it's made.
Because we all know that there are games that never performed well even on much newer hardware and needed modifications from the community, which is the most shameful thing.
I think it's fine if AMD decides to stop optimizing individual games. I'd rather prefer they focus on optimizing the overall performance of their GPUs, and if a game is poorly made, then so be it, let them sell it to someone else.

Reply 45 of 50, by Dolenc

User metadata
Rank Member
Rank
Member

Games are a lot more complex now. A lot. A lot lot.

You had a fixed rendering pipeline before.

Now you can program parts of it, using shaders. Thats where most of the "problems" come from.

They are heavily optimized! Its just so much stuff, its kinda impossible to have everything spot on. And while they are only 2-3 different brands making gpus, there is a bunch of different hardware, generational changes.

A vendor, can also help optimize the game, before release, but even then people are screaming nVidia/amd title.... You will never make everyone happy :p

Reply 46 of 50, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

Can't wait to see the RDNA1/2 vs RDNA3/4 vs GCN driver bug comparision

previously known as Discrete_BOB_058

Reply 47 of 50, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie
Dolenc wrote on 2025-11-05, 16:22:

Games are a lot more complex now. A lot. A lot lot.

You had a fixed rendering pipeline before.

Now you can program parts of it, using shaders. Thats where most of the "problems" come from.

Yeah. To boil it down a bit more, a "driver" is now a whole compiler suite, with multiple high-level shader languages as inputs and multiple hardware architectures as supported targets.

It's an interesting trade off. If GPU vendors had standardized on a low-level shader instruction set back in 2003 or so, their drivers could be a lot simpler. But on the other hand, every GPU would need to have backwards compatibility for the same instruction set baked into the hardware.

GBAJAM 2024 submission on itch: https://90soft90.itch.io/wreckage

Reply 48 of 50, by marxveix

User metadata
Rank Oldbie
Rank
Oldbie

AMD just cheking the waters.

AMD already announced they are going to continue driver optimizations for another 10+ years.All they did was shift the older GPUs into a separate category so they can optimize each generation much better and more efficient.

Best ATi Rage3 drivers for 3DCIF / Direct3D / OpenGL / DVD : ATi RagePro drivers and software
30+MiniGL / OpenGL Win 9x dll files for all ATi Rage3 cards : Re: ATi RagePro OpenGL files

Reply 49 of 50, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie

I don't give a damn that games are more complex nowadays. If they're not willing to optimize the games and deliver products with the proper quality, then they should change jobs and do something less complex.
Look, maybe games are more complex nowadays, but I suppose there are also better tools available today.
I don't know much about programming, but I do know that honesty is a virtue, and if they use the excuse that games are more complex to justify doing a poor job, then I don't think that's honest.

Reply 50 of 50, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie

Where I work, things are constantly buggy and implemented half-bakedly. Perhaps it's the way of the modern programmers. If they can be even called that.

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.