VOGONS


Reply 20 of 40, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2025-10-31, 18:47:
RDNA2 was premiered in October 2020. Marketing refreshes with zero changes to the silicon don't really count. That's how it alwa […]
Show full quote
Ozzuneoj wrote on 2025-10-31, 18:29:
Compare that to RDNA 2... […]
Show full quote

Compare that to RDNA 2...

Radeon RX 6950 XT = May 10, 2022 for $1,099 USD
Radeon RX 6750 XT = May 10, 2022 for $549 USD
Radeon RX 6750 GRE = Oct 18, 2023 for $269 - $289 USD
Radeon RX 6650 XT = May 10, 2022 for $399 USD

RDNA2 was premiered in October 2020. Marketing refreshes with zero changes to the silicon don't really count. That's how it always was for both Nvidia and AMD.

GCN 1-3 were supported for 8-11 years, and GCN 4 was getting normal driver support until 2023

GCN 1.0 through 3.0 support is peculiar, because they were made on the same lithography (28nm) and had a lot of minor architecture tweaks. One of the reasons why GCN 1.0 was held alive for so long and why they all were dropped simultaneously. As for immortal Polaris, it held for so long due to the mining GPU craze.

I don't think that really changes anything I said... and believe it or not, one person's opinion on a forum doesn't determine what "counts" for all of AMD's customers. Go look at what their former\potential customers and the tech media are saying about this. Also, check the steam hardware survey:
https://store.steampowered.com/hwsurvey/Steam … elcome-to-Steam

Before the announcement, how many AMD GPUs out of the roughly 100 listed here were getting full driver support? Only 18 (out of 29 total from AMD... mostly older IGPs) ... And how many will be getting full driver support now? Err... 5. And one of those is an IGP. And none of them are RX 9000 series. Basically, everyone in the hardware survey is using an AMD GPU based on an architecture from 2022 (if that's what we should be going by) or older. AMD is telling all of these people that their days of getting performance improvements are numbered, despite the "AMD fine wine" meme that has helped them stay relevant for the past 15 years.

Even if we assume that people are okay with their 2020 GPUs being put on the back burner, for AMD to stop providing full driver support for a $400 GPU from May of 2022, let alone ones for $550 and $1100, is a ludicrously bad move for a company in their position. They are barely a blip on the radar in the professional\business areas where Nvidia is dominating, so giving home users more reasons to avoid them for their next upgrade is beyond bone-headed.

EDIT: I should add... if they really needed to focus on the current generation (which they should), they should simply do that without making some huge announcement about not wanting to continue improving the GPUs that most of their customers are still using. It sounds like in the follow up posted earlier they have "re-worded" things a bit and are suggesting they might roll out performance improvements as the market requires. Right. 🤣. Okay, well, if they'd worded it that way from the beginning and actually intend to do that, then absolutely no one would care. These companies just do really dumb things sometimes.

Last edited by Ozzuneoj on 2025-10-31, 23:50. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 21 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie

AMD reversed it ...RDNA 1 and 2 will keep getting driver support for both critical updaters and game updates.

Took less than 24 hours, though technically they are still in a form of maintenance mode and game updates will be based on market demand.

Still better than a legacy mode I guess.

Reply 22 of 40, by leileilol

User metadata
Rank l33t++
Rank
l33t++

the glsl compiler is still probably busted for opengl2 since 2022 though. Surprisingly not a lot of noise was ever made about that, as it did affect newer games down the line (i.e. Skindeep)

apsosig.png
long live PCem

Reply 23 of 40, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
Trashbytes wrote on 2025-10-31, 23:47:

AMD reversed it ...RDNA 1 and 2 will keep getting driver support for both critical updaters and game updates.

Took less than 24 hours, though technically they are still in a form of maintenance mode and game updates will be based on market demand.

Still better than a legacy mode I guess.

I don't know if I'd say they reversed it, but they are definitely in damage control mode.

People will still complain about this and assume that AMD is not supporting them unless they keep seeing performance improvements listed in the driver release notes. Whether AMD were actually planning to have anyone dedicated to improving the experience on RDNA 2 or not, you can bet that people will looking for it now.

Just more foot shooting from the PR department.

Last edited by Ozzuneoj on 2025-11-01, 00:09. Edited 1 time in total.

Now for some blitting from the back buffer.

Reply 24 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote on Yesterday, 00:07:

the glsl compiler is still probably busted for opengl2 since 2022 though. Surprisingly not a lot of noise was ever made about that

IIRC AMD hates OpenGL and is still on the "Vulkan is its replacement" so go use that path.

Reply 25 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
Ozzuneoj wrote on Yesterday, 00:08:
I don't know if I'd say they reversed it, but they are definitely in damage control mode. […]
Show full quote
Trashbytes wrote on 2025-10-31, 23:47:

AMD reversed it ...RDNA 1 and 2 will keep getting driver support for both critical updaters and game updates.

Took less than 24 hours, though technically they are still in a form of maintenance mode and game updates will be based on market demand.

Still better than a legacy mode I guess.

I don't know if I'd say they reversed it, but they are definitely in damage control mode.

People will still complain about this and assume that AMD is not supporting them unless they keep seeing performance improvements listed in the driver release notes. Whether they were actually planning to have anyone dedicated to improving the experience on RDNA 2 or not, you can bet that people will looking for it now.

Just more foot shooting from the PR department.

True .. but I watched a Video from one of the better Tech guys who said even older nVidia GPUs don't see much performance improvements from newer drivers after the first couple of years once the tech has matured and even demonstrated this with a ton of data to back it up. (There are exceptions but the majority holds true here)

I feel that RDNA 1 was fine to push to legacy, the tech doesn't have full hardware support for DX12 U features but RDNA 2 ...that should have never been moved to legacy or maintenance mode, its still being sold new in stores and has full support for all DX12 U features.

I feel that this has come down because UDNA is getting closer to full release and they dont want to be dedicating resuorces to the older RDNA tech.

Reply 26 of 40, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
Ozzuneoj wrote on 2025-10-31, 23:42:

and believe it or not, one person's opinion on a forum doesn't determine what "counts" for all of AMD's customers

That's not my opinion. That's how both Nvidia and AMD were putting their old stuff to EOL for ages. In big chunks, regardless of their actual date of release. AMD just handles this more poorly.

Trashbytes wrote on Yesterday, 00:09:

IIRC AMD hates OpenGL and is still on the "Vulkan is its replacement" so go use that path.

AMD has primarily treated OpenGL as a professional API, so everything related to it is usually reserved for FirePro drivers. "SMP support in OpenGL for plebeians? Not on my watch!". ATi did the same thing, but they had to "compromise", due to id Software shadow looming over them.

I feel that this has come down because UDNA is getting closer to full release and they dont want to be dedicating resuorces to the older RDNA tech.

One of the reasons to skip RX 9000 series, because it's probably going to be another TeraScale to GCN transition period driver negligence.

That being said, everything here really concerns only Windows. On Linux side AMD just gave up and now fully invested in supporting open-source drivers, which even have software ray-tracing implementation for Vega.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 27 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie

I will likely be moving the 9070XT I bought over to a CachyOS Rig soon, the more I learn about the truly nasty security shit and AI Enshitification MS is adding to recent Win11 releases is making me not want to run that OS outside of a VM where I can sandbox what it phones home with.

Id use a nVidia GPU for that but their driver support on Linux is truly abysmal compared to AMD and the 9070XT on Linux already supports FSR4/FG via the latest Mesa versions.

I don't fully know if GPU pass through is supported via VM for the 9070XT but some quick reading suggests its possible via Arch which CachyOS is based on.

Reply 28 of 40, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t
Trashbytes wrote on Yesterday, 00:15:
Ozzuneoj wrote on Yesterday, 00:08:
I don't know if I'd say they reversed it, but they are definitely in damage control mode. […]
Show full quote
Trashbytes wrote on 2025-10-31, 23:47:

AMD reversed it ...RDNA 1 and 2 will keep getting driver support for both critical updaters and game updates.

Took less than 24 hours, though technically they are still in a form of maintenance mode and game updates will be based on market demand.

Still better than a legacy mode I guess.

I don't know if I'd say they reversed it, but they are definitely in damage control mode.

People will still complain about this and assume that AMD is not supporting them unless they keep seeing performance improvements listed in the driver release notes. Whether they were actually planning to have anyone dedicated to improving the experience on RDNA 2 or not, you can bet that people will looking for it now.

Just more foot shooting from the PR department.

True .. but I watched a Video from one of the better Tech guys who said even older nVidia GPUs don't see much performance improvements from newer drivers after the first couple of years once the tech has matured and even demonstrated this with a ton of data to back it up. (There are exceptions but the majority holds true here)

Absolutely. People aren't generally benchmarking their games and nitpicking over performance improvements when they do updates, especially for cards that have been out for a couple years. We talk about this all the time even with retro hardware... if you want the best driver for a card, don't use the last one to support it. Use one from when the GPU was still near top of the line.

And this is exactly why AMD should have either not made this change internally or just said nothing at all about it. They have said the whole time that they would continue to do bug fixes, which is really the most important thing. With that, one can assume (maybe optimistically) that if a bug causes a game to run especially poorly, they would probably fix that too.

But, now that they have come right out and told people that their GPU (which has nearly the same feature set as the current generation, and which they may have bought 3 years ago for $1100) is no longer going to get the same attention as others that are hardly any different from the customers' perspective, it just looks really really bad and damages customer trust. Worst of all, it gives the impression (correct or not) that AMD can't support their products as long as Nvidia, which obviously impacts their perceived value.

This is less about customers actually losing something they would normally have, and more about AMD blowing another opportunity to build or maintain a good reputation with their customer base.

Trashbytes wrote on Yesterday, 00:15:

I feel that RDNA 1 was fine to push to legacy, the tech doesn't have full hardware support for DX12 U features but RDNA 2 ...that should have never been moved to legacy or maintenance mode, its still being sold new in stores and has full support for all DX12 U features.

I feel that this has come down because UDNA is getting closer to full release and they dont want to be dedicating resuorces to the older RDNA tech.

Yeah, less people care about RDNA 1, for sure, and the lack of ray tracing and other features gives it a clear distinction from the newer architectures. It just boggles the mind why they would drop support for RDNA 2 at the same time AND then go out and announce it to the world.

They could just... you know... Reassign the person\people who would be working on optimizations for these cards and have them focus on other things unless there is a major issue. There. Done. No one cares and there is no PR disaster.

Now for some blitting from the back buffer.

Reply 29 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie

Yep .. they did what AMD is well know for doing ...snatching defeat from the jaws of victory.

Meanwhile nVidia shareholders are laughing all the way to the bank.

Reply 30 of 40, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Like I mentioned, nothing new here. That's how typically AMD operate their graphics division.
I just checked something. Radeon X1950 Pro (very popular ATi card at the time) was released in October 2006 and AMD pulled the plug inm March 2009, so less than 3 years. And the whole X1000 series architecture was ditched in less than 4 years. That's some remarkable consistency!

The current rug pull from RDNA2 owners probably has something to do with AMD prioritizing AI business now.

Last edited by The Serpent Rider on 2025-11-01, 14:41. Edited 2 times in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 31 of 40, by UCyborg

User metadata
Rank Oldbie
Rank
Oldbie

So both AMD and NVIDIA suck in their own ways.

Arthur Schopenhauer wrote:

A man can be himself only so long as he is alone; and if he does not love solitude, he will not love freedom; for it is only when he is alone that he is really free.

Reply 32 of 40, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Trashbytes wrote on Yesterday, 07:11:

Meanwhile nVidia shareholders are laughing all the way to the bank.

nah; fuck ai, fuck upscaled "4k" and fuck fake framerates. and on the cpu side fuck intel too 😀

apsosig.png
long live PCem

Reply 33 of 40, by marxveix

User metadata
Rank Oldbie
Rank
Oldbie
Trashbytes wrote on Yesterday, 00:15:

I feel that this has come down because UDNA is getting closer to full release and they dont want to be dedicating resuorces to the older RDNA tech.

RDNA4 is nothing like previous RDNA-s, it’s more of an early UDNA build. They abandon other RDNA-s sooner and RDNA4 lives quite long life.

Best ATi Rage3 drivers for 3DCIF / Direct3D / OpenGL / DVD : ATi RagePro drivers and software
30+MiniGL / OpenGL Win 9x dll files for all ATi Rage3 cards : Re: ATi RagePro OpenGL files

Reply 34 of 40, by sunkindly

User metadata
Rank Member
Rank
Member
leileilol wrote on Yesterday, 16:33:
Trashbytes wrote on Yesterday, 07:11:

Meanwhile nVidia shareholders are laughing all the way to the bank.

nah; fuck ai, fuck upscaled "4k" and fuck fake framerates. and on the cpu side fuck intel too 😀

Hehe, I feel the same way.

BF6 is kind of a turning point, and I feel like we're gonna see more of this kind of behavior: requiring things like TPM / secure boot, pulling driver support, forcing upgrades that don't necessitate a real upgrade (I don't care about tensor cores, DLSS, ray tracing and considering the Monster Hunter Wilds fiasco it doesn't seem to matter even if you have a top-tier system). Not to mention the list of a hundred other things, it's all a huge turn off from modern PC gaming for me. If I didn't need it for work I don't think I'd have my current config and even with all the boasting, basic functionality in Adobe products still runs like ass. So...

SUN85-87: NEC PC-8801mkIIMR
SUN88-92: Northgate Elegance | 386DX-25 | Orchid Fahrenheit 1280 | SB 1.0
SUN94-96: BEK-P407 | Cyrix 5x86 120MHz | Tseng Labs ET6000 | SB 16
SUN98-01: ABIT BF6 | Pentium III 1.1GHz | 3dfx Voodoo3 3000 | AU8830

Reply 35 of 40, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
leileilol wrote on Yesterday, 16:33:

nah; fuck ai, fuck upscaled "4k" and fuck fake framerates. and on the cpu side fuck intel too 😀

Of all the AI crap, I only found DLAA to be useful, and only when it replaces the game's inferior TAA implementation. You can force DLAA in any game that supports DLSS via Nvidia Profile Inspector, and run it at the latest version too.

From the games that I play, Rise of the Tomb Raider and FF7 Rebirth benefited the most from this.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 37 of 40, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
sunkindly wrote on Yesterday, 18:22:
leileilol wrote on Yesterday, 16:33:
Trashbytes wrote on Yesterday, 07:11:

Meanwhile nVidia shareholders are laughing all the way to the bank.

nah; fuck ai, fuck upscaled "4k" and fuck fake framerates. and on the cpu side fuck intel too 😀

Hehe, I feel the same way.

BF6 is kind of a turning point, and I feel like we're gonna see more of this kind of behavior: requiring things like TPM / secure boot, pulling driver support, forcing upgrades that don't necessitate a real upgrade (I don't care about tensor cores, DLSS, ray tracing and considering the Monster Hunter Wilds fiasco it doesn't seem to matter even if you have a top-tier system). Not to mention the list of a hundred other things, it's all a huge turn off from modern PC gaming for me. If I didn't need it for work I don't think I'd have my current config and even with all the boasting, basic functionality in Adobe products still runs like ass. So...

Ill give you a hint .. none of the problems with modern game performance has a damn thing to do with the GPU but rather lazy fucking devs just grabbing UE5 and thinking they don't have to do any kind of optrimisation or backend work to get it performing good. One caveat .. DLSS, FSR and FG ..have not helped the situation, they were never meant to be used as a performance crutch but here we are, I have seen them used correctly though, CP2077 comes to mind.

They use UE5 like its UE4 when the two are entirely different engines internally and require very different work flows, also Lumen and Nanite are treated like wall paint and are just slapped into every damn thing with little thought at how performance damaging both are when you have zero clue how they actually work.

Funny thing is people will point at Fortnite and say "But that's UE5 and this is UE5 so why does this run like shit" .. no its not, its a highly modified version of UE5 built by the creators of UE5 for one specific game and that engine only has the features the game is going to use. And every single feature its using has been optimised to the ground to run fast on a huge range of hardware. Your game however uses the off the shelf Wheaties version of UE5 with zero modifications and zero opimisations .. so yeah it runs like shit.

I hate modern lazy game devs and I hate UE5 even more because its the reason they are lazy, EPIC even agrees here, they know they need to do more to train devs on how to work with UE5 and that everything the devs learned from UE4 needs to be unlearned.

Battlefield 6 though is a breath of fresh air, its fast, lightweight and has clearly had a ton of work done to keep it that way ..it doesn't even use RT its all pure raster and I love it. (Its also not UE5)

Reply 38 of 40, by sunkindly

User metadata
Rank Member
Rank
Member

I can't disagree with you on those points but I do think it's a bit of a circle. Lazy devs are going to lean more into AI and Nvidia seems like they'll be very happy to give them the tools to do it.

SUN85-87: NEC PC-8801mkIIMR
SUN88-92: Northgate Elegance | 386DX-25 | Orchid Fahrenheit 1280 | SB 1.0
SUN94-96: BEK-P407 | Cyrix 5x86 120MHz | Tseng Labs ET6000 | SB 16
SUN98-01: ABIT BF6 | Pentium III 1.1GHz | 3dfx Voodoo3 3000 | AU8830

Reply 39 of 40, by bitzu101

User metadata
Rank Newbie
Rank
Newbie

I think making changes to the silicon it probably the most expensive. Every new design needs a process of testing , trialing and you do loose quite a few bucks doing so , hence the companies like tsmc will charge a lot for new designs. The litography is very expensive , the yields for new designs are quite poor as well.

I was reading somewhere that tsmc charges nvidia around 30k per wafer. That is a lot. With new designs and lower nm processes , the cost has skyrocketed. It s becoming very dificult to make new chips at that level.

Needless to say , that in this case , the hardware has nothing to do with it. It s a pure commercial decision from AMD. nothing more.