VOGONS


Reply 40 of 43, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Gaming cards are basically just "charity" for Nvidia now. AMD made a huge deal with OpenAI and moving into the same direction.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 41 of 43, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

Looks like AMD walked this back, per their latest blog post.

This is not the end of support for RDNA 1 and RDNA 2. Your Radeon RX 5000 and RX 6000 series GPUs will continue to receive: […]
Show full quote

This is not the end of support for RDNA 1 and RDNA 2. Your Radeon RX 5000 and RX 6000 series GPUs will continue to receive:

  • Game support for new releases
  • Stability and game optimizations
  • Security and bug fixes

Also, Gamers Nexus summarized the entire situation in a hilarious video.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 42 of 43, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
sunkindly wrote on Yesterday, 08:15:

I can't disagree with you on those points but I do think it's a bit of a circle. Lazy devs are going to lean more into AI and Nvidia seems like they'll be very happy to give them the tools to do it.

Problem is the tools simply cannot overcome the total lack of optimization . .just look at Monster Hunter Wilds, Borderlands 4 or Dragons Dogma 2. DLSS/FG does nothing for these games because the underlying Engine has not had any work done on it to run well without them. If the engine cannot hit 60FPS at 1440p or 2160p then DLSS cannot fix that and combined with FG will actually make it worse because the AI then has to start making shit up to build the extra frames, if you have less than 60 frames to work with them trying to get to 100FPS is going to be a huge struggle along with the huge amount of input latency involved.

Frame gen simply put is horrible beyond 1 or 2 extra frames due to the extra input latency and rubbish extra fake frames and DLSS if has to drop to 720p just to get to 1440p or 2160p @ 60 FPS is going to make the game look like rubbish too.

Its only going to get worse if Epic doesn't get off their asses and start teaching devs how to use their engine correctly, they realise there is a problem so at least they can work on fixing it.

Reply 43 of 43, by Trashbytes

User metadata
Rank Oldbie
Rank
Oldbie
bitzu101 wrote on Yesterday, 18:55:

I think making changes to the silicon it probably the most expensive. Every new design needs a process of testing , trialing and you do loose quite a few bucks doing so , hence the companies like tsmc will charge a lot for new designs. The litography is very expensive , the yields for new designs are quite poor as well.

I was reading somewhere that tsmc charges nvidia around 30k per wafer. That is a lot. With new designs and lower nm processes , the cost has skyrocketed. It s becoming very dificult to make new chips at that level.

Needless to say , that in this case , the hardware has nothing to do with it. It s a pure commercial decision from AMD. nothing more.

AMD is better off with RDNA 4 and UDNA yields due to using chiplets over a mono die like nVidia who has terrible yields with their massive Blackwell dies.