Reply 200 of 261, by 386SX
I don't have a 'smart TV' but I was expecting that considering their internal "phone hw" we were talking about having the same lifetime points of discussions. And computers even if much more expandable are not also immune to that. Let's think to vga cards. Some still sold low end cards only has H264 acceleration when I understand modern codecs already passed to VP9 / AV1 ones that I understand would need a much more powerful hw. So I'd not be surprised if in few time there'd be another switch to a more modern codec on TV again then repeat..
On the other side I understand why the bandwidth save logic might be a important thing for TV frequencies or mobile data contracts too but I ask myself if the whole gpu centric generic accelerators idea might not be able to actually accelerate a newer codec using the existing gpu shader units or whatever. It's not like we have Directx6 fixed video chips, modern gpus seems much more flexible to expect adapting by drivers to newer accelerations instead of fixed functions of older video chips. I don't think that modern codecs are "that heavy" that some still modern vga couldn't decode when some much older cpu can by software decoding.