I'm gonna go out on a limb and say that the hardware inside Smart TVs is far too piecemeal and random for anyone to be able to develop for more than one type of television set.
I mean, just look at the troubles AOSP offshoots like LineageOS have. They support a large number of devices, but that large number is literally just a drop in the fucking bucket of how many phones actually exist out there.
Each one of those phones has different hardware and needs different hardware drivers, which is why each phone generally has a different set of maintainers. The same would be an issue with these Smart TVs.
It's something that is feasible but feels like a waste of resources when you can just not connect your Smart TV to the internet and use a Roku/FireStick/Raspberry Pi running Kodi or go out of your way to purchase one of those commercial monitors that don't have any Smart TV bullshit in them.
At this point my next TV is just going to be a big monitor plugged into a fanless PC, or the functional equivalent (all smart features disabled, HDMI input only) if I can't find something I like at a good price. A monitor will have better picture, refresh rate and response time as well.
Fuck smart tvs. I only use tvs by plugging in my laptop via hdmi. It's only a matter of time before smart tvs start reporting people for stuff like watching pirated content or voiding TOS's with adblockers.
Only janky solution I know of is to use a Raspberry Pi (or other computer) to run Kodi or Android TV. You can then point your TV's DNS to a PiHole or other DNS filter to stop tracking/ads.
I wouldn't call this a janky solution at all. The jank is the "smart" TV itself. I use an Nvidia shield to get the most out of my 4k OLED TV, but otherwise do pretty much the same thing. I put my TV on a VLAN that doesn't have Internet access, too.
I use my steam deck docked as a Home Theater PC (HTPC) with a remote with a gyroscopic mouse and full keyboard on the back, and I love it. YouTube with ad block, jellyfin media server, and videogames