Linux does have better codecs and drivers than Windows for some stuff (Bluetooth for example), but it has worse codecs and drivers for some important proprietary hardware stuff (Nvidia for example)
Self updating without user interaction per default.
I think that this is a terrible idea, until a clear boundary is set between applications that can or cannot break the system. Updating flatpaks automatically might be fine, but updating everything is simply a recipe for disaster.
I'm old. I can use stone age computers. The real barrier is the language. If you can't explain things to me (that knows how to do shit) in terms I've heard before, basically nobody outside of power user, niche users, or software engineers will ever try an OS that you have to learn a new language for just to ask a question. Thank you, that is my longest run-on sentence, and I'm a scientist.