Social media giants would be forced to ban children under the age of 14 from their platforms or face hefty penalties, under proposed laws in South Australia that could be replicated in other states.
These rules really worry me. I think it's really good in theory and we should be protecting kids more, especially from the big personalised algorithms of Facebook, Tiktok, and the like. Curated feeds like that of Lemmy and Reddit concern me less.
But the issue is…how do you prove age, while still enabling people the right to anonymity? I don't want to give my ID to Facebook.
Ah, another well-thought-out tech law from OzGov in the vein of such hits as 'The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia' and the AssAct. I swear we're the designated test ground for dodgy laws for the five (7,13,etc) eyes...
Congratulations you just stumbled onto the plan to enforce online IDs through mygov! Inspired by South Koreas authoritarian governments requirement for a citizen Id to access online services.
Force ids is the only way. Hopefully that means less adults use them too. What they do with your actual I'd is less nefarious than what they do with your data or feed.
Platforms aren't the guardians of our kids. That being said, if there are laws or rules those computers are supposed to follow, companies shouldn't be subverting those for a new "customer base."
Blocking children from online communities is blocking them from seeing external views outside of the bubbles their parents indoctrinate them into, it's blocking them from seeing information to realise if they're in an abusive situation and seeking help, it's marginalising LGBT+ youth if, through no fault of their own, they happen to be born to ultra religious or LGBT+ phobic parents.
These are adult online communities. They are not communities for children. My Facebook feed is not something I would like a child to see or interact with, and I would consider it pretty tame. Algorithmic feeds that amplify minor / random views into a torrent of reinforcement is not what kids - or adults, actually - need.
People should be allowed to decide for themselves what they want to see. If they agree with you and think they don't want to see certain things, then great, they can enable the kids filter, which is usually an easy toggle in settings. If they don't agree with the makers of the app what is suitable for children, they should also have the option to see the rest of the content.
It's blocking kids under fourteen. That's a good age, most kids don't start to think outside parents until puberty, and it gives some time to settle before being thrown to the net.
My concerns are chiefly practical. How will this be identified and enforced?