Rationalization for silencing people that OP disagrees with. Just call them “terrorists” and now it’s moral.
I'm sorry, is there another title you would like to use for mass murderers who engage in mass murder for the purpose of causing terror? Is there somebody here who agrees with ISIS, with their position that people who don't love ISIS shouldn't have heads?
I don't think I need to rationalize my position that scammers and killers are bad and should not be given a free platform upon which to reach hundreds of millions of people with unlimited video uploads. I think that's a perfectly rational position as is.
I don't think opposing neo-Nazi conspiracies makes me a Nazi. I do think you said that in a weak attempt to shut down all rational discussion here.
If terrorists are forced to pay a sysadmin to host a slow, makeshift matrix server in their moms' basement, rather than having free access and unlimited uploads to a global network of 550 million rubes stupid enough to fall for crypto scams, I consider that a win for the world.
If crypto scammers move to some platform nobody's ever heard of, and nobody uses, because it's nothing but crypto scams, I consider that a win for the world.
It's not incoherent to want to make the lives of extremists less convenient. I'm not saying we can or should bother trying to eradicate their access to messaging altogether. I'm saying we should recognize it as a problem and try to address it instead of saying "oh, wow, terrorists use our platform? Cool. Fun. Neat."
But that is kind of the issue here isn't it? Bad elements are not inconveniented at all by any of that, it's normal people who suffer from getting censored.
I personally am not willing to give up any online freedoms I have, just on the off chance that it might be inconvenient for criminals.
Why can't we just better educate people on how to avoid online scams. Oh wait that wouldn't give the government an excuse to legislate another part of our lives into oblivion.
And it's funny how suddenly we are having all these terrorist problems it's like something else is causing it, but once again solving it probably doesn't benefit the government.
I'm all for better education, but there will always be people who don't understand the technology, and scammers and extremists will always look for new ways to trick people and radicalize people.
Most of these regulators are just asking the platforms what they're doing to combat extremism, not actively regulating the platforms. Regulators are, by and large, afraid of technology, and afraid that they'll regulate it incorrectly. But by questioning the companies, they can apply pressure to make sure the companies take moderation seriously. The fear of hypothetical regulation and strong negative PR is usually enough to get the companies to at least try to do better. That's a good thing.
And it’s funny how suddenly we are having all these terrorist problems it’s like something else is causing it, but once again solving it probably doesn’t benefit the government.
I have no fucking idea what you're trying to say here.
And it’s funny how suddenly we are having all these terrorist problems it’s like something else is causing it, but once again solving it probably doesn’t benefit the government.
I have no fucking idea what you're trying to say here.
That the government is making up the terrorist problems so while people are scared they can come to our rescue, legislate, and thus have more control.
Well, presumably the people who control all of the servers and encryption keys and directly profit from the app's users. But whatever, as long as we see fewer scams and fewer terrorists, I'm not picky about who is shutting them up.
According to wikipedia, worldwide servers, HQ in Dubai.
Honestly I doubt even delisting from the play/app stores will stop people from using it, at least on Android.
It would make it much more difficult for scammers to reach victims, and dramatically stem its growth, but that's not really what I'm calling for. I'm mostly just hoping that the world includes these messaging services when thinking about how to address and regulate social media and extremism, rather than excluding them because they misclassify them as not being social networks.
No, my private messages are well-encrypted. But people voluntarily send their private messages to Telegram without e2ee and avail themselves of Telegram's moderation, which they know it does. They just know that their crypto scams are profitable enough given how little of a fuck Telegram gives that they're willing to put up with it. They know that sharing unlimited videos on private servers would cost money, and that money would mean less money to buy weapons with, so why not enjoy unlimited uploads and share them with half a billion people who think "yeah, that's fine."
Facebook is a CP paradise because of the way private groups are set up. Been a problem for years. Even with all the moderation. What do you expect the moderation will do in this instance?
Facebook should be better at moderation. I'm not familiar with the Child Porn myself, but I think this is a problem with moderation that relies on reporting versus somewhat more proactive moderation, such as automatically scanning content (with human review).
I expect the moderation to frustrate the efforts of scammers, extremists, and terrorists so that scams are no longer profitable and so that they can no longer spread hate and terror as effectively.
Scams will always be profitable. The difference between scamming someone in real life and scamming them via the internet isn't all the much different.
Scammers use phone calls to scam people too. Are you suggesting we tap and monitor everyone's phones for keywords?
The thing about privacy is that you seem to be willing to let people or organisations (that we can't prove have our best interest at heart) violate people's privacy in order to get the result you want. And there's no proof that you will get that result.
Meanwhile someone who's human has to make the determination that something is criminal or something is CP and that means we have to pay people to comb through all that data.
That's very taxing on the individuals involved. It does harm to them.
Now I'm sure you'll say something about expectation of privacy when submitting anything to the web. But people do have the expectation of privacy online. Take a look at people who are deliberately de-googling or up in arms about web sites collecting their data to target them with ads.
Totally agree that huge social media systems need to be understood as disproportionately affecting misinformation. I don't know anything about Telegram, though.
Are the pushback people fReEzE PeAChErs or something? Is Telegram just lovely? Dunno.
I'm part of a few Telegram channels full of highly progressive IRL friends and colleagues. I also know Telegram is full of channels dedicated to crypto shilling, liveleak-esque video and imagery, piracy chats, privacy chats, QAnon forums, etc etc. I used it to communicate with family when I was out of the country and didn't want to pay for roaming charges.
Telegram itself is just a piece of software. Telegram's community is wide and varied. Does it need moderation? Yeah probably. Who should be doing the moderating, not just of individual channels but of all the channels? Eh, I don't have a good answer to that.
Telegram itself is just a piece of software. Telegram's community is wide and varied. Does it need moderation? Yeah probably. Who should be doing the moderating, not just of individual channels but of all the channels? Eh, I don't have a good answer to that.
As long as you agree it should be happening, I appreciate that. I think Telegram should probably worry about it, and keep looking for solutions, but also that people should report the problematic groups and channels they come across, and be aware of the issue just to put a little more PR pressure on them to come up with a solution.