Skip Navigation

YouTube shorts disproportionately promotes alt-right content according to this experiment

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

188 comments
  • I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

    The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

    This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

    This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

    • The problem is education. It's a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

      What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I'll argue there are people actively destroying this for their own gain.

      Educated people are dangerous people.

      It's not 1984. It's Brave New World. Aldous Huxley was right.

      • I think we need to do better than just say "get an education."

        There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

        From their perspective I get it, many of the Trump voters didn't go, they hear that and they just assume brainwashing.

        We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of "education" where you regurgitate talking points from teachers, the TV, or the radio as if they're matter of a fact ... and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we'd be ill advised to assume the right can't destroy that.

    • This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

    • sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

      Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place

      Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while

  • Do these companies put their fingers on the scale? Almost certainly

    But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc

    This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.

    Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).

    You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.

    This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up

    The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”

    This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products

    If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings

  • Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

    The existence of proportionately more "right" leaning content than "left" leaning content could adequately explain the outcomes.

  • Real talk: I've been using YouTube without an account and with some ad blocking stuff installed. Based on what I'm seeing, I'm pretty sure the algorithm's datapoint for me is "He was born with a penis and is ok with that."

    When I lose my better judgement and start scrolling shorts like an idiot, It is fight videos (IRL, movie scenes, UFC and boxing), auditing, Charlie Kirk and right-wing influencers, and the occasional clip from Shoresy on the basis "He might be Canadian too, idk".

    It is noticibly weird, and I have brought it up to my kid who uses an account, is not what Youtube believes me to be, and whose shorts feed is very different.

    We do both get that guy who opens Pokemon cards with a catchy jingle, though.

  • I use YouTube and don't get much far-right content. My guess is it's because I don't watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they're trying to push an ideology.

    I made that up without any evidence. It's just my guess. I'm a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn't recommend the far-right crap to me.

188 comments