Skip Navigation

Algorithms are breaking how we think - Technology Connections

A slightly unusual video from the fantastic Technology Connections channel. It articulates a lot of my own thoughts on social media, "algorithms" and AI.

What surprised me the most was the statistic that only 3% of author's views come from the subscriptions feed. This is wild to me because subscriptions are pretty much the only way I have ever used YouTube.

53 comments
  • Let me summarized this video.

    1. Introduction

    This video essay critiques the modern internet landscape, focusing on the phenomenon of "algorithmic complacency." The speaker argues that an increasing number of people rely heavily on algorithmic recommendations for their online experiences, losing agency and critical thinking skills. The video uses the identification of a vintage radio as a starting point to demonstrate the power of self-directed research and contrasts this with the passive consumption fostered by algorithmic feeds.

    2. Key Insights

    • Algorithmic Complacency: The core concept is the tendency for individuals to passively accept algorithmic recommendations, relinquishing control over their online experiences. This is seen as a growing problem as individuals become accustomed to curated content feeds.
    • Loss of Agency: Reliance on algorithms diminishes personal agency, as individuals may no longer actively seek information, curate their experiences, or critically evaluate sources.
    • The Power of Self-Directed Research: The video begins by illustrating how to find information about a vintage radio using observation and search engines. This is presented as a "human superpower" that is being lost due to over-reliance on curated content.
    • Historical Perspective: The video contrasts the current internet experience with earlier eras when users actively navigated and curated their online journeys. This historical comparison highlights the shift towards automated and in-your-face content delivery.
    • Problems with Algorithmic Feeds:
      • Context Collapse: Algorithmic feeds can mix different contexts, leading to misunderstandings and conflicts as people from different social circles encounter each other's content without the benefit of shared context.
      • Reinforcement of Extremes: Algorithmic feeds can reinforce binary understandings of complex issues and exacerbate polarization by prioritizing content that generates high engagement (positive or negative).
      • Learned Helplessness: People may become accustomed to waiting for algorithms to suggest solutions rather than introspectively exploring their own problems.
    • YouTube Example: The speaker uses YouTube's subscription feed as an example of an overlooked, manually-curated alternative. He shows that the subscription feed is far less used than the recommended feed.
    • Automation vs. Curation: The video argues that automation itself isn't inherently bad. However, the curation of information, through algorithms, is the key problem, as algorithms are making decisions for users that are not necessarily best for them.
    • The Dangers of AI Thinking: The speaker is critical of the current AI hype cycle, especially when it comes to tools that can offload thinking processes. The speaker expresses concern about handing over our decision-making to a computer that cannot be held responsible for the decisions it makes.
    • Examples of Problems with Algorithms: The video uses Google Maps and news aggregation apps to illustrate the issues of blind trust in algorithms that may optimize for speed, ad revenue, or other metrics at the expense of other variables.

    3. Conclusion

    The video concludes with a plea for individuals to reclaim their agency and resist algorithmic complacency. The speaker emphasizes the importance of building trusted networks, prioritizing human connection, and engaging in self-directed research to counter the potential harms of a highly automated and curated online world. The central takeaway is that users must actively curate their own online experiences, critically assess information, and resist the temptation to passively accept what algorithms present to them. The speaker's warning is that the internet has become a tool that can exploit human connections and weaponize user’s viewpoints and worldviews, which will continue until we start questioning how we operate in this world.

  • I can say that while I near exclusively use the subscriptions feed to start browsing, and will add interesting videos from it to the watch later list, once i’m nearing the end of a video I’ll often choose from the recommended videos on that video rather than going back to the subscriptions page.

    • My subscription system now is a docker image that downloads interesting channels I specify or videos I add to a playlist. I do wonder how those metrics show up in analytics.

      I stopped using recommendations when I accidently clicked on AI slop and that crap started taking over. It's useless to me now. If I use the legitimate YouTube interface, I spend half the time hiding shorts, slop, reruns, and jumpcuts-make-me-interesting influencers talking at me. Ugh.

      YouTube used to be people who wanted to do or share things without kickbacks. Those are the channels I miss.

  • There is this interesting push and pull with algorithms, they need to show content users will engage with, but, their main value to the companies is that it allows them to easily manipulate what is seen.

    They push people to hard they stop using the algorithm, but if they just let the algorithm act purely one what people engage with, then they can’t monetize it.

    There is a third access of preventing people from going down self destructive rabbit holes, but they don’t care about that until people start talking about regulating them or start moving away.

    • That push and pull is exactly why they've been intentionally using them to rot people's brains. The dumber and more apathetic you can make your users, the more you can monetize them, you first minimize the push so you can maximize the pull. This is not an accidental "quirk" of modern algorithms, it's part of the design. Money must be maximized at all costs, including the mental health of the users and the stability of society. Money uber alles. The techbros will drive our society into the ground without a second thought if it makes them a few bucks richer. They're not planning to stay here anyway. We are just a resource to them, and they will exploit us to the fullest to pursue their unachievable techno-utopia fantasies.

  • Great video, as always. I would suggest PocketTube for Firefox for controlling the chaos of YouTube subscriptions. I don't see shorts at all, and if I'm not looking for, say, music or Star Trek content, I can just turn those categories off.

53 comments