Skip Navigation

Pentagon-Funded Study Uses AI to Detect 'Violations of Social Norms'

21 comments
  • Locking social norms at some predetermined stage is a great way to curb all progress. Like, slavery was a social norm at some point.

  • Minority Report anyone?

    • But there was nothing wrong with the basic idea of the tech in Minority Report. It worked. They saved many lives by preventing imminent murders with it. The main problem in the movie was that they leapt straight from "your name came out of this machine" to "ten years dungeon. No trial."

      Movies are designed to sell as many tickets as possible by presenting scenarios that provoke endorphins. They're not serious scenarios you should be making real-world decisions based off of.

  • DAE feel like they woke up one day recently and “AI” suddenly has the answer to EVERY SINGLE PROBLEM EVER? Yet, nothing is getting noticeably better?

    “AI” doesn’t have to work a dead end job to feed its family, or turn to alcohol because it’s lonely and scared of being forgotten. It’s training data is a curated version of the human experience based on the Internet!

    It’s playing human instead of being human and ALL of its solutions will assume that’s “normal.”

    Imagine a five star general googling “should I attack this country?” That’s silly right? Well that’s what’s happening. It’s just being wrapped in a way that makes it look novel.

    These are algorithms designed to mimic humans. When faced with any actual controversy they must be persuaded to answer in an “acceptable” and predetermined manner.

    The golden rule.

  • We should violate anything Pentagon considers to be a study. Especially when it wants to control Social Norms.

21 comments