Skip Navigation

The US needs a war on its soil

I think part of the reason that many Americans have the mindset they do is because the US thrived after WW2 by not having any large scale infrastructure damage. War torn areas force people to become more caring and work together. Being isolated and never having directly experienced war caused a reinforcement of exceptionalism and individualism. I think a war on its soil would do a large amount to change peoples mindsets.

Edit: just a reminder unpopular opinions should be upvoted.

14 comments
  • Please tell us more about how war is a good thing, oh master of armchair politics.

    Wishing war on anyone is just fucking vile. Go on, look yourself in the mirror and tell yourself you wish that thousands of people should die needlessly.

  • Is this a general prescription for all countries? We all need a war once in a while to remember to be decent humans?

14 comments