The US needs a war on its soil
The US needs a war on its soil
I think part of the reason that many Americans have the mindset they do is because the US thrived after WW2 by not having any large scale infrastructure damage. War torn areas force people to become more caring and work together. Being isolated and never having directly experienced war caused a reinforcement of exceptionalism and individualism. I think a war on its soil would do a large amount to change peoples mindsets.
Edit: just a reminder unpopular opinions should be upvoted.