Most U.S. adults believe America’s founders intended the country to be a Christian nation, and many say they think it should be a Christian nation today, according to a new Pew Research Center survey designed to explore Americans’ views on the topic.
I’ve got kids in my family and they are very aware that America is not a Christian nation. It’s totally taught in California schools. The lack of memory in our adult pop is the issue imo.