What are some fiction books that changed the way you viewed the world?
What are some fiction books that changed the way you viewed the world?
Like, increased empathy or altered your perspective a bit.
On a lighthearted note, 1984 taught me that most people who quote it in political discussions have probably never actually read it.