What are some fiction books that changed the way you viewed the world?
What are some fiction books that changed the way you viewed the world?
Like, increased empathy or altered your perspective a bit.
On a lighthearted note, 1984 taught me that most people who quote it in political discussions have probably never actually read it.
The expanse influenced my idea of what the future might look like if current trends continue. Oppression, otherizing, and massive mega corps don't magically go away just because we're in space. We need to systematically address these things.