Why does everyone think they're so oppressed?
Men, women, black people, white people, gay people, straight people, cis people, trans people, etc. They all think everything, such as society, the media, Hollywood, the government, etc, just absolutely hates them, and wants to get rid of them. Why is the world so paranoid and delusional?
share