I thought the culture wars were mostly about racism. I don't understand the young white women on the TV celebrating that they don't have the right to an abortion. Why are they happy about it? Are they thinking it won't affect them, because they can afford an out of state abortion, and they're hoping that black women won't have the money?
I thought the culture wars were mostly about racism.
Feminism and gender issues have been a part of the culture wars for a very long time. More so than ever now - misogyny is a core feature of the alt-right.
I thought the culture wars were mostly about racism. I don't understand the young white women on the TV celebrating that they don't have the right to an abortion. Why are they happy about it? Are they thinking it won't affect them, because they can afford an out of state abortion, and they're hoping that black women won't have the money?