My friend's just written a (brilliant if massively harrowing) book on women's rights which has this heartbreaking interview with some teenage girls talking about how they have to spend ages on their hair and their make-up before school every day because "it's important to look good. People don't like you otherwise."
meh
Whats the name of this book and is it out yet? I'm quite interested in reading it and how it seems to be in today's society.
Whats the name of this book and is it out yet? I'm quite interested in reading it and how it seems to be in today's society.