Caitlin Moran, whose book was covered briefly on the Report
here, posits that
every woman in this century is a feminist by default:
We go to school with boys, we expect to be treated equally to boys, it's illegal to rape us, you can't own us, our paychecks go into our bank accounts, we are feminists, and unless you have actually handed in your vote at the White House and said "I have no need for this anymore," then you are a feminist. So then it becomes a semantic argument, why have we lost that word? I think young women, they just haven't heard the word feminism in the last 15 years or so anywhere in popular culture. I think one of the reason people got scared of it is that you don't have people going out there and saying, "I'm a feminist!"