I have two little anecdotes to share:
Last week, I asked my women's studies class how many of them consider themselves feminists; one student raised her hand (out of 60). When I asked them if they believe women and men should be equal in society, everyone raised their hands. This was a perfect set-up for talking about the antifeminist backlash, but also extremely depressing.
Earlier today, I asked my students to discuss what needs to be changed in terms of gender equity/inequity in society today. A guy in the back of the room who has barely made a peep so far this semester responded (this is an approximation of what he said), "I think it's really important that women have the right to abortion because from my perspective, as a guy, it's my job to support her decision no matter what because it's her body and not mine. And I don't think it's right that society makes being pro-choice look so bad and that some people, like the religious right, think it's okay to make decisions about women's bodies for them."
I now love this guy...although he still didn't raise his hand when I asked if anyone considered themselves a feminist. Peer pressure? Cultural denigration of a term that should be more neutral than it's often presumed to be? Any other theories?