In my religion, gender, and society class, we're always talking about feminism, but to be honest, I don't really get American feminism. I think the Europeans got it right. American feminism seems overly concerned with masculinity and has a sort of penis envy that, to me, seems completely counterproductive to the ideals of true feminism. In class, we always talk about how men have historically had all the power, but I think that depends on how you define power. I don't know if its specifically because I've grown up in a post-feminism world or because of my own bias as a woman or something else entirely, but I've always felt that women inherently have a sort of power that men seem to struggle to tap into, and grace and wisdom that some men strive for that just comes naturally for us. I hate how the idea of wanting to raise your children is seen as some sort of defeat. I'd rather think of having children as less of a self-sacrifice and more of an honor. If my husband ends up being the stay-at-home dad because I'm too busy with work, I think I will be a bit jealous of his "defeat" to say the least. The "feminine" body and persona are beautiful and diverse aspects of femininity that the feminism movement seems almost ashamed of, as if they're striving for this masculine ideal of power. Burn your bra, and put on a suit. What is that? We're still defining ourselves against the masculine. Why not take the masculine out of the equation completely? Maybe I haven't studied up enough on this subject, but as of now, I love my feminism, and the power that entails. Hell, I even enjoy the lipstick and the skirts and (God-forbid) the door-opening chivalry. Personally, I wouldn't have it any other way.