Feminism has always been a word that has a negative connotation to it. When most people hear the term they automatically go to the women who don't shave their armpits and protest the existence of men. Recently President Obama released an essay exclusive to Glamour Magazine on feminism, and despite what you think of him as a person or politician he brings up some really great points. He brings up how far women have come - earning the right to vote, being able to work the same jobs as men, and even earning the right to have finances without the company of a man. I highly suggest you read the entire essay, as it is what sparked my thought process on this article.
As an attempt to better understand the general view on feminism I conducted two polls on my Twitter account: "Do you believe that men's and women's rights (social, political, economic, or otherwise) should be equal?" and "Do you consider yourself a feminist?" And while 90 percent of those who voted marked "Yes" for the first poll, over half of those who voted marked "No" for the second poll. I was sneaky on purpose, as I put the literal definition of feminism in the first poll: "thedoctrineadvocatingsocial,political,andallotherrightsofwomen equaltothoseofmen." So I was very confused when people said they believed in equal rights for men and women, but did not consider themselves a feminist. I hate to break it to you, but those are the same thing! *cue shocked gasps*
I bring this up because I think the notion around feminism should be changed. If you believe that rights for men and women should be equal you are a feminist. Being a feminist does not mean you hate men, does not mean you don't shave your armpits, does not mean you protest in trees, and does not mean you think women are superior to men. As with any movement, there are going to be the extremists who think all of the things I previously stated- just like there are extremist Republicans and extremist Democrats. When people are passionate about something they will fight for it.
It is no secret that men and women are treated differently. Women get paid less than men even if they have the same job description. If a woman has an active sex life she is labeled a slut, whereas a man with an active sex life gets high fives and praise. The same goes for men as well, as there are stereotypes around them. If a woman is crying, she is in tune with her emotional side or "just being a girl" but if a man is crying he's weak and vulnerable. If a man wears tighter fitting clothes he "might be a little gay" and if he says no to doing something with his friends, he gets "what are you, a girl?" Women are not allowed to like sports without being questioned who was the third round draft pick in 1988 or what the coach's son's middle name is, and if a man doesn't like sports he isn't a real man. What I'm getting at here, is we all kind of treat each other like shit regardless.
It does sound preachy, I get it. It's hard for us to get out of this mindset because it has been indoctrinated to us for so long whether we realize it or not. For the longest time women were not supposed to even have a job, they were supposed to stay at home with the children and cook meals and clean the house. Now, it is normal for men and women to be in the classroom together and hold the same positions in the workplace- so why not reward them the same?
Getting rid of the negative connotation behind feminism falls on everyone. I think if we take a step back and realize what it is we believe in, then we could see a really big change in the way we treat each other. Just a thought.