I used to say that I wasn't a feminist. Over the course of my relatively short life of nineteen years, I had heard many women say that they weren't feminists, so I said I wasn't either. I thought that was the status quo. There is such a negative connotation surrounding the words, "feminist" and "feminism." The reason behind that is the stereotypical feminist that everyone loves to hate. The woman who doesn't shave anything, burns her bras, and hates men. You know who I'm talking about.
Growing up, I decided to look up the definition of "feminism" and what I found shocked me. The definition is simply this, "the belief that men and women should have equal rights and opportunities." Seriously? That's it? That's what people are fighting against instead of for? I couldn't believe that I had continuously said that I didn't believe in feminism, when I now absolutely do. Everyone should be feminists, those of all ages, genders, races, and orientations included.
Looking back and realizing that I heard so many women say that they weren't feminists shocks me. I believe that women who say that, don't know what feminism means. How could a woman not want equal rights in everything she does? Are you okay with a man making more than you in the same position? Are you okay with double standards in terms of clothing and social behavior. Are you okay with people saying, "You throw like a girl" in a derogatory way? Are you seriously okay with that?
I'm feminist and I believe that everyone should be able to wear what they want, do what they want, and say what they want.
I'm a feminist and I believe that women should be able to do anything men can do.
I'm a feminist and I believe in equality.
I'm a feminist and you should be too.