Feminism.
It is a word that simply means that women want equal rights as men.
If it such a simple definition, then why do so many people, men and women, roll their eyes when they hear someone mention the word?
Some people might say it is because some women 'give it a bad name with extreme tactics' or that it is because they simply don't support it.
Are there some people who are extreme about the issue of equality between the genders? Maybe, but what does it matter to you? People are extreme with their opinions about every thing. That does not mean it represents a group as a whole.
I have never gone to a march or rally, but I still consider myself a feminist because I believe that women deserve equal rights as men and equal pay.
Which proves that not every feminist is 'hardcore' or whatever you all fear. They are just simply people who want equality. Plus there is nothing wrong with marching for what you believe in.
Some people think that we are equal already, but that is not true.
Our rights are being threatened and there is still not equal pay everywhere.
For example, Trump wanted to give businesses the right to deny employees of health care coverage over birth control.
How is that equality, if they will still pay for men's medical needs?
It's not.
Then there are also some women who don't support the movement, which astounds me.
If they don't want equality than that is their opinion, but it is a slap in the face to all of the women who fought hard before us just to get the right to vote.
I am sick of hearing people say, "oh feminist are the worst."
Oh? It's the worst that we deserve the same thing that men do? I didn't realize that was that horrible to want.
It honestly astounded me to hear stuff like that in college, but I understand people have other opinions. That does not mean it is OK to be hateful towards someone for wanting equality though. Say nothing and move on.
We have moved forward though. We are open-minded to things than ever before in this day and age, but there is still so much more that needs to be done in regards to treating everyone the same.
The bottom line is that feminism is not a dirty word.
It is a word that embodies equality and people coming together for a cause.
Feminist are not asking men to quit their jobs and move over so women can take them and they aren't asking people to give them more than men.
They simply just want to be equal and to have the same rights, so think about that the next time that you roll your eyes when someone says they are a feminist.