People associate the word "feminist" with a negative meaning, assuming that it means women should have more power than men, but those people could not be more wrong. The definition of feminism is the political, economic, and social equality of the sexes.
Feminists want to be heard, they want to be taken seriously, and they want the same opportunity and treatment as men in the workplace, in the home, and in society.
When women and men are treated equally, everyone thrives. It is important for women and men to collaborate and work together to be productive members of society. Barack Obama said, “That’s what twenty-first century feminism is about: the idea that when everybody is equal, we are all more free.”
When women and men work equally together, the system is more productive and society normalizes the idea of men and women being in positions of power - not just men. Our society needs to get used to seeing women as CEOs and presidents of companies or of the United States of America. We have fallen into a bad habit of expecting women to be stay-at-home moms and men being the breadwinners. When women and men are both working, our society is more balanced; it is more equal.
Everyone should be a feminist because being a feminist means believing in equality - it's as simple as that. And equality in the workplace is one of the most important issues. Women and men need to feel safe from sexual predators and patronizing coworkers. Lena Waithe said, “Black is important because it gives us the opportunity to show that we are all one, it's not just about sexual harassment. It's also about ending homophobia, racism. No one should have to suffer in the workplace for pursuing her dream.”
The most important part about being a feminist and standing with women is to listen to them when they open up. It is one thing to say “Time’s Up,” but you actually have to listen and believe their stories or it won’t be so easy to open up anymore. Reese Witherspoon said, “We're here to stand up for all women and men who have been silenced by abuse and harassment and discrimination within their industries, not just Hollywood. All industries.” When women are silenced, nothing good comes of it; women and men need to feel as though they can rely on others to validate their feelings and believe their survivor stories. This is what being a feminist means; it means being there for your fellow Americans, lending a hand, listening to a victim, and not blaming the victim.
It is important to be a feminist, so girls and women realize that when men are being bossy they are the boss, and that when women are being bossy they are the boss too - not a "bitch." And that men and women both and equally have the power to run companies, work in STEM, be politicians, be scholars, be olympic athletes, and conquer the world - no matter their gender, race, ethnicity or income level.