Don't get me wrong. Being a woman is great. We are some of the strongest creatures to have set foot on the planet. Since the beginning of our existence, we have been bearing and taking care of children while also taking up other roles, such as the hunter-gatherer role in the Native American era all the way up to the present day "whatever-you-want-to-be" role. And yes, women have, in fact, had to deal with discrimination throughout our history such as the lack of rights, everyday disrespect, and inequality. But thanks to the Supreme Court, the present day woman possesses the same rights as their male counterparts.
Although feminism did a lot to help women gain their rights in America and to change their role in society, is feminism really relevant in the 21st Century? It can easily be observed that feminism has evolved from a "sisterhood" of clear-minded thinkers who wanted equal rights into a radical group of females who have a tendency to be aggressive and divisive. These modern feminists have openly defeated their purpose by ironically degrading men when labeling them as evil and violent and also bringing down the women who embrace their femininity and choose to be stay-at-home moms.
Now I know that in some ways, women are not treated as equal to men in the everyday social spectrum. But should we be? Studies have proven that men and women are physiologically created different than men and do, in fact, function extremely different than them.For example, because men have been proven to be physically stronger than women, events such as the Olympics and high school/collegiate sports are segregated according to sex in order for the playing field to be fair.
Feminists have also clearly made their purpose more about judgement and hatred rather than support and equality for their own sex. They have been known to be very one-track-minded and hypocritical by creating societal double standards such as how men are viewed as evil, aggressive monsters if they lay hands on a woman. However, if the roles were to be reversed, the woman would be praised and commended for her bravery and strength because the man probably "deserved it."
Although empowering and uplifting women to be great leaders and role models or to even just simply promote self love is something that can be seen among many feminists, feminism as a whole has completely lost its purpose—for men and women to have equal rights. Instead, feminists have evolved into advocates of male hate and enemies of traditional female roles. Now, I openly support the realm of feminism in which women are encouraged to empower themselves and thrive in society, but what I do not stand for is the realm of feminism that spits out hatred and degrades people, which completely violates the original feminist movement. The primary goal of the original feminist movement was to promote equality, support, and encouragement, but it has developed into something that is almost the complete opposite.