Feminism is defined by Google as: "the advocacy of women's rights on the basis of the equality of the sexes."
Right, so why is there such a negative undertone when we talk about feminism, why do people say, "oh, you're a feminist?" like you just told them you're a serial killer?
To make it simple: feminism is all about making men and women equal, including those of the LGBTQ+ community, POC, etc.
Of course, there are "feminists" who are men-hating and only white hetero supporting, but that's not a feminist, that's a bitch! Crazy that feminism is actually for everyone, right?
A counterargument to many American feminists is "You don't even have it that bad, America is a very well developed country, what do you have to worry about?" WELL, I know it's shocking, but men and women in America still face many problems to this day. Obviously, it's worse in developing countries/ third world countries, but I can't speak from that perspective.
In America, men and women are expected to look a certain way, act a certain way, etc. There's this crazy notion that if a man is interested in something that's deemed as "feminine"- such as painting his nails or coloring his hair- he must be gay. WHY? Or if a girl is outspoken and very proud, she has a plan and she knows how to achieve that plan- she's called a raging bitch or a control freak. We need to break these stupid, close-minded gender roles and stop making dumb assumptions like that. Why can't people act a certain way or express themselves a certain way without being harassed about it?
2018, let's clear up what feminism really is. Feminism isn't a curse word, you can ABSOLUTELY be a male feminist, and intersectional feminism is the new wave, no more ~strictly hetero white female~ feminism, that's whack.
Let's go out there and break the stereotypes, make feminism a good word and like... smash the patriarchy or whatever.