The first idea that often comes to people’s mind when they hear the word ‘feminism’ is a bunch of crazy women who act as though they should be treated better than men. However, the actual definition of feminism is simply this: the theory of the political, economic, and social equality of the sexes. In other words, true feminists simply want the same opportunities men have.
Feminism is not telling women what they should/shouldn’t do.
All women are beautifully different.
Some decide to marry.
Some decide to not marry.
Some decide to be a stay at home mom.
Some decide to have a full-time career.
Whatever they choose, it’s their choice, and they should not be degraded for that.
Feminism is not judging people based on their political views.
It’s obvious that Donald Trump does not have the ‘most respect for women’. However, regardless of how one feels towards him, he is the President of the United States, and continuing to refuse to accept this is detrimental for America. Madonna stating that she has “thought about blowing up the White House” does not send a unifying message, but a dividing one.
Women everyday are experiencing discrimination based on their gender, and sexism is not quite yet something of the past. Many women who are in male dominated industries, such as Megyn Kelly, have experienced this. Although there is nothing wrong one practicing their first amendment right by protesting for equality, I agree with Megyn when she says actions speak louder than words when it comes to this subject (regardless of your political views, you should read her book, "Settle for More"). She has proven this to be true by surpassing all expectations in her career. If women want to manifest that they are just as capable as men, the most effective way of doing this is to show they are.
P.S. If you are a feminist and don't want men to sexualize you and don't want to scar children, please, don't protest nearly naked. That is all.