Recently, I started reading a book called Men Explain Things to Me, by Rebecca Solnit. I have always considered myself to be a feminist, and I grew up surrounded by powerful, ambitious women, but this book really changed the way I view things. Feminism is the idea that men and women are equal, not that women are better than men, but it's all about equality. Nowadays, feminism is equated with a radical movement of "man-haters" but that's just not correct.
We need feminism to stand up for our fellow girls, not just in the United States, but also around the world. There are still a handful of states where rapists have parental rights, and where the rapist can try to gain custody of the child. There are people who work and have worked in the United States government who don't understand how women's bodies work and try to deny basic human rights.
Women in Saudi Arabia are now allowed to drive cars, which is a recent development. There are still child brides in developing countries, and girls are still being denied the same education rights as boys. Reproductive rights, gender-based violence, female genital mutilation, are just a few other issues many women have to deal with.
Intersectionality is extremely important in feminism and it is the idea that people take into account marginalized groups and minorities. If your feminism is only for your friends and people that look like you, you should take a step back and check your privilege.
Benjamin Franklin once said, "Justice will not be served until those who are unaffected are as outraged as those who are." I think this quote is perfect for this topic because it shows that feminism is important for everyone-every gender. Men should be equally outraged, not because they have a sister or a mother, but because it affects them as well.