I want to preface this by saying a few key facts. I am a Christian, and I believe what's in the Bible, no question about it. I do believe modesty is important and I also believe that God intends men and women to fill certain roles and not others. So now I pose the question: why aren't more Christians identifying as feminists?
It's not all about women. Though certainly, some women distort the idea of feminism, It has nothing to do with bringing men down, but bringing women up. We acknowledge that men are raped. Men have higher suicide rates than women. These are problems, too. Any decent person calling themselves a feminist (or a Christian) will tell you that. For this article, I'm going to discuss the female side of things.
Women are raped and get asked questions like "How much did you have to drink? What were you wearing? How many men have you been with in the last month?" All of which are irrelevant questions, in case you didn't know. You know what causes rape? Rapists. So the next time you tell your daughter to watch her drink at a bar, turn to your son and ask him if he knows what consent means. Jesus told women to adorn themselves respectively, yes. But he also said to cut out your right eye if it causes you to stumble.
We live in a world where women go places in packs for safety and get harassed by strange men on the street. We elected a man for president that was accused of sexual assault and was actually videoed making inappropriate comments about a woman, for crying out loud. Google "sexism on national news" and watch a plethora of completely insulting and inappropriate comments made by ignorant men on the news. Brock Turner didn't even serve his full three months of jail time, after ruining a girl's life. And we still have a wage gap.
Men and women both contribute to this skewed culture we live in. If you don't consider yourself a feminist, take a close look at what it really is. Stand up for women being harassed and stomp out rape culture.