Odds are, when you think of a feminist, you think of what society has told you to think of; you picture a man-hating, self-loving, bra-burning, radical woman. And truthfully, that's what I've always thought of too. I think, especially here in the US, a lot of women get the wrong idea of what feminism is truly about. I think they see this group of incredibly liberal women who are here to bring down the 'old white man' views by "liberating" women. This causes young adult females like myself to pick a side based on this behavior. They either see these women and run as far as they can in the opposite direction, or they feel as though they can relate and proclaim themselves as a feminist.
You see, I was very much the latter, and I think this is a really bad thing, considering why feminism is still needed in the world today. Notice that I said 'the world' and not necessarily the US. Sure, there are still sexist people, and women may be considered a step back from men in our country, but the places feminism is really needed are in places other than America, in third world countries where women aren't allowed to attend school, they aren't allowed to hold high positions; they're basically seen as second class citizens. That's something that a lot of American feminists can't even fathom because that's not a reality here. The true purpose of feminism stems from realities like this, so since we do not experience this, I think feminism in the US has strived to find new roads.
In my opinion, here in the United States, women are now essentially asking to be treated on a higher level than men, which was not the idea of feminism in the least. Women just ask for men to be held accountable for their actions, but when women are held accountable it is considered sexist. Why is that okay? Now understand that this is coming from a young, middle-class, heterosexual woman. This isn't coming from the middle-aged white man demographic that everyone associates anti-feminism with. But I truly believe feminism has neared the end of its course in the United States. I think it has taken on a path that was never intended by early feminists. Early feminists, I believe, intended on being paid as equals, hired as equals, treated as equals in society, and even punished as equals. The goals of feminism were never to make men lower than women, or to man-hate. The goals of feminism was for women to be treated as valued members of society, and here in the US, I believe we have done that. So instead of focusing our feminist views on Freeing The Nipple, or basically saying that the men pleading not guilty after being accused of rape shouldn't even be listened to, how about we focus on getting those rights that we are blessed with to women in other countries who can't speak unless spoken to, or can't attend school, or can't have any job other than a housewife. That is where feminism is truly needed.