Ignorance is bliss. Or is it? As a society, we tend to ignore and deny things that make us uncomfortable. We sweep things under the rug because we don't want to deal with it or handle the situation. Or we are even more ignorant to the point that we excuse certain actions and behaviors altogether. How have we come to this? Why are some of these things still socially acceptable? How can we tackle them?
This article idea has come to me through things I have witnessed throughout my life. I grew up knowing I was a girl, did girl things, etc. It never occurred to me that as I grew older, I would be treated differently by society and the people around me. It never occurred to me that boys could easily get away with cat-calling and that I had to control my wardrobe for fear that I might "distract" somebody. Girls were perfect and pretty. They had to fit this mold or society disapproved of them.
As a society, we have constructed gender norms. Colors, advertising, even different foods see to be marketed based off your gender. We all do gender somehow; we fit the mold in some ways and don't fit the mold in other ways. If this separates us as a society, why can't we just eliminate it? We have normalized this blue and pink world where men will always be dominant over women. It has gone way too far; we have a significant rape culture and a significant gender gap in many ways.
To start off, why have we not achieved equal pay? Women have definitely proven themselves to be just as capable of men in the workplace. No, their emotions and biological "hindrances" do not stunt their performance at work. Women have to assume manly qualities to even be considered for a job. Some workplaces would sooner higher a male over a female even if they were to have the same exact resume. Why is that? Why have women not been able to escape the stereotype that they are weak and incapable of doing a "man's work"?
I ask again, why do we normalize all of this?
Most importantly, I'd like to touch on rape culture. Women are stuck behind so many walls of oppression when it comes to sexuality and sexual expression. You enjoy sex? You're a slut. You are waiting for the right time? You're prudent. There seems to be no way around any of it. We will sooner defend a man for heinous, sexual crimes than acknowledging the victim. We ask the victim what she was wearing, where she was going, why she was there as if it was her fault that she was raped or sexually harassed. Do we honestly think women are just crying to be raped because of the length of their skirt? I can speak for all women in saying that no, none of us EVER want negative sexual attention.
Why do we have to teach our girls to be safe? Why can we not teach men to not rape, understand consent, and respect women? How hard is that? If we were to implement that from a young age, we would slowly diminish this horrid rape culture nightmare we are living in. We need to empower women that their bodies are their bodies and that it is their choice to do whatever they please with it. Why hasn't society come to this point yet? Why are we incapable of that?