I think it's fair to say we all have our guilty pleasures in life.
Those romantic comedies, steamy novels, or dramatic tv shows.
We love seeing the super attractive guy, with the super attractive girl.
But I think what society has picked up on, and what continues to happen, is women are being overly sexualized.
Whether it be a commercial, a show, a movie, a picture, or advertisement, sex will sell just about anything.
And I know this happens to men as well, and sure we don't mind looking at it, and might not even notice any real problems, but it is a problem.
We live in a society where men and women want to be heard, they want their voices out on the platforms for the world to hear, they want to change, and action.
We want to end major problems like human trafficking, sexual predators, and rape culture.
But let's put up a billboard of a half-dressed woman for children to see.
Let's make this simple.
No a woman or man for that matter, wearing very little clothing does not mean they are open or willing to engage with anyone sexually. This does not excuse rape, catcalling, or other sexual comments.
But listen, I am a woman, and if there is a man on the beach with a six-pack, It might catch my eye.
Just as a woman with a very revealing top may get a couple of glances, but this still doesn't have to be made sexual or overly dramatic. We can notice, and control our thoughts.
But here's the thing, if we continue to push sex, it really doesn't help our case.
As a woman I know if I walk into a job interview I'm going to look my best because my goal is to show I am a sophisticated individual worth being hired, it sends a message, just as walking into an interview with sweatpants would be.
I know I can speak for all men and women and say we all desire respect, as we should.
We don't want unwanted attention.
But there are a lot of other things we don't want either,
as I mentioned before, predators, sex traffickers, or rapists.
I believe clothing or lack thereof do not lead to such things, but rather things like pornography, graphic movies, shows, or magazines can "encourage."
NOT intentionally.
But think about it, really.
They create a fantasy, which means they aren't real. But when we continue to promote these things it becomes real for some people.
We as women want to be heard, we want respect, we want equality, but I'm telling you we are not going to get that in a society that banks off of sex. Or sexually exploiting ourselves.
Because
1. WE SHOULD NEVER HAVE TO SELL OURSELVES LIKE THAT TO ANYONE
2. I'm pretty sure people will still buy the product without the half dressed individuals if marketed well
I think if we want to change then we need to fix the issues staring right at us.