There's a disturbing trend going in pop culture today. Anywhere you look there's some new TV show, movie, or book about a person with cancer. I bet you're familiar with the "Fault in Our Stars," "Me Earl and the Dying Girl," or "The Big C."
All about cancer. All romanticizing cancer.
Someone told me once that they liked the "Fault in Our Stars" because it more than a love story, it's a book about what it's like to have cancer.
Is it? Is it, really?
I'm not saying I didn't enjoy the movie. I'm not saying it's a bad thing to write about cancer or make a movie about cancer. It's real and it happens.
But don't make the disease a gimmick.
It freaks me out that Hollywood is using the tragedy that is all too real for many families as a way to make money. That seems pretty wrong to me
Cancer isn't a fad.
It's not a trend.
When "Twilight" first came out, everything was about vampires for a while.
Then zombies became the next big topic.
Then distopian novels where the main character was a girl who had to singlehandedly save the world.
Now, cancer?
It just frustrates me that Hollywood is using it as a canvas to display another angsty, hormone-fueled teen romance. I don't think we should make entertainment out of this. It would be one thing if any of these books, movies, or TV shows were any kind of realistic but they are not. It doesn't show the ugly, uncomfortable parts of disease. It glosses over them like afterthoughts, as if the whole point of the character even having the disease is to add to the 'Romeo and Juliet' effect and not how the disease, in all its facets, impacts families and relationships.
I think it's disrespectful to cancer patients and their families.
I'm just saying, don't simplify cancer by using it as a way to make pre teen girls cry.