A great majority of my life has been spent at church. I went to the same church from time I was born until I graduated and moved to a new town. I also went to the Christian school from K-4 to 12th grade, so I have watched the Christian community for many years. Lately, however, I feel the need to distance myself from the community I loved for so long. I am not ashamed of being a Christian, but I'm ashamed of those who call themselves Christians.
Since when did Christianity become hatred? I watched kids get ridiculed for being gay, and had older members of the church explain to me that it was a "choice". Well, last time I checked, Jesus ridiculed no one. Who would choose to be gay if it means being bullied and disowned by your friends and family? The answer is no one.
Since when did Christianity start supporting racism? Recently many people in the NFL took a knee while the National Anthem, and many of the Christians I know are up in arms about it. Why? It was a peaceful protest. The people who kneeled are not against the flag or against this country. They are just bringing attention to the fact that inequality still exists. I have also heard Christians mocking groups like Black Lives Matter, or say things like "All Lives Matter". This also comes from a lack of understanding. The group is called Black Lives Matter not because they think they are the only lives that matter, but because they are the ones receiving unjust treatment. It's the same reason the fight for women's rights is called feminism and not equalism.
Since when did being a Christian mean ignoring science? Why do so many Christians believe that Christianity and science don't go hand in hand? I did not learn that natural selection and evolution were real until I was a sophomore in high school. No one taught us that both could exist in perfect harmony.
Why don't Christians believe in climate change? The earth fluctuates, but study after study shows that we are adding to this fluctuation. Even if you don't believe in climate change, why would you want to continue to destroy the world God made? Why are Christians the least supportive of greener movements, and doing what they can to live in a better, cleaner, world?
Using religion as an excuse to be ignorant on multiple issues is not okay; it never has been. You should be loving those around you regardless of how they identify. You should be educating yourselves on issues, and what we can do to solve them. You should be taking care of the earth and all the organisms in it. These are the things you should be doing as Christians because if you don't you're going to lose more and more people. As an intelligent human being, I am finding it harder and harder to identify as a Christian because I don't want to be associated with the ignorance that seems to come with it.