Taking over the media for the last several months has been the healthcare debate. Since Trump took office, there has been no real answer on what is going to replace Obamacare, but what we know so far is that it's going to be bad.
The thing is, we shouldn't have to fight for healthcare, it should be a right, just like it is in every other developed country. We shouldn't have to beg for basic health coverage anymore.
The newest Senate health care bill takes away the laws that makes it mandatory for states to provide maternity services, emergency services, and mental health services as well as has deep cuts for Medicaid.
The United States preaches freedom and the American Dream, but how are we supposed to be able to reach the American Dream when we aren't even guaranteed health care. We trap our citizens in an endless cycle of debt, with no clear light at the end of the tunnel. In order to have good health insurance you need to have a high paying job, and in order to have a high paying job you need to have typically more than one degree, and in order to get those degrees you need to take out thousands of dollars in loans that you're paying back your whole life. And what kind of healthcare are we supposed to have while we're in school for 4-12 years?
The United States is one of the only fully-developed countries that doesn't have a universal healthcare system, but why?
Because we believe that everyone should be self-sufficient, which is great and all, but every step of the way we put a roadblock in their way. It's ignorant to think that anyone is going to succeed the way we live today.
It honestly shouldn't be up for debate, yes we need to teach people to be self-sufficient, but we also need to care for our citizens. I urge you to look at the new healthcare policies and don't be silent as people take away our basic right to healthcare.



















