Back in the day, journalists were well respected and had a tremendous amount of dignity. As the years went on, the 1960s began to change and revolutionize the media. Events such as the Vietnam War, the JFK assassination, the Bay of Pigs, and many more events made it so journalists had to “cover up” what was going on with the government.
Before this time, the government was supposedly telling the truth to the American people though we do not know for sure. Once the 1960s hit suddenly there was skepticism for trust in the government. I have noticed that journalists are either told not to tell the truth or they are told not to tell the whole story.
But Americans wonder why some people are not educated enough to make their own decisions. Maybe, we shouldn’t be constantly fed lies and we would know what’s going on. Journalists should never have to “cover up” the government’s mistakes and should do their job, which is to inform people. Also, journalists have to stop putting their personal opinions in their writings. It is not professional.
I now understand why 75 percent of Americans feel the government is corrupt. How can we put trust in something that constantly lies to us and never tells the whole story? I want to be a journalist so I can single-handedly bring back some integrity and dignity to writing. People need to know the truth and not be fed lies. I promise to only bring the facts and let people decide what they want to believe in.
It actually sickens me to see our media tell people the wrong things. I mean maybe we would all get along if we knew what was actually happening. The media and the government need a complete restructure and let's hope that happens soon.