There is so much to be afraid of in this world. This world is a scary place. Especially with what America has come to. What happened to the American Dream? So much has changed along the years, makes me think many of Americans truthfully forgotten what the American Dream is and what it is suppose to be.
There is not much freedom in this world including safety. This world has taken a turn to the point where no one can feel safe at a concert, movie, or school event without feeling fear and what may happen. There has been so much change in this world. So many people want that American Dream back. The feeling of ideals of freedom, equality, and opportunity to be held in every American. The only thing Americans care about is their possessions, which is one of many reasons to make me think the American Dream might be dead. Or has it just shifted?
Not many people even have values anymore. America has taken a shift.
Originally, the American Dream was every American has the chance to succeed in success, through hard work. With that being said, the American Dream was once alive with Americans going for what they believed in. Everyone has a chance to achieving success.
Now a days, with that being said, the American Dream has a complete different meaning. It is to focus on what matters. Maybe creating a better life for you and spending time with family and friends.
With that being said, there even a way to get the American Dream back? Is it dead?