The current cultural trend pushes hard for college. At this point, college isn't really considered to be an addition to your education, but a necessary next step. Is it really that important? Should society truly continue placing these four extra years on such a high pedestal?
I would say no, for several reasons.
Consider what the four years of college give you. The diploma is obviously the big goal, but you also gain a significant amount of knowledge. College is also an environment where people become exposed to the "fun" parts of the world. Drinking and drug use are often encouraged and expected events within college. This can be fine for many students, but others come out of college with addictions that shroud them their whole life. The college environment that finds getting wasted several days in a row to be amusing or normal can breed addictions. An overly dramatic statement maybe, but something to consider.
College, of course, costs quite a large sum of money. Oftentimes, college students have to take out immensely detrimental student loans to have the "necessary" education of college. These loans will hang over students' heads long after college has ended.
Many may argue that a college diploma is necessary to get a good job. I would refute that by admitting, yes, college is necessary for some fields, but not for all. In many instances, it is better to spend 4 years getting job experience and working your way up the corporate ladder than it is to spend 4 years at college and start at the same or lower position. By working those 4 years, you also skirt around the problematic student debt.
I will admit that the friendships and connections that college gives you are incredibly helpful. Yet these are not things that become inaccessible once college ends. Most of our lives are lived outside of a classroom, as we continue to make friends and connections far beyond that. Learning how to make connections without the helping hand of classes is an essential skill beyond college. A person who spends 4 years working will understand that skill much better than one with 4 years of college under their belt.
Lastly, I'd like to argue for high school. The majority of knowledge acquired from college is useless within an actual job. Jobs require a specific set of skills that need to be taught before the employee can be productive. It would then be much better to learn more practical skills from school. The better you are at acquiring new skills, the better you will do. In that case, the best scenario would be for high schools to teach their students practical experience to ready them for jobs, instead of basic knowledge so they can do well in college.
There's my rant against college. Personally, I'm glad I went to college and I believe I gained a valuable set of skills from it. I can't say that this is the case for most people I know.