Doesn't it feel like older generations have banded together just to let you know how terrible life is going to be after college? You hear it from the striped polo-shirt and khaki-wearing dads who stroll campus during homecoming, telling tales of their collegiate glory days to anyone who'll listen. You hear it from the doom-and-gloom newscasters who run features twice a month on the bleak job market for recent grads. You hear it from professors who snidely remark that there are no bonus points for participation in the "real world."

It seems that everyone wants a chance to let you know how good you have it, and just how terrible life is going to be as soon as you exchange your cap and gown for business casual. The team at City Guide may not know everything, but we do know what it's like to be in our 20s, because, well, we are. Here are 10 Lies the World Tells You About Life After College.

RELATED: The 10 Types of Douchebags You Meet in College
RELATED: The 10 Most Worthless College Majors 
RELATED: 20 Mistakes Everyone Makes in College