10 Lies the World Tells You About Life After College Image via MyFootpath

9. "Your Major Matters."

The majority of the workforce won't end up in careers that have anything to do with their major. In fact, a recent study found that half of recent grads will take jobs that don't even require a degree. Your undergraduate major likely won't be the key to a lucrative professional life. Even fields with tangible job opportunities often require graduate work for the best positions. You can let this news depress you, or you can allow it to liberate you.  

Not to sound like a filthy hippie, but perhaps you should choose your college major based on how you want to view the world, not how you would like to view your bank account. When you are working as a bartender with a business major on one side and a pre-law grad on the other, at least you'll be able to quote Shakespeare. While they're drinking too much of the good stuff on the job, you'll remember that the Bard once said, "[Alcohol] provokes the desire but takes away the performance." Now that's worth something. 

blog comments powered by Disqus