
03-20-2011, 05:14 PM
No, going to college in this day and age absolutely does NOT guarantee you a job, it can even disqualify you from jobs because you are "overqualified."
I think work experience is more important than just going to college. -- Meaning, go to college, but try to find internships (preferably paid) or a job in the field you're going for. It looks delicious on a resume and the company you're working for might even hire you full time after you graduate.
If you get a degree in anthropology for example...but have no field work in it...You can't find a job at all, you straight up have to go to grad school, and grad school on top of student loans you've already gathered, gets REALLY expensive.
So, I think ideally, you get a combination of both for most fields. Not all fields you can get work experience in (medical professions) prior to graduation (or residency) or simply aren't a matter of work experience, but instead knowing the right people (like most art professions). But, for many people work experience is more important than college. Because yes college looks good, but, more often you will find employers hire people who already know how to do the job, and have experience doing it.
----------
As for "paying the bills" or "enjoying my career" I would rather pay my bills.
If I can pay my bills and save some money every month, I can buy things that I'll enjoy. But never having any money is very limiting...=___=" And generally makes me miserable when I have to choose bills or food...
Last edited by monstahh`; 03-20-2011 at 05:18 PM..
|