Isaac Cohen
Close

Why are you in college? According to a Gallup poll, most students report that their primary reason for attending college is to earn better job prospects. But what makes college degrees lucrative? Why would employers pay more to hire people with fancy sheets of paper? The conventional explanation is that in school people learn skills they later apply on the job. Economists call this the “human capital” model of education. Employers pay more to hire people with degrees because they perform better on the job as a result of the knowledge and skills they attained in school. In this model, the true purpose of college is to learn economically valuable skills.

The human capital model is the implicit justification for all pain inflicted on students at school. Throughout our time in college, we’re told to write essays, study for exams and complete assignments, often on material we’re not interested in. Students spend sleepless nights sitting in front of their MacBooks, scrolling through notes. The explicit goal in their minds? Pass the exam. We’re told we need to learn the material for our own good. Just as a child cries while receiving a vaccine, a student may cry over calculus homework. But in the end, it’s better for them both. At its finest, the model is invoked to justify the insidious Orwellian idea that helping your classmates is “unauthorized collaboration.” Helping people doesn’t help them, professors say, because they’ll never learn on their own. But is the model true? Are we really in college to learn?

Although the human capital model has many fans, particularly among faculty who are fascinated with their subjects and eager to push policies that give them a captive audience onto whom they can thrust their nerdiest interests, it struggles to explain the everyday realities of college. As you probably noticed, much of the curriculum is esoteric and unrelated to actual jobs. People complain about general education requirements, but the issue persists for in-major courses as well. Even computer science, supposedly one of the most applied majors, features requirements like automata theory, where students spend a semester proving abstract theorems that few of them care about and none of them will ever use. The most applied part of computer science, at least for the majority of students, is “programming for the web,” a random elective students can take in their final year. If employers pay more to hire college graduates because we know more, why is our curriculum so ill-adapted to the job market?

The human capital model also cannot explain why so few students actually want to learn the material despite being told to believe it’s good for them. As students, we almost never talk about learning. Instead, we talk about credits, requirements, easy-A’s, how many points each exam is worth and what we still need to do before graduating from DegreeWorks. Even professors spend a considerable portion of class time on this stuff. If college is about learning, who needs exams, grades or credits? And do you ever cheat? Most people have, at least once. But what are cheaters trying to accomplish? And why does the University try to stop them? If cheaters only hurt themselves by depriving themselves of learning, why is cheating a social problem?

Finally, the human capital model’s contention that knowledge learned in school gets applied on the job, requires that people remember what they learned in school. But, even this simple prediction doesn’t hold. Most people take biology and history in high school. Do you recall what DNA Polymerase does or when Jamestown was founded? And you’re an intelligent student at Binghamton University. Surveys of the general American population find that most Americans don’t know electrons are smaller than atoms. Regardless of why students forget most of what they learn, this forgotten knowledge can’t be what’s driving their increased salaries.

So really, now, why are we in college? You knew the answer all along — we’re here to get a degree. We all want degrees because they make it easier to get high-paying jobs. Employers hire people with degrees because having a degree sends a signal about you. It says you are smart, normal, competent and conformist. You go to college like everyone else, you can master complex topics for brief periods of time, you can shut up and listen even to a boring lecture and you have the patience and time management skills to go through with long-term projects. Now do you understand why most of the curriculum is irrelevant, why people cheat, why cheating hurts the university, why exams matter so much, why you supposedly shouldn’t help your classmates and why retention of information is a nonissue?

This explanation regarding the power of a college degree, which economists call the “signaling” model, has an important implication — it’s not the degree itself that is valuable — it’s your level of education relative to others in the job market. No one needed an associates degree until everyone started getting them. Now you need a masters degree to stand out from the crowd of bachelors degrees. In other words, we’re all engaged in a nasty zero-sum competition with no winners. We spend time and money going to college, forgo fun activities to do useless assignments, torture ourselves with all-nighters to cram before finals and inflict punishments on cheaters or people who can’t keep up — all in the service of out-signaling our fellow students. If only we put an end to degree mills, we’d live in a happy, cooperative world, where we can all learn out of curiosity and have fun together purely for the joy.

Isaac Cohen is a senior majoring in computer science.