“Life Will Take Care of the Rest”

A few months ago, Nathan Heller wrote a fantastic review of William Deresiewicz’s book, a book that argues that elite colleges are bad for the soul.

The close of Nathan’s piece makes an important point with a light touch:

Beneath [Deresiewickz’s] fury at the failings of higher education is an almost religious belief in its potential. The stakes are, in truth, lower than he thinks. A college education, even a poor one, isn’t the final straightaway of self-realization, after all. It is the starting gate. College seniors leave with plans for law careers and then, a J.D. later, find their bliss as graphic artists. Financiers emerge as novelists. Avowed actors thrive in corporate life. And some alumni, maybe more than some, never get there; they work, marry, bear kids, buy homes, and feel that their true lives have somehow passed them by.

Would better college years have made those people more fulfilled? Even in the era of fast tracks and credentialism, the psychic mechanisms of an education are mysterious. Let teachers like Deresiewicz believe. For a couple of hours every week, students are theirs in the classroom to challenge and entrance. Then the clock strikes, and the kids flock back into the madness of their lives. Did the new material reach them? Will the lesson be washed from their minds? Who knows. They heard it. Life will take care of the rest.

The College Premium

The everyone-should-attend-college camp often cites the “college premium” — people with college degrees make a lot more money in life than those who do not.

In his recent Econtalk interview, Bryan Caplan adds interesting nuance to this claim. The most important takeaways, in my own words:

  • An average college grad makes 83% more money in an average year than an average high school grad.
  • Folks with some college (who don’t graduate) make on average 10% more than high school grads.
  • Why is there such a premium? The usual story points to the value of the college education itself. Bryan sooner points to the kind of people who attend college.
  • There’s a big difference at the starting point of college. Those who sign up to go to college are, at the outset, going to have higher IQs and an accumulation of other initial advantages than those who choose not to sign up. It stands to reason that those inclined to sign up were going to succeed either way — it has less to do with what they actually learn in school.
  • The 5-year graduation rate for a 4-year college degree (i.e., giving someone five years to graduate) is roughly 55%. In other words, almost half of people who start college do not finish.
  • It’s pretty predictable who will drop out: those with weak academic ability in high school will probably not graduate. Caplan: “For students in the bottom quartile of academic ability [in high school], paying a year’s college tuition is almost as foolish as buying 10,000 lottery tickets.” I previously blogged about this phenomenon in my post Who Should and Should Not Be Going to College.

Bottom Line: The earnings premium college grads enjoy is complicated and may not have much to do with college per se. And fewer people should be attending college — especially those who struggle in high school.

Teamwork Wins in Real World, But School Teaches the Opposite

Imagine if your high school diploma listed not just your name in big italic font but also the names of the specific classmates you allied with to achieve academic success.

Unfortunately, it’s a far-out scenario, because we have an education system that rewards individualistic achievement and teacher-pleasing, not teambuilding and broad collaboration—essential skills in almost every professional field.

In school, there’s only one relationship that matters: the one between you and the all-mighty teacher.

Accordingly, it pays to be a teacher’s pet. Raise your hand constantly in class. Ask for extra work. Tell the other students to quiet down when the teacher enters the classroom. These things might cause classmates to sneer behind your back, but they don’t decide whether you get an A+. You might have no one to sit with at lunch, but you’ll be laughing all the way to the Valedictorian seat at graduation.

In the real world, however, if your officemates sneer at you behind your back, you’ll be falling all the way to the bottom of the company org chart. At work, you only get ahead by completing projects that require more than one person; the most important projects are always group efforts. Teamwork rules. You produce impressive accomplishments by collaborating with others, by forming alliances, by mastering the politics of the office.

In Ender’s Game (one of my favorite books growing up), there’s a bracing example of the military commanders testing Ender’s ability to be a leader. When Ender arrives at Battle School, the head commander praises him relentlessly in front of his peers. By exalting Ender as a true genius out in the open, the commander intentionally makes Ender’s ultra-competitive peers resent the special attention, thereby making it more difficult for Ender to form alliances. The commanders test Ender in a way that school never does: Can he negotiate rivalries and partner with his peers to build a team and accomplish something great?

Now, while it’s true that at work you usually have a single manager who determines your bonus or promotion, that manager’s perception of you is shaped by many sources.

This is not the case at school; there’s little opportunity for a fellow student to sabotage your reputation with your teacher. Did you ever sit around with your teacher in high school and BS about how your classmates are doing academically?

At work, though, this happens all the time. Those you work with whisper quietly to your boss.

Boss: “By the way, how’s it going on that project for the big client?”

Your colleague: “Oh it’s going fine. Yeah, you know, [Your Name]’s working hard, though I’m not sure he’s really a natural at this kind of work. The client has told me it can be hard to work with him at times. But it’s not a big deal, and overall, things are going well, thanks for asking.”

When I meet with really successful professionals, they frequently reflect on this disconnect: in school they thought it was an individual game, in life they realize it’s a team game, and team games require skills they never developed in school.

For example, I had dinner the other week with an accomplished doctor in his 60’s. He told me that in the first half of his career he thought what mattered for standing out in his field was possessing superior knowledge. If he memorized more than the next guy, he thought, he’d get ahead. Today, he realizes what matters is his ability to persuade others—to convince other researchers to partner with him projects, to convince hospitals to adopt his ideas, to convince students in residency to follow his leadership, etc.

And it turns out, memorizing organic chemistry formulas was a whole lot easier than learning to read a room, interpreting human motivations, and building teams who will follow you.

When reflecting on how the education system does or does not prepare students, we should pay special attention not just to areas where school under-prepares students for the real world (more statistics! more engineering!), but where school actively misprepares. Where an entire framework of “how to be successful” has to be unlearned and replaced by something else. These are the most consequential breakage points in formal schooling.

[This post originally appeared on LinkedIn.]

Disrupting the Diploma

I worked with Reid Hoffman (and Greg Beato) on a long essay titled: Disrupting the Diploma: How updating the communication device known as a “diploma” will help students acquire the right skills and help companies hire the right talent. We take on the under discussed topic of credentialing, and how credentialing as a platform will improve higher education.

Excerpt:

In the same way that trailblazers like Coursera and Udacity are making instruction faster, cheaper, and more effective, we should also make certification faster, cheaper, and more effective too.

To do this, we need to apply new technologies to the primary tool of traditional certification, the diploma. We need to take what now exists as a dumb, static document and turn it into a richer, updateable, more connected record of a person’s skills, expertise, and experience. And then we need to take that record and make it part of a fully networked certification platform.

Once we make this leap, certification can play a more active role in helping the higher education system clearly convey to students what skills and competencies they should pursue if their primary objective is to optimize their economic futures.

And:

Imagine an online document that’s iterative like a LinkedIn profile (and might even be part of the LinkedIn profile), but is administered by some master service that verifies the authenticity of its components. While you’d be the creator and primary keeper of this profile, you wouldn’t actually be able to add certifications yourself. Instead, this master service would do so, verifying information with the certification issuers, at your request, after you successfully completed a given curriculum.

Over time, this dynamic, networked diploma will contain an increasing number of icons or badges symbolizing specific certifications. It could also link to transcripts, test scores, and work examples from these curricula, and even evaluations from instructors, classmates, internship supervisors, and others who have interacted with you in your educational pursuits.

Ultimately the various certificates you earn could be bundled into higher-value certifications. If you earn five certificates in the realm of computer science, you might receive an icon or badge that symbolizes this higher level of experience and expertise. In this way, you could eventually assemble portfolios that reflect a similar breadth of experiences that you get when you pursue a traditional four-year degree.

For students, the more modularized approach to instruction embodied in such diplomas would have immediate benefits. Traditional four-year degrees maximize tuition costs, because they only award certification for lengthy courses of study that require substantial capital investments. A more modularized system would move beyond this all-or-nothing approach. Instead of taking general education classes for two years and then dropping out and ending up with little to show for their efforts except two years of debt, students could make smaller investments — in money and time — to acquire specific credentials.

You Never Truly Leave High School

Jennifer Senior wrote a great piece in New York magazine a couple months ago titled “Why You Truly Never Leave High School.” It’s about the formative and lasting nature of the American high school experience. Excerpts below.

Our brain is primed to remember what happens during adolescence:

But for most of us adults, the adolescent years occupy a privileged place in our memories, which to some degree is even quantifiable: Give a grown adult a series of random prompts and cues, and odds are he or she will recall a disproportionate number of memories from adolescence. This phenomenon even has a name—the “reminiscence bump”—and it’s been found over and over in large population samples, with most studies suggesting that memories from the ages of 15 to 25 are most vividly retained.

On the adhesiveness of our self-image from those days:

Our self-image from those years, in other words, is especially adhesive. So, too, are our preferences. “There’s no reason why, at the age of 60, I should still be listening to the Allman Brothers,” Steinberg says. “Yet no matter how old you are, the music you listen to for the rest of your life is probably what you listened to when you were an adolescent.” Only extremely recent advances in neuroscience have begun to help explain why.

It turns out that just before adolescence, the prefrontal cortex—the part of the brain that governs our ability to reason, grasp abstractions, control impulses, and self-­reflect—undergoes a huge flurry of activity, giving young adults the intellectual capacity to form an identity, to develop the notion of a self. Any cultural stimuli we are exposed to during puberty can, therefore, make more of an impression, because we’re now perceiving them discerningly and metacognitively as things to sweep into our self-concepts or reject (I am the kind of person who likes the Allman Brothers). “During times when your identity is in transition,” says Steinberg, “it’s possible you store memories better than you do in times of stability.”

An adolescent subculture is a new phenomenon; teens don’t spend much time with adults anymore:

Until the Great Depression, the majority of American adolescents didn’t even graduate from high school. Once kids hit their teen years, they did a variety of things: farmed, helped run the home, earned a regular wage. Before the banning of child labor, they worked in factories and textile mills and mines. All were different roads to adulthood; many were undesirable, if not outright Dickensian. But these disparate paths did arguably have one virtue in common: They placed adolescent children alongside adults. They were not sequestered as they matured. Now teens live in a biosphere of their own. In their recent book Escaping the Endless Adolescence, psychologists Joseph and Claudia Worrell Allen note that teenagers today spend just 16 hours per week interacting with adults and 60 with their cohort. One century ago, it was almost exactly the reverse.

Something happens when children spend so much time apart from adult company. They start to generate a culture with independent values and priorities.

Guilt can be useful, whereas shame is not:

The academic interest in shame and other emotions of self-consciousness (guilt, embarrassment) is relatively recent. It’s part of a broader effort on the part of psychologists to think systematically about resilience—which emotions serve us well in the long run, which ones hobble and shrink us. Those who’ve spent a lot of time thinking about guilt, for example, have come to the surprising conclusion that it’s pretty useful and adaptive, because it tends to center on a specific event (I cannot believe I did that) and is therefore narrowly focused enough to be constructive (I will apologize, and I will not do that again).

Shame, on the other hand, is a much more global, crippling sensation. Those who feel it aren’t energized by it but isolated. They feel unworthy of acceptance and fellowship; they labor under the impression that their awfulness is something to hide. “And this incredibly painful feeling that you’re not lovable or worthy of belonging?” asks Brown. “You’re navigating that feeling every day in high school.”

We’re all in high school, all the time:

Today, we also live in an age when our reputation is at the mercy of people we barely know, just as it was back in high school, for the simple reason that we lead much more public, interconnected lives. The prospect of sudden humiliation once again trails us, now in the form of unflattering photographs of ourselves or unwanted gossip, virally reproduced. The whole world has become a box of interacting strangers.