Teamwork Wins in Real World, But School Teaches the Opposite

Imagine if your high school diploma listed not just your name in big italic font but also the names of the specific classmates you allied with to achieve academic success.

Unfortunately, it’s a far-out scenario, because we have an education system that rewards individualistic achievement and teacher-pleasing, not teambuilding and broad collaboration—essential skills in almost every professional field.

In school, there’s only one relationship that matters: the one between you and the all-mighty teacher.

Accordingly, it pays to be a teacher’s pet. Raise your hand constantly in class. Ask for extra work. Tell the other students to quiet down when the teacher enters the classroom. These things might cause classmates to sneer behind your back, but they don’t decide whether you get an A+. You might have no one to sit with at lunch, but you’ll be laughing all the way to the Valedictorian seat at graduation.

In the real world, however, if your officemates sneer at you behind your back, you’ll be falling all the way to the bottom of the company org chart. At work, you only get ahead by completing projects that require more than one person; the most important projects are always group efforts. Teamwork rules. You produce impressive accomplishments by collaborating with others, by forming alliances, by mastering the politics of the office.

In Ender’s Game (one of my favorite books growing up), there’s a bracing example of the military commanders testing Ender’s ability to be a leader. When Ender arrives at Battle School, the head commander praises him relentlessly in front of his peers. By exalting Ender as a true genius out in the open, the commander intentionally makes Ender’s ultra-competitive peers resent the special attention, thereby making it more difficult for Ender to form alliances. The commanders test Ender in a way that school never does: Can he negotiate rivalries and partner with his peers to build a team and accomplish something great?

Now, while it’s true that at work you usually have a single manager who determines your bonus or promotion, that manager’s perception of you is shaped by many sources.

This is not the case at school; there’s little opportunity for a fellow student to sabotage your reputation with your teacher. Did you ever sit around with your teacher in high school and BS about how your classmates are doing academically?

At work, though, this happens all the time. Those you work with whisper quietly to your boss.

Boss: “By the way, how’s it going on that project for the big client?”

Your colleague: “Oh it’s going fine. Yeah, you know, [Your Name]’s working hard, though I’m not sure he’s really a natural at this kind of work. The client has told me it can be hard to work with him at times. But it’s not a big deal, and overall, things are going well, thanks for asking.”

When I meet with really successful professionals, they frequently reflect on this disconnect: in school they thought it was an individual game, in life they realize it’s a team game, and team games require skills they never developed in school.

For example, I had dinner the other week with an accomplished doctor in his 60’s. He told me that in the first half of his career he thought what mattered for standing out in his field was possessing superior knowledge. If he memorized more than the next guy, he thought, he’d get ahead. Today, he realizes what matters is his ability to persuade others—to convince other researchers to partner with him projects, to convince hospitals to adopt his ideas, to convince students in residency to follow his leadership, etc.

And it turns out, memorizing organic chemistry formulas was a whole lot easier than learning to read a room, interpreting human motivations, and building teams who will follow you.

When reflecting on how the education system does or does not prepare students, we should pay special attention not just to areas where school under-prepares students for the real world (more statistics! more engineering!), but where school actively misprepares. Where an entire framework of “how to be successful” has to be unlearned and replaced by something else. These are the most consequential breakage points in formal schooling.

[This post originally appeared on LinkedIn.]

Disrupting the Diploma

I worked with Reid Hoffman (and Greg Beato) on a long essay titled: Disrupting the Diploma: How updating the communication device known as a “diploma” will help students acquire the right skills and help companies hire the right talent. We take on the under discussed topic of credentialing, and how credentialing as a platform will improve higher education.

Excerpt:

In the same way that trailblazers like Coursera and Udacity are making instruction faster, cheaper, and more effective, we should also make certification faster, cheaper, and more effective too.

To do this, we need to apply new technologies to the primary tool of traditional certification, the diploma. We need to take what now exists as a dumb, static document and turn it into a richer, updateable, more connected record of a person’s skills, expertise, and experience. And then we need to take that record and make it part of a fully networked certification platform.

Once we make this leap, certification can play a more active role in helping the higher education system clearly convey to students what skills and competencies they should pursue if their primary objective is to optimize their economic futures.

And:

Imagine an online document that’s iterative like a LinkedIn profile (and might even be part of the LinkedIn profile), but is administered by some master service that verifies the authenticity of its components. While you’d be the creator and primary keeper of this profile, you wouldn’t actually be able to add certifications yourself. Instead, this master service would do so, verifying information with the certification issuers, at your request, after you successfully completed a given curriculum.

Over time, this dynamic, networked diploma will contain an increasing number of icons or badges symbolizing specific certifications. It could also link to transcripts, test scores, and work examples from these curricula, and even evaluations from instructors, classmates, internship supervisors, and others who have interacted with you in your educational pursuits.

Ultimately the various certificates you earn could be bundled into higher-value certifications. If you earn five certificates in the realm of computer science, you might receive an icon or badge that symbolizes this higher level of experience and expertise. In this way, you could eventually assemble portfolios that reflect a similar breadth of experiences that you get when you pursue a traditional four-year degree.

For students, the more modularized approach to instruction embodied in such diplomas would have immediate benefits. Traditional four-year degrees maximize tuition costs, because they only award certification for lengthy courses of study that require substantial capital investments. A more modularized system would move beyond this all-or-nothing approach. Instead of taking general education classes for two years and then dropping out and ending up with little to show for their efforts except two years of debt, students could make smaller investments — in money and time — to acquire specific credentials.

You Never Truly Leave High School

Jennifer Senior wrote a great piece in New York magazine a couple months ago titled “Why You Truly Never Leave High School.” It’s about the formative and lasting nature of the American high school experience. Excerpts below.

Our brain is primed to remember what happens during adolescence:

But for most of us adults, the adolescent years occupy a privileged place in our memories, which to some degree is even quantifiable: Give a grown adult a series of random prompts and cues, and odds are he or she will recall a disproportionate number of memories from adolescence. This phenomenon even has a name—the “reminiscence bump”—and it’s been found over and over in large population samples, with most studies suggesting that memories from the ages of 15 to 25 are most vividly retained.

On the adhesiveness of our self-image from those days:

Our self-image from those years, in other words, is especially adhesive. So, too, are our preferences. “There’s no reason why, at the age of 60, I should still be listening to the Allman Brothers,” Steinberg says. “Yet no matter how old you are, the music you listen to for the rest of your life is probably what you listened to when you were an adolescent.” Only extremely recent advances in neuroscience have begun to help explain why.

It turns out that just before adolescence, the prefrontal cortex—the part of the brain that governs our ability to reason, grasp abstractions, control impulses, and self-­reflect—undergoes a huge flurry of activity, giving young adults the intellectual capacity to form an identity, to develop the notion of a self. Any cultural stimuli we are exposed to during puberty can, therefore, make more of an impression, because we’re now perceiving them discerningly and metacognitively as things to sweep into our self-concepts or reject (I am the kind of person who likes the Allman Brothers). “During times when your identity is in transition,” says Steinberg, “it’s possible you store memories better than you do in times of stability.”

An adolescent subculture is a new phenomenon; teens don’t spend much time with adults anymore:

Until the Great Depression, the majority of American adolescents didn’t even graduate from high school. Once kids hit their teen years, they did a variety of things: farmed, helped run the home, earned a regular wage. Before the banning of child labor, they worked in factories and textile mills and mines. All were different roads to adulthood; many were undesirable, if not outright Dickensian. But these disparate paths did arguably have one virtue in common: They placed adolescent children alongside adults. They were not sequestered as they matured. Now teens live in a biosphere of their own. In their recent book Escaping the Endless Adolescence, psychologists Joseph and Claudia Worrell Allen note that teenagers today spend just 16 hours per week interacting with adults and 60 with their cohort. One century ago, it was almost exactly the reverse.

Something happens when children spend so much time apart from adult company. They start to generate a culture with independent values and priorities.

Guilt can be useful, whereas shame is not:

The academic interest in shame and other emotions of self-consciousness (guilt, embarrassment) is relatively recent. It’s part of a broader effort on the part of psychologists to think systematically about resilience—which emotions serve us well in the long run, which ones hobble and shrink us. Those who’ve spent a lot of time thinking about guilt, for example, have come to the surprising conclusion that it’s pretty useful and adaptive, because it tends to center on a specific event (I cannot believe I did that) and is therefore narrowly focused enough to be constructive (I will apologize, and I will not do that again).

Shame, on the other hand, is a much more global, crippling sensation. Those who feel it aren’t energized by it but isolated. They feel unworthy of acceptance and fellowship; they labor under the impression that their awfulness is something to hide. “And this incredibly painful feeling that you’re not lovable or worthy of belonging?” asks Brown. “You’re navigating that feeling every day in high school.”

We’re all in high school, all the time:

Today, we also live in an age when our reputation is at the mercy of people we barely know, just as it was back in high school, for the simple reason that we lead much more public, interconnected lives. The prospect of sudden humiliation once again trails us, now in the form of unflattering photographs of ourselves or unwanted gossip, virally reproduced. The whole world has become a box of interacting strangers.

When People Remind You of Your Younger Self…

…and when you have some issues with how your younger self developed, stuff happens.

Consider David Foster Wallace as a professor of creative writing at Illinois State. From the recent bio:

In his undergraduate class, Wallace was kind to the clueless but cruel to anyone with pretensions. When a student claimed that her sentences were “pretty,” he scribbled lined from her manuscript on the blackboard and challenged, “Which of you thinks this is pretty? Is this pretty? And this?” He continued to battle any young man who reminded him of his younger self. When one student wowed his classmates with a voicy, ironic short story, he took him outside the classroom and told him he had “never witnessed a collective dick-sucking like that before.” Wallace promised to prevent the “erection of an ego-machine” and strafed the student with criticism for the rest of the semester.

Bootcamp Model of Learning

The “bootcamp” model of learning is on the rise–learning via a focused, intensive period of time dedicated to learning one thing.

I did a 10 day intensive meditation bootcamp. All meditation, all the time.

A friend recently completed a four day rationality bootcamp — where you learn and think about the meaning of rationality and how to become more rational yourself.

Another friend recently completed a 10 week Ruby on Rails bootcamp — where you intensively study the Ruby programming language and by the end are employable as a web developer.

Another friend recently completed the 10 week Singularity University at the NASA Ames campus — where you think deeply about how to change the world and network with the likeminded.

In all cases, you stop what you’re doing, travel to a place, surround yourself with teachers and students, and go deep on the topic. The upside to learning this way is obvious. It takes hours to get into creative flow. Deliberate practice — which is a structured way to learn something — requires sustained attention. In an always-on and distractible culture, the rare act of deep immersion can produce differentiated insights. At my meditation retreat, the deep, sustained focus mattered because it was only after 80 hours of continuous meditating where I was able to achieve some of the more profound insights. Had we done two hours a day over many weeks, I don’t think I would have ever reached the heights I did.

The downsides to the bootcamp approach are perhaps less obvious. One downside for me is what you might call “social marination.” I rely on my network to teach me things via ongoing conversation about an idea bouncing around in my head. I might read a book about something, blog about it, then talk to someone in my network, get emails from readers on the topic, then read another book, then perhaps listen to a speaker at a conference, etc. Over a multi-month period of time, consciously and unconsciously, I begin to crystalize lessons or insights. (Is another downside the idea of spaced repetition memorization?)

Formal schooling is the anti-bootcamp model. You study many different topics at once–it’s a constant balancing act. As David Brooks once noted, to be an excellent student you have to train yourself to not let yourself become too interested or immersed in any one thing. I should note that the liberal arts school Colorado College is an exception. There, you study one class per semester. It’s interesting more schools haven’t tried that model.

Finally, the bootcamp model of learning doesn’t have to be a formal class at a campus. Ryan Holiday suggests a bootcamp model to reading books. Interested in the civil war? Read 10 books on the topic in a row. Then pick a new topic. One topic at a time.

My questions in close: What are the skills that lend themselves particularly well to learning-via-bootcamp? Should a model for investing in yourself include attending bootcamps of this sort?

Rahm Emanuel’s Ideas for Improving Higher Ed

From this profile of Rahm Emanuel in the Atlantic, there’s this excellent nugget on how he’s reforming Chicago’s community colleges:

IN HIS 2006 book, The Plan, Rahm proposed that all Americans go to school for at least 14 years. Like Presidents Clinton and Obama, he has long seen community colleges as crucial to preparing the American workforce for global competition and to saving young people who would otherwise be condemned to poverty. But Chicago’s city colleges have become dysfunctional, with graduation rates a pathetic 7 percent. (Nationally, only 15 out of 35 community-college systems graduate more than 50 percent.) “We have 9.4 percent unemployment, 100,000 job openings, and I’m spending a couple hundred million dollars on job training,” Rahm tells me. He pauses to let the absurdity of this sink in. “So we are going to reorganize it.”

Rahm fired almost all the college presidents, hired replacements after a national search, and decreed that six of the seven city-run colleges would have a special concentration. Corporations pledging to hire graduates will have a big hand in designing and implementing curricula. “You’re not going for four years, and you’re not going for a Nobel Prize or a research breakthrough,” he says. “This is about dealing with the nursing shortage, the lab-tech shortage. Hotels and restaurants will take over the curriculum for culinary and hospitality training.” Already AAR, a company that has 600 job openings for welders and mechanics, is partnering with Olive-Harvey College; Northwestern Memorial Hospital is designing job training in health care for Malcolm X College. Equally important, the city colleges are overhauling their inadequate guidance services and contacting the 15,000 students most likely to drop out. As of March, all 120,000 students are being tracked, and those in danger of slipping through the cracks will be counseled. Thinking big, Rahm wants Chicago to be the national model for rescuing the middle class.

Makes a ton of sense. If a kid is in a community college trying get trained for work in a restaurant or a hotel, why the heck wouldn’t the potential employers of those students have their hands all over the curricula? Hopefully Rahm’s model inspires imitators.

No Such Thing as Different Learning Styles?

A couple years ago I interviewed a few neuro-psychologists and learning experts to see if they could help me understand how I learn and process information. My thinking was, kids with learning disabilities submit to a battery of cognitive tests that supposedly reveal useful information about the way they learn. Could I do the same and find out more conclusively if I'm a visual learner or auditory learner? The experts told me that it was unlikely the tests would help someone who is fine and high functioning. So I passed.

According to recent research, though, the very idea of personal "learning styles"–an idea at the center of many education philosophies–may be false. In fact, we may all learn pretty much the same way. Here's more:

Nearly all of the studies that purport to provide evidence for learning styles fail to satisfy key criteria for scientific validity. Any experiment designed to test the learning-styles hypothesis would need to classify learners into categories and then randomly assign the learners to use one of several different learning methods, and the participants would need to take the same test at the end of the experiment. If there is truth to the idea that learning styles and teaching styles should mesh, then learners with a given style, say visual-spatial, should learn better with instruction that meshes with that style. The authors found that of the very large number of studies claiming to support the learning-styles hypothesis, very few used this type of research design.  Of those that did, some provided evidence flatly contradictory to this meshing hypothesis, and the few findings in line with the meshing idea did not assess popular learning-style schemes.

No less than 71 different models of learning styles have been proposed over the years. Most have no doubt been created with students’ best interests in mind, and to create more suitable environments for learning. But psychological research has not found that people learn differently, at least not in the ways learning-styles proponents claim. Given the lack of scientific evidence, the authors argue that the currently widespread use of learning-style tests and teaching tools is a wasteful use of limited educational resources.

(hat tip: Josh Kaufman)

Unbundling Education: Separate Out the Grading Process

The theory behind outsourcing is that it enables specialization: you do one thing really well and let others do the rest.

Some of the more frustrating customer service experiences happen with entities where there's limited outsourcing and specialization. As I've written before, airlines do way too much. They market their brand and flight routes, they handle reservations and bookings, they maintain aircraft, they deal with luggage. More airlines should do as they do with their regional jet business: focus on something and outsource the rest. In the regional jet example, the big airlines handle reservations and ticketing and outsource the actual flying of planes.

City governments are another example. They try to manage parks, sewers, potholes, utilites, and more. Yes, a government entity, as the sole provider of police, fire, roads, and a few other things, will always be more diverse in scope than any sane for-profit corporation. But many governments still do too much beyond the core essentials, and are not able to do any one of these things very well.

One way to think about improving complex, ill-performing products, services, or experiences is to see whether there's a way to unbundle it and allow greater specialization. Arnold Kling applies this approach to improving higher education. Specifically, he thinks schools should separate the task of evaluating students' work from the task of teaching the concepts. Here's the background:

In the legacy education model, teachers combine coaching, feedback, and content delivery. By coaching I mean advice, guidance, and encouragement. Feedback includes formal grading as well as informal praise and criticism. Content delivery includes lectures and reading assignments.

Perhaps the key to radically changing education is to break up those functions.

1. The coach should be someone who knows the student well, who can relate to and motivate the student, who can recommend a good educational path, who takes account of the student's strengths and weaknesses, and who stays on top of how well the student is doing relative to the student's ability.

2. The formal feedback can come from strangers. Students can solve problems or write essays and have these graded by a separate service.

3. The content delivery should be "pulled" by the student rather than pushed by a teacher. For example, a student and a coach could agree that the student should learn statistics. The student then selects a statistics curriculum and works through it. The Khan Academy lectures on statistics are particularly good, in my opinion. But Carnegie-Mellon has a good on-line stats course, also. My guess is that, overall, there is enough content on line to obtain a world class education.

Then, Arnold writes:

A few months ago, Ben Casnocha wrote,

"Maybe 5-10% of high school high achievers should pursue higher education without attending a four year traditional college. This is the "Real Life University" option for entrepreneurial spirits. This is for folks who can learn a lot on their own, can assemble mentors and advisors to guide the process, and most of all find their creativity smothered by drudgery of school — or otherwise are on a trajectory higher than what college can offer — and therefore need an alternative path." 

His estimate of the percentage may be high, particularly in the near term. But that is the group that I wanted to aim at in my post on schools without classrooms.

Anyway, one important issue with alternative education models is interfacing with the legacy credential system. If you take a course from an alternative college, how can you get the credits to transfer to a traditional college or translate into a credible degree?

Arnold's proposed solution: A Means A.

A Means A solves the problem of credibility and comparability of grades in courses taught at different institutions of higher education. The innovation is to separate the grading process from other aspects of higher education. For any college-level course, A Means A will devise an appropriate exam and use independent professionals to grade the exam, according to transparent, standard criteria

A Means A will extend the reliable, independent grading model of the AP exam to a broad spectrum of college-level courses. However, while the AP program compels instructors to "teach to the test," A Means A will "test to what you teach." That is, A Means A will take course objectives as given by instructors. It will design and grade tests that align with the objectives of the course.

It's a great thought. And it looks like one university is actually implementing part of it.

As a business opportunity, Arnold identifies the risks with A Means A, Inc. A company that promises to accomodate the idiosyncracies and variance of different schools' curriculua will have a hard time scaling the grading process in a cost-effective way. And making the credential have currency in marketplace in the early days will be tough. So while I am not so sure of the business opportunity, I think the high level prescription of unbundling is spot on. There are probably good business opportunties along these lines for education entrepreneurs–just need to brainstorm and iterate a bit more.

What Arnold has done with his A Means A post is bring to the table very specific ideas for improving the education system–not vague griping. And he aims his provocations directly at entrepreneurs–not policy wonks or politicans. A refreshing and useful approach.

What 17 Million Americans Got from a College Degree

Over 317,000 waiters and waitresses have college degrees (over 8,000 of them have doctoral or professional degrees), along with over 80,000 bartenders, and over 18,000 parking lot attendants. All told, some 17,000,000 Americans with college degrees are doing jobs that the BLS says require less than the skill levels associated with a bachelor’s degree.

That's from this piece in the Chronicle of Higher Education, via Jon Bischke on Twitter. More:

Putting issues of student abilities aside, the growing disconnect between labor market realities and the propaganda of higher-education apologists is causing more and more people to graduate and take menial jobs or no job at all. This is even true at the doctoral and professional level—there are 5,057 janitors in the U.S. with Ph.D.’s, other doctorates, or professional degrees.

For hundreds of thousands of Americans, spending four years and untold amounts of money (and debt?) gets you a job as a waiter, parking lot attendant, or janitor. Yet everyone from Barack Obama to Bill Gates keep pushing a college education as the way to secure one's economic future. That is a view that should be heavily qualified.

Here's the complete chart: 

Underemployment-chart

Lectures at Home, Homework at School

More wisdom from Sal Kahn (of the Kahn Academy):

…it makes more sense to have students watch lectures at home and do homework at school as opposed to vice versa.

So true! And revealing of larger structural problems of school.

#

Robin Hanson's theory of school is that it isn’t about learning material but rather "learning to accept workplace domination and ranking, and tolerating long hours of doing boring stuff exactly when and how you are told." He links to three other possible functions of school:

  • Legitimization: Repeated contacts with the educational system, which seems impersonal and based on reliable criteria, convinces students (and their parents) that they are ending up in an appropriate place in society based on their skills and abilities. Thus, people accept their position in life: they become resigned to it, maybe even considering it appropriate or fair.
  • Acclimatization: The social relationships in the schools encourage certain traits, appropriate to one’s expected economic position, while discouraging others. Thus, certain relationships are considered normal and appropriate. Subordination to authority is a dominant trait enforced for most students.
  • Stratification: Students from different class backgrounds, races, ethnicities, and genders are overwhelmingly exposed to different environments and social relationships and thus are tracked and prepared for different positions in the hierarchy. The different experiences and successes lead each student to see her place as appropriate.