Monthly Archives: August 2008

Should We Consider Preparation Time When Evaluating Someone’s Performance?

Who’s the better competitor — the person who won the championship after spending hours each day practicing or the person who won the championship with very little practice leading up to it? Or are their achievements equal?

How about completion time? Should a student who spends only 20 minutes on a test and gets an A+ be thought of more highly than the student who spends 60 minutes on the same test and gets an A+? Students with learning disabilities sometimes get significantly more time on a test than others, and yet if they get the same final score on a test we treat the outcomes identically.

Would you think more of this blog post if I told you I spent only a minute writing it, versus two hours? Unlike other aspects of performance — such as the NBA, where exceptions notwithstanding players spend around equal amounts of time preparing for games — the blogosphere has a great deal of variance on this front. Some bloggers spend hours on posts; others not much at all. This is one reason why I think it’s difficult to infer too much about someone’s intelligence from a blog. You just don’t know how much time they’re spending. Then again, maybe this doesn’t matter.

Bottom Line: Almost two years ago I advocated for “certainty scales” to be put next to answers spaces on school tests, forcing students to indicate their level of certainty about their answer. I think a similar type of additional label, perhaps around input time required to obtain the output, would give a more holistic perspective on a person’s performance, and not just in school settings.

Pixar on Creativity (Find Good People, Ideas Will Come)

I just got off a videoconference with some executives in New Zealand and among other things we discussed creativity. I argued that too many people buy into the “analytical vs. creative” dichotomy of cognitive strengths. When I ask rooms of people, “Raise your hand if you consider yourself creative,” usually about 60% raise their hand. I’m always astonished at how few people self-identify as “creative.” I think it’s because of the stereotype that a creative person must be the ripped-jean starving artist type. If you’re a human being, I think you’re well within your right to call yourself creative, as humans are the most creative creatures on this earth. In any event, if you don’t consider yourself a creative person, you’re not likely to realize creative bursts.

It’s a fascinating topic. Ed Catmull, President of Pixar, has a great essay up today on how Pixar — one of the most successful movie studios in the business — fosters creativity. Pertinent especially for larger organizations. He starts with this:

A few years ago, I had lunch with the head of a major motion picture studio, who declared that his central problem was not finding good people—it was finding good ideas. Since then, when giving talks, I’ve asked audiences whether they agree with him. Almost always there’s a 50/50 split, which has astounded me because I couldn’t disagree more with the studio executive. His belief is rooted in a misguided view of creativity that exaggerates the importance of the initial idea in creating an original product. And it reflects a profound misunderstanding of how to manage the large risks inherent in producing breakthroughs.

Right on. Anyone who says finding good ideas is harder than finding good people is delusional. Everyone I talk to, in any size organization, for profit or non, all say that finding top flight talent is one of if not the hardest challenge they face. On the other hand, ideas are everywhere.

I also couldn’t agree more on what he calls “the exaggerated importance of the initial idea.” Here’s my original and follow-up posts on the myth of the eureka moment.

Other nuggets. Here’s how they’ve designed their buildings:

Most buildings are designed for some functional purpose, but ours is structured to maximize inadvertent encounters. At its center is a large atrium, which contains the cafeteria, meeting rooms, bathrooms, and mailboxes. As a result, everyone has strong reasons to go there repeatedly during the course of the workday. It’s hard to describe just how valuable the resulting chance encounters are.

On the challenge of being a successful company:

Systematically fighting complacency and uncovering problems when your company is successful have got to be two of the toughest management challenges there are. Clear values, constant communication, routine postmortems, and the regular injection of outsiders who will challenge the status quo aren’t enough. Strong leadership is also essential—to make sure people don’t pay lip service to the values, tune out the communications, game the processes, and automatically discount newcomers’ observations and suggestions.

(thanks to Ramit’s delicious tag of this piece)

The Best Paragraph I Read Today

From Clay Shirky, one of the more eloquent commentators on the increasingly participatory nature of the internet:

Digital and networked production vastly increase three kinds of freedom: freedom of speech, of the press, and of assembly. This perforce increases the freedom of anyone to say anything at any time. This freedom has led to an explosion in novel content, much of it mediocre, but freedom is like that. Critically, this expansion of freedom has not undermined any of the absolute advantages of expertise; the virtues of mastery remain as they were. What has happened is that the relative advantages of expertise are in precipitous decline. Experts the world over have been shocked to discover that they were consulted not as a direct result of their expertise, but often as a secondary effect — the apparatus of credentialing made finding experts easier than finding amateurs, even when the amateurs knew the same things as the experts.

Also try his speech on Gin, Television, and Social Surplus — this is where he compared time spent watching TV with time spent building wikipedia.

Try More Stuff Than the Other Guy

Here’s Tom Peters on one of my favorite topics (randomness) from the new book The Drunkard’s Walk: How Randomness Rules our Lives:

"If I had said ‘yes’ to all the projects I turned down and ‘no’ to all the ones I took, it would have worked out about the same."—David Picker, movie studio exec

"Mathematical analysis of firings in all major sports has shown that those firings had, on average, no effect on team performance."

And his sum-up:

NB2: If Randomness Rules then your only defense is the so-called "law of large numbers"—that is, success follows from tryin’ enough stuff so that the odds of doin’ something right tilt your way; in my speeches I declare that the only thing I’ve truly learned "for sure" in the last 40 years is "Try more stuff than the other guy"—there is no poetic license here, I mean it.

Visualizing the Book Review Before Writing Your Book

Kind of a neat approach — visualize the book review you want to read before writing your book:

Many years ago, David Allen shared with me that one of the first things he did when planning his first book, the best-selling, Getting Things Done, was to write the Wall Street Journal review of his book, first. He wrote the book review as he would like it to appear in print, even before writing the first chapters of his book.

Derek Scruggs adds:

An acquaintance of mine, a direct marketing guru, once told me that he writes the sales letter before he ever creates the product. Only after he’s explained exactly what you’ll get and why you need it does he set about creating the product. (And sometimes, if the sales letter isn’t compelling enough, he just abandons the product altogether, saving him a lot of time and effort.)

The GTD blog also has the helpful reminder that if you don’t know what "done" looks like before starting a task, you won’t know when you are done.

Identity is That Which is Given

Kenan Malik has a truly excellent essay up about identity and culture and race that’s worth reading slowly and carefully. Identity fascinates me. Growing up, I often explained friends’ behavior (and my own) as attempts to construct and project a coherent identity to the world. Childhood doesn’t usually afford self-understanding, so we lunge toward establishments like religion (I’m a Christian) or ethnic heritage (I’m an Irish-American) to help understand who we are and what we stand for. While there are potentially many different identity pillars upon which we could draw, convention limits us:

According to the modern idea of identity…each person’s sense of who they truly are is intimately linked to only a few special categories – collectives defined by people’s gender, sexuality, religion, race and, in particular, culture.

[W]hat [these] collectives…have in common is that each is defined by a set of attributes that, whether rooted in biology, faith or history, is fixed in a certain sense and compels people to act in particular ways. Identity is that which is given, whether by nature, God or one’s ancestors.

He discusses culture in depth. Whereas race is undoubtedly fixed, it’s not clear culture should be in the same boat.

An individual’s cultural background frames their identity and helps define who they are. If we want to treat individuals with dignity and respect, many multiculturalists argue, we must also treat with dignity and respect the groups that furnish them with their sense of personal being….

Multiculturalists, on the other hand, exhibit a self-conscious desire to preserve cultures…In the modern view, traditions are to be preserved not for pragmatic reasons but because such preservation is a social, political and moral good. Maintaining the integrity of a culture binds societies together, lessens social dislocation and allows the individuals who belong to that culture to flourish. Such individuals can thrive only if they stay true to their culture – in other words, only if both the individual and the culture remains authentic. Modern multiculturalism seeks self-consciously to yoke people to their identity for their own good, the good of that culture and the good of society.

If you’re born a Quebecian, you are supposed to enact the elements of that culture, a culture which multiculturalists celebrate for its differences and indeed work to ensure those differences persist. A problem, then:

An identity is supposed to be an expression of an individual’s authentic self. But it can too often seem like the denial of individual agency in the name of cultural authenticity.

Just because we’re born into a certain culture, this shouldn’t mean we must bear the weight of that culture over our lifetime, particularly when the obligation is to prevent “cultural decay” which can only happen if we are not doing what our ancestors did:

Cultures certainly change and develop. But what does it mean for a culture to decay? Or for an identity to be lost? Will Kymlicka draws a distinction between the ‘existence of a culture’ and ‘its “character” at any given moment’. The character of culture can change but such changes are only acceptable if the existence of that culture is not threatened. But how can a culture exist if that existence is not embodied in its character? By ‘character’ Kymlicka seems to mean the actuality of a culture: what people do, how they live their lives, the rules and regulations and institutions that frame their existence. So, in making the distinction between character and existence, Kymlicka seems to be suggesting that Jewish, Navajo or French culture is not defined by what Jewish, Navajo or French people are actually doing. For if Jewish culture is simply that which Jewish people do or French culture is simply that which French people do, then cultures could never decay or perish – they would always exist in the activities of people.

So, if a culture is not defined by what its members are doing, what does define it? The only answer can be that it is defined by what its members should be doing.

If we should be doing what our ancestors are doing, then culture, according to Malik, has become defined “biological decent.” Biological decent is race. As the cultural critic Walter Benn Michaels puts it, “In order for a culture to be lost… it must be separable from one’s actual behaviour, and in order for it to be separable from one’s actual behaviour it must be anchorable in race.” To wit, the close:

The logic of the preservationist argument is that every culture has a pristine form, its original state. It decays when it is not longer in that form. Like racial scientists with their idea of racial type, some modern multiculturalists appear to hold a belief in cultural type. For racial scientists, a ‘type’ was a group of human beings linked by a set of fundamental characteristics which were unique to it. Each type was separated from others by a sharp discontinuity; there was rarely any doubt as to which type an individual belonged. Each type remained constant through time. There were severe limits to how much any member of a type could drift away from the fundamental ground plan by which the type was constituted. These, of course, are the very characteristics that constitute a culture in much of today’s multiculturalism talk. Many multiculturalists, like racial scientists, have come to think of human types as fixed, unchanging entities, each defined by its special essence.

Which should be alarming to anyone who believes in an individual’s right to construct his own identity separate and apart from ancestry or the expectations of being part of a cultural group.

To bring this back on a more personal level, it reminds me of Paul Graham’s essay called Lies We Tell Our Children in which he said:

Telling a child they have a particular ethnic or religious identity is one of the stickiest things you can tell them. Almost anything else you tell a kid, they can change their mind about later when they start to think for themselves. But if you tell a kid they’re a member of a certain group, that seems nearly impossible to shake.

If you’re a parent and want to play it safe, you tell you’re kid that he’s an X, and that Xes do things a certain way. You tell him that he was born into this group and that there’s no way around it.

Or, you tell your child nothing of the sort, and let him wander about and start building up his identity piece by piece by choice.

This is more or less what happened to me. My parents / genetics pressed upon me virtually no religious (“You are a Christian, go to Church”), ancestral (“Cherish your Slovak roots”), national (“You’re American god dammit!”), racial (“Celebrate your whiteness”), or gender (“Stand up and be proud to be a man”) claims on my identity. This doesn’t mean I was/am immune to these influences, but none of these institutional categories dominated how I conceived of myself growing up.

I suppose if I were born a woman, or black, or devoutly religious, or exceedingly aware of my roots, or ungodly rich or dirt poor, or traveled extensively overseas as a child (something which often reinforces national identity upon return home), any of these might have moved to the fore. Especially if I wasn’t a white male in multiculturalism-dominated schools. Multicultural exercises inevitably leads those Chosen Representatives of the Black People to overemphasize their blackness and making it more fundamental to their identity than it would otherwise be. If anything, the school system’s total lack of interest in the “white man’s experience” (compared to “what it’s like being Hispanic”) led to a stripping down (and occasionally even shame of) those un-chosen aspects of my identity — my race and gender — which meant I never turned to it as a conscious source of identity.

So, while my identity may be less coherent at this point than someone else’s, I’d like to think it is more true to my own values and own beliefs about how the world works. Philosophical beliefs like free will, value beliefs like a woman’s right to choose or the importance of humor, health, and happiness, everyday beliefs like crunchy peanut butter’s superiority to smooth. I am what I believe. And to a large extent, I have chosen what I believe, or at least I hope I have.

(thanks to Will Wilkinson for pointing out the essay)

What I’ve Been Reading

1.Positioning: How to be Seen and Heard in the Overcrowded Marketplace by Al Ries and Jack Trout. Considered a classic for marketing professionals who position products in the minds of customers. Recommended. Thanks to Kevin Gentry for sending this.

2. Religious Literacy: What Every American Needs to Know — And Doesn’t by Stephen Prothero. This is a marvelous book that will interest the religious and non-religious alike. Prothero starts by showing just how dismal religious literacy is in America. Interestingly, literacy among believers — about their own religion or that of others — is little better than atheists. As the historian R. Laurence Moore has written, "Americans are stupefyingly dumb about what they are supposed to believe." In addition to a useful religious glossary in the back and a humbling quiz at the beginning, there is a more general background on religious views in America in the middle which is edifying. I’m taking a "Bible as Literature" class in the fall.

3. Size Matters: How Height Affects the Health, Happiness and Success of Boys — and the Men They Become by Stephen Hall. My expectations for this book were, pardon the pun, too high. I’ve been taller than normal my whole life (I’m currently 6′ 4"). Growing up, this meant I was good at sports, more often the bully than the one being bullied, and often mistaken / treated as older than I really was (this helped during my foray into the business world). In this book Hall suggests that the benefits I reap from my height today derive primarily from the social success that came from being a tall kid. The self-esteem benefits you get as a tall young person carry into adulthood, not as much that people discriminate as adults in favor of tall people (to explain why the CEO ranks are dominated by height). This is a compelling point. Unfortunately, the author tries too hard to seem scientifically legit — rambling on and on about growth charts or diving deep into the psychology of bullying. To me, his too-careful seriousness detracted from the overall enjoyment of the book.

4. Forty Ways to Look at Winston Churchill by Gretchen Rubin. With dozens of biographies already written about Churchill, Rubin takes an absolutely unique approach: she writes forty short chapters examining different angles of his life and how those angles have been portrayed in different biographies. Not surprisingly, she reveals contradictory accounts and highlights the impact of a historian’s bias. I knew very little about Churchill going in so I found this thin volume an excellent introduction. Btw, this is the Gretchen who writes The Happiness Project.

“Of Course” and “Obviously” In Writing

Even when I think it’s unimportant….Even when I know I shouldn’t judge….I always end up focusing on a person’s choice of words and use of language in general.

Sometimes I read an essay where a person uses an interesting and pitch-perfect word, and then I see it show up again a few paragraphs later and it’s a let down. Or I have two oral conversations with someone and very quickly pick up on a favorite word or construction. Examples. One friend loves to talk about "policy cleavages" in politics, another uses the word "dynamic" as an all-purpose noun to describe almost anything, another uses "amusing" as his choice humor superlative. Nothing wrong with this — I just neurotically focus on it. I have my own go-to phrases and words such as the construction "so as to…".

Bryan Caplan today challenges conventional writing advice which says avoid the over-used phrases "of course" and "obviously." If it’s obvious, the advice goes, why do you need to say it?!

Why is it so hard to surrender these words?  The main reasons: When you say "obviously," or "of course"…

1. …listeners know not to waste time looking for a complicated rationale behind your statement. What they see is what they get.

2. …listeners can identify your starting points. It may be obvious that X is true, and obvious that X–>Y, but if you just start with Y, people will be confused.

3. …listeners find out what you take for granted.  If it’s different from what they take for granted, that’s news.

This is something I’ve thought about and notice very consciously when reading. In general, I dislike the phrases. I believe reason #3 above can be used condescendingly. People couch what is not an obvious point with the phrase "obviously" to indicate higher intelligence. Or people drop "of course" left and right because of their own intellectual insecurity — they don’t know what’s obvious or not, so they insist that all is obvious to them. Of course, hard and fast rules about writing should never be followed, and obviously there’s a time and place for everything.


I feel less passionately than some about the semi-colon. I only know one proper way to use it (there are other ways). E.g.: "I wasn’t sure if I was going to like the book; however, I read it anyway because Joe recommended it." Semi-colon followed by "however" or some other transitional phrase. Otherwise, I never use semi-colons. Here’s Michael Kinsley from a long piece on semi-colons:

“The most common abuse of the semicolon, at least in journalism,” explains Kinsley, “is to imply a relationship between two statements without having to make clear what that relationship is. I suppose there are worse crimes in the world. (I don’t know if Osama bin Laden uses semicolons or not.) But Fred did have it right.”


All this talk about phrases and semicolons distracts from a more pressing point: the world would be a much better place if a majority of people correctly used "it’s" and "its."


I’m far from a perfect writer or grammar-follower. Part of why I blog about this stuff is being public about it forces me to a high standard, and also invites readers to point out corrections in my prose.

Disrespecting Credentialism

Why are people who hold degrees from very selective schools more likely to advise me to stay in college and get my degree (from a very selective school) whereas people who hold degrees from unknown schools or have no degree at all more are likely to support a decision to drop out?

Because if I drop out I am disrespecting credentialism — which according to Arnold Kling is “the belief that only people with proper credentials should be hired. If you go to college, you implicitly support credentialism–or at least you do not reject it. If you refuse to go to college, then you show disrespect for credentialism. That disrespect may represent a threat to hiring managers who are credentialist.”

Recently, I met a man in Portland who is going through tough career times. He holds an MBA from a top school and, even late in his career, still cites it prominently in his portfolio of work. At this stage in life he clutches to the credential. He advises me to obtain a similar credential. If I and (many) others do not and nevertheless go on to be successful, the value of his decreases. Thus, I value his advice on the matter but recognize his self-interested bias.


Here’s a related rule of thumb I just developed:

If the importance of your credential and the prominence with which you advertise it does not decrease with age, you are not achieving or succeeding that much in the real world. Would a successful lawyer begin a letter to a prospective client, “Dear Joe, I graduated from Columbia Law School in 1990”? Of course not. He’d hang his hat on real experiences. Al Gore’s bio on this page doesn’t even mention Vanderbilt or Harvard, two brand names most people would be eager to display. He doesn’t need to. His work speaks for itself.

The exception to this rule of thumb would be academia, where it seems credentials remain at the fore regardless of professional success. But this would make sense. The very idea of academia is rooted in credentialism.

Email Dialogue on Business Books, Self-Education, and Mental Models

The very interesting Josh Kaufman and I had an email-style dialogue on business books, self-education, mental models, and more on this wiki page and copied on this blog post (continued below the fold).

Josh runs the Personal MBA Recommended Reading List — a list of the best business books one would need to read for a comprehensive business education. It’s a terrific resource that’s well worth reviewing. In our exchange, we talk about the list of books and whether recently published ones should be excluded, and then meander into the difference between books offering systems / models and practical advice, and conclude on how prominent a role books should play in the self-education process.

From: Ben Casnocha
To: Josh Kaufman
Re: Personal MBA 2008 – The Recency Effect
July 27, 2008   11:54 PM

Dear Josh,

Congrats on launching the 2008 recommended list for the Personal MBA. As a fellow bookslut, I love perusing "best of" lists of books. I admire your chutzpah to compile such a list in the business genre as we would probably both agree that crap dominates this category of books. As someone who wrote a book on entrepreneurship last year, and in the process immersed myself in the industry, I saw up close how many stupid "leadership" books (to pick just one sub-category) come out each year. The scary thing is that people must be buying these management-guru authored books or else they wouldn’t be published!

So kudos to you for sifting through all the crap and finding the gems and providing a fabulous guide for the self-improvement crowd. And double kudos for adopting an appropriately broad definition of the genre — indeed your list contains some items (such as Deep Survival) that would be shelved far from a traditional business shelf in a bookstore. As Jim Collins has said, the majority of a business person’s reading should be books not traditionally thought of as "business" texts.

But let me probe on one aspect of the list that I find troubling: its inclusion of books published recently. One of your picks, Ethics for the Real World, was published just a few weeks ago. Another, Four Hour Workweek, was published last year. It’s truly bizarre to see these juxtaposed with enduring classics like The Effective Executive by Peter Drucker on a "best business books of all-time" list. The ultimate sign of a book’s effectiveness is its lasting power, no? — if it is still being talked about at least a few years after pub date. The inclusion of hot new books underscores the list’s vulnerability to fads or clever launch-date marketing buzz.  Several friends of mine, minutes after reading it, said Four hour Workweek changed their life — but you can only evaluate this statement a year or two after its utterance, to see if the book actually has changed their life in the way they predicted it would. Until then, we hold our breath.

The fact that you dropped Seth Godin’s The Dip (published last year) from last year’s list would indicate to me that it seemed big and important at the time, but in hindsight, not so much. Right? Even the most self-aware humans can fall prey to the recency effect which says we assign disproportionate salience to recent stimuli/observations. So Josh, why not make it a "system rule" that no book published in the last 3 years is eligible? To me, this would improve the authority and respectability of your list considerably.

Later, I want to ask what role books play in the self-education process.

Until then,


Continue reading