Book Review: The Intel Trinity

Steve Jobs, in a 1994 interview, said that once you discover that everything around you that we call “life” — rules, expectations, institutions, buildings, companies, theories, and so on — were made by people no smarter than you, everything changes. Because when you realize that most of what seems permanent and “the way things have always been” was, at one point, the proactive creation of a fallible human being, then you learn that if you poke at life you can actually change it. From then on, you take a much broader view of life’s possibilities.

It’s a powerful point that I agree with, except for the notion that the institutions and companies and norms and countries around us were built by people “no smarter than you.” In fact, the Founding Fathers of America were probably smarter than you or me. Same with Steve Jobs. Not all of us is smart enough or persistent enough to leave an enduring impact. But it’s true most of us are smarter than we know.

In any case, if you apply Jobs’ comment to Silicon Valley, it resonates. It’s uncommon to step back and ponder who created the norms and culture of modern tech entrepreneurship that we take for granted today. I locate the answer in (at least) two companies. HP, where Dave and Bill pioneered the idea of flexible work hours, employees owning equity in companies, casual attire, non-hierarchal decision making, and so much of the “west coast” aesthetic that is central to modern Silicon Valley identity. When HP introduced these policies, they were considered bold and groundbreaking. And then Intel, which, by growing from an idea to the world’s most important company, set a standard for execution that became the high water mark for other startups that aspired to global scale.

Intel also was one of the first companies to raise modern venture capital. How often do we stop and think about the original investors who decided to invest real money in a high risk, low liquidity tech company, and the entrepreneur who thought to sell equity in his company in exchange for enough risk capital to shoot for the stars?

I recently read Mike Malone’s The Intel Trinity, a wonderful guide to the history of Intel and the famous troika of Bob Noyce, Gordon Moore, and Andy Grove. These guys created Silicon Valley. The Intel Trinity explains the story of Intel well, and the tremendously intense and sometimes volatile relationship between them. Those of us too young to have lived through the rise of Intel are an especially relevant audience for this book, as is anyone who does not understand the historical meaning or importance of Moore’s Law. While there are a couple chapters in the book about Andy Grove’s personal history, for more color on that — his unbelievable personal life story as an immigrant from Hungary — I’d recommend Grove’s memoir Swimming Across.

Book Short: The Complacent Class by Tyler Cowen

Tyler Cowen’s latest book — The Complacent Class: The Self-Defeating Quest for the American Dream — follows up The Great Stagnation and Average is Over as the third in a trilogy about what’s gone wrong in America that has caused, in Tyler’s summation, wages to stagnate, infrastructure to decay, entrepreneurship to slow, and in general a large swath of Americans to fall behind in the modern economy.

Lengthier reviews already published elsewhere, so I’ll offer just three impressions of this provocative book written by a good friend.

First, reading it was reminder number 6,238 that I live in an exceptionally privileged life. I’m doing fine and almost everyone I know is doing just fine. The economy around me is booming. I travel almost exclusively to parts of the world where everyone is doing fine. That I find myself in this position is due almost entirely to luck and good fortune; what responsibilities I and my fellow lottery winners have to those handed a harder set of cards is one of the most important moral questions I grapple with.

Second, the habits of mind and action that Tyler says contribute to the complacency of so many Americans are the same habits we write about in The Start-up of You — except we extoll the positive version of them, of course! Tyler talks about risk aversion; we talk about how to take intelligent risk. If you want solutions, at the individual level, to some of the diagnoses in The Complacent Class…then read The Start-up of You!

Third, there’s an ambiguity in language on the topic of entrepreneurship that pops up in this book and other books and articles that study economic data. Tyler cites data — and points to this 538 piece summarizing the data — showing that fewer people are starting companies. Even the tech sector has fewer startups today! Entrepreneurship is slowing down, it seems? Well, maybe. Venture capital is pouring into startups. If there were fewer and fewer companies being started, why is there more and more venture capital being invested in startups? I think the issue here is the definition of “startup” and “high tech.” High level economic data tend to look at “new business formation” to draw conclusions about “entrepreneurship” and they define “startup” as any sort of new venture. Even “high tech” is broader than the specific niche that Silicon Valley is famous for and that venture capital chases: software and hardware startups financed and run in such a way as to one day achieve massive scale. This sort of entrepreneurship — the sort parodied on HBO’s Silicon Valley, glorified on Shark Tank, and written about in the popular press — is thriving, even if, in general, fewer Americans are starting “new businesses.” Of course, this doesn’t detract from the broader point that a lot of Americans are facing stagnant careers and a lot of once-stable industries are no longer reliable sources of prosperity.

Anyway, Marginal Revolution, Tyler’s blog, has long been a must-read. And don’t miss his podcast, which is still fairly new but really hitting its stride…

What I’ve Been Reading

Two very long books.

1. 1Q84 by Haruki Murakami. I really enjoyed working my way through this novel. If you’re new to Murakami, I wouldn’t start with this one. It’s dauntingly long (almost 1,000 pages), and I could see some readers getting lost — if you aren’t ready for it — amid the strangeness and sadness that permeate many scenes in the book. But if you’re in the right headspace, the hyper detailed descriptions, the plot, and strange sci-fi “weather” cast over Tokyo make a memorable reading experience.

Here’s just one quote that gives a sense of the vibe: “Once you pass a certain age, life is just a continuous process of losing one thing after another. One after another, things you value slip out of your hands the way a comb loses teeth. People you love fade away one after another. That sort of thing.”

2. Far From the Tree: Parents, Children, and the Search for Identity by Andrew Solomon

This is another brick of a book but of a very different sort: non-fiction, organized into focused chapters, each on some element of a non-ordinary life experience and how parents and families adapt. There are chapters on deafness, dwarfism, transgender, homosexuality, prodigies, autism, and others. If you’re a parent of a child who falls into any of these categories, it’s a a must-read. I’m not, but I still found myself learning a ton about the life experience of those with certain types of disabilities. In many chapters there’s a spotlight given to the push and pull of advocacy groups, politicians, educators, and others who try to standardize a point of view on whether parents should buy cochlear implants for their deaf child, for example, or a 5 year old boy who tells his parents he wants to become a girl.

Solomon inclines to telling anecdotes over statistics, because “numbers imply trends and anecdotes imply chaos.” There’s a lot of messiness in the real family experiences profiled here. Internal debate. Changes of heart. I’m in awe at how Solomon shares stories about the families he spoke to. Genuine compassion and yet steel eyed honesty. He manages to assert his own opinion on topics where there’s true debate, but without simplifying the matter or selling short the diversity of views.

Book Short: The Checklist Manifesto

Atul Gawande’s book The Checklist Manifesto is a wonderfully engaging summation of how the world has become so complex, and how to use checklists — yes, a simple to-do checklist — to manage the complexity that underlies modern professions.

The surgery room is the primary setting for the book’s examples, Gawande’s own vocation of course, but there are also useful stories from the worlds of building construction, aviation, and Wall Street trading.

Here’s Derek Sivers’ detailed summary of the book.

 

Book Review: But What If We’re Wrong?

klostermanChuck Klosterman wrote one of the most stimulating books that I read in 2016: But What If We’re Wrong: Thinking About the Present As If It Were the Past. There are countless interesting observations on science and pop culture and sports and history. By contemplating which assumptions of today might be disproven in the future, or which authors of note today might be forgotten in the future and which no-name writers might become famous after their death, he unearths novel theories about familiar topics. Why did Herman Melville become renowned by future generations but not his own? Which theories of science are we totally convinced are true today but may well be proven false by future generations of physicists? Which TV show made in 2016 will be referenced by historians in 2080 when they try to explain what life was like in 2016?

My favorite paragraphs are pasted below, with the bold font mine.

Thanks to Russ Roberts for recommending this via his Econtalk conversation with Chuck.


When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who’s ever lived.”

Aristotle had argued more than a thousand years prior: He believed all objects craved their “natural place,” and that this place was the geocentric center of the universe, and that the geocentric center of the universe was Earth. In other words, Aristotle believed that a dropped rock fell to the earth because rocks belonged on earth and wanted to be there.

For the next thirty years, nothing about the reception of [Moby Dick] changes. But then World War I happens, and—somehow, and for reasons that can’t be totally explained—modernists living in postwar America start to view literature through a different lens. There is a Melville revival. The concept of what a novel is supposed to accomplish shifts in his direction and amplifies with each passing generation…

I suspect most conventionally intelligent people are naïve realists, and I think it might be the defining intellectual quality of this era. The straightforward definition of naïve realism doesn’t seem that outlandish: It’s a theory that suggests the world is exactly as it appears.

Any time you talk to police (or lawyers, or journalists) about any kind of inherently unsolvable mystery, you will inevitably find yourself confronted with the concept of Occam’s Razor: the philosophical argument that the best hypothesis is the one involving the lowest number of assumptions.

The reason something becomes retrospectively significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative.

The arc of Lethem’s larger contention boils down to two points. The first is that no one is really remembered over the long haul, beyond a few totemic figures—Joyce, Shakespeare, Homer—and that these figures serve as placeholders for the muddled generalization of greatness (“Time is a motherfucker and it’s coming for all of us,” Lethem notes).

The reason shadow histories remained in the shadows lay in the centralization of information: If an idea wasn’t discussed on one of three major networks or on the pages of a major daily newspaper or national magazine, it was almost impossible for that idea to gain traction with anyone who wasn’t consciously searching for alternative perspectives. That era is now over. There is no centralized information, so every idea has the same potential for distribution and acceptance.

Competing modes of discourse no longer “compete.” They coexist.

Take, for example, the plight of Native Americans. What American subculture has suffered more irrevocably? Prior to Columbus’s landing in the New World, the Native American population approached one hundred million. Now it’s a little over three million, two-thirds of whom are relegated to fifty delineated reservations on mostly undesirable land. Still, that equates to roughly 1 percent of the total US population. Yet Native Americans are essentially voiceless, even in conversations that specifically decry the lack of minority representation. Who is the most prominent Native American media figure or politician? Sherman Alexie? Louise Erdrich? Tom Cole or Markwayne Mullin, both of whom are from the same state? Who, for that matter, is the most famous Native American athlete, or rapper, or reality star? Maybe Sam Bradford? Maybe Sacheen Littlefeather, who’s been virtually invisible since the seventies? When the Academy Awards committee next announces the nominations for Best Picture, how many complaints will focus on the lack of films reflecting the Native American experience? Outside the anguish expressed over the use of the term “Redskin” by the Washington football franchise, it’s hard to find conversation about the biases facing Native Americans; outside the TV show Fargo, you almost never see it reflected in the popular culture. Everyone concedes it exists, but it’s not a popular prejudice (at least not among the mostly white liberals who drive these conversations). Their marginalization is ignored, thus creating a fertile factory for the kind of brilliant outsider who won’t be recognized until that artist is dead and gone. So this is one possibility—a Navajo Kafka.

 Kurt Vonnegut’s A Man Without a Country: “I think that novels that leave out technology misrepresent life as badly as Victorians misrepresented life by leaving out sex.”

…the myth of universal timeliness. There is a misguided belief—often promoted by creative writing programs—that producing fiction excessively tied to technology or popular culture cheapens the work and detracts from its value over time. If, for example, you create a plot twist that hinges on the use of an iPad, that story will (allegedly) become irrelevant once iPads are replaced by a new form of technology. If a character in your story is obsessed with watching Cheers reruns, the meaning of that obsession will (supposedly) evaporate once Cheers disappears from syndication. If your late-nineties novel is consumed with Monica Lewinsky, the rest of the story (purportedly) devolves into period piece. The goal, according to advocates of this philosophy, is to build a narrative that has no irretraceable connection to the temporary world. But that’s idiotic, for at least two reasons. The first is that it’s impossible to generate deep verisimilitude without specificity. The second is that if you hide the temporary world and the work somehow does beat the odds and become timeless, the temporary world you hid will become the only thing anyone cares about

But I’ve been a paid critic for enough years to know my profession regularly overrates many, many things by automatically classifying them as potentially underrated. The two terms have become nonsensically interchangeable.

The nonfiction wing of this level houses elemental tacticians like Robert Caro; someone like William T. Vollmann straddles both lines, fortified by his sublime recklessness. Even the lesser books from these writers are historically important, because—once you’re defined as great—failures become biographically instructive. 

The third tier houses commercial writers who dependably publish major or minor bestsellers and whose success or failure is generally viewed as a reflection of how much (or how little) those books sell. These individuals are occasionally viewed as “great at writing,” but rarely as great writers. They are envied and discounted at the same time. They are what I call “vocally unrated”: A large amount of critical thought is directed toward explaining how these types of novels are not worth thinking about.

Now, if the world were logical, certain predictions could be made about what bricks from that pyramid will have the greatest likelihood of remaining intact after centuries of erosion. Devoid of all other information, a betting man would have to select a level-one writer like Roth, just as any betting man would take the Yankees if forced to wager on who will win the World Series one hundred seasons from now. If you don’t know what the weather will be like tomorrow, assume it will be pretty much the same as today. But this would require an astonishing cultural stasis. It would not simply mean that the way we presently consume and consider Roth will be the way Roth is consumed and considered forevermore; it would mean that the manner in which we value and assess all novels will remain unchanged. It also means Roth must survive his inevitable post-life reevaluation by the first generation of academics who weren’t born until he was already gone, a scenario where there will be no room for advancement and plenty of room for diminishing perceptions (no future contrarian can provocatively claim, “Roth is actually better than everyone thought at the time,” because—at the time—everyone accepted that he was viewed as remarkable). He is the safest bet, but still not a safe bet. Which is why I find myself fixated on the third and sixth tiers of my imaginary triangle: “the unrated.” As specific examples, they all face immeasurable odds. But as a class, they share certain perverse advantages.

Normal consumers declare rock to be dead whenever they personally stop listening to it (or at least to new iterations of it), which typically happens about two years after they graduate from college.

The Beatles were the first major band to write their own songs, thus making songwriting a prerequisite for credibility; they also released tracks that unintentionally spawned entire subgenres of rock, such as heavy metal (“Helter Skelter”), psychedelia (“Tomorrow Never Knows”), and country rock (“I’ll Cry Instead”).

Do I think the Beatles will be remembered in three hundred years? Yes. I believe the Beatles will be the Sousa of Rock (alongside Michael Jackson, the Sousa of Pop22). If this were a book of predictions, that’s the prediction I’d make. But this is not a book about being right. This is a book about being wrong, and my faith in wrongness is greater than my faith in the Beatles’ unassailability. What I think will happen is probably not what’s going to happen. So I will consider what might happen instead.

Since rock, pop, and rap are so closely tied to youth culture, there’s an undying belief that young people are the only ones who can really know what’s good. It’s the only major art form where the opinion of a random fourteen-year-old is considered more relevant than the analysis of a sixty-four-year-old scholar. (This is why it’s so common to see aging music writers championing new acts that will later seem comically overrated—once they hit a certain age, pop critics feel an obligation to question their own taste.)

Take architecture: Here we have a creative process of immense functional consequence. It’s the backbone of the urban world we inhabit, and it’s an art form most people vaguely understand—an architect is a person who designs a structure on paper, and that design emerges as the structure itself. Architects fuse aesthetics with physics and sociology. And there is a deep consensus over who did this best, at least among non-architects: If we walked down the street of any American city and asked people to name the greatest architect of the twentieth century, most would say Frank Lloyd Wright. In fact, if someone provided a different answer, we’d have to assume we’ve stumbled across an actual working architect, an architectural historian, or a personal friend of Frank Gehry. Of course, most individuals in those subsets would cite Wright, too. But in order for someone to argue in favor of any architect except Wright (or even to be in a position to name three other plausible candidates), that person would almost need to be an expert in architecture. Normal humans don’t possess enough information to nominate alternative possibilities. And what emerges from that social condition is an insane kind of logic: Frank Lloyd Wright is indisputably the greatest architect of the twentieth century, and the only people who’d potentially disagree with that assertion are those who legitimately understand the question. History is defined by people who don’t really understand what they are defining.

I don’t believe all art is the same. I wouldn’t be a critic if I did. Subjective distinctions can be made, and those distinctions are worth quibbling about. The juice of life is derived from arguments that don’t seem obvious. But I don’t believe subjective distinctions about quality transcend to anything close to objective truth—and every time somebody tries to prove otherwise, the results are inevitably galvanized by whatever it is they get wrong.

To matter forever, you need to matter to those who don’t care. And if that strikes you as sad, be sad.

But maybe it takes an idiot to pose this non-idiotic question: How do we know we’re not currently living in our own version of the year 1599? According to Tyson, we have not reinvented our understanding of scientific reality since the seventeenth century. Our beliefs have been relatively secure for roughly four hundred years. That’s a long time—except in the context of science. In science, four hundred years is a grain in the hourglass.

One of Greene’s high-profile signatures is his support for the concept of “the multiverse.” Now, what follows will be an oversimplification—but here’s what that connotes: Generally, we work from the assumption that there is one universe, and that our galaxy is a component of this one singular universe that emerged from the Big Bang. But the multiverse notion suggests there are infinite (or at least numerous) universes beyond our own, existing as alternative realities. Imagine an endless roll of bubble wrap; our universe (and everything in it) would be one tiny bubble, and all the other bubbles would be other universes that are equally vast. In his book The Hidden Reality, Greene maps out nine types of parallel universes within this hypothetical system.

“In physics, when we say we know something, it’s very simple,” Tyson reiterates. “Can we predict the outcome? If we can predict the outcome, we’re good to go, and we’re on to the next problem. There are philosophers who care about the understanding of why that was the outcome. Isaac Newton [essentially] said, ‘I have an equation that says why the moon is in orbit. I have no fucking idea how the Earth talks to the moon. It’s empty space—there’s no hand reaching out.’

Galileo famously refused to chill and published his Dialogue Concerning the Two Chief World Systems as soon as he possibly could, mocking all those who believed (or claimed to believe) that the Earth was the center of the universe. The pope, predictably, was not stoked to hear this. But the Vatican still didn’t execute Galileo; he merely spent the rest of his life under house arrest (where he was still allowed to write books about physics) and lived to be seventy-seven.

What Bostrom is asserting is that there are three possibilities about the future, one of which must be true. The first possibility is that the human race becomes extinct before reaching the stage where such a high-level simulation could be built. The second possibility is that humans do reach that stage, but for whatever reason—legality, ethics, or simple disinterest—no one ever tries to simulate the complete experience of civilization. The third possibility is that we are living in a simulation right now. Why? Because if it’s possible to create this level of computer simulation (and if it’s legally and socially acceptable to do so), there won’t just be one simulation. There will be an almost limitless number of competing simulations, all of which would be disconnected from each other. A computer program could be created that does nothing except generate new simulations, all day long, for a thousand consecutive years. And once those various simulated societies reach technological maturity, they would (assumedly) start creating simulations of their own—simulations inside of simulations.

The term “conspiracy theory” has an irrevocable public relations problem. Technically, it’s just an expository description for a certain class of unproven scenario. But the problem is that it can’t be self-applied without immediately obliterating whatever it’s allegedly describing. You can say, “I suspect a conspiracy,” and you can say, “I have a theory.” But you can’t say, “I have a conspiracy theory.” Because if you do, it will be assumed that even you don’t entirely believe the conspiracy you’re theorizing about.

But it still must be asked: Discounting those events that occurred within your own lifetime, what do you know about human history that was not communicated to you by someone else? This is a question with only one possible answer.

This, it seems, has become the standard way to compartmentalize a collective, fantastical phenomenon: Dreaming is just something semi-interesting that happens when our mind is at rest—and when it happens in someone else’s mind (and that person insists on describing it to us at breakfast), it isn’t interesting at all.

[On the Buzzfeed blue vs. gold dress viral phenom.] The next day, countless pundits tried to explain why this had transpired. None of their explanations were particularly convincing. Most were rooted in the idea that this happened because we were all looking at a photo of a dress, as opposed to the dress itself. But that only shifts the debate, without really changing it—why, exactly, would two people see the same photograph in two completely different ways?

Adams is the author of On the Genealogy of Color. He believes the topic of color is the most concrete way to consider the question of how much—or how little—our experience with reality is shared with the experience of other people. It’s an unwieldy subject that straddles both philosophy and science. On one hand, it’s a physics argument about the essential role light plays in our perception of color; at the same time, it’s a semantic argument over how color is linguistically described differently by different people. There’s also a historical component: Up until the discovery of color blindness in the seventeenth century, it was assumed that everyone saw everything the same way (and it took another two hundred years before we realized how much person-to-person variation there is). What really changed four hundred years ago was due (once again) to the work of Newton and Descartes, this time in the field of optics. Instead of things appearing “red” simply because of their intrinsic “redness” (which is what Aristotle believed), Newton and Descartes realized it has to do with an object’s relationship to light.

On the same day I spoke with Linklater about dreams, there was a story in The New York Times about a violent incident that had occurred a few days prior in Manhattan. A man had attacked a female police officer with a hammer and was shot by the policewoman’s partner. This shooting occurred at ten a.m., on the street, in the vicinity of Penn Station. Now, one assumes seeing a maniac swinging a hammer at a cop’s skull before being shot in broad daylight would be the kind of moment that sticks in a person’s mind. Yet the Times story explained how at least two of the eyewitness accounts of this event ended up being wrong. Linklater was fascinated by this: “False memories, received memories, how we fill in the blanks of conjecture, the way the brain fills in those spaces with something that is technically incorrect—all of these errors allow us to make sense of the world, and are somehow accepted enough to be admissible in a court of law. They are accepted enough to put someone in prison.” And this, remember, was a violent incident that had happened only hours before. The witnesses were describing something that had happened that same day, and they had no incentive to lie.

How much of history is classified as true simply because it can’t be sufficiently proven false?

All of which demands a predictable question: What significant historical event is most likely wrong? And not because of things we know that contradict it, but because of the way wrongness works.

When D. T. Max published his posthumous biography of David Foster Wallace, it was depressing to discover that many of the most memorable, electrifying anecdotes from Wallace’s nonfiction were total fabrications.

In Ken Burns’s documentary series The Civil War, the most fascinating glimpses of the conflict come from personal letters written by soldiers and mailed to their families. When these letters are read aloud, they almost make me cry. I robotically consume those epistles as personal distillations of historical fact. There is not one moment of The Civil War that feels false. But why is that? Why do I assume the things Confederate soldiers wrote to their wives might not be wildly exaggerated, or inaccurate, or straight-up untruths?

I doubt the current structure of television will exist in two hundred fifty years, or even in twenty-five. People will still want cheap escapism, and something will certainly satisfy that desire (in the same way television does now). But whatever that something is won’t be anything like the television of today. It might be immersive and virtual (like a Star Trekian holodeck) or it might be mobile and open-sourced (like a universal YouTube, lodged inside our retinas). But it absolutely won’t be small groups of people, sitting together in the living room, staring at a two-dimensional thirty-one-inch rectangle for thirty consecutive minutes, consuming linear content packaged by a cable company.

[To understand a given era through a TV show.] We’d want a TV show that provided the most realistic portrait of the society that created it, without the self-aware baggage embedded in any overt attempt at doing so. In this hypothetical scenario, the most accurate depiction of ancient Egypt would come from a fictional product that achieved this goal accidentally, without even trying. Because that’s the way it always is, with everything. True naturalism can only be a product of the unconscious. So apply this philosophy to ourselves, and

To attack True Detective or Lost or Twin Peaks as “unrealistic” is a willful misinterpretation of the intent. We don’t need television to accurately depict literal life, because life can literally be found by stepping outside.

If anyone on a TV show employed the stilted, posh, mid-Atlantic accent of stage actors, it would instantly seem preposterous; outside a few notable exceptions, the goal of televised conversation is fashionable naturalism. But vocal delivery is only a fraction of this equation. There’s also the issue of word choice: It took decades for screenwriters to realize that no adults have ever walked into a tavern and said, “I’ll have a beer,” without noting what specific brand of beer they wanted 

But when a show’s internal rules are good, the viewer is convinced that they’re seeing something close to life. When the rom-com series Catastrophe debuted on Amazon, a close friend tried to explain why the program seemed unusually true to him. “This is the first show I can ever remember,” he said, “where the characters laugh at each other’s jokes in a non-obnoxious way.” This seemingly simple idea was, in fact, pretty novel—prior to Catastrophe, individuals on sitcoms constantly made hilarious remarks that no one seemed to notice were hilarious. For decades, this was an unspoken, internal rule: No one laughs at anything. So seeing characters laugh naturally at things that were plainly funny was a new level of realness. The way a TV show is photographed and staged (this is point number three) are industrial attributes that take advantage of viewers’ preexisting familiarity with the medium: When a fictional drama is filmed like a news documentary, audiences unconsciously absorb the action as extra-authentic (a scene shot from a single mobile perspective, like most of Friday Night Lights, always feels closer to reality than scenes captured with three stationary cameras, like most of How I Met Your Mother).

What is the realest fake thing we’ve ever made on purpose? 

Nothing on TV looks faker than failed attempts at realism. A show like The Bachelor is instantly recognized (by pretty much everyone, including its intended audience) as a prefab version of how such events might theoretically play out in a distant actuality. No television show has ever had a more paradoxical title than MTV’s The Real World, which proved to be the paradoxical foundation of its success.

Roseanne was the most accidentally realistic TV show there ever was…By the standards of TV, both of these people were wildly overweight. Yet what made Roseanne atypical was how rarely those weight issues were discussed. Roseanne was the first American TV show comfortable with the statistical reality that most Americans are fat. And it placed these fat people in a messy house, with most of the key interpersonal conversations happening in the kitchen or the garage or the laundry room. These fat people had three non-gorgeous kids, and the kids complained constantly, and two of them were weird and one never smiled.

The less incendiary take on football’s future suggests that it will continue, but in a different shape. It becomes a regional sport, primarily confined to places where football is ingrained in the day-to-day culture (Florida, Texas, etc.). Its fanbase resembles that of contemporary boxing—rich people watching poor people play a game they would never play themselves.

A few months after being hired as head football coach at the University of Michigan, Jim Harbaugh was profiled on the HBO magazine show Real Sports. It was a wildly entertaining segment, heavily slanted toward the intellection that Harbaugh is a lunatic. One of the last things Harbaugh said in the interview was this: “I love football. Love it. Love it. I think it’s the last bastion of hope for toughness in America in men, in males.”

“But look what happened to boxing,” people will say (and these people sometimes include me). “Boxing was the biggest sport in America during the 1920s, and now it exists on the fringes of society. It was just too brutal.” Yet when Floyd Mayweather fought Manny Pacquiao in May of 2015, the fight grossed $400 million, and the main complaint from spectators was that the fight was not brutal enough. Because it operates on a much smaller scale, boxing is—inside its own crooked version of reality—flourishing. It doesn’t seem like it, because the average person doesn’t care. But boxing doesn’t need average people. It’s not really a sport anymore. It’s a mildly perverse masculine novelty, and that’s enough to keep it relevant.

Midway through the episode, the show’s producers try to mathematically verify if youth participation in football is decreasing as much as we suspect. It is. But the specificity of that stat is deceiving: It turns out youth participation is down for all major sports—football, basketball, baseball, and even soccer (the so-called sport of the future). Around the same time, The Wall Street Journal ran a similar story with similar statistics: For all kids between six and eighteen (boys and girls alike), overall participation in team sports was down 4 percent.

But sometimes the reactionaries are right. It’s wholly possible that the nature of electronic gaming has instilled an expectation of success in young people that makes physical sports less desirable. There’s also the possibility that video games are more inclusive, that they give the child more control, and that they’re simply easier for kids who lack natural physical gifts. All of which point to an incontestable conclusion: Compared to traditional athletics, video game culture is much closer to the (allegedly) enlightened world we (supposedly) want to inhabit.

The gap for the Famous Idaho Potato Bowl was even greater—the human attendance was under 18,000 while the TV audience approached 1.5 million. This prompted USA Today to examine the bizarre possibility of future bowl games being played inside gigantic television studios, devoid of crowds.

What makes the United States so interesting and (arguably) “exceptional” is that it’s a superpower that did not happen accidentally. It did not evolve out of a preexisting system that had been the only system its founders could ever remember; it was planned and strategized from scratch, and it was built to last. Just about everyone agrees the founding fathers did a remarkably good job, considering the impossibility of the goal.

This logic leads to a strange question: If and when the United States does ultimately collapse, will that breakdown be a consequence of the Constitution itself? If it can be reasonably argued that it’s impossible to create a document that can withstand the evolution of any society for five hundred or a thousand or five thousand years, doesn’t that mean present-day America’s pathological adherence to the document we happened to inherit will eventually wreck everything?

Wexler notes a few constitutional weaknesses, some hypothetical and dramatic (e.g., what if the obstacles created to make it difficult for a president to declare war allow an enemy to annihilate us with nuclear weapons while we debate the danger) and some that may have outlived their logical practicality without any significant downside (e.g., California and Rhode Island having equal representation in the Senate, regardless of population).

But I would traditionally counter that Washington’s One Big Thing mattered more, and it actually involved something he didn’t do: He declined the opportunity to become king, thus making the office of president more important than any person who would ever hold it. This, as it turns out, never really happened. There is no evidence that Washington was ever given the chance to become king, and—considering how much he and his peers despised the mere possibility of tyranny—it’s hard to imagine this offer was ever on the table.

Washington’s kingship denial falls into the category of a “utility myth”—a story that supports whatever political position the storyteller happens to hold, since no one disagrees with the myth’s core message (i.e., that there are no problems with the design of our government, even if that design allows certain people to miss the point).

…Every strength is a weakness, if given enough time.

Back in the landlocked eighties, Dave Barry offhandedly wrote something pretty insightful about the nature of revisionism. He noted how—as a fifth-grader—he was told that the cause of the Civil War was slavery. Upon entering high school, he was told that the cause was not slavery, but economic factors. At college, he learned that it was not economic factors but acculturalized regionalism. But if Barry had gone to graduate school, the answer to what caused the Civil War would (once again) be slavery.

Much of the staid lionization of Citizen Kane revolves around structural techniques that had never been done before 1941. It is, somewhat famously, the first major movie where the ceilings of rooms are visible to the audience. This might seem like an insignificant detail, but—because no one prior to Kane cinematographer Gregg Toland had figured out a reasonable way to get ceilings into the frame—there’s an intangible, organic realism to Citizen Kane that advances it beyond its time period. Those visible ceilings are a meaningful modernization that twenty-first-century audiences barely notice.

There’s growing evidence that the octopus is far more intelligent than most people ever imagined, partially because most people always assumed they were gross, delicious morons.

What I’ve Been Reading

Books and more books.

gold1. Digital Gold by Nathaniel Popper. An incredibly engaging journalistic introduction to Bitcoin and blockchain, with cinematic storytelling about the people who pioneered the technology over the past 15 years. The book is about a year old and since then, Bitcoin has struggled, though I suspect many of the characters in this book — and the experts in real life — remain bullish on cryptocurrencies in the long run. Excellent for those learning the fundamentals of bitcoin and bitcoin history.

2. Dark Matter by Blake Crouch. I made the mistake of starting this sci-fi thriller at 11pm one night in bed. I was up till 1am. It’s a classic page turner set against the backdrop of the Many-Worlds interpretation of quantum mechanics, which is a real thing and which is super interesting to contemplate. (There is a universe right now where Donald Trump is not the current president-elect, for example.)

3. Lightning Rods by Helen Dewitt. I definitely would not admit to being entertained by this entertaining novel. Definitely not.

4. Upheavals of Thought: The Intelligence of Emotions by Martha Nussbaum. I’m a Nussbaum fan and the early chapters here contained provocative reflections on the power of emotions. How an emotion like fear of death manifests in so many aspects of our thought stream. Ultimately this was too dense for me to make it all the way through, but I’m glad I read as far as I did.

Book Review: Sapiens: A Brief History of Humankind

sapiensOne of my favorite books of 2016: Sapiens: A Brief History of Humankind by Yuval Noah Harari. I feel a little sheepish joining the parade of praise — everyone in Silicon Valley seems to be reading the book. I’ve wandered into more than one cocktail party conversation where someone is going on about how myths underpin modern society (which is one of the arguments of the book).

It’s extraordinary in scope, engagingly written, and full of provocative factoids that make you stop and think. Through it all, there is an overarching argument about why homo sapiens overtook other human species on earth (punch line: our ability to cooperate and trade with strangers). And he also presents a history of the world that’s driven by what he calls three revolutions: cognitive, agricultural, scientific.

But even if you don’t follow the overarching argument, there’s plenty to think about when it comes to ancient history, animal rights, religion, happiness, and technological revolution. Not surprisingly, with so much ground covered, subject matter experts have quibbled (or more than quibbled) with some of Yuval’s claims, though nothing triggered me to lose trust in Harari as a guide.

I highlighted 170 sentences or paragraphs in the book. I’ve pasted a bunch of them below. As always, each paragraph is a new thought, and all are direct quotes from the book.


The truth is that from about 2 million years ago until around 10,000 years ago, the world was home, at one and the same time, to several human species. And why not? Today there are many species of foxes, bears and pigs. The earth of a hundred millennia ago was walked by at least six different species of man. It’s our current exclusivity, not that multi-species past, that is peculiar – and perhaps incriminating. As we will shortly see, we Sapiens have good reasons to repress the memory of our siblings.

The fact is that a jumbo brain is a jumbo drain on the body. It’s not easy to carry around, especially when encased inside a massive skull. It’s even harder to fuel. In Homo sapiens, the brain accounts for about 2–3 per cent of total body weight, but it consumes 25 per cent of the body’s energy when the body is at rest. By comparison, the brains of other apes require only 8 per cent of rest-time energy. Archaic humans paid for their large brains in two ways. Firstly, they spent more time in search of food. Secondly, their muscles atrophied. 

Death in childbirth became a major hazard for human females. Women who gave birth earlier, when the infant’s brain and head were still relatively small and supple, fared better and lived to have more children. Natural selection consequently favoured earlier births. And, indeed, compared to other animals, humans are born prematurely, when many of their vital systems are still under-developed. A colt can trot shortly after birth; a kitten leaves its mother to forage on its own when it is just a few weeks old. Human babies are helpless, dependent for many years on their elders for sustenance, protection and education.

Raising children required constant help from other family members and neighbours. It takes a tribe to raise a human. Evolution thus favoured those capable of forming strong social ties. In addition, since humans are born underdeveloped, they can be educated and socialised to a far greater extent than any other animal. This is a key to understanding our history and psychology. Genus Homo’s position in the food chain was, until quite recently, solidly in the middle. For millions of years, humans hunted smaller creatures and gathered what they could, all the while being hunted by larger predators. It was only 400,000 years ago that several species of man began to hunt large game on a regular basis, and only in the last 100,000 years – with the rise of Homo sapiens – that man jumped to the top of the food chain.

Millions of years of dominion have filled them with self-confidence. Sapiens by contrast is more like a banana republic dictator. Having so recently been one of the underdogs of the savannah, we are full of fears and anxieties over our position, which makes us doubly cruel and dangerous.

According to this theory Homo sapiens is primarily a social animal. Social cooperation is our key for survival and reproduction. It is not enough for individual men and women to know the whereabouts of lions and bison. It’s much more important for them to know who in their band hates whom, who is sleeping with whom, who is honest, and who is a cheat.

How did Homo sapiens manage to cross this critical threshold, eventually founding cities comprising tens of thousands of inhabitants and empires ruling hundreds of millions? The secret was probably the appearance of fiction. Large numbers of strangers can cooperate successfully by believing in common myths.

In contrast, ever since the Cognitive Revolution, Sapiens have been able to change their behaviour quickly, transmitting new behaviours to future generations without any need of genetic or environmental change. As a prime example, consider the repeated appearance of childless elites, such as the Catholic priesthood, Buddhist monastic orders and Chinese eunuch bureaucracies. The existence of such elites goes against the most fundamental principles of natural selection, since these dominant members of society willingly give up procreation.

Trade may seem a very pragmatic activity, one that needs no fictive basis. Yet the fact is that no animal other than Sapiens engages in trade, and all the Sapiens trade networks about which we have detailed evidence were based on fictions. Trade cannot exist without trust, and it is very difficult to trust strangers. The global trade network of today is based on our trust in such fictional entities as the dollar, the Federal Reserve Bank, and the totemic trademarks of corporations.

Significant differences begin to appear only when we cross the threshold of 150 individuals, and when we reach 1,000–2,000 individuals, the differences are astounding. If you tried to bunch together thousands of chimpanzees into Tiananmen Square, Wall Street, the Vatican or the headquarters of the United Nations, the result would be pandemonium. By contrast, Sapiens regularly gather by the thousands in such places. Together, they create orderly patterns – such as trade networks, mass celebrations and political institutions – that they could never have created in isolation.

The pursuit of an easier life resulted in much hardship, and not for the last time. It happens to us today. How many young college graduates have taken demanding jobs in high-powered firms, vowing that they will work hard to earn money that will enable them to retire and pursue their real interests when they are thirty-five? But by the time they reach that age, they have large mortgages, children to school, houses in the suburbs that necessitate at least two cars per family, and a sense that life is not worth living without really good wine and expensive holidays abroad. What are they supposed to do, go back to digging up roots? No, they double their efforts and keep slaving away. One of history’s few iron laws is that luxuries tend to become necessities and to spawn new obligations.

As humans spread around the world, so did their domesticated animals. Ten thousand years ago, not more than a few million sheep, cattle, goats, boars and chickens lived in restricted Afro-Asian niches. Today the world contains about a billion sheep, a billion pigs, more than a billion cattle, and more than 25 billion chickens. And they are all over the globe. The domesticated chicken is the most widespread fowl ever. Following Homo sapiens, domesticated cattle, pigs and sheep are the second, third and fourth most widespread large mammals in the world. From a narrow evolutionary perspective, which measures success by the number of DNA copies, the Agricultural Revolution was a wonderful boon for chickens, cattle, pigs and sheep.

To ensure that the pigs can’t run away, farmers in northern New Guinea slice off a chunk of each pig’s nose. This causes severe pain whenever the pig tries to sniff. Since the pigs cannot find food or even find their way around without sniffing, this mutilation makes them completely dependent on their human owners. In another area of New Guinea, it has been customary to gouge out pigs’ eyes, so that they cannot even see.

Immediately after birth the calf is separated from its mother and locked inside a tiny cage not much bigger than the calf’s own body. There the calf spends its entire life – about four months on average. It never leaves its cage, nor is it allowed to play with other calves or even walk – all so that its muscles will not grow strong. Soft muscles mean a soft and juicy steak. The first time the calf has a chance to walk, stretch its muscles and touch other calves is on its way to the slaughterhouse. In evolutionary terms, cattle represent one of the most successful animal species ever to exist. At the same time, they are some of the most miserable animals on the planet.

Until the late modern era, more than 90 per cent of humans were peasants who rose each morning to till the land by the sweat of their brows.

Yet the idea that all humans are equal is also a myth. In what sense do all humans equal one another? Is there any objective reality, outside the human imagination, in which we are truly equal? Are all humans equal to one another biologically?

So here is that line from the American Declaration of Independence translated into biological terms: We hold these truths to be self-evident, that all men evolved differently, that they are born with certain mutable characteristics, and that among these are life and the pursuit of pleasure.

When, in 1860, a majority of American citizens concluded that African slaves are human beings and must therefore enjoy the right of liberty, it took a bloody civil war to make the southern states acquiesce.

How do you cause people to believe in an imagined order such as Christianity, democracy or capitalism? First, you never admit that the order is imagined. You always insist that the order sustaining society is an objective reality created by the great gods or by the laws of nature. People are unequal, not because Hammurabi said so, but because Enlil and Marduk decreed it. People are equal, not because Thomas Jefferson said so, but because God created them that way. Free markets are the best economic system, not because Adam Smith said so, but because these are the immutable laws of nature. You also educate people thoroughly.

An objective phenomenon exists independently of human consciousness and human beliefs. Radioactivity, for example, is not a myth. Radioactive emissions occurred long before people discovered them, and they are dangerous even when people do not believe in them.

The subjective is something that exists depending on the consciousness and beliefs of a single individual. It disappears or changes if that particular individual changes his or her beliefs. Many a child believes in the existence of an imaginary friend who is invisible and inaudible to the rest of the world. The imaginary friend exists solely in the child’s subjective consciousness, and when the child grows up and ceases to believe in it, the imaginary friend fades away. The inter-subjective is something that exists within the communication network linking the subjective consciousness of many individuals. If a single individual changes his or her beliefs, or even dies, it is of little importance. However, if most individuals in the network die or change their beliefs, the inter-subjective phenomenon will mutate or disappear. Inter-subjective phenomena are neither malevolent frauds nor insignificant charades. They exist in a different way from physical phenomena such as radioactivity, but their impact on the world may still be enormous. Many of history’s most important drivers are inter-subjective: law, money, gods, nations.

If I alone were to stop believing in the dollar, in human rights, or in the United States, it wouldn’t much matter. These imagined orders are inter-subjective, so in order to change them we must simultaneously change the consciousness of billions of people, which is not easy.

The Sumerians thereby released their social order from the limitations of the human brain, opening the way for the appearance of cities, kingdoms and empires. The data-processing system invented by the Sumerians is called ‘writing’. Continue reading

Book Notes: Tribe by Sebastian Junger

51r-egdn4sl-_sx331_bo1204203200_Tribe: On Homecoming and Belonging by Sebastian Junger is a short book about how humans relate to each other and how modern society is pulling us away from our “tribal” roots. It touches on many topics related to community, war, how our very old brain is ill-equipped for modern society, the community norms of Native Americans, and more. Somehow it manages to hold together to be a stimulating and coherent read from start to finish. I think does accurately describe some of the dynamics that lead to modern unhappiness. Recommended. Highlighted sentences from the Kindle below, not in order.



Thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become European,” a French émigré named Hector de Crèvecoeur lamented in 1782. “There must be in their social bond something singularly captivating and far superior to anything to be boasted of among us.” 

It’s easy for people in modern society to romanticize Indian life, and it might well have been easy for men like George as well. That impulse should be guarded against. Virtually all of the Indian tribes waged war against their neighbors and practiced deeply sickening forms of torture. Prisoners who weren’t tomahawked on the spot could expect to be disemboweled and tied to a tree with their own intestines or blistered to death over a slow fire or simply hacked to pieces and fed alive to the dogs.

A person living in a modern city or a suburb can, for the first time in history, go through an entire day—or an entire life—mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone.

The psychological effect of placing such importance on affluence can be seen in microcosm in the legal profession. In 2015, the George Washington Law Review surveyed more than 6,000 lawyers and found that conventional success in the legal profession—such as high billable hours or making partner at a law firm—had zero correlation with levels of happiness and well-being reported by the lawyers themselves. In fact, public defenders, who have far lower status than corporate lawyers, seem to lead significantly happier lives.

Bluntly put, modern society seems to emphasize extrinsic values over intrinsic ones, and as a result, mental health issues refuse to decline with growing wealth. The more assimilated a person is into American society, the more likely they are to develop depression during the course of their lifetime, regardless of what ethnicity they are. Mexicans born in the United States are wealthier than Mexicans born in Mexico but far more likely to suffer from depression.

“The economic and marketing forces of modern society have engineered an environment… that maximize[s] consumption at the long-term cost of well-being,” a study in the Journal of Affective Disorders concluded in 2012. “In effect, humans have dragged a body with a long hominid history into an overfed, malnourished, sedentary, sunlight-deficient, sleep-deprived, competitive, inequitable, and socially-isolating environment with dire consequences.”

Baby rhesus monkeys were separated from their mothers and presented with the choice of two kinds of surrogates: a cuddly mother made out of terry cloth or an uninviting mother made out of wire mesh. The wire mesh mother, however, had a nipple that dispensed warm milk. The babies took their nourishment as quickly as possible and then rushed back to cling to the terry cloth mother, which had enough softness to provide the illusion of affection. Clearly, touch and closeness are vital to the health of baby primates—including humans.

Also unthinkable would be the modern practice of making young children sleep by themselves. In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone in their own room—a figure that rose to 95 percent among families considered “well educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

Boehm points out that among current-day foraging groups, group execution is one of the most common ways of punishing males who try to claim a disproportionate amount of the group’s resources.

One year into the siege, just before I got to the city, a teenage couple walked into no-man’s-land along the Miljacka River, trying to cross into a Serb-held area. They were quickly gunned down, the young man falling first and the woman crawling over to him as she died. He was a Serb and she was a Muslim, and they had been in love all through high school. They lay there for days because the area was too dangerous for anyone to retrieve their bodies.

American analysts based in England monitored the effects of the bombing to see if any cracks began to appear in the German resolve, and to their surprise found exactly the opposite: the more the Allies bombed, the more defiant the German population became. Industrial production actually rose in Germany during the war. And the cities with the highest morale were the ones—like Dresden—that were bombed the hardest.

He was unable to find a single instance where communities that had been hit by catastrophic events lapsed into sustained panic, much less anything approaching anarchy. If anything, he found that social bonds were reinforced during disasters, and that people overwhelmingly devoted their energies toward the good of the community rather than just themselves.

According to a study based on a century of records at the Carnegie Hero Fund Commission, male bystanders performed more than 90 percent of spontaneous rescues of strangers, and around one in five were killed in the attempt. (“Hero” is generally defined as risking your life to save non-kin from mortal danger. The resulting mortality rate is higher than for most US combat units.) Researchers theorize that greater upper-body strength and a predominantly male personality trait known as “impulsive sensation seeking” lead men to overwhelmingly dominate this form of extreme caretaking.

The greater empathic concern women demonstrate for others may lead them to take positions on moral or social issues that men are less likely to concern themselves with.

In late 2015, a bus in eastern Kenya was stopped by gunmen from an extremist group named Al-Shabaab that made a practice of massacring Christians as part of a terrorism campaign against the Western-aligned Kenyan government. The gunmen demanded that Muslim and Christian passengers separate themselves into two groups so that the Christians could be killed, but the Muslims—most of whom were women—refused to do it. They told the gunmen that they would all die together if necessary, but that the Christians would not be singled out for execution. The Shabaab eventually let everyone go.

What would you risk dying for—and for whom—is perhaps the most profound question a person can ask themselves.

“The miners’ code of rescue meant that each trapped miner had the knowledge that he would never be buried alive if it were humanly possible for his friends to reach him,” a 1960 study called Individual and Group Behavior in a Coal Mine Disaster explained. “At the same time, the code was not rigid enough to ostracize those who could not face the rescue role.”

If women aren’t present to provide the empathic leadership that every group needs, certain men will do it. If men aren’t present to take immediate action in an emergency, women will step in.

Twenty years after the end of the siege of Sarajevo, I returned to find people talking a little sheepishly about how much they longed for those days. More precisely, they longed for who they’d been back then. Even my taxi driver on the ride from the airport told me that during the war, he’d been in a special unit that slipped through the enemy lines to help other besieged enclaves. “And now look at me,” he said, dismissing the dashboard with a wave of his hand.

We didn’t learn the lesson of the war, which is how important it is to share everything you have with human beings close to you. The best way to explain it is that the war makes you an animal. We were animals. It’s insane—but that’s the basic human instinct, to help another human being who is sitting or standing or lying close to you.” I asked Ahmetašević if people had ultimately been happier during the war. “We were the happiest,” Ahmetašević said. Then she added: “And we laughed more.”

Given the profound alienation of modern society, when combat vets say that they miss the war, they might be having an entirely healthy response to life back home. Iroquois warriors did not have to struggle with that sort of alienation because warfare and society existed in such close proximity that there was effectively no transition from one to the other.

But the very worst experience, by far, was having a friend die. In war after war, army after army, losing a buddy is considered the most devastating thing that can possibly happen. It is far more disturbing than experiencing mortal danger oneself and often serves as a trigger for psychological breakdown on the battlefield or later in life.

Horrific experiences are unfortunately a human universal, but long-term impairment from them is not, and despite billions of dollars spent on treatment, roughly half of Iraq and Afghanistan veterans have applied for permanent PTSD disability. Since only 10 percent of our armed forces experience actual combat, the majority of vets claiming to suffer from PTSD seem to have been affected by something other than direct exposure to danger.

Administration counselor I spoke with, who asked to remain anonymous, described having to physically protect someone in a PTSD support group because other vets wanted to beat him up for seeming to fake his trauma. This counselor said that many combat veterans actively avoid the VA because they worry about losing their temper around patients who they think are milking the system. “It’s the real deals—the guys who have seen the most—that this tends to bother,” he told me.

It’s common knowledge in the Peace Corps that as stressful as life in a developing country can be, returning to a modern country can be far harder. One study found that one in four Peace Corps volunteers reported experiencing significant depression after their return home, and that figure more than doubled for people who had been evacuated from their host country during wartime or some other kind of emergency.

One of the most noticeable things about life in the military, even in support units, is that you are almost never alone. Day after day, month after month, you are close enough to speak to, if not touch, a dozen or more people. When I was with American soldiers at a remote outpost in Afghanistan, we slept ten to a hut in bunks that were only a few feet apart. I could touch three other men with my outstretched hand from where I lay. They snored, they talked, they got up in the middle of the night to use the piss tubes, but we always felt safe because we were in a group. The outpost was attacked dozens of times, yet I slept better surrounded by those noisy, snoring men than I ever did camping alone in the woods of New England.

According to Shalev, the closer the public is to the actual combat, the better the war will be understood and the less difficulty soldiers will have when they come home.

Secondly, ex-combatants shouldn’t be seen—or be encouraged to see themselves—as victims. One can be deeply traumatized, as firemen are by the deaths of both colleagues and civilians, without being viewed through the lens of victimhood.

Rachel Yehuda pointed to littering as the perfect example of an everyday symbol of disunity in society. “It’s a horrible thing to see because it sort of encapsulates this idea that you’re in it alone, that there isn’t a shared ethos of trying to protect something shared,” she told me. “It’s the embodiment of every man for himself. It’s the opposite of the military.”

American Indians, proportionally, provide more soldiers to America’s wars than any other demographic group in the country. They are also the product of an ancient culture of warfare that takes great pains to protect the warrior from society, and vice versa.

Unlike criticism, contempt is particularly toxic because it assumes a moral superiority in the speaker.

Book Notes: The Road to Character

brooksI’m a longtime lover of David Brooks’ columns and books. His most recent book, The Road to Character, resonated. It connects well with the themes I touch on in my essay Happy Ambition: Striving for Success, Avoiding Status Cocaine, and Prioritizing Happiness.

Brooks shares stories about exemplars of moral virtue from various historical periods. He makes a special point to underscore how our modern culture may be distorting the most fulfilling version of the good life. He challenges those who take moral shortcuts en route to professional achievement. Most of all, he recommends prioritizing “eulogy virtues” (the sorts of things people talk about at your funeral) over “resume virtues” (the sorts of accomplishments you list on your resume).

Highly recommended. Below are some of my favorite paragraphs and sentences.


The self-effacing person is soothing and gracious, while the self-promoting person is fragile and jarring. Humility is freedom from the need to prove you are superior all the time, but egotism is a ravenous hunger in a small space—self-concerned, competitive, and distinction-hungry. Humility is infused with lovely emotions like admiration, companionship, and gratitude. “Thankfulness,” the Archbishop of Canterbury, Michael Ramsey, said, “is a soil in which pride does not easily grow.”

This is the way humility leads to wisdom. Montaigne once wrote, “We can be knowledgeable with other men’s knowledge, but we can’t be wise with other men’s wisdom.” That’s because wisdom isn’t a body of information. It’s the moral quality of knowing what you don’t know and figuring out a way to handle your ignorance, uncertainty, and limitation.

But we often put our loves out of order. If someone tells you something in confidence and then you blab it as good gossip at a dinner party, you are putting your love of popularity above your love of friendship. If you talk more at a meeting than you listen, you may be putting your ardor to outshine above learning and companionship. We do this all the time.

One could, Frankl wrote, still participate in a rapturous passion for one’s beloved and thus understand the full meaning of the words “The angels are lost in perpetual contemplation of an infinite glory.”

A dozen voices from across the institution told students that while those who lead flat and unremarkable lives may avoid struggle, a well-lived life involves throwing oneself into struggle, that large parts of the most worthy lives are spent upon the rack, testing moral courage and facing opposition and ridicule, and that those who pursue struggle end up being happier than those who pursue pleasure.

Eisenhower himself would later lose his own firstborn son, Doud Dwight, known in the family as “Icky,” an experience that darkened his world ever after. “This was the greatest disappointment and disaster in my life,” he would write decades later, “the one I have never been able to forget completely. Today, when I think of it, even now as I write about it, the keenness of our loss comes back to me as fresh and terrible as it was in that long dark day soon after Christmas, 1920.” The fragility and remorselessness of this life demanded a certain level of discipline.

When modern culture tries to replace sin with ideas like error or insensitivity, or tries to banish words like “virtue,” “character,” “evil,” and “vice” altogether, that doesn’t make life any less moral; it just means we have obscured the inescapable moral core of life with shallow language. It just means we think and talk about these choices less clearly, and thus become increasingly blind to the moral stakes of everyday life.

We really do have dappled souls. The same ambition that drives us to build a new company also drives us to be materialistic and to exploit. The same lust that leads to children leads to adultery. The same confidence that can lead to daring and creativity can lead to self-worship and arrogance.

The danger of sin, in other words, is that it feeds on itself. Small moral compromises on Monday make you more likely to commit other, bigger moral compromises on Tuesday. A person lies to himself and soon can no longer distinguish when he is lying to himself and when he isn’t. Another person is consumed by the sin of self-pity, a passion to be a righteous victim that devours everything around it as surely as anger or greed.

Since self-control is a muscle that tires easily, it is much better to avoid temptation in the first place rather than try to resist it once it arises.

He was willing to appear tongue-tied if it would help him conceal his true designs. Just as he learned to suppress his anger as a boy, he learned to suppress his ambitions and abilities as an adult. He was reasonably learned in ancient history, admiring especially the crafty Athenian leader Themistocles, but he never let that on. He did not want to appear smarter than other people, or somehow superior to the average American. Instead he cultivated the image of simple, unlearned charm.

Eisenhower, for example, was fueled by passion and policed by self-control. Neither impulse was entirely useless and neither was entirely benign. Eisenhower’s righteous rage could occasionally propel him toward justice, but it could occasionally blind him. His self-control enabled him to serve and do his duty, but it could make him callous. 

He distrusts passionate intensity and bold simplicity because he know that in politics the lows are lower than the highs are high—the damage leaders do when they get things wrong is greater than the benefits they create when they get things right. Therefore caution is the proper attitude, an awareness of the limits the foundation of wisdom.

The whole object of VMI training was to teach Marshall how to exercise controlled power. The idea was that power exaggerates the dispositions—making a rude person ruder and a controlling person more controlling. The higher you go in life, the fewer people there are to offer honest feedback or restrain your unpleasant traits. So it is best to learn those habits of self-restraint, including emotional self-restraint, at an early age. “What I learned at VMI was self-control, discipline, so that it was ground in,” he would recall later.

And indeed, if we look at love in its most passionate phase, we see that love often does several key things to reorient the soul. The first thing it does is humble us. It reminds us that we are not even in control of ourselves.

Love is a surrender. You expose your deepest vulnerabilities and give up your illusions of self-mastery. This vulnerability and the desire for support can manifest itself in small ways. Eliot once wrote, “There is something strangely winning to most women in that offer of the firm arm; the help is not wanted physically at the moment, but the sense of help, the presence of strength that is outside them and yet theirs, meets a continual want of imagination.”

Next, love decenters the self. Love leads you out of your natural state of self-love. Love makes other people more vivid to you than you are to yourself.

If the shallow person lives in the smallness of his own ego, a person in love finds that the ultimate riches are not inside, they are out there, in the beloved and in the sharing of a destiny with the beloved. A successful marriage is a fifty-year conversation getting ever closer to that melding of mind and heart. Love expresses itself in shared smiles and shared tears and ends with the statement, “Love you? I am you.”

Self-control is like a muscle. If you are called upon to exercise self-control often in the course of a day, you get tired and you don’t have enough strength to exercise as much self-control in the evening. But love is the opposite. The more you love, the more you can love. A person who has one child does not love that child less when the second and third child come along. A person who loves his town does not love his country less. Love expands with use.

Augustine’s feeling of fragmentation has its modern corollary in the way many contemporary young people are plagued by a frantic fear of missing out. The world has provided them with a superabundance of neat things to do. Naturally, they hunger to seize every opportunity and taste every experience. They want to grab all the goodies in front of them. They want to say yes to every product in the grocery store. They are terrified of missing out on anything that looks exciting. But by not renouncing any of them they spread themselves thin. What’s worse, they turn themselves into goodie seekers, greedy for every experience and exclusively focused on self. If you live in this way, you turn into a shrewd tactician, making a series of cautious semicommitments without really surrendering to some larger purpose. You lose the ability to say a hundred noes for the sake of one overwhelming and fulfilling yes.

Furthermore, the world is so complex, and fate so uncertain, that you can never really control other people or the environment effectively enough to be master of your own destiny. Reason is not powerful enough to build intellectual systems or models to allow you to accurately understand the world around you or anticipate what is to come. Your willpower is not strong enough to successfully police your desires. If you really did have that kind of power, then New Year’s resolutions would work. Diets would work. The bookstores wouldn’t be full of self-help books. You’d need just one and that would do the trick. You’d follow its advice, solve the problems of living, and the rest of the genre would become obsolete. The existence of more and more self-help books is proof that they rarely work.

One key paradox of pride is that it often combines extreme self-confidence with extreme anxiety. The proud person often appears self-sufficient and egotistical but is really touchy and unstable. The proud person tries to establish self-worth by winning a great reputation, but of course this makes him utterly dependent on the gossipy and unstable crowd for his own identity. The proud person is competitive. But there are always other people who might do better. The most ruthlessly competitive person in the contest sets the standard that all else must meet or get left behind. Everybody else has to be just as monomaniacally driven to success. One can never be secure.

One of the things you have to do in order to receive grace is to renounce the idea that you can earn it. You have to renounce the meritocratic impulse that you can win a victory for God and get rewarded for your effort. Then you have to open up to it. You do not know when grace will come to you. But people who are open and sensitive to it testify that they have felt grace at the oddest and at the most needed times.

As the theologian Lisa Fullam has put it, “Humility is a virtue of self-understanding in context, acquired by the practice of other centeredness.”

Johnson was living a life, familiar to us now but more unusual in his own day, in which he was thrown continually back on himself. Without a settled trade, like farming or teaching, separated from the rootedness of extended family life, he was compelled to live as a sort of freelancer according to his wits. His entire destiny—his financial security, his standing in his community, his friendships, his opinions and meaning as a person—were determined by the ideas that flashed through his mind. The Germans have a word for this condition: Zerrissenheit—loosely, “falling-to-pieces-ness.” This is the loss of internal coherence that can come from living a multitasking, pulled-in-a-hundred-directions existence. This is what Kierkegaard called “the dizziness of freedom.” When the external constraints are loosened, when a person can do what he wants, when there are a thousand choices and distractions, then life can lose coherence and direction if there isn’t a strong internal structure.

Man’s chief merit consists in resisting the impulses of his nature.

He devised a strategy to defeat the envy in his heart. He said that in general he did not believe that one vice should be cured by another. But envy is such a malignant state of mind that the dominance of almost any other quality is to be preferred. So he chose pride. He told himself that to envy another is to admit one’s inferiority, and that it is better to insist on one’s superior merit than to succumb to envy. When tempted to envy another, he persuaded himself of his own superior position. Then, turning in a more biblical direction, he preached charity and mercy. The world is so bursting with sin and sorrow that “there are none to be envied.” Everyone has some deep trouble in their lives. Almost no one truly enjoys their own achievements, since their desires are always leaping forward and torturing them with visions of goods unpossessed.

Montaigne came to realize how hard it was to control one’s own mind, or even one’s body. He despaired over even his own penis, “which intrudes so tiresomely when we do not require it and fails us so annoyingly when we need it most.” But the penis is not alone in its rebellion.

A philosopher can cultivate the greatest mind in history, but one bite from a rabid dog could turn him into a raving idiot. Montaigne is the author of the take-you-down-a-peg saying that “on the loftiest throne in the world we are still only sitting on our own rump.” He argues that “if others examined themselves attentively, as I do, they would find themselves, as I do, full of inanity and nonsense. Get rid of it I cannot without getting rid of myself. We are all steeped in it, one as much as another; but those who are aware of it are a little better off—though I don’t know.” As Sarah Bakewell observes in her superb book on the man, How to Live, that final coda “though I don’t know” is pure Montaigne.

He’s a little lazy, so he learns to relax. (Johnson gave himself fervent self-improvement sermons, but Montaigne would not. Johnson was filled with moral sternness; Montaigne was not.) Montaigne’s mind naturally wanders, so he takes advantage and learns to see things from multiple perspectives. Every flaw comes with its own compensation.

The ardent and the self-demanding have never admired Montaigne. They find his emotional register too narrow, his aspirations too modest, his settledness too bland.

Most important, Johnson understood that it takes some hard pressure to sculpt a character. The material is resistant. There has to be some pushing, some sharp cutting, and hacking. It has to be done in confrontation with the intense events of the real world, not in retreat from them.

This moral realism then found expression in humanists like Samuel Johnson, Michel de Montaigne, and George Eliot, who emphasized how little we can know, how hard it is to know ourselves, and how hard we have to work on the long road to virtue. “We are all of us born in moral stupidity, taking the world as an udder to feed our supreme selves,” Eliot wrote. It was also embodied, in different ways and at different times, in the thought of Dante, Hume, Burke, Reinhold Niebuhr, and Isaiah Berlin. All of these thinkers take a limited view of our individual powers of reason. They are suspicious of abstract thinking and pride. They emphasize the limitations in our individual natures.

Some of these limitations are epistemological: reason is weak and the world is complex. We cannot really grasp the complexity of the world or the full truth about ourselves. Some of these limitations are moral: There are bugs in our souls that lead us toward selfishness and pride, that tempt us to put lower loves over higher loves. Some of the limitations are psychological: We are divided within ourselves, and many of the most urgent motions of our minds are unconscious and only dimly recognized by ourselves. Some of them are social: We are not self-completing creatures. To thrive we have to throw ourselves into a state of dependence—on others, on institutions, on the divine. The place that limitation occupies in the “crooked timber” school is immense.

Then came humanistic psychology led by people like Carl Rogers, the most influential psychologist of the twentieth century. The humanistic psychologists shifted away from Freud’s darker conception of the unconscious and promoted a sky-high estimation of human nature. The primary psychological problem, he argued, is that people don’t love themselves enough, and so therapists unleashed a great wave of self-loving.

If you were born at any time over the last sixty years, you were probably born into what the philosopher Charles Taylor has called “the culture of authenticity.” This mindset is based on the romantic idea that each of us has a Golden Figure in the core of our self. There is an innately good True Self, which can be trusted, consulted, and gotten in touch with. Your personal feelings are the best guide for what is right and wrong.

Pride is the central vice. Pride is a problem in the sensory apparatus. Pride blinds us to the reality of our divided nature. Pride blinds us to our own weaknesses and misleads us into thinking we are better than we are. Pride makes us more certain and closed-minded than we should be. Pride makes it hard for us to be vulnerable before those whose love we need. Pride makes coldheartedness and cruelty possible.

Once the necessities for survival are satisfied, the struggle against sin and for virtue is the central drama of life. No external conflict is as consequential or as dramatic as the inner campaign against our own deficiencies. This struggle against, say, selfishness or prejudice or insecurity gives meaning and shape to life. It is more important than the external journey up the ladder of success.

You become more disciplined, considerate, and loving through a thousand small acts of self-control, sharing, service, friendship, and refined enjoyment. If you make disciplined, caring choices, you are slowly engraving certain tendencies into your mind.

Defeating weakness often means quieting the self. Only by quieting the self, by muting the sound of your own ego, can you see the world clearly. Only by quieting the self can you be open to the external sources of strengths you will need. Only by stilling the sensitive ego can you react with equipoise to the ups and downs of the campaign. The struggle against weakness thus requires the habits of self-effacement—reticence, modesty, obedience to some larger thing—and a capacity for reverence and admiration.

Sin and limitation are woven through our lives. We are all stumblers, and the beauty and meaning of life are in the stumbling—in recognizing the stumbling and trying to become more graceful as the years go by.

Book Notes: Happiness from n+1

happiness_3_grandeI’ve been reading n+1 magazine for a long time. It gives me a flavor for the Brooklyn intellectual hipster scene. Some of the topics are too highfalutin for me but the writing usually makes me think in a new way. Their greatest hits — essays, that is — have been compiled into a book titled Happiness: 10 Years of n+1. It’s excellent. So many well written essays on literature, sex, torture, and writing itself. Below are my highlighted paragraphs/sentences. Note they come from different essays so do not necessarily make sense in order…


The moral responsibility is not to be intelligent. It’s to think. An attribute, self-satisfied and fixed, gets confused with an action, thinking, which revalues old ideas as well as defends them. Thought adds something new to the world; simple intelligence wields hardened truth like a bludgeon.

Far more effective, the torture theorists say, is to shatter the routines that make us adults without physical violence. Sleep deprivation becomes the favored tactic, accompanied by such macabre tricks as putting clocks forward or back randomly, making sure that the prisoner cannot tell day from night, irregular feeding, temporary starvation, random extremes of temperature, loud music constantly, and, of course, random responses from the captors, either rage or bizarre affection, always absurd and unmotivated. (It’s allowed, for instance, to reward uncooperative behavior, the better to induce false hopes in the subject.)…Deroutinization… At least since the Scottish Enlightenment, people have recognized that habits and chains of association make up a strong part of individual identity, but it would be a twisted view of humans that made our habits both the necessary and sufficient condition of our individual life.

Did they notice that the scene begins when Ivan Karamazov asks his brother if he would torture a child if it meant ensuring happiness for the rest of the world? We know our president’s answer would be an enthusiastic thumbs-up, as long as it’s someone else’s thumb.

Adults project the sex of children in lust, or examine children sexually with magnifying glasses to make sure they don’t appeal to us. But these lenses became burning glasses.

Now children from junior high to high school to college live in the most perfect sex environment devised by contemporary society—or so adults believe. Now they are inmates in great sex colonies where they wheel in circles holding hands with their pants down.

The lesson each time is that sleeping with strangers or being photographed naked lets the authors know themselves better. Many of these institutions are driven by women. Perhaps they, even more than young men, feel an urgency to know themselves while they can—since America curses them with a premonition of disappointment: when flesh sags, freedom will wane.

Though the young person has never been old, the old person once was young. When you look up the age ladder, you look at strangers; when you look down the age ladder, you are always looking at versions of yourself. As an adult, it depends entirely on your conception of yourself whether those fantastic younger incarnations will seem long left behind or all-too-continuous with who you are now. And this conception of yourself depends, in turn, on the culture’s attitudes to adulthood and childhood, age and youth. This is where the trouble arises. For in a culture to which sex furnishes the first true experiences, it makes a kind of sense to return to the ages at which sex was first used to pursue experience and one was supposedly in a privileged position to find it. Now we begin to talk, not about our sex per se, but about a fundamental change in our notion of freedom, and what our lives are a competition for.

At times I wonder if we are witnessing a sexualization of the life process itself, in which all pleasure is canalized into the sexual, and the function of warm, living flesh in any form is to allow us access to autoerotism through the circuit of an other. This is echoed at the intellectual level in the discourse of “self-discovery.” The real underlying question of sexual encounter today may not be “What is he like in bed?” (heard often enough, and said without shame) but “What am I like in bed?” (never spoken). That is to say, at the deepest level, one says: “Whom do I discover myself to be in sex?”—so that sex becomes the special province of self-discovery. Continue reading