Author Archives: Ben Casnocha

Impressions of Morocco

Locals lined up to see the King drive by in Marrakech

Locals lined up to see the King drive by in Marrakech

I spent 10 days in Morocco (Marrakech and Fes) over Christmas and New Year’s. Some of my impressions:

  • Quite a beautiful country, physically. The old towns of Marrakech and Fes — the endless alleyways and crevices, and of course the souks — are quite something to take in. You feel transported back in time as you walk the streets and interact with shop keepers whose families have owned the little candy store or butcher or scarf outlet for more decades than you can count. Outside the cities, the red desert landscape is punctuated by the Atlas mountain range.
  • Food tours continue to be a highlight of my trips. The Marrakech city food tour brought us to hole-in-the-wall couscous restaurants that would have been impossible to find otherwise, and brought us face to face with the (whole) cooked head of a sheep. The eyeballs are a delicacy. I can only speak to the taste of the cheek meat…
  • Rug negotiations. Morocco is known for their carpets. Rug makers spend years in the mountains hand weaving gorgeous rugs that eventually find their way into the dozens (hundreds?) of rug shops in the cities. Then, the rugs are hawked aggressively to the large numbers of French and Spanish tourists wandering the alleyways. “You want rug? Low price. Only the best price.” That sort of thing. Once inside the shop, the game of salesmanship and negotiation is fun to watch. They’ve been selling rugs for so long that you can bet that every word and expression offered during the display of rugs and discussion of colors, etc. is a refined technique with the aim of moving product.
  • Morocco is a Muslim country, so there’s no alcohol served in most restaurants and you don’t see people drinking. (It makes New Year’s Eve a rather sober affair in Morocco.) That said, Moroccan wine is produced and consumed in huge quantities in the country…
  • Men rule. During the day, the cafes are occupied almost exclusively by men. The stores are manned almost exclusively by men. “The women are at home,” our driver told us.
  • Beautiful chandeliers and wall designs. Even in the most podunk restaurant in a non-hip restaurant, the light fixture will be some crazy ornate work of art. The Moroccan aesthetic is very popular in the U.S. — as a luxury good. In Morocco, the Moroccan aesthetic is…everywhere.
  • All hail the king. The King of Morocco, who’s usually stationed in Rabat, came through Marrakech when we were there. News spread and soon thousands of people had lined up along the street to see him. Hours and hours and hours passed. Shops closed. Streets were blocked off. We were stranded — unable to cross a street back to our riad given the road closure. Finally the king’s motorcade whizzed by, the people on the street waved, and two seconds later it was all over. When we asked a guide about the scene later, he said shutting down the economic heartbeat of a city for a full day just so people can catch a glimpse of the king is nuts. “It’s why the government likes its people illiterate and uneducated. They’ll blindly be entranced by the king,” he said.
  • Morocco’s riads (hotels inside the medina) all offer hamams; getting your body scrubbed down feels great. It’s crazy how much dead skin comes off. Hamam and regular Swedish massage are dirt cheap in Morocco so you can help yourself to multiple servings of each.

Marrakech is a three hour flight from London. It’s well worth a visit. Read the memoir  “Dreams of Trespass” on the flight over.

Happy 2017. I hope it’s a fun and peaceful year for you, wherever you are…

Heleo Interview Part 1

Here’s a six minute video excerpt (part 1) of a recent conversation I did with business site Heleo. We cover how adaptation is a business and life skill; how to get feedback on how you should adapt; what one might need to unlearn from school; and why there IS such a thing as a dumb question in a meeting.

Book Review: But What If We’re Wrong?

klostermanChuck Klosterman wrote one of the most stimulating books that I read in 2016: But What If We’re Wrong: Thinking About the Present As If It Were the Past. There are countless interesting observations on science and pop culture and sports and history. By contemplating which assumptions of today might be disproven in the future, or which authors of note today might be forgotten in the future and which no-name writers might become famous after their death, he unearths novel theories about familiar topics. Why did Herman Melville become renowned by future generations but not his own? Which theories of science are we totally convinced are true today but may well be proven false by future generations of physicists? Which TV show made in 2016 will be referenced by historians in 2080 when they try to explain what life was like in 2016?

My favorite paragraphs are pasted below, with the bold font mine.

Thanks to Russ Roberts for recommending this via his Econtalk conversation with Chuck.


When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who’s ever lived.”

Aristotle had argued more than a thousand years prior: He believed all objects craved their “natural place,” and that this place was the geocentric center of the universe, and that the geocentric center of the universe was Earth. In other words, Aristotle believed that a dropped rock fell to the earth because rocks belonged on earth and wanted to be there.

For the next thirty years, nothing about the reception of [Moby Dick] changes. But then World War I happens, and—somehow, and for reasons that can’t be totally explained—modernists living in postwar America start to view literature through a different lens. There is a Melville revival. The concept of what a novel is supposed to accomplish shifts in his direction and amplifies with each passing generation…

I suspect most conventionally intelligent people are naïve realists, and I think it might be the defining intellectual quality of this era. The straightforward definition of naïve realism doesn’t seem that outlandish: It’s a theory that suggests the world is exactly as it appears.

Any time you talk to police (or lawyers, or journalists) about any kind of inherently unsolvable mystery, you will inevitably find yourself confronted with the concept of Occam’s Razor: the philosophical argument that the best hypothesis is the one involving the lowest number of assumptions.

The reason something becomes retrospectively significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative.

The arc of Lethem’s larger contention boils down to two points. The first is that no one is really remembered over the long haul, beyond a few totemic figures—Joyce, Shakespeare, Homer—and that these figures serve as placeholders for the muddled generalization of greatness (“Time is a motherfucker and it’s coming for all of us,” Lethem notes).

The reason shadow histories remained in the shadows lay in the centralization of information: If an idea wasn’t discussed on one of three major networks or on the pages of a major daily newspaper or national magazine, it was almost impossible for that idea to gain traction with anyone who wasn’t consciously searching for alternative perspectives. That era is now over. There is no centralized information, so every idea has the same potential for distribution and acceptance.

Competing modes of discourse no longer “compete.” They coexist.

Take, for example, the plight of Native Americans. What American subculture has suffered more irrevocably? Prior to Columbus’s landing in the New World, the Native American population approached one hundred million. Now it’s a little over three million, two-thirds of whom are relegated to fifty delineated reservations on mostly undesirable land. Still, that equates to roughly 1 percent of the total US population. Yet Native Americans are essentially voiceless, even in conversations that specifically decry the lack of minority representation. Who is the most prominent Native American media figure or politician? Sherman Alexie? Louise Erdrich? Tom Cole or Markwayne Mullin, both of whom are from the same state? Who, for that matter, is the most famous Native American athlete, or rapper, or reality star? Maybe Sam Bradford? Maybe Sacheen Littlefeather, who’s been virtually invisible since the seventies? When the Academy Awards committee next announces the nominations for Best Picture, how many complaints will focus on the lack of films reflecting the Native American experience? Outside the anguish expressed over the use of the term “Redskin” by the Washington football franchise, it’s hard to find conversation about the biases facing Native Americans; outside the TV show Fargo, you almost never see it reflected in the popular culture. Everyone concedes it exists, but it’s not a popular prejudice (at least not among the mostly white liberals who drive these conversations). Their marginalization is ignored, thus creating a fertile factory for the kind of brilliant outsider who won’t be recognized until that artist is dead and gone. So this is one possibility—a Navajo Kafka.

 Kurt Vonnegut’s A Man Without a Country: “I think that novels that leave out technology misrepresent life as badly as Victorians misrepresented life by leaving out sex.”

…the myth of universal timeliness. There is a misguided belief—often promoted by creative writing programs—that producing fiction excessively tied to technology or popular culture cheapens the work and detracts from its value over time. If, for example, you create a plot twist that hinges on the use of an iPad, that story will (allegedly) become irrelevant once iPads are replaced by a new form of technology. If a character in your story is obsessed with watching Cheers reruns, the meaning of that obsession will (supposedly) evaporate once Cheers disappears from syndication. If your late-nineties novel is consumed with Monica Lewinsky, the rest of the story (purportedly) devolves into period piece. The goal, according to advocates of this philosophy, is to build a narrative that has no irretraceable connection to the temporary world. But that’s idiotic, for at least two reasons. The first is that it’s impossible to generate deep verisimilitude without specificity. The second is that if you hide the temporary world and the work somehow does beat the odds and become timeless, the temporary world you hid will become the only thing anyone cares about

But I’ve been a paid critic for enough years to know my profession regularly overrates many, many things by automatically classifying them as potentially underrated. The two terms have become nonsensically interchangeable.

The nonfiction wing of this level houses elemental tacticians like Robert Caro; someone like William T. Vollmann straddles both lines, fortified by his sublime recklessness. Even the lesser books from these writers are historically important, because—once you’re defined as great—failures become biographically instructive. 

The third tier houses commercial writers who dependably publish major or minor bestsellers and whose success or failure is generally viewed as a reflection of how much (or how little) those books sell. These individuals are occasionally viewed as “great at writing,” but rarely as great writers. They are envied and discounted at the same time. They are what I call “vocally unrated”: A large amount of critical thought is directed toward explaining how these types of novels are not worth thinking about.

Now, if the world were logical, certain predictions could be made about what bricks from that pyramid will have the greatest likelihood of remaining intact after centuries of erosion. Devoid of all other information, a betting man would have to select a level-one writer like Roth, just as any betting man would take the Yankees if forced to wager on who will win the World Series one hundred seasons from now. If you don’t know what the weather will be like tomorrow, assume it will be pretty much the same as today. But this would require an astonishing cultural stasis. It would not simply mean that the way we presently consume and consider Roth will be the way Roth is consumed and considered forevermore; it would mean that the manner in which we value and assess all novels will remain unchanged. It also means Roth must survive his inevitable post-life reevaluation by the first generation of academics who weren’t born until he was already gone, a scenario where there will be no room for advancement and plenty of room for diminishing perceptions (no future contrarian can provocatively claim, “Roth is actually better than everyone thought at the time,” because—at the time—everyone accepted that he was viewed as remarkable). He is the safest bet, but still not a safe bet. Which is why I find myself fixated on the third and sixth tiers of my imaginary triangle: “the unrated.” As specific examples, they all face immeasurable odds. But as a class, they share certain perverse advantages.

Normal consumers declare rock to be dead whenever they personally stop listening to it (or at least to new iterations of it), which typically happens about two years after they graduate from college.

The Beatles were the first major band to write their own songs, thus making songwriting a prerequisite for credibility; they also released tracks that unintentionally spawned entire subgenres of rock, such as heavy metal (“Helter Skelter”), psychedelia (“Tomorrow Never Knows”), and country rock (“I’ll Cry Instead”).

Do I think the Beatles will be remembered in three hundred years? Yes. I believe the Beatles will be the Sousa of Rock (alongside Michael Jackson, the Sousa of Pop22). If this were a book of predictions, that’s the prediction I’d make. But this is not a book about being right. This is a book about being wrong, and my faith in wrongness is greater than my faith in the Beatles’ unassailability. What I think will happen is probably not what’s going to happen. So I will consider what might happen instead.

Since rock, pop, and rap are so closely tied to youth culture, there’s an undying belief that young people are the only ones who can really know what’s good. It’s the only major art form where the opinion of a random fourteen-year-old is considered more relevant than the analysis of a sixty-four-year-old scholar. (This is why it’s so common to see aging music writers championing new acts that will later seem comically overrated—once they hit a certain age, pop critics feel an obligation to question their own taste.)

Take architecture: Here we have a creative process of immense functional consequence. It’s the backbone of the urban world we inhabit, and it’s an art form most people vaguely understand—an architect is a person who designs a structure on paper, and that design emerges as the structure itself. Architects fuse aesthetics with physics and sociology. And there is a deep consensus over who did this best, at least among non-architects: If we walked down the street of any American city and asked people to name the greatest architect of the twentieth century, most would say Frank Lloyd Wright. In fact, if someone provided a different answer, we’d have to assume we’ve stumbled across an actual working architect, an architectural historian, or a personal friend of Frank Gehry. Of course, most individuals in those subsets would cite Wright, too. But in order for someone to argue in favor of any architect except Wright (or even to be in a position to name three other plausible candidates), that person would almost need to be an expert in architecture. Normal humans don’t possess enough information to nominate alternative possibilities. And what emerges from that social condition is an insane kind of logic: Frank Lloyd Wright is indisputably the greatest architect of the twentieth century, and the only people who’d potentially disagree with that assertion are those who legitimately understand the question. History is defined by people who don’t really understand what they are defining.

I don’t believe all art is the same. I wouldn’t be a critic if I did. Subjective distinctions can be made, and those distinctions are worth quibbling about. The juice of life is derived from arguments that don’t seem obvious. But I don’t believe subjective distinctions about quality transcend to anything close to objective truth—and every time somebody tries to prove otherwise, the results are inevitably galvanized by whatever it is they get wrong.

To matter forever, you need to matter to those who don’t care. And if that strikes you as sad, be sad.

But maybe it takes an idiot to pose this non-idiotic question: How do we know we’re not currently living in our own version of the year 1599? According to Tyson, we have not reinvented our understanding of scientific reality since the seventeenth century. Our beliefs have been relatively secure for roughly four hundred years. That’s a long time—except in the context of science. In science, four hundred years is a grain in the hourglass.

One of Greene’s high-profile signatures is his support for the concept of “the multiverse.” Now, what follows will be an oversimplification—but here’s what that connotes: Generally, we work from the assumption that there is one universe, and that our galaxy is a component of this one singular universe that emerged from the Big Bang. But the multiverse notion suggests there are infinite (or at least numerous) universes beyond our own, existing as alternative realities. Imagine an endless roll of bubble wrap; our universe (and everything in it) would be one tiny bubble, and all the other bubbles would be other universes that are equally vast. In his book The Hidden Reality, Greene maps out nine types of parallel universes within this hypothetical system.

“In physics, when we say we know something, it’s very simple,” Tyson reiterates. “Can we predict the outcome? If we can predict the outcome, we’re good to go, and we’re on to the next problem. There are philosophers who care about the understanding of why that was the outcome. Isaac Newton [essentially] said, ‘I have an equation that says why the moon is in orbit. I have no fucking idea how the Earth talks to the moon. It’s empty space—there’s no hand reaching out.’

Galileo famously refused to chill and published his Dialogue Concerning the Two Chief World Systems as soon as he possibly could, mocking all those who believed (or claimed to believe) that the Earth was the center of the universe. The pope, predictably, was not stoked to hear this. But the Vatican still didn’t execute Galileo; he merely spent the rest of his life under house arrest (where he was still allowed to write books about physics) and lived to be seventy-seven.

What Bostrom is asserting is that there are three possibilities about the future, one of which must be true. The first possibility is that the human race becomes extinct before reaching the stage where such a high-level simulation could be built. The second possibility is that humans do reach that stage, but for whatever reason—legality, ethics, or simple disinterest—no one ever tries to simulate the complete experience of civilization. The third possibility is that we are living in a simulation right now. Why? Because if it’s possible to create this level of computer simulation (and if it’s legally and socially acceptable to do so), there won’t just be one simulation. There will be an almost limitless number of competing simulations, all of which would be disconnected from each other. A computer program could be created that does nothing except generate new simulations, all day long, for a thousand consecutive years. And once those various simulated societies reach technological maturity, they would (assumedly) start creating simulations of their own—simulations inside of simulations.

The term “conspiracy theory” has an irrevocable public relations problem. Technically, it’s just an expository description for a certain class of unproven scenario. But the problem is that it can’t be self-applied without immediately obliterating whatever it’s allegedly describing. You can say, “I suspect a conspiracy,” and you can say, “I have a theory.” But you can’t say, “I have a conspiracy theory.” Because if you do, it will be assumed that even you don’t entirely believe the conspiracy you’re theorizing about.

But it still must be asked: Discounting those events that occurred within your own lifetime, what do you know about human history that was not communicated to you by someone else? This is a question with only one possible answer.

This, it seems, has become the standard way to compartmentalize a collective, fantastical phenomenon: Dreaming is just something semi-interesting that happens when our mind is at rest—and when it happens in someone else’s mind (and that person insists on describing it to us at breakfast), it isn’t interesting at all.

[On the Buzzfeed blue vs. gold dress viral phenom.] The next day, countless pundits tried to explain why this had transpired. None of their explanations were particularly convincing. Most were rooted in the idea that this happened because we were all looking at a photo of a dress, as opposed to the dress itself. But that only shifts the debate, without really changing it—why, exactly, would two people see the same photograph in two completely different ways?

Adams is the author of On the Genealogy of Color. He believes the topic of color is the most concrete way to consider the question of how much—or how little—our experience with reality is shared with the experience of other people. It’s an unwieldy subject that straddles both philosophy and science. On one hand, it’s a physics argument about the essential role light plays in our perception of color; at the same time, it’s a semantic argument over how color is linguistically described differently by different people. There’s also a historical component: Up until the discovery of color blindness in the seventeenth century, it was assumed that everyone saw everything the same way (and it took another two hundred years before we realized how much person-to-person variation there is). What really changed four hundred years ago was due (once again) to the work of Newton and Descartes, this time in the field of optics. Instead of things appearing “red” simply because of their intrinsic “redness” (which is what Aristotle believed), Newton and Descartes realized it has to do with an object’s relationship to light.

On the same day I spoke with Linklater about dreams, there was a story in The New York Times about a violent incident that had occurred a few days prior in Manhattan. A man had attacked a female police officer with a hammer and was shot by the policewoman’s partner. This shooting occurred at ten a.m., on the street, in the vicinity of Penn Station. Now, one assumes seeing a maniac swinging a hammer at a cop’s skull before being shot in broad daylight would be the kind of moment that sticks in a person’s mind. Yet the Times story explained how at least two of the eyewitness accounts of this event ended up being wrong. Linklater was fascinated by this: “False memories, received memories, how we fill in the blanks of conjecture, the way the brain fills in those spaces with something that is technically incorrect—all of these errors allow us to make sense of the world, and are somehow accepted enough to be admissible in a court of law. They are accepted enough to put someone in prison.” And this, remember, was a violent incident that had happened only hours before. The witnesses were describing something that had happened that same day, and they had no incentive to lie.

How much of history is classified as true simply because it can’t be sufficiently proven false?

All of which demands a predictable question: What significant historical event is most likely wrong? And not because of things we know that contradict it, but because of the way wrongness works.

When D. T. Max published his posthumous biography of David Foster Wallace, it was depressing to discover that many of the most memorable, electrifying anecdotes from Wallace’s nonfiction were total fabrications.

In Ken Burns’s documentary series The Civil War, the most fascinating glimpses of the conflict come from personal letters written by soldiers and mailed to their families. When these letters are read aloud, they almost make me cry. I robotically consume those epistles as personal distillations of historical fact. There is not one moment of The Civil War that feels false. But why is that? Why do I assume the things Confederate soldiers wrote to their wives might not be wildly exaggerated, or inaccurate, or straight-up untruths?

I doubt the current structure of television will exist in two hundred fifty years, or even in twenty-five. People will still want cheap escapism, and something will certainly satisfy that desire (in the same way television does now). But whatever that something is won’t be anything like the television of today. It might be immersive and virtual (like a Star Trekian holodeck) or it might be mobile and open-sourced (like a universal YouTube, lodged inside our retinas). But it absolutely won’t be small groups of people, sitting together in the living room, staring at a two-dimensional thirty-one-inch rectangle for thirty consecutive minutes, consuming linear content packaged by a cable company.

[To understand a given era through a TV show.] We’d want a TV show that provided the most realistic portrait of the society that created it, without the self-aware baggage embedded in any overt attempt at doing so. In this hypothetical scenario, the most accurate depiction of ancient Egypt would come from a fictional product that achieved this goal accidentally, without even trying. Because that’s the way it always is, with everything. True naturalism can only be a product of the unconscious. So apply this philosophy to ourselves, and

To attack True Detective or Lost or Twin Peaks as “unrealistic” is a willful misinterpretation of the intent. We don’t need television to accurately depict literal life, because life can literally be found by stepping outside.

If anyone on a TV show employed the stilted, posh, mid-Atlantic accent of stage actors, it would instantly seem preposterous; outside a few notable exceptions, the goal of televised conversation is fashionable naturalism. But vocal delivery is only a fraction of this equation. There’s also the issue of word choice: It took decades for screenwriters to realize that no adults have ever walked into a tavern and said, “I’ll have a beer,” without noting what specific brand of beer they wanted 

But when a show’s internal rules are good, the viewer is convinced that they’re seeing something close to life. When the rom-com series Catastrophe debuted on Amazon, a close friend tried to explain why the program seemed unusually true to him. “This is the first show I can ever remember,” he said, “where the characters laugh at each other’s jokes in a non-obnoxious way.” This seemingly simple idea was, in fact, pretty novel—prior to Catastrophe, individuals on sitcoms constantly made hilarious remarks that no one seemed to notice were hilarious. For decades, this was an unspoken, internal rule: No one laughs at anything. So seeing characters laugh naturally at things that were plainly funny was a new level of realness. The way a TV show is photographed and staged (this is point number three) are industrial attributes that take advantage of viewers’ preexisting familiarity with the medium: When a fictional drama is filmed like a news documentary, audiences unconsciously absorb the action as extra-authentic (a scene shot from a single mobile perspective, like most of Friday Night Lights, always feels closer to reality than scenes captured with three stationary cameras, like most of How I Met Your Mother).

What is the realest fake thing we’ve ever made on purpose? 

Nothing on TV looks faker than failed attempts at realism. A show like The Bachelor is instantly recognized (by pretty much everyone, including its intended audience) as a prefab version of how such events might theoretically play out in a distant actuality. No television show has ever had a more paradoxical title than MTV’s The Real World, which proved to be the paradoxical foundation of its success.

Roseanne was the most accidentally realistic TV show there ever was…By the standards of TV, both of these people were wildly overweight. Yet what made Roseanne atypical was how rarely those weight issues were discussed. Roseanne was the first American TV show comfortable with the statistical reality that most Americans are fat. And it placed these fat people in a messy house, with most of the key interpersonal conversations happening in the kitchen or the garage or the laundry room. These fat people had three non-gorgeous kids, and the kids complained constantly, and two of them were weird and one never smiled.

The less incendiary take on football’s future suggests that it will continue, but in a different shape. It becomes a regional sport, primarily confined to places where football is ingrained in the day-to-day culture (Florida, Texas, etc.). Its fanbase resembles that of contemporary boxing—rich people watching poor people play a game they would never play themselves.

A few months after being hired as head football coach at the University of Michigan, Jim Harbaugh was profiled on the HBO magazine show Real Sports. It was a wildly entertaining segment, heavily slanted toward the intellection that Harbaugh is a lunatic. One of the last things Harbaugh said in the interview was this: “I love football. Love it. Love it. I think it’s the last bastion of hope for toughness in America in men, in males.”

“But look what happened to boxing,” people will say (and these people sometimes include me). “Boxing was the biggest sport in America during the 1920s, and now it exists on the fringes of society. It was just too brutal.” Yet when Floyd Mayweather fought Manny Pacquiao in May of 2015, the fight grossed $400 million, and the main complaint from spectators was that the fight was not brutal enough. Because it operates on a much smaller scale, boxing is—inside its own crooked version of reality—flourishing. It doesn’t seem like it, because the average person doesn’t care. But boxing doesn’t need average people. It’s not really a sport anymore. It’s a mildly perverse masculine novelty, and that’s enough to keep it relevant.

Midway through the episode, the show’s producers try to mathematically verify if youth participation in football is decreasing as much as we suspect. It is. But the specificity of that stat is deceiving: It turns out youth participation is down for all major sports—football, basketball, baseball, and even soccer (the so-called sport of the future). Around the same time, The Wall Street Journal ran a similar story with similar statistics: For all kids between six and eighteen (boys and girls alike), overall participation in team sports was down 4 percent.

But sometimes the reactionaries are right. It’s wholly possible that the nature of electronic gaming has instilled an expectation of success in young people that makes physical sports less desirable. There’s also the possibility that video games are more inclusive, that they give the child more control, and that they’re simply easier for kids who lack natural physical gifts. All of which point to an incontestable conclusion: Compared to traditional athletics, video game culture is much closer to the (allegedly) enlightened world we (supposedly) want to inhabit.

The gap for the Famous Idaho Potato Bowl was even greater—the human attendance was under 18,000 while the TV audience approached 1.5 million. This prompted USA Today to examine the bizarre possibility of future bowl games being played inside gigantic television studios, devoid of crowds.

What makes the United States so interesting and (arguably) “exceptional” is that it’s a superpower that did not happen accidentally. It did not evolve out of a preexisting system that had been the only system its founders could ever remember; it was planned and strategized from scratch, and it was built to last. Just about everyone agrees the founding fathers did a remarkably good job, considering the impossibility of the goal.

This logic leads to a strange question: If and when the United States does ultimately collapse, will that breakdown be a consequence of the Constitution itself? If it can be reasonably argued that it’s impossible to create a document that can withstand the evolution of any society for five hundred or a thousand or five thousand years, doesn’t that mean present-day America’s pathological adherence to the document we happened to inherit will eventually wreck everything?

Wexler notes a few constitutional weaknesses, some hypothetical and dramatic (e.g., what if the obstacles created to make it difficult for a president to declare war allow an enemy to annihilate us with nuclear weapons while we debate the danger) and some that may have outlived their logical practicality without any significant downside (e.g., California and Rhode Island having equal representation in the Senate, regardless of population).

But I would traditionally counter that Washington’s One Big Thing mattered more, and it actually involved something he didn’t do: He declined the opportunity to become king, thus making the office of president more important than any person who would ever hold it. This, as it turns out, never really happened. There is no evidence that Washington was ever given the chance to become king, and—considering how much he and his peers despised the mere possibility of tyranny—it’s hard to imagine this offer was ever on the table.

Washington’s kingship denial falls into the category of a “utility myth”—a story that supports whatever political position the storyteller happens to hold, since no one disagrees with the myth’s core message (i.e., that there are no problems with the design of our government, even if that design allows certain people to miss the point).

…Every strength is a weakness, if given enough time.

Back in the landlocked eighties, Dave Barry offhandedly wrote something pretty insightful about the nature of revisionism. He noted how—as a fifth-grader—he was told that the cause of the Civil War was slavery. Upon entering high school, he was told that the cause was not slavery, but economic factors. At college, he learned that it was not economic factors but acculturalized regionalism. But if Barry had gone to graduate school, the answer to what caused the Civil War would (once again) be slavery.

Much of the staid lionization of Citizen Kane revolves around structural techniques that had never been done before 1941. It is, somewhat famously, the first major movie where the ceilings of rooms are visible to the audience. This might seem like an insignificant detail, but—because no one prior to Kane cinematographer Gregg Toland had figured out a reasonable way to get ceilings into the frame—there’s an intangible, organic realism to Citizen Kane that advances it beyond its time period. Those visible ceilings are a meaningful modernization that twenty-first-century audiences barely notice.

There’s growing evidence that the octopus is far more intelligent than most people ever imagined, partially because most people always assumed they were gross, delicious morons.

Impressions of Malaysia

A stop on the KL food tour

A stop on the KL food tour

Malaysia is a country of 30 million people that co-anchors the SE Asian economy with Indonesia. It’s also a place known to have great food and friendly people. So I was delighted to have the opportunity to visit there for the first time recently to give a couple talks.

I was only in Kuala Lumpur so it’s probably more accurate to say I was in Kuala Lumpur rather than Malaysia, as the smaller towns and countryside are quite different from the capital city. (The most striking example of this distinction in Asia is the grand canyon of a difference between “Beijing/Shanghai” and “the rest of China.”)

Kuala Lumpur was more relaxed than I was expecting. A few years ago I spent several days in neighboring Jakarta, a city that overwhelms you with traffic and chaos. KL felt positively tranquil by Jakarta standards.

There isn’t a must-see attraction in KL. There are plenty of striking skyscrapers to gawk at; all the luxury hotel brands with posh buildings; some nice looking mosques; various museums, an aquarium, and so on. The malls are fun and huge and contain everything: movies, nice restaurants, casual restaurants, salons, coffee shops, all sorts of retail, banks, and more. Just wandering around a massive Malaysian mall gives you plenty to look at and think about.

The Kuala Lumpur food tour is very much worth doing. We tasted Malay, Chinese, and Indian food in places where there wasn’t a tourist in sight. A food tour remains a favorite way for me to see a city, learn about its culture and economy, of course taste some of its food. I’ve done them in Istanbul, Copenhagen, Kyoto, and now Kuala Lumpur. It’s great for non-foodies: they tend to emphasize cheap local eats.

Malaysia is predominately Muslim and seemingly a bit stricter about religious rules than its neighbor Indonesia. In Malaysia, if you’re not Muslim and marry a Muslim, you legally are required to convert — a requirement that’s uncommon in other Muslim-majority countries. Also, the local scandal of the moment in KL, as it was relayed to me by a few secular locals, was a boycott of Auntie Anne’s restaurant. Yes – the Western chain that sells those delicious hot dogs and pretzels. The reason for the boycott? The phrase “hot dog” on the menu. Dogs apparently are sacred to some Muslims. Thus, the phrase “hot dog” offends. Strange world. Strange times…

The Factor That Colors Happiness and Unhappiness the Most

Is she enjoying the city she’s living in? Is he enjoying his job, his co-workers, his boss? How’s he feeling about life?

The factor that has the most explanatory power, in my view, on questions having to do with personal happiness and satisfaction in one’s personal and professional life, is the following: Is this person satisfied with their romantic relationship status?

Note I am not saying “in a happy relationship.” There are plenty of people who are single and very happy with that status. But those who are single and yet would like to be in a relationship tend to be unhappy, and project that unhappiness across all aspects of their lives. Those in unhappy relationships act similarly.

Romance rules all. To deeply understand a person and their probable happiness is to understand their romantic happenings. Of course, in almost all professional contexts, and a great deal of personal ones too, it is inappropriate to probe on such topics. Which is one reason why most friendships are not very deep.

What I’ve Been Reading

Books and more books.

gold1. Digital Gold by Nathaniel Popper. An incredibly engaging journalistic introduction to Bitcoin and blockchain, with cinematic storytelling about the people who pioneered the technology over the past 15 years. The book is about a year old and since then, Bitcoin has struggled, though I suspect many of the characters in this book — and the experts in real life — remain bullish on cryptocurrencies in the long run. Excellent for those learning the fundamentals of bitcoin and bitcoin history.

2. Dark Matter by Blake Crouch. I made the mistake of starting this sci-fi thriller at 11pm one night in bed. I was up till 1am. It’s a classic page turner set against the backdrop of the Many-Worlds interpretation of quantum mechanics, which is a real thing and which is super interesting to contemplate. (There is a universe right now where Donald Trump is not the current president-elect, for example.)

3. Lightning Rods by Helen Dewitt. I definitely would not admit to being entertained by this entertaining novel. Definitely not.

4. Upheavals of Thought: The Intelligence of Emotions by Martha Nussbaum. I’m a Nussbaum fan and the early chapters here contained provocative reflections on the power of emotions. How an emotion like fear of death manifests in so many aspects of our thought stream. Ultimately this was too dense for me to make it all the way through, but I’m glad I read as far as I did.

Visiting Prison Again — With Defy Ventures

kubik-161014-defy_0086

A few weeks ago, I continued my education on an issue I’m passionate about — criminal justice reform and the prison system — when I participated in a Defy Ventures event at a California maximum security prison. It was an incredibly powerful day.

About 50 VCs and entrepreneurs, mostly from LA, trekked to the prison north of Los Angeles, under the organizing leadership of Mark Suster and Brad Feld and the non-profit Defy Ventures. Defy Ventures, in their own words, “transforms the lives of business leaders and people with criminal histories through their collaboration along the entrepreneurial journey.” Catherine Hoke, founder/CEO of Defy, is one of the most passionate entrepreneurs I’ve ever encountered — and I’ve encountered many passionate entrepreneurs.

From 9:15am-8pm we gathered with the inmates in an indoor gymnasium. The gym looked and felt like any other — except for the multiple signs on the wall that said “No Warning Shots Will Be Fired in the Gym” and for the gunner who paced back and forth from a ledge near the roof of the building, holding an automatic rifle.

The schedule was non-stop. Shaking hands, talking 1:1, listening to their business pitches. Many of the inmates were nervous: some told us that the opportunity to meet us was the biggest opportunity they ever had in their life. We were also nervous: we were in a max security prison. A day full of personal interactions with “criminals” forces you to abandon stereotypes and understand these men for who they are: human beings, flawed like all of us, but human.

The most powerful hour of the day occurred just after lunch. Inmates and volunteers lined up in two lines facing each other. There was about 7-8 yards separating the two lines, and a line of tape on the floor in the middle. We stared at each other across the line. Catherine posed a series of questions and asked us to step to the middle line if the statement was true for us. Example: “If you grew up in poverty, step to the line.” Almost every inmate stepped to the line; almost no volunteers did. “If your parents have been incarcerated, step to the line.” Almost every inmate stepped to the line; almost no volunteers did. And so on. It became abundantly clear very early on — and clear in the most visible way, as people physically stepped forward and back during the questions — that most of the inmates were dealt a set of cards in life that made “failure” a likelihood.

We were asked to maintain eye contact with the inmate standing directly across from us in the lineup. I didn’t know the backstory of the person I happened to line up across from. Then the question: “If you were arrested under the age of 17, step to the line.” He did. Then: “If you have been incarcerated for more than 20 years, step to the line.” He did. Those two questions stopped me cold. He made a mistake as a teenager, and has spent 20 years in the slammer. Are you kidding me? I choked up. Meanwhile, he maintained a steady, compassionate gaze, reaching out to shake volunteers’ hands whenever they stepped to the line.

Afterwards, I went up to him and learned his story. He was in for attempted murder — he had been out with a group of guys one night and one person had a gun and the prosecution proved there was intent to kill. I asked about his time behind bars. He said he spent several years at Pelican Bay, the notorious supermax prison in California, where the entire prison was on complete lock down for three straight years — meaning everyone was confined to their cell 23-24 hours a day. I can’t imagine the effect such isolation has on the human mind. By the end of our conversation, I couldn’t shake the feeling that this guy had been completely screwed by “the system” — the length and conditions of incarceration seemed utterly unjust.

#

A year ago, I visited San Quentin State Prison and spoke to a group of 20 inmates about The Startup of You. They had all read the book and our 90 minute discussion was a mind blow — it opened my eyes to a population of people and a political issue to which I was totally blind. I wrote about my experience in this blog post.

I’ve been thinking about the criminal justice system ever since: how unjust the system is to so many, how prisons mete out punishment and yet often fail to cultivate rehabilitation, the philosophical basis for believing in redemption and second chances, among other topics.

My interest has led to me on several paths:

  • A friend and I spent a bunch of time brainstorming business ideas that would serve prisoners and their families. There are more than two million Americans locked up and many of them are gouged by predatory companies that supposedly “serve” their families. Specifically, a small number of telecom companies monopolize the phone systems inside prisons and charge ridiculous fees for inmates to call their families. Is there a way to disrupt them
  • I watched the Netflix documentary 13th, which is about the mass incarceration problem in America and how it’s connected to race. The film’s title is a reference to the 13th amendment which banned slavery. The film argues that mass incarceration is a modern version of racially-charged enslavement. I highly recommend the documentary.
  • I’ve been reading Shaka Senghor’s moving memoir Writing My Wrongs, about his 19 year incarceration for murder and ultimate redemption. Shaka did a tour of duty as an MIT Media Lab Fellow and delivered a popular TED talk. I’ve gotten to know Shaka a little bit and find him inspiring.
  • I donated money to The Last Mile and am headed back to San Quentin in a few weeks to speak to another group of men who are reading The Startup of You.

Speaking of The Last Mile, they have pivoted to a model that may, from my perspective, make it one of the most interesting programs in the prison reform marketplace of ideas: they’re teaching inmates how to code! They have set up computers in prison that let inmates code despite the computers not being connected to the internet. Coding instructors on the outside pipe in via live video chat to teach the inmates. Imagine if inmates could get paid for coding while on the inside (prisons as the new low-cost outsourcing option for companies!). And when they get out, the men possess one of the highest paid skills in the modern economy. Coding skills may be the most reliable path to economic self-sufficiency today.

Anyway, thank you to Defy Ventures for all the work you are doing to tackle one of America’s most morally pressing issues. And thank you to Brad and Mark for inviting me on the unforgettable trip.

Book Review: Sapiens: A Brief History of Humankind

sapiensOne of my favorite books of 2016: Sapiens: A Brief History of Humankind by Yuval Noah Harari. I feel a little sheepish joining the parade of praise — everyone in Silicon Valley seems to be reading the book. I’ve wandered into more than one cocktail party conversation where someone is going on about how myths underpin modern society (which is one of the arguments of the book).

It’s extraordinary in scope, engagingly written, and full of provocative factoids that make you stop and think. Through it all, there is an overarching argument about why homo sapiens overtook other human species on earth (punch line: our ability to cooperate and trade with strangers). And he also presents a history of the world that’s driven by what he calls three revolutions: cognitive, agricultural, scientific.

But even if you don’t follow the overarching argument, there’s plenty to think about when it comes to ancient history, animal rights, religion, happiness, and technological revolution. Not surprisingly, with so much ground covered, subject matter experts have quibbled (or more than quibbled) with some of Yuval’s claims, though nothing triggered me to lose trust in Harari as a guide.

I highlighted 170 sentences or paragraphs in the book. I’ve pasted a bunch of them below. As always, each paragraph is a new thought, and all are direct quotes from the book.


The truth is that from about 2 million years ago until around 10,000 years ago, the world was home, at one and the same time, to several human species. And why not? Today there are many species of foxes, bears and pigs. The earth of a hundred millennia ago was walked by at least six different species of man. It’s our current exclusivity, not that multi-species past, that is peculiar – and perhaps incriminating. As we will shortly see, we Sapiens have good reasons to repress the memory of our siblings.

The fact is that a jumbo brain is a jumbo drain on the body. It’s not easy to carry around, especially when encased inside a massive skull. It’s even harder to fuel. In Homo sapiens, the brain accounts for about 2–3 per cent of total body weight, but it consumes 25 per cent of the body’s energy when the body is at rest. By comparison, the brains of other apes require only 8 per cent of rest-time energy. Archaic humans paid for their large brains in two ways. Firstly, they spent more time in search of food. Secondly, their muscles atrophied. 

Death in childbirth became a major hazard for human females. Women who gave birth earlier, when the infant’s brain and head were still relatively small and supple, fared better and lived to have more children. Natural selection consequently favoured earlier births. And, indeed, compared to other animals, humans are born prematurely, when many of their vital systems are still under-developed. A colt can trot shortly after birth; a kitten leaves its mother to forage on its own when it is just a few weeks old. Human babies are helpless, dependent for many years on their elders for sustenance, protection and education.

Raising children required constant help from other family members and neighbours. It takes a tribe to raise a human. Evolution thus favoured those capable of forming strong social ties. In addition, since humans are born underdeveloped, they can be educated and socialised to a far greater extent than any other animal. This is a key to understanding our history and psychology. Genus Homo’s position in the food chain was, until quite recently, solidly in the middle. For millions of years, humans hunted smaller creatures and gathered what they could, all the while being hunted by larger predators. It was only 400,000 years ago that several species of man began to hunt large game on a regular basis, and only in the last 100,000 years – with the rise of Homo sapiens – that man jumped to the top of the food chain.

Millions of years of dominion have filled them with self-confidence. Sapiens by contrast is more like a banana republic dictator. Having so recently been one of the underdogs of the savannah, we are full of fears and anxieties over our position, which makes us doubly cruel and dangerous.

According to this theory Homo sapiens is primarily a social animal. Social cooperation is our key for survival and reproduction. It is not enough for individual men and women to know the whereabouts of lions and bison. It’s much more important for them to know who in their band hates whom, who is sleeping with whom, who is honest, and who is a cheat.

How did Homo sapiens manage to cross this critical threshold, eventually founding cities comprising tens of thousands of inhabitants and empires ruling hundreds of millions? The secret was probably the appearance of fiction. Large numbers of strangers can cooperate successfully by believing in common myths.

In contrast, ever since the Cognitive Revolution, Sapiens have been able to change their behaviour quickly, transmitting new behaviours to future generations without any need of genetic or environmental change. As a prime example, consider the repeated appearance of childless elites, such as the Catholic priesthood, Buddhist monastic orders and Chinese eunuch bureaucracies. The existence of such elites goes against the most fundamental principles of natural selection, since these dominant members of society willingly give up procreation.

Trade may seem a very pragmatic activity, one that needs no fictive basis. Yet the fact is that no animal other than Sapiens engages in trade, and all the Sapiens trade networks about which we have detailed evidence were based on fictions. Trade cannot exist without trust, and it is very difficult to trust strangers. The global trade network of today is based on our trust in such fictional entities as the dollar, the Federal Reserve Bank, and the totemic trademarks of corporations.

Significant differences begin to appear only when we cross the threshold of 150 individuals, and when we reach 1,000–2,000 individuals, the differences are astounding. If you tried to bunch together thousands of chimpanzees into Tiananmen Square, Wall Street, the Vatican or the headquarters of the United Nations, the result would be pandemonium. By contrast, Sapiens regularly gather by the thousands in such places. Together, they create orderly patterns – such as trade networks, mass celebrations and political institutions – that they could never have created in isolation.

The pursuit of an easier life resulted in much hardship, and not for the last time. It happens to us today. How many young college graduates have taken demanding jobs in high-powered firms, vowing that they will work hard to earn money that will enable them to retire and pursue their real interests when they are thirty-five? But by the time they reach that age, they have large mortgages, children to school, houses in the suburbs that necessitate at least two cars per family, and a sense that life is not worth living without really good wine and expensive holidays abroad. What are they supposed to do, go back to digging up roots? No, they double their efforts and keep slaving away. One of history’s few iron laws is that luxuries tend to become necessities and to spawn new obligations.

As humans spread around the world, so did their domesticated animals. Ten thousand years ago, not more than a few million sheep, cattle, goats, boars and chickens lived in restricted Afro-Asian niches. Today the world contains about a billion sheep, a billion pigs, more than a billion cattle, and more than 25 billion chickens. And they are all over the globe. The domesticated chicken is the most widespread fowl ever. Following Homo sapiens, domesticated cattle, pigs and sheep are the second, third and fourth most widespread large mammals in the world. From a narrow evolutionary perspective, which measures success by the number of DNA copies, the Agricultural Revolution was a wonderful boon for chickens, cattle, pigs and sheep.

To ensure that the pigs can’t run away, farmers in northern New Guinea slice off a chunk of each pig’s nose. This causes severe pain whenever the pig tries to sniff. Since the pigs cannot find food or even find their way around without sniffing, this mutilation makes them completely dependent on their human owners. In another area of New Guinea, it has been customary to gouge out pigs’ eyes, so that they cannot even see.

Immediately after birth the calf is separated from its mother and locked inside a tiny cage not much bigger than the calf’s own body. There the calf spends its entire life – about four months on average. It never leaves its cage, nor is it allowed to play with other calves or even walk – all so that its muscles will not grow strong. Soft muscles mean a soft and juicy steak. The first time the calf has a chance to walk, stretch its muscles and touch other calves is on its way to the slaughterhouse. In evolutionary terms, cattle represent one of the most successful animal species ever to exist. At the same time, they are some of the most miserable animals on the planet.

Until the late modern era, more than 90 per cent of humans were peasants who rose each morning to till the land by the sweat of their brows.

Yet the idea that all humans are equal is also a myth. In what sense do all humans equal one another? Is there any objective reality, outside the human imagination, in which we are truly equal? Are all humans equal to one another biologically?

So here is that line from the American Declaration of Independence translated into biological terms: We hold these truths to be self-evident, that all men evolved differently, that they are born with certain mutable characteristics, and that among these are life and the pursuit of pleasure.

When, in 1860, a majority of American citizens concluded that African slaves are human beings and must therefore enjoy the right of liberty, it took a bloody civil war to make the southern states acquiesce.

How do you cause people to believe in an imagined order such as Christianity, democracy or capitalism? First, you never admit that the order is imagined. You always insist that the order sustaining society is an objective reality created by the great gods or by the laws of nature. People are unequal, not because Hammurabi said so, but because Enlil and Marduk decreed it. People are equal, not because Thomas Jefferson said so, but because God created them that way. Free markets are the best economic system, not because Adam Smith said so, but because these are the immutable laws of nature. You also educate people thoroughly.

An objective phenomenon exists independently of human consciousness and human beliefs. Radioactivity, for example, is not a myth. Radioactive emissions occurred long before people discovered them, and they are dangerous even when people do not believe in them.

The subjective is something that exists depending on the consciousness and beliefs of a single individual. It disappears or changes if that particular individual changes his or her beliefs. Many a child believes in the existence of an imaginary friend who is invisible and inaudible to the rest of the world. The imaginary friend exists solely in the child’s subjective consciousness, and when the child grows up and ceases to believe in it, the imaginary friend fades away. The inter-subjective is something that exists within the communication network linking the subjective consciousness of many individuals. If a single individual changes his or her beliefs, or even dies, it is of little importance. However, if most individuals in the network die or change their beliefs, the inter-subjective phenomenon will mutate or disappear. Inter-subjective phenomena are neither malevolent frauds nor insignificant charades. They exist in a different way from physical phenomena such as radioactivity, but their impact on the world may still be enormous. Many of history’s most important drivers are inter-subjective: law, money, gods, nations.

If I alone were to stop believing in the dollar, in human rights, or in the United States, it wouldn’t much matter. These imagined orders are inter-subjective, so in order to change them we must simultaneously change the consciousness of billions of people, which is not easy.

The Sumerians thereby released their social order from the limitations of the human brain, opening the way for the appearance of cities, kingdoms and empires. The data-processing system invented by the Sumerians is called ‘writing’. Continue reading

Book Notes: Tribe by Sebastian Junger

51r-egdn4sl-_sx331_bo1204203200_Tribe: On Homecoming and Belonging by Sebastian Junger is a short book about how humans relate to each other and how modern society is pulling us away from our “tribal” roots. It touches on many topics related to community, war, how our very old brain is ill-equipped for modern society, the community norms of Native Americans, and more. Somehow it manages to hold together to be a stimulating and coherent read from start to finish. I think does accurately describe some of the dynamics that lead to modern unhappiness. Recommended. Highlighted sentences from the Kindle below, not in order.



Thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become European,” a French émigré named Hector de Crèvecoeur lamented in 1782. “There must be in their social bond something singularly captivating and far superior to anything to be boasted of among us.” 

It’s easy for people in modern society to romanticize Indian life, and it might well have been easy for men like George as well. That impulse should be guarded against. Virtually all of the Indian tribes waged war against their neighbors and practiced deeply sickening forms of torture. Prisoners who weren’t tomahawked on the spot could expect to be disemboweled and tied to a tree with their own intestines or blistered to death over a slow fire or simply hacked to pieces and fed alive to the dogs.

A person living in a modern city or a suburb can, for the first time in history, go through an entire day—or an entire life—mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone.

The psychological effect of placing such importance on affluence can be seen in microcosm in the legal profession. In 2015, the George Washington Law Review surveyed more than 6,000 lawyers and found that conventional success in the legal profession—such as high billable hours or making partner at a law firm—had zero correlation with levels of happiness and well-being reported by the lawyers themselves. In fact, public defenders, who have far lower status than corporate lawyers, seem to lead significantly happier lives.

Bluntly put, modern society seems to emphasize extrinsic values over intrinsic ones, and as a result, mental health issues refuse to decline with growing wealth. The more assimilated a person is into American society, the more likely they are to develop depression during the course of their lifetime, regardless of what ethnicity they are. Mexicans born in the United States are wealthier than Mexicans born in Mexico but far more likely to suffer from depression.

“The economic and marketing forces of modern society have engineered an environment… that maximize[s] consumption at the long-term cost of well-being,” a study in the Journal of Affective Disorders concluded in 2012. “In effect, humans have dragged a body with a long hominid history into an overfed, malnourished, sedentary, sunlight-deficient, sleep-deprived, competitive, inequitable, and socially-isolating environment with dire consequences.”

Baby rhesus monkeys were separated from their mothers and presented with the choice of two kinds of surrogates: a cuddly mother made out of terry cloth or an uninviting mother made out of wire mesh. The wire mesh mother, however, had a nipple that dispensed warm milk. The babies took their nourishment as quickly as possible and then rushed back to cling to the terry cloth mother, which had enough softness to provide the illusion of affection. Clearly, touch and closeness are vital to the health of baby primates—including humans.

Also unthinkable would be the modern practice of making young children sleep by themselves. In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone in their own room—a figure that rose to 95 percent among families considered “well educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

Boehm points out that among current-day foraging groups, group execution is one of the most common ways of punishing males who try to claim a disproportionate amount of the group’s resources.

One year into the siege, just before I got to the city, a teenage couple walked into no-man’s-land along the Miljacka River, trying to cross into a Serb-held area. They were quickly gunned down, the young man falling first and the woman crawling over to him as she died. He was a Serb and she was a Muslim, and they had been in love all through high school. They lay there for days because the area was too dangerous for anyone to retrieve their bodies.

American analysts based in England monitored the effects of the bombing to see if any cracks began to appear in the German resolve, and to their surprise found exactly the opposite: the more the Allies bombed, the more defiant the German population became. Industrial production actually rose in Germany during the war. And the cities with the highest morale were the ones—like Dresden—that were bombed the hardest.

He was unable to find a single instance where communities that had been hit by catastrophic events lapsed into sustained panic, much less anything approaching anarchy. If anything, he found that social bonds were reinforced during disasters, and that people overwhelmingly devoted their energies toward the good of the community rather than just themselves.

According to a study based on a century of records at the Carnegie Hero Fund Commission, male bystanders performed more than 90 percent of spontaneous rescues of strangers, and around one in five were killed in the attempt. (“Hero” is generally defined as risking your life to save non-kin from mortal danger. The resulting mortality rate is higher than for most US combat units.) Researchers theorize that greater upper-body strength and a predominantly male personality trait known as “impulsive sensation seeking” lead men to overwhelmingly dominate this form of extreme caretaking.

The greater empathic concern women demonstrate for others may lead them to take positions on moral or social issues that men are less likely to concern themselves with.

In late 2015, a bus in eastern Kenya was stopped by gunmen from an extremist group named Al-Shabaab that made a practice of massacring Christians as part of a terrorism campaign against the Western-aligned Kenyan government. The gunmen demanded that Muslim and Christian passengers separate themselves into two groups so that the Christians could be killed, but the Muslims—most of whom were women—refused to do it. They told the gunmen that they would all die together if necessary, but that the Christians would not be singled out for execution. The Shabaab eventually let everyone go.

What would you risk dying for—and for whom—is perhaps the most profound question a person can ask themselves.

“The miners’ code of rescue meant that each trapped miner had the knowledge that he would never be buried alive if it were humanly possible for his friends to reach him,” a 1960 study called Individual and Group Behavior in a Coal Mine Disaster explained. “At the same time, the code was not rigid enough to ostracize those who could not face the rescue role.”

If women aren’t present to provide the empathic leadership that every group needs, certain men will do it. If men aren’t present to take immediate action in an emergency, women will step in.

Twenty years after the end of the siege of Sarajevo, I returned to find people talking a little sheepishly about how much they longed for those days. More precisely, they longed for who they’d been back then. Even my taxi driver on the ride from the airport told me that during the war, he’d been in a special unit that slipped through the enemy lines to help other besieged enclaves. “And now look at me,” he said, dismissing the dashboard with a wave of his hand.

We didn’t learn the lesson of the war, which is how important it is to share everything you have with human beings close to you. The best way to explain it is that the war makes you an animal. We were animals. It’s insane—but that’s the basic human instinct, to help another human being who is sitting or standing or lying close to you.” I asked Ahmetašević if people had ultimately been happier during the war. “We were the happiest,” Ahmetašević said. Then she added: “And we laughed more.”

Given the profound alienation of modern society, when combat vets say that they miss the war, they might be having an entirely healthy response to life back home. Iroquois warriors did not have to struggle with that sort of alienation because warfare and society existed in such close proximity that there was effectively no transition from one to the other.

But the very worst experience, by far, was having a friend die. In war after war, army after army, losing a buddy is considered the most devastating thing that can possibly happen. It is far more disturbing than experiencing mortal danger oneself and often serves as a trigger for psychological breakdown on the battlefield or later in life.

Horrific experiences are unfortunately a human universal, but long-term impairment from them is not, and despite billions of dollars spent on treatment, roughly half of Iraq and Afghanistan veterans have applied for permanent PTSD disability. Since only 10 percent of our armed forces experience actual combat, the majority of vets claiming to suffer from PTSD seem to have been affected by something other than direct exposure to danger.

Administration counselor I spoke with, who asked to remain anonymous, described having to physically protect someone in a PTSD support group because other vets wanted to beat him up for seeming to fake his trauma. This counselor said that many combat veterans actively avoid the VA because they worry about losing their temper around patients who they think are milking the system. “It’s the real deals—the guys who have seen the most—that this tends to bother,” he told me.

It’s common knowledge in the Peace Corps that as stressful as life in a developing country can be, returning to a modern country can be far harder. One study found that one in four Peace Corps volunteers reported experiencing significant depression after their return home, and that figure more than doubled for people who had been evacuated from their host country during wartime or some other kind of emergency.

One of the most noticeable things about life in the military, even in support units, is that you are almost never alone. Day after day, month after month, you are close enough to speak to, if not touch, a dozen or more people. When I was with American soldiers at a remote outpost in Afghanistan, we slept ten to a hut in bunks that were only a few feet apart. I could touch three other men with my outstretched hand from where I lay. They snored, they talked, they got up in the middle of the night to use the piss tubes, but we always felt safe because we were in a group. The outpost was attacked dozens of times, yet I slept better surrounded by those noisy, snoring men than I ever did camping alone in the woods of New England.

According to Shalev, the closer the public is to the actual combat, the better the war will be understood and the less difficulty soldiers will have when they come home.

Secondly, ex-combatants shouldn’t be seen—or be encouraged to see themselves—as victims. One can be deeply traumatized, as firemen are by the deaths of both colleagues and civilians, without being viewed through the lens of victimhood.

Rachel Yehuda pointed to littering as the perfect example of an everyday symbol of disunity in society. “It’s a horrible thing to see because it sort of encapsulates this idea that you’re in it alone, that there isn’t a shared ethos of trying to protect something shared,” she told me. “It’s the embodiment of every man for himself. It’s the opposite of the military.”

American Indians, proportionally, provide more soldiers to America’s wars than any other demographic group in the country. They are also the product of an ancient culture of warfare that takes great pains to protect the warrior from society, and vice versa.

Unlike criticism, contempt is particularly toxic because it assumes a moral superiority in the speaker.

“There Are No Shortcuts”

A few months ago, President Obama gave a moving eulogy in honor of Beau Biden, the late son of Vice President Biden. Minutes 13-15 are emotional, as Obama’s voice cracks. And the words ring true. In the social media age, it’s not hard to get some attention; to generate some controversy. But to make your name mean something and to have it stand for dignity and integrity — that’s rare. It’s not something you can buy. There are no shortcuts. Video below (start at minute 13).