Asking Acquaintances About Mutual Friends

All business is people business ultimately, and so improving your ability to size someone up should be a relentless priority — it is for me, anyway. By “size a person up” I mean figuring out how much you trust a person, how you can best collaborate with him, whether you’d hire her, whether you should fire him.

One of the simple ways I size a person up is by understanding how they understand and judge other people. In this way, I start to be build a model of the person I’m getting to know. I get to know their likes and dislikes, their biases, their underlying motivations, and of course their meta ability to evaluate people — all by hearing them talk about friends I know well.

Practically speaking, when I meet someone new, I like to ask them about someone we know in common. “So how do you know Jane?” Sure, it’s a trite question. But it can lead to a substantive exchange. It doesn’t have to be gossip. How has this person partnered with Jane? What’s frustrated him about Jane? What have been the delights?

When you ask someone to talk about their relationship with someone else, they often inadvertently reveal a lot about who they are.

At a breakfast meeting, I once asked an acquaintance — who I was also evaluating as a prospective business partner — to describe how he knew a mutual friend. As I probed, I realized this acquaintance spoke in condescending, patriarchal terms about a person who I very much considered his peer. It was revealing. I may not have gotten a glimpse at this element of his oversized ego if we had not gone down this path.

In another case, by talking about mutual friends I realized the person I was speaking to grasped subtleties about a friend’s personality that I had missed, and it made me all the more excited about partnering with him because of his extraordinary ability to make sense of at least one complicated person — and likely many others.

Bottom Line: Get to know someone new by asking him or her about someone you already know well.

Knowledge Accumulates Over Generations

One of the central takeaways from Chuck Klosterman’s book is that throughout history many well-verified “truths” about how the world works have, in time, been proven wrong. He provocatively asks: Which assumptions about the world do we hold dear today that subsequent generations, benefitting from greater scientific discovery, will laugh at?

You can learn this lesson vividly in the arena of building engineering and home repair, as I have.

Consider a building structure that was originally built 100 years ago but has been updated over time. An engineer will inspect the building and say, “Oh, that foundation work utilized a technique that was common in 1980.” Or: “That way of supporting a second story addition was popular in the 70’s.” A specific building technique is easily timestamped based on the prevailing knowledge at that time. With the punch line being: There’s a different best practice today. “In 2017, we do it differently.” And, usually (but not always) — it’s a better technique.

It’s inspiring to see how building engineers continue to iterate their approach. And it occurred that it’d be amusing if management consultants similarly couched their advice in before-and-after timestamped language. “That way of doing performance management was popular in the 80’s, but we know better now.” “Structuring your decision making that way was popular in the 90’s, but we know better now.”


Related, somewhat of a counterpoint: The always provocative Robin Hanson says one of the big neglected problems in the world is that each generation has to re-learn lessons during its individual lifetimes.

Neglected Big Problems

Book Short: The Checklist Manifesto

Atul Gawande’s book The Checklist Manifesto is a wonderfully engaging summation of how the world has become so complex, and how to use checklists — yes, a simple to-do checklist — to manage the complexity that underlies modern professions.

The surgery room is the primary setting for the book’s examples, Gawande’s own vocation of course, but there are also useful stories from the worlds of building construction, aviation, and Wall Street trading.

Here’s Derek Sivers’ detailed summary of the book.


Which Health Advice Is Actually True?

Spencer Greenberg, an extremely rational person and ultra synthesizer, posted the below as a public entry on Facebook. I found it interesting. What follows are Spencer’s words…

A query for you about human health: what are dietary/nutrition/health recommendations that are (essentially) universally agreed on by nutrition and health experts of all stripes and schools of thought? Given the incredibly high levels of disagreement in this area, and the poor quality of a lot of the studies, this depressingly short list (below) is all I can come up with. I’m hoping you can help me expand it!

Also, this list probably has some mistakes, so let me know what I’m getting wrong!

-Preliminary List of Universally Recommended Health Interventions-
(1) Don’t consume a lot of sugar (at best, it’s empty calories and probably causes tooth decay, but some claim it’s much worse than that).
(2) Exercise regularly (its best to rotate which type of exercise you do – be very careful to avoid injury, especially when you are getting into new forms of exercise – it’s also unclear what forms of exercise are best e.g. strength training vs. cardio, and how much exercise you should get – also, extremely high levels of exercise are believed to be associated with increasing some health risks).
(3) If you are going to eat a lot of carbohydrates, generally you should choose complex carbs over simple carbs (usually whole grains are also recommended over refined grains, but some argue that whole grains should be sprouted/soaked to remove parts of the seed that are designed to protect it from digestion [HT: Gary Basin]).
(4) Brush your teeth with fluoride toothpaste at least once per day (though perhaps it is not actually a good idea immediately after eating, especially if you’ve been eating acidic foods, the suggestion is to brush before breakfast, or wait an hour after eating – and beware of brushing too often or too vigorously – brushing twice per day may be better than brushing once – and also note that there do exist some very small segment of people in the health field that are against fluoride).
(5) Hydrate regularly throughout each day, especially as soon as you feel thirsty, but even if you don’t (doing so with water is the safest bet, though it’s not clear how much liquid you need in total, and it’s also not clear whether it’s important to do this with water or if other drinks like non-sugary tea are fine replacements. Also, the the 8 cups of water a day thing seems to be bullshit).
(6) Eat plenty of vegetables (preferably not deep fried ones though – note also that there do exist a very small number of people in the health field who advocate an essentially zero-carb or meat only diet).
(7) Don’t eat a lot of deep fried foods in general.
(8) Take Vitamin D3 supplements if you are >60 years old and don’t get a lot of outdoor time, and for the general population, take it if you get very little sunlight.
(9) Avoid frequently drinking large quantities of alcohol.
(10) Avoid frequently consuming tobacco products (but since many of them are addictive, that means it’s safest to avoid them altogether).
(11) If you have the ability to make yourself lose weight and keep it off, prioritize weight and fat loss if you have a very high body fat percentage or a lot of body fat around the gut area [HT: Julia June Bossmann, Ben Hoffman] (the extent to which mild to moderate obesity is bad per se is somewhat debated, as in some studies mild levels of obesity were sometimes even correlated with better outcomes – avoiding metabolic syndrome and poor blood sugar dis-regulation may be more on point than avoiding a very high body fat percentage though the two are significantly correlated [HT: Kara Loewentheil] – however lots of data suggests that three years after a diet most people have regained the weight they lost, and some say that regularly cycling your weight by losing then gaining then losing again could be unhealthy).
(12) If you are going to eat something sweet, fruit is a better bet than candy or sugary baked goods.
(13) Avoid consuming trans fats.
(14) Don’t consume excessive amounts of mercury (which is found in many fish – some say that tuna, king mackerel, marlin, orange roughy, shark, and swordfish are particularly worrisome).
(15) Don’t eat a lot of foods that are burned to the point of being blackened.
(16) Spend some time outdoors in the sun each week.
(17) Avoid getting frequently sunburned.
(18) If you are unusually low in any vitamin then you should consume more of it (but if you have normalish levels, there is not a consensus on whether you should have more of any vitamins as far as I can tell, except perhaps Vitamin D for the elderly which seems to be basically agreed upon – there is also disagreement about whether vitamin pills are as effective as vitamins from whole foods).
(19) If you are a strict vegan, take vitamin B12 supplements.
(20) Don’t get addicted to any drugs (prescription or non-prescription) other than possibly caffeine.
(21) Don’t run a constant sleep deficit (though the amount of sleep each person requires to not have a deficit seems to vary pretty considerably).
(22) Wash your hands with soap regularly [HT: Amy Willey] (though some claim that special anti-bacterial hand soap is not a good idea)
(23) Don’t regularly have non-negligible amounts of caffeine within a few hours of going to bed (though people’s sensitivity to this seems to vary a lot).
(24) Don’t spend your day in very long stretches of sitting without moving (i.e. take breaks where you move around) [HT: Eva Vivalt].
(25) When typing, avoid having your wrists bent at a significant angle for long periods, and avoid having to bend your neck substantially downward or upward to see your computer screen.
(26) Each week spend at least a bit of social time with people you get along well with.
(27) If you have very high levels of anxiety, depression or hopelessness you should seek treatment as soon as possible (e.g. you could try Cognitive Behavioral Therapy with a psychologist or go to a psychiatrist).
(28) Eating a diverse range of healthy foods is usually better than eating a narrow range of foods (of course a diverse range of unhealthy foods is still unhealthy [HT: Bryan Hobart]).
(29) Avoid very high doses of certain vitamin and mineral supplements (e.g. Iron supplements, vitamin A and vitamin B-6, where overdosing is known to happen – beware of mega-dose vitamins unless you know what you’re doing, as they are unlikely to be helpful and could be harmful).
(30) If you have a broken bone or reasonable sized cut or scrape that appears as if it could be infected, go to a doctor immediately (some broken bones require splinting to heal properly, infected wounds may require treatment).
(31) If you have a mole that violates enough of the ABCDE rule, get it checked out by a dermatologist, which means: Asymmetry (if one side of the mole doesn’t match the other), Border irregularity, Color is not uniform, Diameter more than 6 mm (which is about the size of a pencil eraser), and Evolving size, shape or color.
(32) Highly processed meats (e.g. hot dog or bologna) are worse than less processed ones [HT: Claire Zabel].
(33) Do things to keep your brain active, such as learning something new each week or doing something that is mentally taxing [HT: Chad Gracia]
(34) If you have high levels of stress, try to reduce them using whatever techniques you find effective [HT: Jujubee Kang] (high stress has been linked to various negative indicators in the body – techniques that some people find effective to reduce stress include Cognitive Behavioral Therapy (CBT), meditation, progressive muscle relaxation, yoga, walking in nature, and exercise that keeps your heart rate elevated for a reasonably long period).
(35) Keep your sleep cycles at least roughly in sync with the dark/light cycle of the planet (i.e. do most of your sleeping at night, and most of your waking hours during the day).
(36) Don’t regularly drink alcohol before going to bed.
(37) If you think you may be suicidal, or you have made plans for suicide, you should call a suicide hotline immediately, and afterward make an appointment with a therapist or psychiatrist for as soon as is possible.
(38) Keep your living environment at a comfortable temperature, generally in the 65-75°F (18-24°C) range.
(39) Go to a dentist for teeth cleaning and a checkup at least twice per year.

What else should be on this list that I missed? What am I mistaken about that I should remove from this list (because there is actually some disagreement among experts)?

Also, here are some other health and nutrition questions that didn’t make the list because, while many advocate strongly for one side, there still seems to be a reasonable amount of disagreement (rightly or wrongly):

-List of Important Health Questions Experts Don’t Agree On-
(a) How bad is saturated fat, if at all?
(b) How useful is omega 3 supplementation, if at all?
(c) How bad are carbs versus other macro nutrients?
(d) Is there any benefit (or harm) to getting more than the RDA of protein (0.36 grams per pound of body weight)?
(e) Is flossing effective? (If you like flossing, or at least don’t mind it, it may well be worth it, but the benefits are not as clearly established as one would ideally like, and there have been some claims, possibly false, that it can cause bacteria to escape from your mouth into your body in a way that could be bad)
(f) Does dietary cholesterol lead to high blood cholesterol? (apparently the FDA just released new guidelines on this that say “no” [HT: Romeo Stevens])
(g) Is blood cholesterol correlated enough with bad outcomes that we should care about it, per se?
(h) What types of preventative screening / testing should everyone routinely get?
(i) What dietary supplements (if any) should a healthy person take?
(j) Is there any harm from Aspartame or other artificial sweeteners? (all of the many randomized controlled trials on Aspartame on humans I’ve looked out found no negative effects except headaches in a small subset of people, but other studies in rats show weird effects that are hard to interpret, and a lot of people are anti-Aspartame without providing clear reasons)
(k) Is polyunsaturated fat good for you, bad, or neutral?
(l) How bad is meat for you as a broad category, or is it too broad a category to generalize?
(m) How much should you limit your salt intake?
(n) Which type of cooking oil (e.g. olive oil, coconut oil, avocado oil, etc.) should you use or avoid?
(o) How much exercise is ideal, and of what forms?
(p) Is going into ketosis (by lowering your carb intake dramatically) a good or bad idea?
(q) Is intermittent fasting a good idea?
(r) Is it important to go to bed/wake up at the same time every day?
(s) Is there any real benefit to eating organic foods?
(t) Are “grass fed” animal products healthier than non-grass fed ones?
(u) Is there any real difference (in your body) between sugar and high fructose corn syrup? (common sense about chemical composition and some studies suggest the answer is that there is no difference, but many people think high fructose corn syrup is worse)
(v) What’s the optimal mix of macro nutrients?
(w) Does metformin increase lifespan for (basically) healthy individuals? [HT: Amy Willey]
(x) Are GMO foods actually risky, or are they fine?
(y ) Is the heuristic of eating “natural” or “whole” foods actually accurate, or does it exclude too much?
(z) Are probiotics (like acidophilus) useful to take for a generally healthy person?
(aa) How important is stretching, what type of stretching (static vs. active) is best, and when should you do it (just before you exercise, just after, or at other times)?
(ab) How bad are pesticides on our foods (which types are bad, and how much of them do we have to consume before problems begin)?
(ac) Should you take a multi-vitamin pill? (the tide has been turning against them as repeated studies fail to find a benefit in healthy people, but some experts still recommend them)
(ad) Are there vitamin/mineral deficiencies that a significant proportion of people in developed countries have? (e.g. possibly magnesium, potassium, choline, D3, K2 [HT: Romeo Stevens])
(ae) Is it helpful to wake up when the sun rises each morning?
(af) Do heavily calorie restricted diets improve longevity in humans (like they do in mice)?

Impressions of Morocco

Locals lined up to see the King drive by in Marrakech

Locals lined up to see the King drive by in Marrakech

I spent 10 days in Morocco (Marrakech and Fes) over Christmas and New Year’s. Some of my impressions:

  • Quite a beautiful country, physically. The old towns of Marrakech and Fes — the endless alleyways and crevices, and of course the souks — are quite something to take in. You feel transported back in time as you walk the streets and interact with shop keepers whose families have owned the little candy store or butcher or scarf outlet for more decades than you can count. Outside the cities, the red desert landscape is punctuated by the Atlas mountain range.
  • Food tours continue to be a highlight of my trips. The Marrakech city food tour brought us to hole-in-the-wall couscous restaurants that would have been impossible to find otherwise, and brought us face to face with the (whole) cooked head of a sheep. The eyeballs are a delicacy. I can only speak to the taste of the cheek meat…
  • Rug negotiations. Morocco is known for their carpets. Rug makers spend years in the mountains hand weaving gorgeous rugs that eventually find their way into the dozens (hundreds?) of rug shops in the cities. Then, the rugs are hawked aggressively to the large numbers of French and Spanish tourists wandering the alleyways. “You want rug? Low price. Only the best price.” That sort of thing. Once inside the shop, the game of salesmanship and negotiation is fun to watch. They’ve been selling rugs for so long that you can bet that every word and expression offered during the display of rugs and discussion of colors, etc. is a refined technique with the aim of moving product.
  • Morocco is a Muslim country, so there’s no alcohol served in most restaurants and you don’t see people drinking. (It makes New Year’s Eve a rather sober affair in Morocco.) That said, Moroccan wine is produced and consumed in huge quantities in the country…
  • Men rule. During the day, the cafes are occupied almost exclusively by men. The stores are manned almost exclusively by men. “The women are at home,” our driver told us.
  • Beautiful chandeliers and wall designs. Even in the most podunk restaurant in a non-hip restaurant, the light fixture will be some crazy ornate work of art. The Moroccan aesthetic is very popular in the U.S. — as a luxury good. In Morocco, the Moroccan aesthetic is…everywhere.
  • All hail the king. The King of Morocco, who’s usually stationed in Rabat, came through Marrakech when we were there. News spread and soon thousands of people had lined up along the street to see him. Hours and hours and hours passed. Shops closed. Streets were blocked off. We were stranded — unable to cross a street back to our riad given the road closure. Finally the king’s motorcade whizzed by, the people on the street waved, and two seconds later it was all over. When we asked a guide about the scene later, he said shutting down the economic heartbeat of a city for a full day just so people can catch a glimpse of the king is nuts. “It’s why the government likes its people illiterate and uneducated. They’ll blindly be entranced by the king,” he said.
  • Morocco’s riads (hotels inside the medina) all offer hamams; getting your body scrubbed down feels great. It’s crazy how much dead skin comes off. Hamam and regular Swedish massage are dirt cheap in Morocco so you can help yourself to multiple servings of each.

Marrakech is a three hour flight from London. It’s well worth a visit. Read the memoir  “Dreams of Trespass” on the flight over.

Happy 2017. I hope it’s a fun and peaceful year for you, wherever you are…

Heleo Interview Part 1

Here’s a six minute video excerpt (part 1) of a recent conversation I did with business site Heleo. We cover how adaptation is a business and life skill; how to get feedback on how you should adapt; what one might need to unlearn from school; and why there IS such a thing as a dumb question in a meeting.

Book Review: But What If We’re Wrong?

klostermanChuck Klosterman wrote one of the most stimulating books that I read in 2016: But What If We’re Wrong: Thinking About the Present As If It Were the Past. There are countless interesting observations on science and pop culture and sports and history. By contemplating which assumptions of today might be disproven in the future, or which authors of note today might be forgotten in the future and which no-name writers might become famous after their death, he unearths novel theories about familiar topics. Why did Herman Melville become renowned by future generations but not his own? Which theories of science are we totally convinced are true today but may well be proven false by future generations of physicists? Which TV show made in 2016 will be referenced by historians in 2080 when they try to explain what life was like in 2016?

My favorite paragraphs are pasted below, with the bold font mine.

Thanks to Russ Roberts for recommending this via his Econtalk conversation with Chuck.

When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who’s ever lived.”

Aristotle had argued more than a thousand years prior: He believed all objects craved their “natural place,” and that this place was the geocentric center of the universe, and that the geocentric center of the universe was Earth. In other words, Aristotle believed that a dropped rock fell to the earth because rocks belonged on earth and wanted to be there.

For the next thirty years, nothing about the reception of [Moby Dick] changes. But then World War I happens, and—somehow, and for reasons that can’t be totally explained—modernists living in postwar America start to view literature through a different lens. There is a Melville revival. The concept of what a novel is supposed to accomplish shifts in his direction and amplifies with each passing generation…

I suspect most conventionally intelligent people are naïve realists, and I think it might be the defining intellectual quality of this era. The straightforward definition of naïve realism doesn’t seem that outlandish: It’s a theory that suggests the world is exactly as it appears.

Any time you talk to police (or lawyers, or journalists) about any kind of inherently unsolvable mystery, you will inevitably find yourself confronted with the concept of Occam’s Razor: the philosophical argument that the best hypothesis is the one involving the lowest number of assumptions.

The reason something becomes retrospectively significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative.

The arc of Lethem’s larger contention boils down to two points. The first is that no one is really remembered over the long haul, beyond a few totemic figures—Joyce, Shakespeare, Homer—and that these figures serve as placeholders for the muddled generalization of greatness (“Time is a motherfucker and it’s coming for all of us,” Lethem notes).

The reason shadow histories remained in the shadows lay in the centralization of information: If an idea wasn’t discussed on one of three major networks or on the pages of a major daily newspaper or national magazine, it was almost impossible for that idea to gain traction with anyone who wasn’t consciously searching for alternative perspectives. That era is now over. There is no centralized information, so every idea has the same potential for distribution and acceptance.

Competing modes of discourse no longer “compete.” They coexist.

Take, for example, the plight of Native Americans. What American subculture has suffered more irrevocably? Prior to Columbus’s landing in the New World, the Native American population approached one hundred million. Now it’s a little over three million, two-thirds of whom are relegated to fifty delineated reservations on mostly undesirable land. Still, that equates to roughly 1 percent of the total US population. Yet Native Americans are essentially voiceless, even in conversations that specifically decry the lack of minority representation. Who is the most prominent Native American media figure or politician? Sherman Alexie? Louise Erdrich? Tom Cole or Markwayne Mullin, both of whom are from the same state? Who, for that matter, is the most famous Native American athlete, or rapper, or reality star? Maybe Sam Bradford? Maybe Sacheen Littlefeather, who’s been virtually invisible since the seventies? When the Academy Awards committee next announces the nominations for Best Picture, how many complaints will focus on the lack of films reflecting the Native American experience? Outside the anguish expressed over the use of the term “Redskin” by the Washington football franchise, it’s hard to find conversation about the biases facing Native Americans; outside the TV show Fargo, you almost never see it reflected in the popular culture. Everyone concedes it exists, but it’s not a popular prejudice (at least not among the mostly white liberals who drive these conversations). Their marginalization is ignored, thus creating a fertile factory for the kind of brilliant outsider who won’t be recognized until that artist is dead and gone. So this is one possibility—a Navajo Kafka.

 Kurt Vonnegut’s A Man Without a Country: “I think that novels that leave out technology misrepresent life as badly as Victorians misrepresented life by leaving out sex.”

…the myth of universal timeliness. There is a misguided belief—often promoted by creative writing programs—that producing fiction excessively tied to technology or popular culture cheapens the work and detracts from its value over time. If, for example, you create a plot twist that hinges on the use of an iPad, that story will (allegedly) become irrelevant once iPads are replaced by a new form of technology. If a character in your story is obsessed with watching Cheers reruns, the meaning of that obsession will (supposedly) evaporate once Cheers disappears from syndication. If your late-nineties novel is consumed with Monica Lewinsky, the rest of the story (purportedly) devolves into period piece. The goal, according to advocates of this philosophy, is to build a narrative that has no irretraceable connection to the temporary world. But that’s idiotic, for at least two reasons. The first is that it’s impossible to generate deep verisimilitude without specificity. The second is that if you hide the temporary world and the work somehow does beat the odds and become timeless, the temporary world you hid will become the only thing anyone cares about

But I’ve been a paid critic for enough years to know my profession regularly overrates many, many things by automatically classifying them as potentially underrated. The two terms have become nonsensically interchangeable.

The nonfiction wing of this level houses elemental tacticians like Robert Caro; someone like William T. Vollmann straddles both lines, fortified by his sublime recklessness. Even the lesser books from these writers are historically important, because—once you’re defined as great—failures become biographically instructive. 

The third tier houses commercial writers who dependably publish major or minor bestsellers and whose success or failure is generally viewed as a reflection of how much (or how little) those books sell. These individuals are occasionally viewed as “great at writing,” but rarely as great writers. They are envied and discounted at the same time. They are what I call “vocally unrated”: A large amount of critical thought is directed toward explaining how these types of novels are not worth thinking about.

Now, if the world were logical, certain predictions could be made about what bricks from that pyramid will have the greatest likelihood of remaining intact after centuries of erosion. Devoid of all other information, a betting man would have to select a level-one writer like Roth, just as any betting man would take the Yankees if forced to wager on who will win the World Series one hundred seasons from now. If you don’t know what the weather will be like tomorrow, assume it will be pretty much the same as today. But this would require an astonishing cultural stasis. It would not simply mean that the way we presently consume and consider Roth will be the way Roth is consumed and considered forevermore; it would mean that the manner in which we value and assess all novels will remain unchanged. It also means Roth must survive his inevitable post-life reevaluation by the first generation of academics who weren’t born until he was already gone, a scenario where there will be no room for advancement and plenty of room for diminishing perceptions (no future contrarian can provocatively claim, “Roth is actually better than everyone thought at the time,” because—at the time—everyone accepted that he was viewed as remarkable). He is the safest bet, but still not a safe bet. Which is why I find myself fixated on the third and sixth tiers of my imaginary triangle: “the unrated.” As specific examples, they all face immeasurable odds. But as a class, they share certain perverse advantages.

Normal consumers declare rock to be dead whenever they personally stop listening to it (or at least to new iterations of it), which typically happens about two years after they graduate from college.

The Beatles were the first major band to write their own songs, thus making songwriting a prerequisite for credibility; they also released tracks that unintentionally spawned entire subgenres of rock, such as heavy metal (“Helter Skelter”), psychedelia (“Tomorrow Never Knows”), and country rock (“I’ll Cry Instead”).

Do I think the Beatles will be remembered in three hundred years? Yes. I believe the Beatles will be the Sousa of Rock (alongside Michael Jackson, the Sousa of Pop22). If this were a book of predictions, that’s the prediction I’d make. But this is not a book about being right. This is a book about being wrong, and my faith in wrongness is greater than my faith in the Beatles’ unassailability. What I think will happen is probably not what’s going to happen. So I will consider what might happen instead.

Since rock, pop, and rap are so closely tied to youth culture, there’s an undying belief that young people are the only ones who can really know what’s good. It’s the only major art form where the opinion of a random fourteen-year-old is considered more relevant than the analysis of a sixty-four-year-old scholar. (This is why it’s so common to see aging music writers championing new acts that will later seem comically overrated—once they hit a certain age, pop critics feel an obligation to question their own taste.)

Take architecture: Here we have a creative process of immense functional consequence. It’s the backbone of the urban world we inhabit, and it’s an art form most people vaguely understand—an architect is a person who designs a structure on paper, and that design emerges as the structure itself. Architects fuse aesthetics with physics and sociology. And there is a deep consensus over who did this best, at least among non-architects: If we walked down the street of any American city and asked people to name the greatest architect of the twentieth century, most would say Frank Lloyd Wright. In fact, if someone provided a different answer, we’d have to assume we’ve stumbled across an actual working architect, an architectural historian, or a personal friend of Frank Gehry. Of course, most individuals in those subsets would cite Wright, too. But in order for someone to argue in favor of any architect except Wright (or even to be in a position to name three other plausible candidates), that person would almost need to be an expert in architecture. Normal humans don’t possess enough information to nominate alternative possibilities. And what emerges from that social condition is an insane kind of logic: Frank Lloyd Wright is indisputably the greatest architect of the twentieth century, and the only people who’d potentially disagree with that assertion are those who legitimately understand the question. History is defined by people who don’t really understand what they are defining.

I don’t believe all art is the same. I wouldn’t be a critic if I did. Subjective distinctions can be made, and those distinctions are worth quibbling about. The juice of life is derived from arguments that don’t seem obvious. But I don’t believe subjective distinctions about quality transcend to anything close to objective truth—and every time somebody tries to prove otherwise, the results are inevitably galvanized by whatever it is they get wrong.

To matter forever, you need to matter to those who don’t care. And if that strikes you as sad, be sad.

But maybe it takes an idiot to pose this non-idiotic question: How do we know we’re not currently living in our own version of the year 1599? According to Tyson, we have not reinvented our understanding of scientific reality since the seventeenth century. Our beliefs have been relatively secure for roughly four hundred years. That’s a long time—except in the context of science. In science, four hundred years is a grain in the hourglass.

One of Greene’s high-profile signatures is his support for the concept of “the multiverse.” Now, what follows will be an oversimplification—but here’s what that connotes: Generally, we work from the assumption that there is one universe, and that our galaxy is a component of this one singular universe that emerged from the Big Bang. But the multiverse notion suggests there are infinite (or at least numerous) universes beyond our own, existing as alternative realities. Imagine an endless roll of bubble wrap; our universe (and everything in it) would be one tiny bubble, and all the other bubbles would be other universes that are equally vast. In his book The Hidden Reality, Greene maps out nine types of parallel universes within this hypothetical system.

“In physics, when we say we know something, it’s very simple,” Tyson reiterates. “Can we predict the outcome? If we can predict the outcome, we’re good to go, and we’re on to the next problem. There are philosophers who care about the understanding of why that was the outcome. Isaac Newton [essentially] said, ‘I have an equation that says why the moon is in orbit. I have no fucking idea how the Earth talks to the moon. It’s empty space—there’s no hand reaching out.’

Galileo famously refused to chill and published his Dialogue Concerning the Two Chief World Systems as soon as he possibly could, mocking all those who believed (or claimed to believe) that the Earth was the center of the universe. The pope, predictably, was not stoked to hear this. But the Vatican still didn’t execute Galileo; he merely spent the rest of his life under house arrest (where he was still allowed to write books about physics) and lived to be seventy-seven.

What Bostrom is asserting is that there are three possibilities about the future, one of which must be true. The first possibility is that the human race becomes extinct before reaching the stage where such a high-level simulation could be built. The second possibility is that humans do reach that stage, but for whatever reason—legality, ethics, or simple disinterest—no one ever tries to simulate the complete experience of civilization. The third possibility is that we are living in a simulation right now. Why? Because if it’s possible to create this level of computer simulation (and if it’s legally and socially acceptable to do so), there won’t just be one simulation. There will be an almost limitless number of competing simulations, all of which would be disconnected from each other. A computer program could be created that does nothing except generate new simulations, all day long, for a thousand consecutive years. And once those various simulated societies reach technological maturity, they would (assumedly) start creating simulations of their own—simulations inside of simulations.

The term “conspiracy theory” has an irrevocable public relations problem. Technically, it’s just an expository description for a certain class of unproven scenario. But the problem is that it can’t be self-applied without immediately obliterating whatever it’s allegedly describing. You can say, “I suspect a conspiracy,” and you can say, “I have a theory.” But you can’t say, “I have a conspiracy theory.” Because if you do, it will be assumed that even you don’t entirely believe the conspiracy you’re theorizing about.

But it still must be asked: Discounting those events that occurred within your own lifetime, what do you know about human history that was not communicated to you by someone else? This is a question with only one possible answer.

This, it seems, has become the standard way to compartmentalize a collective, fantastical phenomenon: Dreaming is just something semi-interesting that happens when our mind is at rest—and when it happens in someone else’s mind (and that person insists on describing it to us at breakfast), it isn’t interesting at all.

[On the Buzzfeed blue vs. gold dress viral phenom.] The next day, countless pundits tried to explain why this had transpired. None of their explanations were particularly convincing. Most were rooted in the idea that this happened because we were all looking at a photo of a dress, as opposed to the dress itself. But that only shifts the debate, without really changing it—why, exactly, would two people see the same photograph in two completely different ways?

Adams is the author of On the Genealogy of Color. He believes the topic of color is the most concrete way to consider the question of how much—or how little—our experience with reality is shared with the experience of other people. It’s an unwieldy subject that straddles both philosophy and science. On one hand, it’s a physics argument about the essential role light plays in our perception of color; at the same time, it’s a semantic argument over how color is linguistically described differently by different people. There’s also a historical component: Up until the discovery of color blindness in the seventeenth century, it was assumed that everyone saw everything the same way (and it took another two hundred years before we realized how much person-to-person variation there is). What really changed four hundred years ago was due (once again) to the work of Newton and Descartes, this time in the field of optics. Instead of things appearing “red” simply because of their intrinsic “redness” (which is what Aristotle believed), Newton and Descartes realized it has to do with an object’s relationship to light.

On the same day I spoke with Linklater about dreams, there was a story in The New York Times about a violent incident that had occurred a few days prior in Manhattan. A man had attacked a female police officer with a hammer and was shot by the policewoman’s partner. This shooting occurred at ten a.m., on the street, in the vicinity of Penn Station. Now, one assumes seeing a maniac swinging a hammer at a cop’s skull before being shot in broad daylight would be the kind of moment that sticks in a person’s mind. Yet the Times story explained how at least two of the eyewitness accounts of this event ended up being wrong. Linklater was fascinated by this: “False memories, received memories, how we fill in the blanks of conjecture, the way the brain fills in those spaces with something that is technically incorrect—all of these errors allow us to make sense of the world, and are somehow accepted enough to be admissible in a court of law. They are accepted enough to put someone in prison.” And this, remember, was a violent incident that had happened only hours before. The witnesses were describing something that had happened that same day, and they had no incentive to lie.

How much of history is classified as true simply because it can’t be sufficiently proven false?

All of which demands a predictable question: What significant historical event is most likely wrong? And not because of things we know that contradict it, but because of the way wrongness works.

When D. T. Max published his posthumous biography of David Foster Wallace, it was depressing to discover that many of the most memorable, electrifying anecdotes from Wallace’s nonfiction were total fabrications.

In Ken Burns’s documentary series The Civil War, the most fascinating glimpses of the conflict come from personal letters written by soldiers and mailed to their families. When these letters are read aloud, they almost make me cry. I robotically consume those epistles as personal distillations of historical fact. There is not one moment of The Civil War that feels false. But why is that? Why do I assume the things Confederate soldiers wrote to their wives might not be wildly exaggerated, or inaccurate, or straight-up untruths?

I doubt the current structure of television will exist in two hundred fifty years, or even in twenty-five. People will still want cheap escapism, and something will certainly satisfy that desire (in the same way television does now). But whatever that something is won’t be anything like the television of today. It might be immersive and virtual (like a Star Trekian holodeck) or it might be mobile and open-sourced (like a universal YouTube, lodged inside our retinas). But it absolutely won’t be small groups of people, sitting together in the living room, staring at a two-dimensional thirty-one-inch rectangle for thirty consecutive minutes, consuming linear content packaged by a cable company.

[To understand a given era through a TV show.] We’d want a TV show that provided the most realistic portrait of the society that created it, without the self-aware baggage embedded in any overt attempt at doing so. In this hypothetical scenario, the most accurate depiction of ancient Egypt would come from a fictional product that achieved this goal accidentally, without even trying. Because that’s the way it always is, with everything. True naturalism can only be a product of the unconscious. So apply this philosophy to ourselves, and

To attack True Detective or Lost or Twin Peaks as “unrealistic” is a willful misinterpretation of the intent. We don’t need television to accurately depict literal life, because life can literally be found by stepping outside.

If anyone on a TV show employed the stilted, posh, mid-Atlantic accent of stage actors, it would instantly seem preposterous; outside a few notable exceptions, the goal of televised conversation is fashionable naturalism. But vocal delivery is only a fraction of this equation. There’s also the issue of word choice: It took decades for screenwriters to realize that no adults have ever walked into a tavern and said, “I’ll have a beer,” without noting what specific brand of beer they wanted 

But when a show’s internal rules are good, the viewer is convinced that they’re seeing something close to life. When the rom-com series Catastrophe debuted on Amazon, a close friend tried to explain why the program seemed unusually true to him. “This is the first show I can ever remember,” he said, “where the characters laugh at each other’s jokes in a non-obnoxious way.” This seemingly simple idea was, in fact, pretty novel—prior to Catastrophe, individuals on sitcoms constantly made hilarious remarks that no one seemed to notice were hilarious. For decades, this was an unspoken, internal rule: No one laughs at anything. So seeing characters laugh naturally at things that were plainly funny was a new level of realness. The way a TV show is photographed and staged (this is point number three) are industrial attributes that take advantage of viewers’ preexisting familiarity with the medium: When a fictional drama is filmed like a news documentary, audiences unconsciously absorb the action as extra-authentic (a scene shot from a single mobile perspective, like most of Friday Night Lights, always feels closer to reality than scenes captured with three stationary cameras, like most of How I Met Your Mother).

What is the realest fake thing we’ve ever made on purpose? 

Nothing on TV looks faker than failed attempts at realism. A show like The Bachelor is instantly recognized (by pretty much everyone, including its intended audience) as a prefab version of how such events might theoretically play out in a distant actuality. No television show has ever had a more paradoxical title than MTV’s The Real World, which proved to be the paradoxical foundation of its success.

Roseanne was the most accidentally realistic TV show there ever was…By the standards of TV, both of these people were wildly overweight. Yet what made Roseanne atypical was how rarely those weight issues were discussed. Roseanne was the first American TV show comfortable with the statistical reality that most Americans are fat. And it placed these fat people in a messy house, with most of the key interpersonal conversations happening in the kitchen or the garage or the laundry room. These fat people had three non-gorgeous kids, and the kids complained constantly, and two of them were weird and one never smiled.

The less incendiary take on football’s future suggests that it will continue, but in a different shape. It becomes a regional sport, primarily confined to places where football is ingrained in the day-to-day culture (Florida, Texas, etc.). Its fanbase resembles that of contemporary boxing—rich people watching poor people play a game they would never play themselves.

A few months after being hired as head football coach at the University of Michigan, Jim Harbaugh was profiled on the HBO magazine show Real Sports. It was a wildly entertaining segment, heavily slanted toward the intellection that Harbaugh is a lunatic. One of the last things Harbaugh said in the interview was this: “I love football. Love it. Love it. I think it’s the last bastion of hope for toughness in America in men, in males.”

“But look what happened to boxing,” people will say (and these people sometimes include me). “Boxing was the biggest sport in America during the 1920s, and now it exists on the fringes of society. It was just too brutal.” Yet when Floyd Mayweather fought Manny Pacquiao in May of 2015, the fight grossed $400 million, and the main complaint from spectators was that the fight was not brutal enough. Because it operates on a much smaller scale, boxing is—inside its own crooked version of reality—flourishing. It doesn’t seem like it, because the average person doesn’t care. But boxing doesn’t need average people. It’s not really a sport anymore. It’s a mildly perverse masculine novelty, and that’s enough to keep it relevant.

Midway through the episode, the show’s producers try to mathematically verify if youth participation in football is decreasing as much as we suspect. It is. But the specificity of that stat is deceiving: It turns out youth participation is down for all major sports—football, basketball, baseball, and even soccer (the so-called sport of the future). Around the same time, The Wall Street Journal ran a similar story with similar statistics: For all kids between six and eighteen (boys and girls alike), overall participation in team sports was down 4 percent.

But sometimes the reactionaries are right. It’s wholly possible that the nature of electronic gaming has instilled an expectation of success in young people that makes physical sports less desirable. There’s also the possibility that video games are more inclusive, that they give the child more control, and that they’re simply easier for kids who lack natural physical gifts. All of which point to an incontestable conclusion: Compared to traditional athletics, video game culture is much closer to the (allegedly) enlightened world we (supposedly) want to inhabit.

The gap for the Famous Idaho Potato Bowl was even greater—the human attendance was under 18,000 while the TV audience approached 1.5 million. This prompted USA Today to examine the bizarre possibility of future bowl games being played inside gigantic television studios, devoid of crowds.

What makes the United States so interesting and (arguably) “exceptional” is that it’s a superpower that did not happen accidentally. It did not evolve out of a preexisting system that had been the only system its founders could ever remember; it was planned and strategized from scratch, and it was built to last. Just about everyone agrees the founding fathers did a remarkably good job, considering the impossibility of the goal.

This logic leads to a strange question: If and when the United States does ultimately collapse, will that breakdown be a consequence of the Constitution itself? If it can be reasonably argued that it’s impossible to create a document that can withstand the evolution of any society for five hundred or a thousand or five thousand years, doesn’t that mean present-day America’s pathological adherence to the document we happened to inherit will eventually wreck everything?

Wexler notes a few constitutional weaknesses, some hypothetical and dramatic (e.g., what if the obstacles created to make it difficult for a president to declare war allow an enemy to annihilate us with nuclear weapons while we debate the danger) and some that may have outlived their logical practicality without any significant downside (e.g., California and Rhode Island having equal representation in the Senate, regardless of population).

But I would traditionally counter that Washington’s One Big Thing mattered more, and it actually involved something he didn’t do: He declined the opportunity to become king, thus making the office of president more important than any person who would ever hold it. This, as it turns out, never really happened. There is no evidence that Washington was ever given the chance to become king, and—considering how much he and his peers despised the mere possibility of tyranny—it’s hard to imagine this offer was ever on the table.

Washington’s kingship denial falls into the category of a “utility myth”—a story that supports whatever political position the storyteller happens to hold, since no one disagrees with the myth’s core message (i.e., that there are no problems with the design of our government, even if that design allows certain people to miss the point).

…Every strength is a weakness, if given enough time.

Back in the landlocked eighties, Dave Barry offhandedly wrote something pretty insightful about the nature of revisionism. He noted how—as a fifth-grader—he was told that the cause of the Civil War was slavery. Upon entering high school, he was told that the cause was not slavery, but economic factors. At college, he learned that it was not economic factors but acculturalized regionalism. But if Barry had gone to graduate school, the answer to what caused the Civil War would (once again) be slavery.

Much of the staid lionization of Citizen Kane revolves around structural techniques that had never been done before 1941. It is, somewhat famously, the first major movie where the ceilings of rooms are visible to the audience. This might seem like an insignificant detail, but—because no one prior to Kane cinematographer Gregg Toland had figured out a reasonable way to get ceilings into the frame—there’s an intangible, organic realism to Citizen Kane that advances it beyond its time period. Those visible ceilings are a meaningful modernization that twenty-first-century audiences barely notice.

There’s growing evidence that the octopus is far more intelligent than most people ever imagined, partially because most people always assumed they were gross, delicious morons.

Impressions of Malaysia

A stop on the KL food tour

A stop on the KL food tour

Malaysia is a country of 30 million people that co-anchors the SE Asian economy with Indonesia. It’s also a place known to have great food and friendly people. So I was delighted to have the opportunity to visit there for the first time recently to give a couple talks.

I was only in Kuala Lumpur so it’s probably more accurate to say I was in Kuala Lumpur rather than Malaysia, as the smaller towns and countryside are quite different from the capital city. (The most striking example of this distinction in Asia is the grand canyon of a difference between “Beijing/Shanghai” and “the rest of China.”)

Kuala Lumpur was more relaxed than I was expecting. A few years ago I spent several days in neighboring Jakarta, a city that overwhelms you with traffic and chaos. KL felt positively tranquil by Jakarta standards.

There isn’t a must-see attraction in KL. There are plenty of striking skyscrapers to gawk at; all the luxury hotel brands with posh buildings; some nice looking mosques; various museums, an aquarium, and so on. The malls are fun and huge and contain everything: movies, nice restaurants, casual restaurants, salons, coffee shops, all sorts of retail, banks, and more. Just wandering around a massive Malaysian mall gives you plenty to look at and think about.

The Kuala Lumpur food tour is very much worth doing. We tasted Malay, Chinese, and Indian food in places where there wasn’t a tourist in sight. A food tour remains a favorite way for me to see a city, learn about its culture and economy, of course taste some of its food. I’ve done them in Istanbul, Copenhagen, Kyoto, and now Kuala Lumpur. It’s great for non-foodies: they tend to emphasize cheap local eats.

Malaysia is predominately Muslim and seemingly a bit stricter about religious rules than its neighbor Indonesia. In Malaysia, if you’re not Muslim and marry a Muslim, you legally are required to convert — a requirement that’s uncommon in other Muslim-majority countries. Also, the local scandal of the moment in KL, as it was relayed to me by a few secular locals, was a boycott of Auntie Anne’s restaurant. Yes – the Western chain that sells those delicious hot dogs and pretzels. The reason for the boycott? The phrase “hot dog” on the menu. Dogs apparently are sacred to some Muslims. Thus, the phrase “hot dog” offends. Strange world. Strange times…

The Factor That Colors Happiness and Unhappiness the Most

Is she enjoying the city she’s living in? Is he enjoying his job, his co-workers, his boss? How’s he feeling about life?

The factor that has the most explanatory power, in my view, on questions having to do with personal happiness and satisfaction in one’s personal and professional life, is the following: Is this person satisfied with their romantic relationship status?

Note I am not saying “in a happy relationship.” There are plenty of people who are single and very happy with that status. But those who are single and yet would like to be in a relationship tend to be unhappy, and project that unhappiness across all aspects of their lives. Those in unhappy relationships act similarly.

Romance rules all. To deeply understand a person and their probable happiness is to understand their romantic happenings. Of course, in almost all professional contexts, and a great deal of personal ones too, it is inappropriate to probe on such topics. Which is one reason why most friendships are not very deep.

What I’ve Been Reading

Books and more books.

gold1. Digital Gold by Nathaniel Popper. An incredibly engaging journalistic introduction to Bitcoin and blockchain, with cinematic storytelling about the people who pioneered the technology over the past 15 years. The book is about a year old and since then, Bitcoin has struggled, though I suspect many of the characters in this book — and the experts in real life — remain bullish on cryptocurrencies in the long run. Excellent for those learning the fundamentals of bitcoin and bitcoin history.

2. Dark Matter by Blake Crouch. I made the mistake of starting this sci-fi thriller at 11pm one night in bed. I was up till 1am. It’s a classic page turner set against the backdrop of the Many-Worlds interpretation of quantum mechanics, which is a real thing and which is super interesting to contemplate. (There is a universe right now where Donald Trump is not the current president-elect, for example.)

3. Lightning Rods by Helen Dewitt. I definitely would not admit to being entertained by this entertaining novel. Definitely not.

4. Upheavals of Thought: The Intelligence of Emotions by Martha Nussbaum. I’m a Nussbaum fan and the early chapters here contained provocative reflections on the power of emotions. How an emotion like fear of death manifests in so many aspects of our thought stream. Ultimately this was too dense for me to make it all the way through, but I’m glad I read as far as I did.