Book Review: Tyler Cowen’s “The Age of the Infovore”

An essay originally published by the American Enterprise Institute in 2009.

I liken my information consumption patterns to eating at a churrascaria restaurant, the Brazilian style steakhouse where passadors (meat waiters) circle your table with knives and a skewer and offer fresh cuttings of filet mignon, lamb, duck, sausage, and more in unlimited quantities. The waiters know me well and recommend new cuts of meat. A hand-selected group of friends from all over the world join me at my table each night. They are outstanding conversationalists. At the churrascaria I eat, I drink, I debate, and I leave feeling not just full but deeply satisfied with the whole experience.

The restaurant is my RSS reader (“Real Simple Syndication”) and the meat is the content flowing from my 120 subscriptions to websites or blogs. An RSS reader downloads the latest content from websites whenever they are updated and displays them in one application. The restaurant ambience, conversation, menu recommendations, and emotional satisfaction that settles in the stomach after a long meal with good friends: this is the social and interactive aspect to the Web.

In my RSS reader there are subscriptions of all sorts: personal blogs from individual thinkers I respect (Will Wilkinson, Paul Kedrosky, James Fallows, many others) to topical blogs that cover various interests (entrepreneurship, airline travel, cognitive science, psychology, education, the publishing industry), to the bookmarks and Twitter feeds of friends. I pull in the latest from my favorite columnists at the Washington Post, Financial Times, and Los Angeles Times. I even subscribe to a haiku website where frustrated patrons of my local San Francisco public transit line sound off in 5-7-5 style.

On any given day I might read an in-depth analysis on a new economic policy, followed by tips on meditation, followed by a reflection from a personal friend whose mother was recently diagnosed with cancer, rounded off by a couple of YouTube clips.

In other words, my online information diet is diverse, highly personal, and comprised mostly of short bits that I consume one after another in rapid-fire fashion. And it all happens in about the same time it would take me to go through a daily newspaper.

Tyler Cowen, a respected academic and top economics blogger, in his new book The Age of the Infovore: Succeeding in the Information Economy, offers two explanations for my style of info consumption.

The first is economic: when culture is free and a click away, as it is on blogs and Twitter and the broader Internet, we sample broadly and consume it in smaller chunks: “When access is easy, we tend to favor the short, the sweet, and the bitty. When access is difficult, we tend to look for large-scale productions, extravaganzas, and masterpieces,” writes Cowen. “The current trend—as it has been running for decades—is that a lot of our culture is coming in shorter and smaller bits.” Think 30-second YouTube clips instead of a full movie, iTunes singles instead of complete albums, two paragraph blog posts instead of an entire essay. And now the 140-character limit on Twitter instead of a blog-style free-form text box.

The second reason is the intellectual and emotional stimulation we experience by assembling a custom stream of bits. Cowen refers to this process as the “daily self-assembly of synthetic experiences.” My inputs appear a chaotic jumble of scattered information but to me they touch all my interest points. When I consume them as a blend, I see all-important connections between the different intellectual narratives I follow—a business idea (entrepreneurship) in the airplane space (travel), for example. Because building the blend is a social exercise—real communities and friendships form around certain topics—my social life and intellectual life intersect more intensely than before. And I engage in ongoing self-discovery by reflecting upon my interests, finding new bits to add to my stream, and thinking about how it all fits together.

Cowen maintains that these benefits enhance your internal mental existence; how you order information in your head and how you use this information to conceive of your identity and life aspirations affects your internal well-being. Because a personal blend reflects a diverse set of media (think hyper-specific niche news outlets in lieu of a nightly news broadcast that everyone watches on one of three networks), and because each person constructs their own stories to link their inputs together, the benefits are unique to the individual. They are also invisible. It is impossible to see what stories someone is crafting internally to make sense of their stream; it is impossible to appreciate the personal coherence of it.

The way the benefits of info consumption habits accrue privately but are perceived publicly approximates romance, Cowen adds. Compare a long-distance relationship to a proximate one. In a long-distance relationship, you have infrequent but very high peaks when you see each other. Friends see you run off for fancy getaway weekends when the sweetheart comes to town. Yet day-to-day it is not very satisfying. In a marriage by contrast you have frequent, bite-size, mundane interactions which rarely hit peaks or valleys of intensity. The happiness research that asserts married couples are happier than non-married ones and especially happier than couples dating long-distance is not always self-evident. Outsiders see the inevitable frustrations and flare-ups that mark even stable marriages. What they cannot see is the interior satisfaction that the couple derives by weaving together these mundane moments into a relationship rich in meaning and depth, and in writing a shared life narrative that is all their own. Cowen puts the analogy this way:

Many critics of contemporary life want our culture to remain like a long-distance relationship, with thrilling peaks, when most of us are growing into something more mature. We are treating culture like a self-assembly of small bits, and we are creating and committing ourselves to a fascinating brocade, much as we can make a marriage into a rich and satisfying life.

________

Learning from Bits

Beyond internal order and pleasure, what can you learn about the external world from bit-based information consumption habits? Do bits cohere into knowledge?

Consider my casual interest in libertarianism. Several years ago, like many young wage earners writing larger and larger checks to the IRS, I decided to learn more about the subject. I had not read any books or taken any classes on the subject. Names like Hayek and Nozick were foreign to me. Since I did not know whether the topic would interest me I did not want to commit a lot of time upfront to studying it. But I was interested enough to do the low-cost, easy thing: I subscribed to a few blogs, adding them to my daily mix. By following links (the currency of the Web) I found more blogs, added a few libertarian thinkers on Twitter, and listened to podcasts. Because of the social nature of the Web, I became friends with the authors of the content and hooked up with them on Facebook. On my blog, meanwhile, I tested out some of my own ideas on the political theory.

Over time the bits have cohered into something meaningful. My bottom-up, interactive, iterative, in-the-background accumulation of bits about libertarianism has made me an informed amateur. Unlike subjects of study in school, where I read a few books about something for a semester and moved on, libertarian bits populate my daily blend. I am sticking with it on a regular basis in part because it is just more fun than alternative modes of learning.

Within my online information diet, it is exhilarating to follow narratives, read the latest controversy (seasteading, anyone?), add my own two cents to the debate, and stitch together all that I have learned. Self-education has gone from being like a loner sitting in a bar sparsely populated with hazily attractive women to being in the center of a packed, rocking night club where the women are wearing mini-skirts and the guys’ shirts open up several buttons down. As Cowen puts it, “The emotional power of our blends is potent, and they make work, and learning, a lot more fun.” When a topic gets filtered through a two-way, fast-moving, personal bit stream, it commands my attention in a way the static, one-way, black-and-white version of the topic never could.

It is true that bits can only take you so far. I cannot become an expert on political science by reading snippets alone in disjointed fashion, no matter how much they may cohere over time and in my head. Expert understanding of an issue requires a type of foundational knowledge that comes from intense, monogamous engagement with the topic. The most popular way to do this is to read detailed books about something and then write about it at length.

Some of the most successful consumers and producers of intellectual bits on the Internet spent 30-plus years pre-Internet reading long books and establishing the foundation of knowledge upon which their bits sit. Guys like Cowen, who blogs prolifically at marginalrevolution.com and the political journalist Andrew Sullivan, who riffs on current affairs at The Atlantic dozens of times a day. Both are Ph.D.’s.

But here’s the catch: I’m not interested in libertarianism enough to read the books. Most people have not and will not broach the great books. Even in blissful yesteryear—that sepia-colored past that no one can quite place but everyone can nostalgically agree was the age of true intellectualism—even then most passed on the canon. Today, reading and editing Wikipedia entries displaces TV surfing, not time spent reading The Iliad.

For many people on many topics, it’s the bits or nothing.

________

Side Effects of Bits Culture: An Attention Crisis?

Just as the club-goer might pay the price for late night revelry with a next-morning hangover, the ravenous consumer of bits—skimming along the surface of ideas, darting from site to site—might pay the price in the form of short attention span, shallow reflection time, and an inability to dive deep into something even when it is desired. With the care and lack of sensationalism you have come to expect from the media, this point of view (which I will summarize but do not endorse without qualifications) is variously headlined, “The Attention Crisis,” “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future,” and “Is Google Making Us Stupid?

A year ago Nick Carr wrote an essay in The Atlantic in which he pondered:

Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

Carr explains that a decade of information consumption on the Internet has made him less able to focus on long books or articles even when he wants to.

His article had broad resonance. It struck a chord with devotees of life-hacking gurus such as Tim Ferriss and Merlin Mann who promote a “low information diet” and infrequent email checking as key to productivity and success. It shined contemporary light on the philosophy of famed computer scientist Donald Knuth who, since 1990, has not used email because of how it interferes with deep thinking: “Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things.”

In April, Laura Miller of Salon cited Carr’s article in a review of Winifred Gallagher’s new book Rapt: Attention and the Focused Life. Miller’s review frames Carr’s argument in evolutionary terms: We are wired to pay attention to loud noises, movement, shiny objects, and other signs of threats or meals. To survive, our ancestors had to deploy quick reactive attention when a famished tiger approached their camp. Today’s new, urgent threats are far milder—unread email (there is something almost irresistible about a bold unread message), blog posts, and pop-up windows —but we mistake them as equivalent. Meanwhile, the reflective attention we allocate to less exciting, non-urgent, creative or intellectual endeavors with long-term payoffs gets drowned out. Unless, of course, we make a conscious choice to resist our reactive impulses and cultivate “the conditions conducive to deep thought” which Miller says “have become increasingly rare in our highly mediated lives.”

When we binge on ice cream, we know we’re filling ourselves up but not in the right way. When we binge on 250-word articles, videos, and “10 Ways to Improve Your Body Today!” blog posts, we are better at deluding ourselves into thinking what we are doing matters and how we are doing it comes with no trade-offs. In other words, your attention or concentration might be diminishing to that of a one-eyed fly, and you do not even know it. “People aren’t aware of what’s happening to their mental processes in the same way that people years ago couldn’t look into their lungs and see the residual deposits,” says David Meyer of the Cognition and Perception Program at the University of Michigan.

The draconian bottom line for these people is as follows. The human brain is a famously plastic organ: how we use it shapes what it can do and what it becomes. If we spend all our mental cycles getting quick hits from blogs and our BlackBerries, our brains will optimize around this deployment of attention. Reading complicated books will become a hell of a chore and enduring long stretches of reflective solitude will become nearly unbearable. The bastions of intellectual culture are preparing to weep.

________

Age-Old Distractions and Un-focus

Arguments surrounding the attention crisis are grounded in the faulty assumption that frantic multi-tasking, unreflective bit-consuming, and witheringly short attention spans are new phenomena—that we are more distracted and reactive now than we were in times past—and thus the culprit must also be something new to this moment, namely the Internet.

Vaughn Bell of the University of London coined “the myth of the concentration oasis” to refer to the idea that pre-Internet people led lives free of distraction and invested hours on end to single activities. “If you think Twitter is an attention magnet,” Bell says, “Try living with an infant.” For centuries mothers have been trying to prepare dinner, watch the door for street vendors, process errant visual and aural stimuli, and look after their children—all at once.

We have always had distractions. We have never had long attention spans. We have never had a golden age where our minds could freely concentrate on one thing and spawn a million complex and nuanced thoughts. Cowen reminds us that charges to the contrary have been made at the birth of every new cultural medium throughout history.

Moreover, the technologies that are supposedly turning our brain into mush are very much within our control. The difference between the new distractions (a flickering TV in the kitchen) and age-old ones (crying infant) is that the TV can be turned off, whereas the crying infant cannot. Willpower decides whether technology takes over your life (and your brain). In order to write this article I have turned off my email and phone to minimize distractions. Millions of hyperactive teens chose to stop texting and IM-ing and curl up with Harry Potter.

The glorification of “focus” is the second problem with the criticisms of bit-consumption and technology use in general. While some amount of focus is necessary, it is not the case that sitting alone in a quiet white walled room with no beeps or buzzes is the ultimate day-to-day environment for deep, creative thinking. Sam Anderson in New York Magazine summarized research that says un-focus is actually an important part of creativity—random meanderings and conversations can trigger important creative insights. Excessive conscious attention on one particular point can come at the cost of the free-associative brainstorms that just might lead to the next big thing. A University of Amsterdam study showed participants who were distracted from making a decision, and forced to consciously focus on something else, devoted valuable unconscious thought to the issue and ultimately made a better decision when they returned to the task.

David Allen, the well-known productivity expert, says young people today use chaos and un-focus to generate creative insights. He notes the 25-year-old rock guitarist who says the best ideas for new songs come when he is working on another one. Allen wonders:

Maybe we allowed ourselves to be “taught” to focus, to constrain our impulsive thinking, to “pay attention”—and in the process ossified our ability to multi-perceive? Maybe there was a prior state of wide-eyed wonder that we lost.

We ought to consider the possibility that attention may not be only reflective or reactive, that thinking may not only be deep or shallow, or focus only deployed either on task or off. There might be a synthesis that amounts to what Anderson calls “mindful distraction.” Gen Y, the so-called ADD generation raised on the Internet, might be the ambassadors for mindful distraction. There are still young people accomplishing big things, their IQs are still as high as ever, and innovation continues apace… even as they are tethered to their iPhones. Somehow, they’ve made it work.

________

Neurodiversity and Individuality

Not everyone is a 25-year-old rocker searching for new songs. Some people’s creative aspirations are modest, and their immediate concerns are how to complete three assigned tasks by five o’clock without succumbing to Twitter or email temptations. Nor is everyone a tenured professor, like Tyler Cowen, whose public intellectual function demands up-to-the-minute engagement with news and culture. An entrepreneur starting a company needs to “get stuff done” in a more real sense—focus head-down on a single product launch, say. There is no time for the extensive assembling of information bits.

The optimal amount of focus or un-focus in your life, and the intricacy of your own personal blend of bits, will depend on who you are, what you are trying to do, and especially how your brain works.

This is why Cowen starts and ends his book with an emphasis on neurodiversity (the range of cognitive styles and preferences) and individuality. He thinks we need to celebrate diversity of minds in general and the “autistic cognitive style” in particular.

Autistics are extreme information lovers. They also prefer to consume small bits of information over which they impose local coherence and meaning. (Preferences aside, high IQ autistics have no problem seeing the bigger picture, too.) Someone with an autistic cognitive style, which Cowen takes pains to distinguish from an autism diagnosis generally, can use the new social technologies to “replicate or mimic some of the information-absorbing, information-processing, and mental-ordering abilities of autistics.”

The RSS reader experience I described at the outset is heaven to someone with an autistic cognitive style keen on collecting and ordering information to an intense degree.

When skeptics make sweeping negative claims about how the Web affects cognition, they are forgetting the people whose natural tendencies and strengths blossom in an information-rich environment. Cowen’s overriding point, delivered in a “can’t we all just get along” spirit, is that everyone processes the stimuli of the world differently. Everyone deploys attention in their own way. We should embrace the new tools—even if we do not personally benefit— that allow the infovores among us to perform tasks effectively and acquire knowledge rapidly.

________

There Is No Turning Back the Clock

The debates about the effect of technology on society are predictable. When something new comes on the scene, Silicon Valley geeks cheer. Then when it hits the mainstream (as Twitter just has) the usual contingent of worried mothers and aging academics wring their collective hands about how gizmo X is destroying all that is great about human civilization. The geeks hurl back the Luddite caricature, and the mothers fire back with the wild-eyed techno-optimist-determinist charge.

If we must place The Age of the Infovore on this spectrum, it is solidly on the techno-optimist side but with an eye toward taming passions not inflaming them. Defending the often-skewered with gentle wit and wisdom (and real academic research) is something Cowen does well. Over his career he has defended globalization from the charge that it is homogenizing and corroding culture; he has defended legalized prostitution and chain bookstores; and he has defended commercialism and consumerism on aesthetic grounds.

But this attempt is more than a general defense of technological progress from the latest round of timid Luddites. It is about autism, it is about the economics of stories and the importance of narrative to self-understanding, it is about creating and organizing internal personal worlds as refuge against any amount of external ugliness. It is about too many things, perhaps. It is sometimes hard to find the thread connecting the wide coverage—from riffs on higher education to a character study of Sherlock Holmes to a jargon-filled take on Immanuel Kant—and this weakens the ultimate impact of any single thesis. Cowen has many different theories and finds room for them all.

If these theories, both the disparate and connected ones, were not so interesting and original, the book would disappoint. But this is one of the most stimulating defenses of Internet information culture written, and the perspectives on autism, storytelling, and interiority do not have to be convincing to be fascinating. It is also fun: the playful audacity he displays, for instance, when pondering why intelligent life on other planets have not called us to chat (they have embraced the life of the mind, that’s why!) keeps you engaged all the way through.

The factor most in Cowen’s favor is the wind at the backs of all techno-optimists like his brethren Clay Shirky and Don Tapscott: the forward momentum of technological development. You cannot turn back the clock. It is impossible to envision a future where there is less information and fewer people on social networks. It is very possible to envision increasing abundance along with better filters to manage it. The most constructive contributions to the debate, then, heed Moore’s Law in the broadest sense and offer specific suggestions for how to harness the change for the better.

The Internet makes smart people smarter and dumb people dumber. This book shows the work of a smart person who has converted the Web from a generic all-you-can-eat buffet into an exquisite churrascaria of his own creation. After reading Cowen, you cannot help but want to similarly elevate your own intellectual life by embracing your cognitive style, assembling your own blends of information, and joining a social feast that is stimulating, fun, educational, and even a bit distracting.