Zadie Smith on Technology and Philosophy

I love Zadie Smith, but her lengthy review of The Social Network movie is disappointing. She tries to do a macro cultural critique of the online social network phenomenon but gets lost pretty quickly. A sample paragraph:

When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears. It reminds me that those of us who turn in disgust from what we consider an overinflated liberal-bourgeois sense of self should be careful what we wish for: our denuded networked selves don’t look more free, they just look more owned.

Say what?

Most negative pieces like Smith's are premised on the idea that Facebook and the web are changing our lives in a massive way. Most positive pieces are similarly premised except instead they argue that everything is sweetness and light.

Someone should write an article that argues the total impact (good or bad) of social networking technologies on an individual's identity, philosophies, behavior, and relationships may actually be overstated by the legion of recent essayists and filmmakers. And that it may be especially overstated even by those who claim it's been life changing — i.e., the piece skeptically assesses first-person testimonies. I'm not saying I hold this view, but it would be a refreshingly different way to frame the conversation.

###

Here's William Gibson on related topics in a recent interview. One line on globalization:

I’ve become convinced that nostalgia is a fundamentally unhealthy modality. When you see it, it’s usually attached to something else that’s really, seriously bad. I don’t traffic in nostalgia. We’re becoming a global culture.

The Wisdom of Your Former Self

Andy McKenzie quotes the great Portuguese writer Fernando Pessoa:

I often find texts of mine that I wrote when I was very young–when I was seventeen or twenty. And some have a power of expression that I do not remember having then. Certain sentences and passages I wrote when I had just taken a few steps away from adolescence seem produced by the self I am today, educated by years and things. I recognize I am the same as I was.

Usually people shudder with embarrassment at the prospect of coming upon writings of their youth. It's the same logic techno-skeptics use when advising youth not to blog: Your adolescent riffs will come off as naive and immature to your wise, adult self.

Pessoa in adulthood found the opposite. His texts of 17 or 20 years old display insight of a caliber close to what he could have produced as an older man.

Isn't this prospect — that there's little significant difference between your youthful thoughts and adult ones — even more terrifying?

Since few adults today have records of their youth, it's easier for them to maintain self-serving narratives about how far they've come in adulthood. This will change, as more young people publish on Twitter and blogs. 20-somethings today will one day look back at those permanent and forever Google-able writings and either shudder in embarrasement as commonly presumed, or, like Pessoa, they will find them remarkably similar to their adult expressions. In the cases where there's no significant difference, the fact that one must work to become wiser will be unavoidable.

The Origins of “Think Different”

My sixth grade technology teacher changed my life.

He taught an early-morning elective class on computer repair in which we learned how to fix Macintosh computers. The curriculum covered how to take apart hard drives, how to re-install system software ("C is for CD and that's good enough for me" was the jingle to remember to hold the C key when starting a computer from system software), re-build desktops, run Disk Utility the right way, partition hard drives, and much more. In exchange for the free 7:00 AM class I had to periodically do maintenance and repair work on the school's computers. It was a hell of a deal: the skills I picked up continue to serve me well, and that class facilitated my burgeoning interest in software and the internet (with which I would soon become obsessed).

But the biggest gift from that class and teacher had nothing to do with the nitty gritty of computer repair. Rather, it was the introduction of a certain kind of life philosophy. He forced all of us to memorize the text of the Apple "Think Different" television campaign. We had to recite the ad back to him word-for-word in order to pass the class. It was a profoundly inspiring message.

On our last day, he wrote each of us a personal letter, continuing the theme of the advertisement. Mine read: "If you continue to work hard and do well, you can acquire the skills needed to change the world. With education one can make great scientific or technological breakthroughs, curb world hunger and child labor, prevent the spread of nuclear weapons, promote peace, and have the power to bring about great change in the world. With education, you have the power to do nearly anything. If you don't change the world, who will?" Then, as a postscript, he added: "Be sure to back up your hard drive."

In the 10 years since, I have not forgotten a word from the advertisement and have recited it hundreds of times (no exaggeration!) to whoever will listen, in various venues. (And the short video series I did last year was called Think Different TV.)

For all these reasons, I was extremely intrigued to watch this six minute clip of a young Steve Jobs discussing the origins of the advertisement in the context of marketing, branding, and values. Highly recommended everyone in business.

(via TechCrunch)

The Feel-Bad Effect from Not-So-Close Facebook Friends

My good friend Stan James writes about how social networks amplify the feel-bad-in-comparison effect when you see people raving about how glorious their lives are:

In my trips back to Colorado, I have been struck each time by the discord between people’s Facebook lives and what they say in private. On Facebook they have been on an amazing vacation to exotic beaches. In person they confess that the vacation was a desperate attempt to save a marriage. On Facebook they have been to gliteratee tech conferences. In person they confess they haven’t been able to sleep for months, and are on anti-anxiety medication from the stress of financial pressures on their company

What’s interesting is that this feel-bad Facebook effect seems to come from a distinct source: not-so-close Facebook friends.

In the case of true close friends, you know about all the crap that is going on in their lives. From deep interaction, you know the specific pains and doubt that lies behind the smiling profile picture…

Since TV was invented, critics have pointed out the dangers of watching the perfect people who seem to inhabit the screen. They are almost universally beautiful, live in interesting places, do intereseting work (if they work at all), are unfailingly witty, and never have to do any cleaning. They never even need to use the toilet. It cannot be pschologically healthy to compare yourself to these phantasms.

So it’s interesting that social networks have inadvertently created the same effect, but using an even more powerful source. Instead of actors in Hollywood, the characters are people that you know to be real and have actually met. The editing is done not by film school graduates, but by the people themselves.

In the end, my friend’s strategy seems to be the right one: don’t spend too much time purusing the lives of people who aren’t in your life. And spend more time learning about the uncut, unedited, off-line lives that your friends are actually living.

Very true when reading other people's public content. People tend not to share their warts in public forums. Keep that in mind if you feel shitty in comparison when reading about the apparent charmed life of a blogger you don't know well in real life.

Four years ago I wrote a somewhat similar post but from the perspective of a person who writes generally upbeat tweets and blog posts. When you know you are going to blog about an experience before you have the experience, you want it to be good so that you can write a positive post that's fun to write and read. It changes the actual experience to be more positive. After writing about the (positive) experience, it's in the historical record. When you read old posts to remember your past, you feel happy about all the positive experiences you accumulated and recorded. It's not just about whitewashing the past or selective memory (though this is part of it); there's an anticipatory effect of sharing the experience in a public forum that changes the actual experience for the better.

#

I just told a friend I was writing this post. She said, "This is a litmus test I use for how close I am with a friend. If s/he doesn't tell me anything bad about their life, I assume we're not very good friends."

###

Note to readers: Blogging will be light for the next month due to extensive travel, and probably more sporadic than usual for the rest of 2010. If you don't already use an RSS reader I encourage you to do so, and subscribe to this feed. You can also get my posts via email.

The Effect of the “Like” Button

The "Like" button on Facebook (and now all over the web) allows a user to indicate positivity about a piece of content without actually writing a comment. Millions of people have "Liked" status updates, notes, photos, bios, comments, and now external web pages. Similar one-click sentiment links ("Endorse," "Helpful," "Not Helpful") now exist on LinkedIn, Quora, and other social networking sites. And of course "Upvote" and "Downvote" arrows have driven sites like Reddit for a long time.

Like0 What is the effect (as I see it) of these type of one-click sentiment buttons? In a sentence: Overall engagement goes up, substantive comments / contributions go down.

Take a blog post. Historically, the only option for a user to engage was to leave a written comment. Suppose I write a post and five people leave comments. With the addition of a "Like" button I would estimate three people would leave comments, but four people would click "Like." Before sentiment buttons: 5 written comments. After sentiment buttons: 7 engaged users, 3 commenters.

My friend Dario Abramskiehn asked me awhile ago why certain of my posts receive more comments than others. I follow Chris Yeh's theory on comments: the less serious / difficult / lengthy the blog post, the more comments it will have, assuming an average level of interestingness. I chalk this up to the law of reciprocity: if you take the time to crank out something really thoughtful and original, readers feel like they reciprocate the effort in a comment. So many, naturally, abstain. By contrast, if you post something provocative and short, it's easy to leave a quick comment and feel square with the effort of the blogger.

With a "Like" button, readers who would have previously abstained can now indicate passive positive sentiment. Some readers who previously left a comment but did so half-assedly would now click "Like."

The generic, safe nature of "Like" also increases total engagement when difficult topics would otherwise deter readers from chiming in. A friend recently posted a status update on Facebook about a relative's fight against cancer. It was a positive update — i.e., one that warranted congratulations or encouragement — but, given the sensitivity of illness topics, given the fear of offending someone — the status received quite a few "Likes" and almost no comments. By contrast, a message that's more straightforward — such as my tweet about how many libertarians are religious — received several comments but no "Likes."

Bottom Line: The "Like" feature and other passive sentiment links next to content on the web show that the way users engage with content will continue to change, and that the way to measure the vitality of an online community continues to be more complicated than raw numbers such as unique visitors or numbers of comments.

(thanks to Steve Dodson for helping brainstorm this. P.S. "Like" buttons coming to this blog soon.)

Age Matters to Multitasking and Information Diets

Tyler Cowen continues the discussion about internet information culture and multitasking:

Do law partners and top investment bankers multitask?

Yes.

I won't quite write "end of story" but…

Of course top CEOs don't multitask all the time, they multitask selectively, combined with periods of extreme focus.  Still, I would say that multitasking is passing the market test… It's one thing to think that a seventeen-year-old teenager will multitask too much; it's another thing to make the same claim about an extremely valuable executive, surrounded by assistants, time management specialists, and so on. [BC: Emphasis mine.]

It's true that virtually every high-performing CEO I know multitasks. To the extent they are as effective as ever, and I think they are, who's to tell them to stop multitasking?

The harder question is about the 17 year-old. Arguably, the over-30 CEO has developed a "foundation" of focus — by that I mean he has experience from pre-internet pre-iPhone life of not multitasking as much and thus he knows when to turn off the background noise and do the extreme focus Tyler mentions. The 17 year-old, by contrast, has no such experiences. Multitasking is all he's ever known and all he's ever doing even as he is "immersed in a developmental stage where impulse control is dangerously weak and the brain is at a peak of malleability." Will he be able to do extreme focus when he must?

A different but related issue is about information consumption in the age of the web. Some of the most successful consumers and producers of intellectual bits on the Internet — guys like Tyler and Andrew Sullivan — spent 30-plus years pre-Internet reading long books and establishing the foundation of knowledge upon which their bits sit. Me? I've grown up on the web. I haven't read all the Great Books. My model is more a mix of books and bits. I do believe the bits will cohere in the long-run into a kind of foundational knowledge of the sort Tyler got from books, but perhaps the books/bits ratio for me should be different than his at this stage.

What the Net is Doing to Our Brains

The conversation is back with the release of Nick Carr’s book The Shallows: What the Internet is Doing to Our Brains. I haven’t read it yet but you can get the gist by perusing the reviews, following Carr’s blog, or reading RSSted Development where I briefly contrast Carr and Cowen. To see some back-and-forth on what the studies actually say about technology and distractability, read the comments section of Jonah Lehrer’s post.

I continue to try to figure out how I can improve my ability to concentrate, and I worry about how the internet is adversely affecting that mission. In the end I fall into the pro-internet camp, if such a crude distinction can even be made, but I do not think this is mutually exclusive with whole-hearted support of the broader conversation Carr has ignited or this Alain de Botton quip:

One of the more embarrassing and self-indulgent challenges of our time is how we can relearn to concentrate. The past decade has seen an unparalleled assault on our capacity to fix our minds steadily on anything.

Recent steps I’ve taken to improve my ability to concentrate: a) track my time more rigorously, b) use self-control to block access to twitter, facebook, and other time sink sites; c) turn off email for hours at a time, d) don’t use mobile email, e) wear Bose headsets to block out noise and to remind myself I’m supposed to be working.

###

  • Speaking of books I haven’t read yet, The Authenticity Hoax sounds interesting.
  • Here’s a clip showing what not to do if you’re a PR person faced with an inquisitive reporter via SF’s Laguna Honda hospital.
  • AEI crunches the numbers on how much money U.S. airline consumers would save if Open Skies were global.
  • Colin Marshall asks how much human energy is wasted on personal relationship re-engineering (aka therapy).

Slowing Rate of Change and Tech Innovation

"We flatter ourselves by imagining that we live in an age of endless invention and innovation," says Paul Kedrosky. A classic approach is applying Moore's Law to…everything, and then leaping to claims about unprecedented change in society more generally. I'm reading a number of commentators call bullshit. They are arguing that the number of new important innovations has been steadily declining and that the pace of change is slowing.

Here's Philip Longman in U.S. News & World Report:

There is a distinction to be made between inventions that are merely sophisticated–such as, say, personal digital assistants–and those that fundamentally alter the human condition. The invention of the light bulb created more useful hours in each day for virtually every human being. The electric motor directly raised the productivity in every sphere of life, from speeding up assembly lines to creating so many labor-saving devices in the home that millions of housewives were able to join the paid work force. The internal combustion engine allowed for mass, high-speed transportation of both people and freight while also opening up vast regions of cheap land to suburban development. The materials revolution that brought us petroleum refining, synthetic chemicals, and pharmaceuticals involved learning to rearrange molecules in ways that made raw materials fundamentally more valuable. Without the genetically improved seeds that brought us the "Green Revolution" of the late 1960s and '70s, there would be mass starvation.

Can we make any parallel claim about the single greatest technology of our own time? It remains possible that networked computers and other new information technologies will one day create similar, societywide bursts in productivity, health, and wealth. Yet to date, the marginal gains computers have brought to communications are modest even compared with the improvements made by the telegraph. The first trans-Atlantic telegraph cable in 1866 reduced the time required to send a message from New York to London from about a week to a few minutes. Notes economist Alan Blinder: "No modern IT innovation has, or I dare say will, come close to such a gain!"

Here's Scott Sumner with a personal observation in a post about economic growth rates:

My grandmother died at age 79 on the very week they landed on the moon. I believe that when she was young she lived in a small town or farm in Wisconsin. There was probably no indoor plumbing, car, home appliances, TV, radio, electric lights, telephone, etc. Her life saw more change than any other generation in world history, before or since. I’m already almost 55, and by comparison have seen only trivial changes during my life. That’s not to say I haven’t seen significant changes, but relative to my grandma, my life has been fairly static. Even when I was a small boy we had a car, indoor plumbing, appliances, telephone, TV, modern medicine, and occasional trips in airplanes.

Michael Lind makes similar points in his Time magazine piece called "The Boring Age."

Here's Peter Thiel:

The question about what sorts of innovations we are likely to see in the next 10 or 20 years depends a great deal on what people do. The pessimistic view is that we are living in a society that depends on innovation and science and technology, but that is actually not focusing on these things nearly enough and that as a result, we are headed towards an extended period of stagnation and very slow growth throughout all the nations of the developed world.

The more optimistic view is that we somehow figure out a way to restart the innovative engine that's probably gotten stalled. And my version of this would be that we go back to where the '50's and '60's ended and look back at the great technologies people were pursuing at the time; space, robots, artificial intelligence, the next generation of biotechnology and sort of look at where people thought the future of the world was going to be in 1968 and we try to take off from where things got detoured at that time.

Here's a different 10 min video of Peter Thiel in which he talks about the lack of innovation in the context of financial markets. There is an attitude that "someone else is doing it" but in fact no one is doing it. "There is a lot less going on than people think," he says.

So: Why is this happening?

Tyler Cowen once said, "If we had to build today's energy infrastructure working under the current regulatory and NIMBY burden, it probably could not be done." Can we extract from this a larger claim that a bloated government and burdensome regulatory environment are significantly dampening innovation? An ever-powerful bureaucratic class strangling creativity? Or is it that the government is not doing enough in funding basic research toward big innovation (as it did with Darpa and space program of the past)? Are there cultural norms around conformity that are causing too many to be too deferential to the status quo? Are too many smart young people going to school? (In the past boy-geniuses had more unconventional educations which helped lead to extreme innovation, perhaps.) What are other reasons?

One counter-argument to all of the above is that there is indeed accelerating change and new innovation, it's just that we don't yet see it. As David Dalrymple said to me in a tweet, "The exponential trend only applies directly to enabling technologies, not to technologically-enabled milestones like flight."

(thanks to Michael Vassar for helping brainstorm some of these ideas.)

The Age of Early Self-Conception

On Facebook the other day I viewed a profile of a "friend" who's in college and she typed this as her bio:

me? hmm. well, i'm a fighter. i'm a little crazy, but i'm passionate and i love hard when i do let myself love. when i'm upset, i need ice cream and to have my back rubbed. i'm restless by nature, and am happiest when i'm moving. i'm athletic but not a jock, musical but not a musician, and neither side of my brain seems to be dominant. sometimes i find comfort in words, and sometimes in numbers, but always in the smell of spring and my best friends. i'm bad with change, but get tired of staying the same. i'm contradicting and i think too much, but i'm told that it's cute. i believe in things that happen for a reason, and i hope that Vassar is one of them. i am extreme, i am loved. i am hopeful.

When reading her rather self-conscious, careful bio (though the lowercase letters and opening phrase "me?" attempt to signal the opposite), I was struck: When else have so many millions of people under age 22 been asked to write their "biography" for public consumption? When else have hundreds of millions of people been asked to (essentially) publicly list their interests, favorite quotations, religious views, and political views?

Imagine the tens of millions of 15 year-olds who go to set up their profile and see a big white text book that says "Bio." As the cursor blinks, they ask themselves, "What is my biography? What are my interests? What are my religious views? What is my relationship status? Am I sexually interested in men or women?"

Social network people say that the profile we look at the most is our own. We are very interested in how we present ourselves to the world. But perhaps more important, we are interested in trying to figure out ourselves. As younger and younger people set up profiles, they end up confronting some of the central angst-inducing identity questions early in life.

Insofar as this all prompts reflection on issues, I say 'tis a good thing. But there's also a risk of people too quickly pouring cement on their identity. A 15-year-old selects from a drop down menu "Liberal" and views his page a few times a day. What does that do to his willingness to evolve his mindset?

There should be a checkbox at the top of your profile labeled "Keep Your Identity Small" and it would keep the "bio" box open but disable the other drop-downs. There should be a drop-down option for "Uncertain" in each category.

(A hat tip is owed to somebody for talking to me about this, but I cannot remember who.)

Self-Centric vs. Reader-Centric Uses of Social Media

There are self-centric and reader-centric ways to use social media. “Self-centric” = an approach that serves you the author best, “Reader-centric” = an approach that serves your readers best. Some examples:

The frequency of blog posts: Self-centric bloggers blog whenever they feel the inspiration, the reader-centric blogs at traffic-maximizing optimal levels (usually once a day).

Method for sharing links: Almost daily, Tyler Cowen posts “Assorted Links” which is a series of interesting links posted in a numbered form. Steve Silberman does the same on Twitter — he posts tons of interesting links. Tyler and Steve are being reader centric — the link is published in an easily viewable, common format that readers enjoy. But, it does almost nothing for Tyler and Steve. It is very hard to search through and access these links in the future. By contrast, I rarely do link dumps on my blog, and instead have categorized over 6,000 web pages on delicious. I am self-centric — I am storing the links in a bookmarking system that sorts by date and category and can be easily backed-up and searched.

Content of blog posts and tweets: The self-centric writer posts whatever is on his mind, including the proverbial “what I had for breakfast” dispatch. The reader-centric writer thinks hard about what will be interesting to an external audience, and shapes it as a product for a customer. Self-centric blogs are more personal; reader-centric blogs tend to be about a specific topic.

Replying to tweets: Hundreds if not thousands of people have replied to me (@bencasnocha) on Twitter, but I rarely post replies of my own because I don’t find it an efficient conversational medium. (I do read all replies.) Also, I don’t want my main Twitter page to be polluted with all random replies to random people. Compare my Twitter page to this popular twitterer. I’m being self-centric, instead of reader-centric.

If you replace “self-centric” with “selfish” and “reader-centric” with “selfless” you can see how the old adage “it’s selfish to be selfless” applies in this case. Many times reader-centric uses of social media, by increasing total readership, become long-run self-centric.