youmightfindyourself:

Humans have a perplexing 
tendency to fear rare threats such as shark attacks while blithely 
ignoring far greater risks like 
unsafe sex and an unhealthy diet. Those illusions are not just 
silly—they make the world a more dangerous place.

We like to think that humans are supremely logical, making decisions on the basis of hard data and not on whim. For a good part of the 19th and 20th centuries, economists and social scientists assumed this was true too. The public, they believed, would make rational decisions if only it had the right pie chart or statistical table. But in the late 1960s and early 1970s, that vision of homo economicus—a person who acts in his or her best interest when given accurate information—was knee­capped by researchers investigating the emerging field of risk perception. What they found, and what they have continued teasing out since the early 1970s, is that humans have a hell of a time accurately gauging risk. Not only do we have two different systems—logic and instinct, or the head and the gut—that sometimes give us conflicting advice, but we are also at the mercy of deep-seated emotional associations and mental shortcuts.

People are likely to react with little fear to certain types of objectively dangerous risk that evolution has not prepared them for, such as guns, hamburgers, automobiles, smoking, and unsafe sex, even when they recognize the threat at a cognitive level.

Even if a risk has an objectively measurable probability—like the chances of dying in a fire, which are 1 in 1,177—people will assess the risk subjectively, mentally calibrating the risk based on dozens of subconscious calculations. If you have been watching news coverage of wildfires in Texas nonstop, chances are you will assess the risk of dying in a fire higher than will someone who has been floating in a pool all day. If the day is cold and snowy, you are less likely to think global warming is a threat.

Our hardwired gut reactions developed in a world full of hungry beasts and warring clans, where they served important functions. Letting the amygdala (part of the brain’s emotional core) take over at the first sign of danger, milliseconds before the neocortex (the thinking part of the brain) was aware a spear was headed for our chest, was probably a very useful adaptation. Even today those nano-pauses and gut responses save us from getting flattened by buses or dropping a brick on our toes. But in a world where risks are presented in parts-per-billion statistics or as clicks on a Geiger counter, our amygdala is out of its depth.

A risk-perception apparatus permanently tuned for avoiding mountain lions makes it unlikely that we will ever run screaming from a plate of fatty mac ’n’ cheese. “People are likely to react with little fear to certain types of objectively dangerous risk that evolution has not prepared them for, such as guns, hamburgers, automobiles, smoking, and unsafe sex, even when they recognize the threat at a cognitive level,” says Carnegie Mellon University researcher George Loewenstein, whose seminal 2001 paper, “Risk as Feelings,” debunked theories that decision making in the face of risk or uncertainty relies largely on reason. “Types of stimuli that people are evolutionarily prepared to fear, such as caged spiders, snakes, or heights, evoke a visceral response even when, at a cognitive level, they are recognized to be harmless,” he says. Even Charles Darwin failed to break the amygdala’s iron grip on risk perception. As an experiment, he placed his face up against the puff adder enclosure at the London Zoo and tried to keep himself from flinching when the snake struck the plate glass. He failed.

Read on.

10/10/11

youmightfindyourself:

Why do we feel nostalgia? And are infinite entertainment choices changing the way we look back?

Certain problems never disappear. Sometimes that’s because there’s no solution to whatever the problem is. But just as often, it’s because the problem isn’t problematic; the so-called “problem” is just an inexact, unresolved phenomenon two reasonable people can consistently disagree over. The “nostalgia problem” fits in this class: Every so often (like right now), people interested in culture become semifixated on a soft debate over the merits or dangers of nostalgia (as it applies to art, and particularly to pop music). The dispute resurfaces every time a new generation attains a social position that’s both dominant and insecure; I suppose if this ever stopped, we’d be nostalgic for the time when it still periodically mattered to people.

The highest-profile current example is the book Retromania: Pop Culture’s Addiction to Its Own Past, written by the British writer Simon Reynolds (almost certainly the smartest guy to ever earnestly think about Death in Vegas for more than 4½ minutes). Promoting his book on Slate, Reynolds casually mentioned two oral histories he saw as connected to the phenomenon (the grunge overview Everybody Loves Our Town and the ’80s-heavy I Want My MTV). Those passing mentions prompted writers from both books to politely reject the idea that these works were somehow reliant on the experience of nostalgia (nostalgia has a mostly negative literary connotation). But this is not the only example: The music writer for New York magazine wrote about this subject apolitically for Pitchfork, essentially noting the same thing I just reiterated — for whatever reason, this (semi-real) “nostalgia problem” suddenly appears to be something writers are collectively worried about at this (semi-random) moment. The net result is a bunch of people defending and bemoaning the impact of nostalgia in unpredictable ways; I suppose a few of these arguments intrigue me, but just barely. I’m much more interested in why people feel nostalgia, particularly when that feeling derives from things that don’t actually intersect with any personal experience they supposedly had. I don’t care if nostalgia is good or bad, because I don’t believe either of those words really applies.

But still — before a problem can be discarded, one needs to identify what that problem is. In my view, this dispute has three principal elements. None of them are new. The central reason most smart people (and certainly most critics) tend to disparage nostalgia is obvious: It’s an uncritical form of artistic appreciation. If you unconditionally love something from your own past, it might just mean you love that period of your own life. In other words, you’re not really hearing “Baby Got Back.” What you’re hearing is a song that reminds you of a time when you were happy, and you’ve unconsciously conflated that positive memory with any music connected to the recollection. You can’t separate the merit of a song from the time when you originally experienced it. [The counter to this argument would be that this seamless integration is arguably the most transcendent thing any piece of art can accomplish.] A secondary criticism boils down to self-serving insecurity; when we appreciate things from our past, we’re latently arguing that those things are still important — and if those things are important, we can pretend our own life is equally important, because those are the things that comprise our past. [The counterargument would be that personal history does matter, and that the size of one’s reality is the size of one’s memory.] A third criticism is that nostalgia is lazy, lifeless, and detrimental to creativity. [The counter to this would be that even those who hate nostalgia inevitably concede it feels good, and feeling good is probably the point.] There are other arguments that can be made here, but these are the main three; if you’re “pro-” or “anti-” nostalgia, a version of your central thesis inevitably falls somewhere in this paragraph. And in all three cases, both sides of the debate are built around that magical bridge between art and the experience of being alive. It’s always based on the premise that we are nostalgic for things that transport us back to an earlier draft of ourselves, and that this process of mental time travel is either wonderful or pathetic (because that’s certainly how it feels).

But what if this is just how we explain it? What if nostalgia has less to do with our own lives than we superficially assume?

What if the feeling we like to call “nostalgia” is simply the byproduct of accidental repetition?

Stare at a photograph of someone you dated long, long ago. The emotional reaction you’ll have (unless you’re weird or depressed or kind of terrible) is positive; even if this person broke your heart, you will effortlessly remember all the feelings you had that allowed your heart to be broken. This is real nostalgia: You are looking at something that actively reminds you of your past (and which exists solely for that purpose), and you’re reimagining the conditions and circumstances surrounding that image. But you’re probably not judging the quality of the photo. You probably don’t think, “You know, it’s impossible for me to tell if the composition and framing of this picture is professional, because I remember too much about the day it was taken.” You probably aren’t concerned with overrating the true inventive prowess of whoever snapped the photo. The picture is just a delivery device for the memory. This is why thinking about old music (or old films, or old books) is so much more complicated and unclear: It’s not just that we like the feeling that comes along with the song. We like the song itself. The song itselfsounds good, even if we don’t spend a second thinking about our personal relationship to when we originally heard it. Yet we still place this sonic experience into the category of “nostalgic appreciation,” because that seems to make the most sense.

Except that it doesn’t.

It doesn’t make sense to assume any art we remember from the past is going to automatically improve when we experience it again, simply because it has a relationship to whatever our life used to be like. We may not even remember that particular period with any clarity or import. These things might be connected, but they might also be unrelated. Obviously, some songs do remind us of specific people and specific places (and if someone were to directly ask you “What songs make you nostalgic?,” these are the tracks you’d immediately list). But so many other old songs only replicate that sensation. The song connects you with nothing tangible, yet still seems warm and positive and extra-meaningful. It’s nostalgia without memory. And what this usually means is that you listened to that particular song a lot, during a stage in your life when you listened to a smaller number of songs with a much higher frequency. It might have nothing to do with whatever was happening alongside that listening experience; it might just be that you accidentally invested the amount of time necessary to appreciate the song to its fullest possible extent. What seems like “nostalgia” might be a form of low-grade expertise that amplifies the value of the listening event.

Here’s what I mean: For at least one year of my life, I had only six cassettes. One of these was Ozzy Osbourne’s Bark at the Moon, which (as an adult) I consider to be the third or fourth-best Ozzy solo album. But it’s definitely the Ozzy release I’ve listened to the most, purely because I only had five other tapes. It’s entirely possible I’ve listened to Bark at the Moon more than all the other Osbourne solo albums combined.

Now, the first song on side two of Bark at the Moon is titled “Centre of Eternity.” It’s a bit ponderous and a little too Ozzy-by-the-numbers. It means absolutely nothing to me personally and doesn’t make me long for the days of yore; until I started writing this essay, I hadn’t listened to it in at least 10 years. But as soon as I replayed it, it sounded great. Moreover, it was a weirdly complete listening experience — not only did I like the song as a whole, but I also noticed and remembered all the individual parts (the overwrought organ intro, how Jake E. Lee’s guitar was tuned, when the drums come in, the goofy sci-fi lyrics, etc.). There may be a finite amount one can “get” from this particular song, but — whatever that amount is — I got it all. And this is not because of any relationship I’ve created between “Centre of Eternity” and my life from the middle 1980s, most of which I don’t remember or even care about. It’s because the middle ’80s were a time when I might lay on my bed and listen to a random Ozzy song 365 times over the course of 12 months. It’s not an emotional experience. It’s a mechanical experience. I’m not altering the value of “Centre of Eternity” by making it signify something specific to me or my past; I’ve simply listened to it enough to have multiple auditory experiences simultaneously (and without even trying). The song sounds better than logic dictates because I (once) put in enough time to “get” everything it potentially offers. Maybe it’s not that we’re overrating our memories; maybe it’s that we’re underrating the import of prolonged exposure. Maybe things don’t become meaningful unless we’re willing to repeat our interaction with whatever that “thing” truly is.

And this, I think, is what makes our current “nostalgia problem” more multifaceted than the one we had 10 years ago. This process I just described? The idea of accidentally creating a false sense of nostalgia though inadvertent-yet-dogged repetition? That’s ending, and it’s not coming back.

In the year 2011, I don’t know why anyone would listen to any song every day for a year. Even if it was your favorite song, it would be difficult to justify. It would be like going to the New York Public Library every morning and only reading Jonathan Livingston Seagull. Music is now essentially free, so no one who loves music is limited by an inability to afford cassettes. Radio is less important than it used to be (which means songs can’t be regularly inflicted on audiences), MTV only shows videos when no one is watching, and Spotify is a game-changer. Equally important is the way modern pop music is recorded and produced: It’s consciously designed for digital immediacy. Listen to the first 90 seconds of Rihanna’s album Loud — if you don’t love it right away, you’re not going to love it a month from now. There’s also been a shift in how long a critic (professional or otherwise) can be expected to hear a product before judging its value. This is especially true for albums that are supposed to be important; most meaningful responses to Radiohead’s The King of Limbs and Kanye and Jay-Z’s Watch the Throne happened within 24 hours of their embargoed release. When someone now complains that a song is being “played to death,” it usually just means it’s been licensed to too many commercials and movie trailers.

Now, no one can irrefutably declare that this evolution is bad, good, or merely different; it seems like it will (probably) be negative for artists, positive for casual consumers, and neutral for serious music fans. But it’s absolutely going to change what we classify (rightly or wrongly) as “nostalgia.” It won’t eliminate it, but it will turn it into something totally unlike the way things are now.

Of course, if you hate nostalgia, this seems like good news. “Excellent,” you probably think. “Now I won’t have to listen to people trying to convince me that Pearl Jam’s No Code is awesome, based on the argument that they used to listen to Pearl Jam in high school.” From a practical standpoint, there’s no historical loss to the genocide of self-made nostalgia; the Internet will warehouse what people’s minds do not. (Since the Internet is a curator-based medium, it’s also a naturally backward-looking medium.) People won’t need to “remember” Pearl Jam in order for Pearl Jam to survive forever; in 500 years, we will still have a more complete, more accurate portrait of Eddie Vedder than of Mozart or John Philip Sousa or Chuck Berry, even if no one in America is still aware that a song titled “Jeremy” once existed. It’s uncomfortable to admit this, but technology has made the ability to remember things borderline irrelevant. Having a deep memory used to be a real competitive advantage, but now it’s like having the ability to multiply four-digit numbers in your head — impressive, but not essential.

Yet people will still want to remember stuff.

People enjoy remembering things, and particularly things that happened within their own lifetime. Remembering creates meaning. There are really only two stages in any existence — what we’re doingnow, and what we were doing then. That’s why random songs played repeatedly take on a weight that outsizes their ostensive worth: We can unconsciously hear the time and thought we invested long ago. But no one really does this anymore. No one endlessly plays the same song out of necessity. So when this process stops happening — when there are no more weirdos listening to “Centre of Eternity” every day for a year, without even particularly liking it — what will replace that experience?

I suspect it will be replaced by the actions of other people.

Connectivity will replace repetition. Instead of generating false nostalgia by having the same experience over and over, we will aggregate false nostalgia from those fleeting moments when everyone seemed to be doing the same thing at once. It won’t be a kid playing the same song 1,000 times in a row; it will be that kid remembering when he and 999 other people all played the same song once (and immediately discussed it on Twitter, or on whatever replaces Twitter). It will be a short, shared experience that seems vast enough to be justifiably memorable. And I don’t know what that will feel like, and I don’t know if it will be better or worse. But I’m sure it will make some people miss the way things used to be.

10/04/11
youmightfindyourself:

White Walls, Consumerism: When my friend Joanne McNeil asked me about my thoughts on the origins of whiteness as signifier of high-tech. I ignored any associations to traditional cultural meanings (death, life, good, bad, purity, what-have-you) and biology (teeth, rice, ivory) and instead challenged myself to look at technology as an historical artifact of modern science. It did not occur to me until after the English riots were disparaged as “shopping riots,” and just as I experienced the heady consumer-high from my first new cell phone purchase in over five years - a white iPhone 4 - to look at it Whiteness an artifact of consumerism.
What Joanne was asking me about was the exterior whiteness of consumer electronics - this is very different from the interior whiteness of control rooms and linen undershirts - it is more akin to the exterior whiteness of the earliest experiments in Modernist architecture and the white dresses worn by nurses. The whiteness of my new iPhone is not an expression of my personal aspirations - there are millions of other people with the exact same phone. The whiteness is an expression of a public aspiration. To borrow a term from Jean-François Lyotard the “narrative knowledge” that girds consumerism is not too different from the metanarratives that gird “scientific knowledge.”
While Republican candidates are more than happy to pander to a crowd that cheers the death of someone who can’t afford health insurance, but most Americans to not want to believe that their gain comes at someone else’s expense. The American dream which is the “master narrative” of consumerism was never a simple promise of personal progress; of “I get MINE, and to hell with you.” What fun would a jetpack be if you had the only one? It would be almost as worthless as owning the world’s only fax machine. If you are king it might be cool to own the only one of something, but in a consumer society you would look like a bigger jerk than a Segway owner. I didn’t buy an iPhone because no one has one. I bought one because everyone I know has one. That means I have lots of people to tell me what the best apps are and to share the features of the phone.
Americans might dream of having a good job that enables them to own a nice car and a charming house in a pleasant place, but an element of that dream is that they are not alone in that pleasure. That along with millions of others, they are enjoying luxuries that once only a few kings might have hoped for. The dream was never to exclude that wealth from the rest of the world, it was explicit that billions more people would someday soon enjoy the justice, stability, mobility, and material pleasures of modern life. Millions of Americans fought and died for that dream in the twentieth century. In the 21st they were convinced to fight an enemy that “hated their freedom.” It is not clear to me that they would have fought to keep their big cars and cheap fuel prices if that had been the justification that had been given.
It was clear this summer that the super-rich consumer elite have abandoned the “we” aspect of the American Dream. It is not at all clear that consumer culture can survive without it. No matter what wonder scientific progress is able to deliver, it is meaningless if it is attached to a dream that no one buys. Whatever else high-tech whiteness is, it would be a truly tragic thing to be nostalgic for. 

youmightfindyourself:

White Walls, Consumerism: When my friend Joanne McNeil asked me about my thoughts on the origins of whiteness as signifier of high-tech. I ignored any associations to traditional cultural meanings (death, life, good, bad, purity, what-have-you) and biology (teeth, rice, ivory) and instead challenged myself to look at technology as an historical artifact of modern science. It did not occur to me until after the English riots were disparaged as “shopping riots,” and just as I experienced the heady consumer-high from my first new cell phone purchase in over five years - a white iPhone 4 - to look at it Whiteness an artifact of consumerism.

What Joanne was asking me about was the exterior whiteness of consumer electronics - this is very different from the interior whiteness of control rooms and linen undershirts - it is more akin to the exterior whiteness of the earliest experiments in Modernist architecture and the white dresses worn by nurses. The whiteness of my new iPhone is not an expression of my personal aspirations - there are millions of other people with the exact same phone. The whiteness is an expression of a public aspiration. To borrow a term from Jean-François Lyotard the “narrative knowledge” that girds consumerism is not too different from the metanarratives that gird “scientific knowledge.”

While Republican candidates are more than happy to pander to a crowd that cheers the death of someone who can’t afford health insurance, but most Americans to not want to believe that their gain comes at someone else’s expense. The American dream which is the “master narrative” of consumerism was never a simple promise of personal progress; of “I get MINE, and to hell with you.” What fun would a jetpack be if you had the only one? It would be almost as worthless as owning the world’s only fax machine. If you are king it might be cool to own the only one of something, but in a consumer society you would look like a bigger jerk than a Segway owner. I didn’t buy an iPhone because no one has one. I bought one because everyone I know has one. That means I have lots of people to tell me what the best apps are and to share the features of the phone.

Americans might dream of having a good job that enables them to own a nice car and a charming house in a pleasant place, but an element of that dream is that they are not alone in that pleasure. That along with millions of others, they are enjoying luxuries that once only a few kings might have hoped for. The dream was never to exclude that wealth from the rest of the world, it was explicit that billions more people would someday soon enjoy the justice, stability, mobility, and material pleasures of modern life. Millions of Americans fought and died for that dream in the twentieth century. In the 21st they were convinced to fight an enemy that “hated their freedom.” It is not clear to me that they would have fought to keep their big cars and cheap fuel prices if that had been the justification that had been given.

It was clear this summer that the super-rich consumer elite have abandoned the “we” aspect of the American Dream. It is not at all clear that consumer culture can survive without it. No matter what wonder scientific progress is able to deliver, it is meaningless if it is attached to a dream that no one buys. Whatever else high-tech whiteness is, it would be a truly tragic thing to be nostalgic for. 

09/19/11

youmightfindyourself:

In 1996, the US computer entrepreneur Brewster Kahle set up theInternet Archive, its mission being to provide “universal access to all knowledge”. This admirable project strives to store copies of every single web page ever posted: a ghostly archive of the virtual. So what are we to make of the fact that, a decade and a half later, this digital pioneer is turning from bytes to books? In what seems, on the face of it, an act of splendid perversity, Kahle has set up a series of converted shipping containers in California where he hopes to create another archive – one that contains a copy of every book ever published.

His action touches on an anxiety. Are books, like defunct internet pages, heading towards the point where they will be archived as an academic curiosity? Some think so. You won’t find any shortage of people willing to pronounce the printed book doomed, arguing that the convenience and searchability of digital text and the emergence of a Kindle-first generation will render them obsolete.

Certainly, electronic books have overcome their technological obstacles. Page turns are fast enough, battery life is long enough, and screens are legible in sunlight. Digital sales now account for 14% of Penguin’s business. But there are reasons to reject the idea that the extinction of the printed book is just around the corner, just as there were reasons to reject the notion that e-books would never catch on because you couldn’t read them in the bath and, y’know, books are such lovely objects.

Personally, I’m still in the habit of paperbacks. Much of my professional life is spent reviewing, and I like to scribble on my books and bend the pages back. Plus I can’t be arsed figuring out how to get publishers to send e-versions. But habit is all it is. I’ve no hostility to digital. I’ve spent a good deal of time with the Kindle, and it does the trick.

In some ways, though, the question of whether we do our reading off paper or plastic is the least interesting one. More interesting is what we’re reading, and the manner in which we do so. A large number of literate westerners spend most of their waking hours at computers, and those computers are connected to the web. The characteristic activity on such a computer has been given the pleasing name “wilfing”, adapted from the acronym WWILF, or “What was I looking for?” You work a bit. You check if it’s your move in Facebook Scrabble. You get an email. You answer it. You get a text. You answer it. Since your phone’s in your hand, you play Angry Birds for five minutes. You work a bit. You go online to check something, get distracted by a link, forget what you were looking for, stumble on a picture of a duck that looks like Hitler, share it on Twitter, rinse and repeat.

Sci-fi author Cory Doctorow has called the internet “an ecosystem of interruption technologies”. TS Eliot’s line “distracted from distraction by distraction” seems apt. Zadie Smith, among other writers, has said that the key to the sustained attention required to create a novel is to work on a computer that isn’t online. You could call wilfing multitasking, or parallelistic cognitive layering – or you could call it cocking around on the web. Whatever, it’s fair to wonder what, if anything, it is doing to our heads.

There are two main schools of thought. One is that modern culture is making us cleverer. In Everything Bad Is Good for You, Steven Johnson observes that IQ scores in the west are rising, and argues that pop culture – from soaps to video games to the web – is responsible. In the other corner is Nicholas Carr, author of The Shallows: How the Internet Is Changing the Way We Think, Read and Remember. He thinks the web is making us more stupid. We surf the shallows in a state of permanent distraction, and concentrate on no single thing for long enough to engage properly with it. Since much of our mental energy is spent processing the medium, little is left for the message. Carr, then, is a descendent of Plato, who mistrusted writing because he thought people would stop bothering to know anything if it was all there in books.

What both seem to agree on, however, is that a defining characteristic of digital culture is that it divides the attention. That has become a fact about the texture of our lives. We experience anxiety, fragmentation, semiotic overload. It seems logical to conclude, in that case, that books – particularly fiction – will not just be read in such an environment; they will also seek to reflect it.

Already, there’s evidence of this. If it really were the case that our attention spans are shortening, you might expect to see a wholesale revival of interest in short stories, or even lyric poems, and a tendency for full-length books to shrink. But we’re not seeing that. Instead we’re seeing Wolf HallFingersmithThe Crimson Petal and the WhiteThe CorrectionsUnderworldInfinite JestTree of Smoke, and fat Stephen King after fat Stephen King.

These books may resemble 19th-century novels in size, but the pace of even the most traditional of them is faster. And in many – especially in Infinite Jest, a novel about addiction, entertainment, radioactive rodents and tennis – you can see a conscious attempt to engage with the phenomenon of information overload. You don’t need to write a novel in tweets to write a novel about the experience of living in the age of Twitter.

An app for The Waste Land

But we can expect other changes. Most commercial writing is shaped by the market, and the market is shaped by the formats that have become standard; and those have been shaped by issues of portability, how wide you can make the spine without it breaking, the sizes of printing presses, and so forth.

Most novels, for example, have 300-350 B-format (ie standard paperback) pages. Deviate from this format drastically and your novel won’t make it to the front tables of the bookshop. This means relatively few publishers do; and, in turn, the literary culture is shaped by that.

So, even if they now seem natural, the lengths and formats of books are but cultural accidents. If this all goes, there will be consequences for the shape, size and format of prose narrative. How far is up for debate. Fiction by mobile phone is still essentially at the gimmick stage in Europe. But in Japan, keitai shosetsu, or “cellphone novels”, are composed on mobile phones and posted to media-sharing sites before being published in hardback. As long ago as 2007, keitai shosetsu accounted for four out of the top five literary bestsellers in Japan.

That’s a bit ahead of us. But blogs-to-books is already a well-established pathway in the UK. Online serialisation, interactive narratives (AKA reinventing the Choose Your Own Adventure), even books written to order for subscribers (see unbound.co.uk) are starting to emerge. Experimentalists such as Kate Pullinger – who alongside her straight literary novels has masterminded various experiments in narrative, including the collaborative online novel Flight Paths – are likely to remain marginal, but those margins are getting bigger. The boundaries of the book – as the success of apps for The Waste Land and On the Road, giving access to edits, revisions and encyclopedic paraphernalia – are becoming much more plastic, much less fixed.

There’s nothing really all that new going on in kind, of course. Footnotes have always allowed you to nest one text in another; bibliographies have always embedded texts in a network of other texts; and the concordance (an index to every word in a work) looks very much like a search engine to me. But the difference in degree that hypertext (those endlessly distracting blue links) and electronic searches bring to those things can scarcely be overstated.

I remember hearing the story of the person who walked into a room and saw another man there, apparently transfixed by an object on a table. He appeared to the onlooker to be in some important way absent – there and not there, as if his soul had left his body. The onlooker, who had never seen someone reading a book before, concluded that he had been possessed.

That present-but-absent quality, the essential solitude of the experience of “escaping into a book”, may no longer be the over-riding experience. The mass availability of electronic texts – infinitely reproducible and available almost everywhere – makes communal reading possible way past the book-group level. The same copy of a text can be read, annotated, and potentially even edited by any number of people at the same time.

So what we can expect from books is what the internet has always given us. More. More of everything. But what of taking in continuous prose, in the form conventionally known as ”reading”? One way or another, that’s here to stay. Now, if Brewster Kahle could only find me a comfy chair and some room in one of his shipping containers, I could get back on the job.

08/18/11