The New Evolution
Is technology providing the ladder to the next stage of human consciousness?
By Josh Rosenblatt, Fri., March 4, 2011
If you happen to be roaming the halls of the Austin Convention Center during South by Southwest, you'll be forgiven for wondering if all the constant exposure to technology on display doesn't indicate some kind of serious pathology. If needing to be engaged with a Blackberry or an iPhone or an iPad or an iPod or a Kindle or a laptop or a MP3 player or a digital camera (or all seven) at all times isn't just as unhealthy as needing a cigarette, a beer, a needle, or an orgasm (or all four). Doesn't any compulsive, pathological behavior (you might ask yourself) imply addiction?
There are some in the world of medicine and social science who would say "absolutely." For them, the argument is simple: How can you spend 10 hours straight playing online video games or 12 hours straight surfing the Web or all day updating your Facebook page or all night checking your e-mail and not be considered sick? How can it be healthy to need constant contact with anything? These advocates call the condition Internet addiction disorder.
There are others, though, who say there's no proof that all this seemingly compulsive behavior is actually doing us any real physical, psychological, or emotional harm, that Internet addiction is merely an anecdotal condition. University of Texas associate professor S. Craig Watkins, for one, told me that obsessive Internet use, though problematic, simply may not meet the criteria for an American Psychiatric Association definition of addiction. "Addiction is behavior that causes some type of injury – physical injury, physiological injury, economic injury," Watkins says. "The extent to which the Internet leads to those types of outcomes in a widespread and recognizable way, that I'm not convinced of yet."
While researching for his 2009 book, The Young and the Digital: What the Migration to Social Network Sites, Games, and Anytime, Anywhere Media Means for Our Future, Watkins says he looked closely at what he calls some young people's "problematic" relationships with the Internet and other media but determined that much of the talk about Internet addiction, and its sinister byproducts, may, in the end, be nothing but talk.
"What I see in terms of young people's use of the Internet and social media is that they're not – like some people worry they are – using social media as a substitute for interacting face-to-face with peers and acquaintances," Watkins says. "We don't see people using Facebook to the point where they have to drop out of school or where they're fired or put on probation at work because they can't function."
For Watkins, who has spent the last five years researching young people and their relationships to technology, the concern isn't really about addiction. He says the real challenge facing social scientists curious about the effects of technology on the brain – not to mention society at large, which is in a constant state of panic about what youth are getting mixed up and messed up by – is not addiction but attention. Or rather the lack of attention.
"By constantly engaging with media, and often more than one piece of media at a time, are we cultivating behaviors and norms that make it difficult to maintain a certain amount of focus and attention on one central or primary task, like lectures or homework or our workplaces?" Watkins asks. "Are we cutting down on productivity and quality of work because we're losing the ability to pay attention?"
Here you have a generation of people, Watkins says, which has portioned out its attention to several different things simultaneously for most of their lives: writing an e-mail while watching TV while reading a tweet while driving a car. You'd have to be naive not to wonder if that kind of sensory-overload approach to information intake could lead to an inability to focus.
But isn't there a flip side to that coin? Aren't human beings who are raised from birth in an environment that demands handling multiple sources of information and interaction at once simply capable of different things – ways of learning or even ways of being – than those who aren't or weren't? Isn't there a trade-off between not being able to concentrate on one thing and being able to concentrate on several things at once, especially in a world that has rapidly become a swirling eddy of excess stimulation, multitasking, and rapid-fire interaction?
Could it be that all this seemingly obsessive, potentially addictive immersion in constant stimuli is actually making us smarter?
These questions make me think of an April 2005 article I read in The New York Times Magazine (one that, not for nothing, my editor was able to find for me with a just a few simple, practiced clicks of the mouse) called "Watching TV Makes You Smarter" by Steven Johnson. In it, Johnson argues that today's television shows require far more cognitive engagement and effort from viewers than those made 20 years ago, that we have been trained and have trained ourselves to be capable of understanding shows with longer narrative threads, larger casts, and more involved dialogue than anyone would have thought possible when television first appeared.
"For decades," Johnson writes, "we've worked under the assumption that mass culture follows a path declining steadily toward lowest-common-denominator standards, presumably because the 'masses' want dumb, simple pleasures and big media companies try to give the masses what they want." In fact, the opposite is true: To appreciate shows like The Wire and Lost, audiences had to keep dozens of interwoven story threads in their heads. They had to be comfortable with a script that doesn't concern itself with easy answers, immediate resolution, or even clarity. They had to resign themselves to story arcs that could take entire seasons to unfold.
"Think of the cognitive benefits conventionally ascribed to reading: attention, patience, retention, the parsing of narrative threads," Johnson writes. "Over the last half-century, programming on TV has increased the demands it places on precisely these mental faculties."
And the human brain has responded to those demands by becoming different, and better, than it was before. More able to deal with layered storylines and less willing to suffer the glacial tedium of the single-thread narrative. The human brain has responded by evolving.
Johnson's claims are in stark contrast to those of Nicholas Carr, whose 2010 book The Shallows: What the Internet Is Doing to Our Brain has become the hub of around which the pro-Internet/anti-Internet debate seems to turn, at least in intellectual circles. Carr worries that the Internet, along with its promise of immediate and overwhelming access to information, is stripping us of the ability to concentrate and think with any real depth. He fears the kind of rapid-fire sensory collecting we do on the Web is turning us into "chronic scatterbrains," unable to even read a book for more than a few minutes at a time.
In a recent online interview for the website Big Think, Carr said: "[W]hat the web seems to be doing, and a lot of the proponents of the web seem to be completely comfortable with, it's pushing us all in the direction of skimming and scanning and multitasking, and it's not encouraging us or even giving us an opportunity to engage in more attentive ways of thinking. And so, I think losing those abilities may – we may end up finding that those are actually the most valuable ways of thinking that are available to us as human beings."
But where Carr sees intellectual and mental erosion, others see the ability to build and understand entirely new landscapes. In Johnson's television world, this new thinking is necessary to comprehend what he calls "multi-threaded dramas." Watkins speaks about "multitasking" and "multiplatforms" and "multimedia." What they can all agree on is that this brave new world is a hydra. What we choose to do with it and how we are able to adapt to it will make the difference between sinking or swimming, between addiction and evolution.
So, is it possible that words like "pathology" or "addiction" are just a pejorative for an older generation to describe an evolution that the younger generation is going through? Is it possible that 50 years from now our success as individuals will be dependent on our ability to sit for 10 hours a day engaging with multiplatform social media and video games? Is today's addiction disorder tomorrow's road to success?
To put it in evolutionary terms, there might be an argument to be made for the idea that we have reached a point where technology outpaces evolution and that the things a young person deals with daily are changing the brain physically and forcing it to do things brains 50 years ago would never have been able to do.
It's a common idea in media studies. Neurobiologists and social scientists and new-media theorists are always stumbling upon new evidence they say proves that the Internet and video games and cell phones are changing the actual physical structure of the brain, creating new neural pathways that make us capable of handling faster, more all-consuming virtual interaction. Carr sees this and wonders what all this neural sleight of hand is doing to our ability to contemplate the world with anything approaching nuance.
"You may grow new neurons that are then recruited into these circuits, or your existing neurons may grow new synaptical terminals. And again, that also serves to strengthen the activity in those, in those particular pathways that are being used – new pathways," Carr said. "On the other hand, you know, the brain likes to be efficient, and so even as it's strengthening the pathways you're exercising ... it's weakening the connections in other ways between the cells that supported old ways of thinking or working or behaving, or whatever that you're not exercising so much. ... As we adapt to that information environment, so to speak, we gain certain skills, but we lose other ones."
These "other ones" – "contemplativeness, reflection, introspection ... solitary ways of thinking" – are the means by which Carr believes we build "a rich intellectual life."
Watkins, meanwhile, takes a slightly less dire approach to the discussion, choosing to see the physiological effects the Internet has on the brain not as degradation but as an evolutionary step forward. "We are steadily evolving into a unique and distinct type of species, based in part on the ways in which we are using technology," he says. "And our excessive, very intense, very robust engagement with technology will inevitably lead to a different brain, different circuitry system, different functions."
Somewhere in the middle of this debate sits Daniel Hope, an Austin-based counselor and life coach who focuses on the effects social media have on our relationships. He is hosting an Interactive panel this year called Why Everything Is Amazing but Nobody Is Happy to try and decipher whether social media is throwing off the balance between technological curiosity and spiritual meaning.
"Social networking has such a powerful emotional effect on people that I believe there are ways that can be used in a positive way just as easily as it is abused or could misdirect us away from important things like relationships," he says. "The rules of life are simple: They come down to gratitude, economy, commitment, and if we're not pursuing them and instead we're pursuing the next iPad or the next touch screen or the next video game – if that's where we're getting our sense of meaning from – then we're going to be miserable, by definition."
So where does Hope fall on this question of technology and the development of the species? Is it providing the ladder to the next stage of human consciousness or the quicksand for us to sink into emotionally, spiritually, even morally? To quote Hope, is social media "ushering us into the next stage of human evolution or is it just making monkeys out of us?"
Here, the life coach has no firm answer. "The picture that comes to mind is someone hunched over a computer where their eyesight is failing, their muscles have atrophied because they're not going out and doing things. That does not seem like the model of an evolutionary upward trajectory," he says. "But at the same time, I think that we find ourselves right now in the most overwhelming and most stimulating environment in the history of human existence, and I think that we're learning right now how to live with all of these stimulants in our lives, all of this constant stimulation."
Maybe there is such a thing as Internet addiction disorder, but if there is, it might actually be the necessary condition for our next great leap forward. Maybe to become that "unique and distinct type of species" Watkins described, we all need to become media addicts. There's a good chance we are all already there. Surely all 20,000 of this year's Interactive attendees could be diagnosed with Internet addiction. As could you, I'm guessing. As could anyone who spends more than a few minutes a day staring at a computer screen or talking on a smart phone. God knows I could.
But diagnoses lose meaning when they're handed out to everybody. And at some point, doesn't pathological behavior become simply behavior? And doesn't addiction become simply ... the way we are now?
Related Panels
Why Everything Is Amazing but Nobody Is Happy
Friday, March 11, 3:30pm, ACC 5ABC
Can the Internet Make Us Happy?
Sunday, March 13, 9:30am, Hyatt, TX Ballroom 5-7
I'm So Productive, I Never Get Anything Done
Monday, March 14, 11am, ACC 10AB
Felicia Day: Monday Keynote
Monday, March 14, 2pm, ACC Ballroom D