Excerpt
Mind Change
1
Mind Change
A Global Phenomenon
Let’s enter a world unimaginable even a few decades ago, one like no other in human history. It’s a two-dimensional world of only sight and sound, offering instant information, connected identity, and the opportunity for here-and-now experiences so vivid and mesmerizing that they can outcompete the dreary reality around us. It’s a world teeming with so many facts and opinions that there will never be enough time to evaluate and understand even the smallest fraction of them. For an increasing number of its inhabitants, this virtual world can seem more immediate and significant than the smelly, tasty, touchy 3-D counterpart: it’s a place of nagging anxiety or triumphant exhilaration as you are swept along in a social networking swirl of collective consciousness. It’s a parallel world where you can be on the move in the real world, yet always hooked into an alternative time and place. The subsequent transformation of how we might all be living very soon is a vitally important issue, perhaps even the most important issue of our time.1 Why? Because it may be that a daily existence revolving around smartphone, iPad, laptop, and Xbox is radically changing not just our everyday lifestyles but also our identities and even our inner thoughts in unprecedented ways.2 As a neuroscientist, I’m fascinated by the potential effects of a screen-oriented daily existence on how we think and what we feel, and I want to explore how that exquisitely adaptable organ, the brain, may now be reacting to this novel environment, recently dubbed the “digital wildfire.”3
In the developed world, there is now a one in three chance that children will live to 100 years of age.4 Thanks to the advances of biomedicine, we can anticipate longer and healthier lives; thanks to technology, we can foresee an existence increasingly freed from the daily domestic grind that characterized the lives of previous generations. Unlike so much of humanity in the past and still in many nightmare scenarios around the world, we take it as the norm and as our entitlement not to be hungry, cold, in pain, or in constant fear for our lives. Unsurprisingly, therefore, there are many in our society who are convinced that we’re doing just fine, that these digital technologies are not so much a raging wildfire but more of a welcoming hearth at the heart of our current lifestyles. Accordingly, various reassuring arguments are ready at hand to counter reservations and concerns that might otherwise be viewed as exaggerated, even hysterical.
One starting premise is that surely everyone has enough common sense to ensure that we don’t let the new cyberculture hijack daily life wholesale. Surely we are sensible and responsible enough to self-regulate how much time we spend online and to ensure that our children don’t become completely obsessed by the screen. But the argument that we are automatically rational beings does not stand the test of history: when has common sense ever automatically prevailed over easy, profitable, or enjoyable possibilities? Just look at the persistence of hundreds of millions worldwide who still spend money on a habit that caused a hundred million fatalities in the twentieth century and which, if present trends continue, promises up to one billion deaths in this century: smoking.5 Not much common sense at work there.
Then again, the reliability of human nature might work in our favor if only we could assume that our innate genetic makeup leads most of us to do the right thing, regardless of any corrupting external influences. Yet in itself, this idea immediately runs counter to the superlative adaptability of the human brain, which allows us to occupy more ecological niches than any other species on the planet. The Internet was initially created as a way for scientists to contact each other, and this invention spawned phenomena such as 4chan, a collection of message boards where people post images and short text comments, mostly anonymously and with no holds barred.6 This form of self-expression is a new niche to which we may adapt, with consequences as extreme as the medium itself. If it is the hallmark of our species to thrive wherever we find ourselves, then the digital technologies could bring out the worst in human nature rather than being rendered harmless by it.
Another way of dismissing out of hand the concerns that the effects of digital technology may bring is a kind of solipsistic stance in which the screen enthusiast proudly points to his or her own perfectly balanced existence, which combines the pleasures and advantages of cyberculture with life in three dimensions. Yet psychologists have been telling us for many years that such subjective introspection is an unreliable barometer of mental state.7 In any case, it should be obvious enough that just because a single individual may be able to achieve an ideal mix between the virtual and the real, it does not automatically mean that others are capable of exercising similar restraint and sound judgment. And even those individuals who think they’ve got everything just right will often admit in an unguarded moment that “It’s easy to waste a lot of time on Facebook,” that they are “addicted” to Twitter, or that, yes, they do find it hard to concentrate long enough to read a whole newspaper article. In the United Kingdom, the advent of I, an abbreviated version of the national quality paper The Independent, and the introduction on the BBC of the 90 Second News Update stand as testimony to the demands of an ever larger constituency of readers and viewers—not just the younger generation—who have a reduced attention span and are demanding print and broadcast media to match.
Another consolation is the conviction that the next generation will work out just fine, thanks to parents who take control and intervene where necessary. Sadly, this idea has already proved to be a nonstarter. For reasons we shall explore shortly, parents often complain that they cannot control what their offspring do online, and many already despair at their inability to prize their children away from the screen and back into a world of three dimensions.
Marc Prensky, an American technologist, coined the term “Digital Native” for someone defined by his or her perceived outlook and abilities, based on an automatic facility and familiarity with digital technologies.8 By contrast, “Digital Immigrants” are those of us who, according to Prensky, “have adopted many aspects of the technology, but just like those who learn another language later in life, retain an ‘accent’ because we still have one foot in the past.” It is unlikely that anyone reading these words will not have strong views as to which side of the divide he or she belongs and whether the distinction is cause for unalloyed celebration or deep anxiety. Generally speaking, it corresponds to age, although Prensky himself did not pinpoint a specific line of demarcation. The date of birth of the Digital Native seems therefore to be uncertain: we could start as far back as the 1960s, when the term “computer” entered into common parlance, or as late as 1990, for by the time a young Digital Native born then could read and write, email (which started around 1993) would have become an inescapable part of life.
The important distinction is that Digital Natives know no other way of life other than the culture of Internet, laptop, and mobile. They can be freed from the constraints of local mores and hierarchical authority and, as autonomous citizens of the world, will personalize screen-based activities and services while collaborating with, and contributing to, global social networks and information sources.
But a much gloomier portrait of the Digital Native is being painted by pundits such as the British American author Andrew Keen:
MySpace and Facebook are creating a youth culture of digital narcissism; open-source knowledge sharing sites like Wikipedia are undermining the authority of teachers in the classroom; the YouTube generation are more interested in self-expression than in learning about the world; the cacophony of anonymous blogs and user-generated content is deafening today’s youth to the voices of informed experts.9
Then again, perhaps the Digital Native doesn’t actually exist after all. Neil Selwyn, of the Institute of Education in London, argues that the current generation is actually no different from preceding ones: young people are not hardwired to have unprecedented brains.10 Rather, many young people are using technology in a far more sporadic, passive, solitary, and, above all, unspectacular way than the hype of the blogosphere and zealous proponents of cyberculture might have us believe.
Irrespective of whether the digital age has spawned a new type of superbeing or just ordinary humans better adapted to screen life, suffice it to say that, for the moment, parents are most likely to be Digital Immigrants and their children Digital Natives. The former are still learning the enormous potential of these technologies in adulthood, while the latter have known nothing else. This cultural divide often makes it hard for parents to know how best to approach situations that they intuitively perceive to be a problem, such as seemingly excessive time spent on computer-based activities; meanwhile, children may feel misunderstood and impatient with views they regard as inappropriate and outdated for present-day life.
Although reports and surveys have focused largely on the next generation, the concerns I want to flag are not limited to the Digital Native alone. Far from it. But a generational divide has undoubtedly arisen from the vertiginous increase in the pace of ever smarter digital devices and applications. What will be the effects on each generation, and on the relationship between them?
In a 2011 report, Virtual Lives, researchers for the U.K. children’s charity Kidscape assessed the online activities of more than two thousand children between the ages of eleven and eighteen. Just under half of the children questioned said they behaved differently online compared to their normal lives, with many claiming it made them feel more powerful and confident. One explained: “It’s easier to be who you want to be, because nobody knows you and if you don’t like the situation you can just exit and it is over.” Another echoed this sentiment, noting: “You can say anything online. You can talk to people that you don’t normally speak to and you can edit your pictures so you look better. It is as if you are a completely different person.” These findings, the report argues, “suggest that children see cyberspace as detachable from the real world and as a place where they can explore parts of their behavior and personality that they possibly would not show in real life. They seem unable to understand that actions online can have repercussions in the real world.”11 The easy opportunity of alternative identity and the notion that actions don’t have consequences have never previously featured in a child’s development, and they are posing unprecedented questions as to what might be for the best. While the brain is indeed not hardwired to interface effectively with screen technologies, it has evolved to respond with exquisite sensitivity to external influences—to the environment it inhabits. And the digital environment is getting ever more pervasive at an ever younger age. Recently Fischer-Price introduced a potty-training seat complete with an iPad holder,12 presumably to complement an infant lifestyle where the recliner in which the baby may spend many hours is also dominated by a screen.13
This is why the question of the impact of digital technologies is so very important. Hardened captains of industry or slick entrepreneurs will often sidle up to me during the coffee break at corporate events and let their professional mask slip as they recount in despair the obsessional fixation of their teenage son or daughter with the computer. But these anxieties remain unchanneled and unfocused. Where can these troubled parents share their experiences with others on a wider platform and articulate them in a formal and cogent way? At the moment, nowhere. In the following pages, we’ll be looking at many studies on preteens as well as teenagers; unfortunately, there are far fewer studies on adults, perhaps because they are less cohesive and identifiable as a group than a volunteer student body or a captive classroom. But, in any event, it’s important to view the data not as a self-help guide for bringing up kids but rather as a pivotal factor in the bigger picture of society as a whole.