
I know no one's asking, but...
If you ask me, there are two problems with Americans today. One is that
we don't read anymore. And this could pose a serious problem in the Digital Age, according to psychologist
Maryanne Wolf, who fears if we don't first learn to read, "we may not form a brain...accustomed to automatic inference and critical thinking before the immersion in the digital world." Consequently, Wolf believes we need to deliberately form a reading brain that can transfer all its inferential skills to the digital environment where it can synthesize knowledge within an online context. Wolf goes on to quote Joseph Epstein who said "We are what we read," and argues that what our children read online is different from what was a part of our formation as readers in society. "Our brains have been trained to be critical minds that bring to bear everything that we know and everything that we've read.... Part of our formation was to probe and think deeply through deep reading. Our children may become great decoders of information, but they may not be set to think deeply and go below the surface, in part, because of the way that the presentation exists online."
All this talk of delving below the surface brings me to Americans' second problem: we're becoming tragically shallow. And that's a problem, if
this study is any indication, because our collective well-being may suffer for it. What gives me the impression that we're becoming shallow? Well, playing with the numbers from the 1988 and 1998
Myers-Briggs Type Indicator (MBTI) tests, Americans are increasingly preferring action before reflection; desiring breadth over depth; making copious acquaintances rather than fewer substantive relationships; becoming less able to recognize patterns; leaving facts at face-value rather than integrating abstract constructs into the wider context; and making decisions based not on logic and reason but on what achieves the greatest balance among all stakeholders.
Comparing the results from the 1998
MBTI survey with those of its 1988 and 1991 predecessors, Americans are becoming less introverted (-3.0%) and more extroverted (+3.0%), less intuitive (-5.2%) and more sensing (+5.2%), less thinking (-12.7%) and more feeling (+12.7%), and less judging (-4.0%) and more perceptive (+4.0%). Specifically, we're increasingly acting before thinking; we're oriented more towards people and objects than concepts and ideas; we desire breadth over depth, even in our social interactions; we distrust intuition and abstraction and prefer concrete information; and we find meaning in the data itself rather than how the data relates to patterns and theories. We're also increasingly likely to keep decisions open and we make decisions less by following a given set of rules and more by empathizing with the situation and considering the needs of everyone involved in order to achieve the greatest harmony among those affected.
Looking at the personality types associated with famous people, one sees that with this increase in extroverted, sensing, feeling, and perceptive (
ESFP) types, we're gaining more entertainers like Michael Jackson, Elvis Presley, Bill Cosby, and John Travolta. We're also seeing more charismatic politicians who energize their bases but repel their political opponents like Bill Clinton and Ronald Reagan. We're fortunately seeing more selfless humanitarians like Mother Theresa, but we're also seeing increasing superficiality with a surge in iconic beauties like Jacqueline Kennedy Onassis, Marilyn Monroe, and Brooke Shields.
With less introversion, intuition, thinking, and judgment (
INTJ), we're losing influential historical leaders like George Washington, Thomas Jefferson, Abraham Lincoln, Alexander the Great, and Julius Caesar. We're seeing fewer non-violent human rights activists like Gandhi, Martin Luther King, and Nelson Mandela. We're witnessing the emergence of fewer scientists and other great thinkers like Charles Darwin, Albert Einstein, Isaac Newton, and Plato, and fewer inventors and innovators like Thomas Edison and Walt Disney.
In short, we're seeing more entertainment, charisma, physical beauty, and political polarization, and less foundational leadership, scientific discovery, and innovation. Interestingly, we're seeing fewer inspirational leaders who espouse ideas that compel us to improve the world, which may be less important with an increase in humanitarians. But perhaps all this charity work would be unnecessary if the nation weren't becoming a place where problems repeat themselves because patterns aren't recognized; solutions are arrived at in the most shallow and least analytical manner possible; and societal ills go largely unresolved as political polarization worsens, innovation decreases, and mindless entertainment placates the woes of superficial, social butterflies who act before they think. Our only saving grace is that more decisions will be made through consensus-building--that is, if they're made at all, which is less likely with the more indecisive ESFP types.
So what's causing this shift in personalities? Some believe the preference for breadth over depth and factoids over integrated knowledge is the hallmark of a postmodern society. But if so, what's driving this postmodern trend? A child psychologist by the name of
Alison Gopnik believes it may be the increase in the time spent in the education system which arrests people's psychological development and keeps them in a state of expert fact-gathering and skillful memorization at the expense of integration of knowledge and decision-making.
Another perspective comes from
Ray Bradbury, author of
Fahrenheit 451 (published in 1951), who names television as the culprit, believing it causes Americans to lose interest in literature and to favor factoids over integration of information (e.g. knowing Napoleon's birthday without knowing his significance in history). If this line of reasoning is true, then the problem can only worsen with the increase in both type and frequency of media consumption today. We're already seeing more extroverts directing their energies outward toward people (idolizing and modeling after celebrities; social networking; etc.) and objects (iPhones, iPads, clothes, and other material items). Today's technology may further drive extroverts to be more action-oriented (participating in reality shows; posting their own YouTube videos; tweeting; and, yes, blogging); to desire breadth (Googling for factoids and multitasking); and to prefer more frequent social interactions (texting, IM chatting, and "friending" hundreds of casual acquaintances).
I'd love to see what the MBTI stats look like today. It would be interesting to see if the explosion of Web 2.0 technologies has yielded more shallow personality types. A recent
Frontline documentary called "
Digital Nation" may provide a glimpse into the Web's influence. One notable trend was the infantilization of adults and their detachment from reality as they increasingly engage in virtual worlds (gaming, Second Life, etc.). The other--perhaps less surprising--trend was the alarming increase in multitasking among children and adolescents. As with most debates on multitasking, this documentary featured the usual baby-boomer technological cheerleaders glibly accusing their nay-saying counterparts of being "old fogies" and technophobic Luddites.
As a Gen-X'er observing from the sidelines, I must say I share the concerns of those "Luddites." I've always had my qualms about multitasking because it encourages cursory attention to information, making it virtually impossible to go deeper and find meaning in the disparate sources of information. Sharing these concerns, author Nicholas Carr
fears that the Internet with all its distractions and interruptions may turn us into "scattered and superficial thinkers." He noted the studies of developmental psychologist Patricia Greenfield who concluded that while certain computer tasks may enhance visual-spatial intelligence, this may be at the expense of losing "higher-order cognitive processes," such as "abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination."
Carr argues that the multitasking Internet environment deprives us of the chance to devote the requisite amount of attention necessary for associating new pieces of information with previously acquired and established knowledge so that we can master complex concepts. According to Carr, with all the Internet's distractions, "our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory."
Carr goes on to explain how book reading protects against this, saying that "reading a long sequence of pages helps us develop a rare kind of mental discipline" that gives us the "capacity to engage in the quieter, attentive modes of thought that underpin contemplation, reflection and introspection." Technophile Clay Shirky
disagrees, countering that reading is "an unnatrual act; [for] we are no more evolved to read books than we are to use computers." Indeed, Carr himself argues that our "innate bias is to be distracted. Our [evolutionary] predisposition is to be aware of as much of what's going on around us as possible" in order to reduce the likelihood of being attacked by a predator or overlooking an important food source. Deep reading helps train us to sustain our attention towards one task which allows time for integration of ideas and big-picture thinking.
But unlike Carr who urges that we continue teaching this discipline of deep, critical reading, Shirky argues that "literate societies become literate by investing extraordinary resources, every year, training children to read. Now it's our turn to figure out what response we need to shape our use of digital tools.... We are now witnessing the rapid stress of older institutions accompanied by the slow and fitful development of cultural alternatives. Just as required education was a response to print, using the Internet well will require new cultural institutions as well, not just new technologies." (Typical sanguine libertarian drivel BTW. Just look at the source of the article: WSJ, where blind faith in markets, technology, and being a lemming are par for the course.)
So is that why people like Shirky (gotta love that name) harbor such an aversion to funding education? Because we need alternatives that educate children in a way that caters to their technological sensibilities? Well, one thing's for sure: Shirky certainly believes we shouldn't fear the future but embrace it. For, as he says, "we are living through [an] explosion of publishing capability today, where digital media link over a billion people into the same network. This linking together in turn lets us trap our cognitive surplus, the trillion hours a year of free time the educated population of the planet has to spend doing things they care about. In the 20th century, the bulk of that time was spent watching television, but our cognitive surplus is so enormous that diverting even a tiny fraction of time from consumption to participation can create enormous positive effects.
Hmm... That "cognitive surplus" idea sounds suspiciously like the Singularity which a recent
NYT article defines as "a time, possibly just a couple decades from now, when a superior intelligence will dominate and life will take on an altered form" where "human beings and machines will so effortlessly and elegantly merge that poor health, the ravage of old age and even death itself will all be things of the past." So basically, it's a merging of humans and machines to create a super intelligence, or in Shirky's words, a "cognitive surplus," where people throughout the world can link their minds together through technology in a way that will improve the human condition. This is not as far-fetched as one might think if Carr's warning is correct and our brains are already being reduced to automatic "signal-processing units" that only find meaning in decoding the data itself rather than how the data relates to patterns and theories. Heck, we're already meeting the computers half way if that's the case!
Such automated thinking, however, may be our undoing if Daniel Pink's assertions are true. Pink, the author of
A Whole New Mind, gave a recent lecture where he argued that rules-based routinized work that can be automated (tax accounting, processing uncontested divorce papers, etc.) are already being made obsolete by the likes of programs like TurboTax and 123DivorceMe.com and it will only get worse with time. One way to protect our viability is to tap into our right brain capabilities and develop qualities like inventiveness, empathy, and meaning in order to produce work that will be difficult to automate. Left-brain thinking is analytical, sequential, and logical. Unlike the left brain, the right brain processes everything at once, works with the context of the data (rather than the data in and of itself), and synthesizes (rather than merely analyzing) the data. He argued that machines replaced our muscles by taking over our manufacturing jobs during the last century. This century machines will replace our brains--not our empathetic, creative, big-picture-thinking right side, but our rules-based, logical left side.
If those MBTI stats are correct, then our one ray of hope lies in our increasing ability to empathize. But when it comes to addressing the "creative" and "big-picture thinking" parts of the puzzle, we have a lot of work to do.
Carr would argue that deep reading could help develop many of those necessary qualities espoused by Pink, including increasing our creativity. But as I said at the outset, Americans aren't reading anymore, so I guess a lot of us will be out of work if Pink's forecasting is correct.
If children are to meet the needs of the 21st century workforce by developing creative minds that can integrate knowledge and engage in big-picture thinking, how are we to educate them? Should reading for deep, critical thinking still be stressed? I believe it should. But should we also transfer this deep-thinking approach from the print environment to the online realm? Most definitely! Maybe this is what Shirky meant when he said "using the Internet well will require new cultural institutions." If so, then what institutions, and how should they address the problem?
As of yet, I've heard no real answers to this conundrum, and this lack of real solutions is where I feel the agenda truly lies. My fear is that there won't be any attempt to address this problem because they WANT us to be shallow, noncritical thinkers. (And this, by the way, is the
true reason behind the declining support for education, libraries, and literacy programs.) They want our brains to be reduced to shallow information processing centers, not only so that we won't question authority or possess the mental acumen for independence and self-determination, but also so that we'll--voluntarily or otherwise--meld our brainpower with that of computer technology. Why? Who knows? To fulfill some "Matrix" or
Brave New World type of prophecy? Population surveillance? Subjugation?
Universal mind control? You name it. But the glamorization of robots and cyborgs is undeniably omnipresent in pop culture from
Beyonce and
Christina Aguilera to autotune and the
Transformers franchise. We're clearly being instructed to embrace this inevitability, and it just might work if the claims in
this article and
this news segment are any indication.
Interestingly, both
WSJ articles (Carr's and Shirky's) contain at least one occurrence of the word "Net" rather than "Internet," which piqued my interest. This abbreviated form is less commonly used than the fuller form, so my increasingly paranoid mind couldn't help but notice and read into it as a deliberate choice of words. Nets entrap their intended victims, and I can't help but feel that the deliberate neglect of education and literacy needs coupled with the concerted universalization of technology in the name of attaining the Singularity is a trap. So I'll just wait it out on the sidelines with my fellow "cave-dwellers" and proceed with caution when it comes to blindly adopting "utopian" technologies. And all those technophiles pointing their fingers and sneering at us Luddites can go ahead and continue their dismissive derision. I'll happily let them be the guinea pigs and join in much later when the coast is indisputably clear because my fear is that nothing but bad will ultimately come of all of this. But believe me when I say I hope more than anything that I'm wrong. Unfortunately, I don't think I am....