The Dumbing Of America
Call Me a Snob, but Really, We’re a Nation of Dunces
By Susan Jacoby
Sunday, February 17, 2008; Page B01
The Washington Post
The mind of this country, taught to aim at low objects, eats upon itself.” Ralph Waldo Emerson offered that observation in 1837, but his words echo with painful prescience in today’s very different United States. Americans are in serious intellectual trouble — in danger of losing our hard-won cultural capital to a virulent mixture of anti-intellectualism, anti-rationalism and low expectations.
This is the last subject that any candidate would dare raise on the long and winding road to the White House. It is almost impossible to talk about the manner in which public ignorance contributes to grave national problems without being labeled an “elitist,” one of the most powerful pejoratives that can be applied to anyone aspiring to high office. Instead, our politicians repeatedly assure Americans that they are just “folks,” a patronizing term that you will search for in vain in important presidential speeches before 1980. (Just imagine: “We here highly resolve that these dead shall not have died in vain . . . and that government of the folks, by the folks, for the folks, shall not perish from the earth.”) Such exaltations of ordinariness are among the distinguishing traits of anti-intellectualism in any era.
The classic work on this subject by Columbia University historian Richard Hofstadter, “Anti-Intellectualism in American Life,” was published in early 1963, between the anti-communist crusades of the McCarthy era and the social convulsions of the late 1960s. Hofstadter saw American anti-intellectualism as a basically cyclical phenomenon that often manifested itself as the dark side of the country’s democratic impulses in religion and education. But today’s brand of anti-intellectualism is less a cycle than a flood. If Hofstadter (who died of leukemia in 1970 at age 54) had lived long enough to write a modern-day sequel, he would have found that our era of 24/7 infotainment has outstripped his most apocalyptic predictions about the future of American culture.
Dumbness, to paraphrase the late senator Daniel Patrick Moynihan, has been steadily defined downward for several decades, by a combination of heretofore irresistible forces. These include the triumph of video culture over print culture (and by video, I mean every form of digital media, as well as older electronic ones); a disjunction between Americans’ rising level of formal education and their shaky grasp of basic geography, science and history; and the fusion of anti-rationalism with anti-intellectualism.
First and foremost among the vectors of the new anti-intellectualism is video. The decline of book, newspaper and magazine reading is by now an old story. The drop-off is most pronounced among the young, but it continues to accelerate and afflict Americans of all ages and education levels.
Reading has declined not only among the poorly educated, according to a report last year by the National Endowment for the Arts. In 1982, 82 percent of college graduates read novels or poems for pleasure; two decades later, only 67 percent did. And more than 40 percent of Americans under 44 did not read a single book — fiction or nonfiction — over the course of a year. The proportion of 17-year-olds who read nothing (unless required to do so for school) more than doubled between 1984 and 2004. This time period, of course, encompasses the rise of personal computers, Web surfing and video games.
Does all this matter? Technophiles pooh-pooh jeremiads about the end of print culture as the navel-gazing of (what else?) elitists. In his book “Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter,” the science writer Steven Johnson assures us that we have nothing to worry about. Sure, parents may see their “vibrant and active children gazing silently, mouths agape, at the screen.” But these zombie-like characteristics “are not signs of mental atrophy. They’re signs of focus.” Balderdash. The real question is what toddlers are screening out, not what they are focusing on, while they sit mesmerized by videos they have seen dozens of times.
Despite an aggressive marketing campaign aimed at encouraging babies as young as 6 months to watch videos, there is no evidence that focusing on a screen is anything but bad for infants and toddlers. In a study released last August, University of Washington researchers found that babies between 8 and 16 months recognized an average of six to eight fewer words for every hour spent watching videos.
I cannot prove that reading for hours in a treehouse (which is what I was doing when I was 13) creates more informed citizens than hammering away at a Microsoft Xbox or obsessing about Facebook profiles. But the inability to concentrate for long periods of time — as distinct from brief reading hits for information on the Web — seems to me intimately related to the inability of the public to remember even recent news events. It is not surprising, for example, that less has been heard from the presidential candidates about the Iraq war in the later stages of the primary campaign than in the earlier ones, simply because there have been fewer video reports of violence in Iraq. Candidates, like voters, emphasize the latest news, not necessarily the most important news.
No wonder negative political ads work. “With text, it is even easy to keep track of differing levels of authority behind different pieces of information,” the cultural critic Caleb Crain noted recently in the New Yorker. “A comparison of two video reports, on the other hand, is cumbersome. Forced to choose between conflicting stories on television, the viewer falls back on hunches, or on what he believed before he started watching.”
As video consumers become progressively more impatient with the process of acquiring information through written language, all politicians find themselves under great pressure to deliver their messages as quickly as possible — and quickness today is much quicker than it used to be. Harvard University’s Kiku Adatto found that between 1968 and 1988, the average sound bite on the news for a presidential candidate — featuring the candidate’s own voice — dropped from 42.3 seconds to 9.8 seconds. By 2000, according to another Harvard study, the daily candidate bite was down to just 7.8 seconds.
The shrinking public attention span fostered by video is closely tied to the second important anti-intellectual force in American culture: the erosion of general knowledge.
People accustomed to hearing their president explain complicated policy choices by snapping “I’m the decider” may find it almost impossible to imagine the pains that Franklin D. Roosevelt took, in the grim months after Pearl Harbor, to explain why U.S. armed forces were suffering one defeat after another in the Pacific. In February 1942, Roosevelt urged Americans to spread out a map during his radio “fireside chat” so that they might better understand the geography of battle. In stores throughout the country, maps sold out; about 80 percent of American adults tuned in to hear the president. FDR had told his speechwriters that he was certain that if Americans understood the immensity of the distances over which supplies had to travel to the armed forces, “they can take any kind of bad news right on the chin.”
This is a portrait not only of a different presidency and president but also of a different country and citizenry, one that lacked access to satellite-enhanced Google maps but was far more receptive to learning and complexity than today’s public. According to a 2006 survey by National Geographic-Roper, nearly half of Americans between ages 18 and 24 do not think it necessary to know the location of other countries in which important news is being made. More than a third consider it “not at all important” to know a foreign language, and only 14 percent consider it “very important.”
That leads us to the third and final factor behind the new American dumbness: not lack of knowledge per se but arrogance about that lack of knowledge. The problem is not just the things we do not know (consider the one in five American adults who, according to the National Science Foundation, thinks the sun revolves around the Earth); it’s the alarming number of Americans who have smugly concluded that they do not need to know such things in the first place. Call this anti-rationalism — a syndrome that is particularly dangerous to our public institutions and discourse. Not knowing a foreign language or the location of an important country is a manifestation of ignorance; denying that such knowledge matters is pure anti-rationalism. The toxic brew of anti-rationalism and ignorance hurts discussions of U.S. public policy on topics from health care to taxation.
There is no quick cure for this epidemic of arrogant anti-rationalism and anti-intellectualism; rote efforts to raise standardized test scores by stuffing students with specific answers to specific questions on specific tests will not do the job. Moreover, the people who exemplify the problem are usually oblivious to it. (“Hardly anyone believes himself to be against thought and culture,” Hofstadter noted.) It is past time for a serious national discussion about whether, as a nation, we truly value intellect and rationality. If this indeed turns out to be a “change election,” the low level of discourse in a country with a mind taught to aim at low objects ought to be the first item on the change agenda.
Susan Jacoby’s latest book is “The Age of American Unreason.”
Source: http://www.washingtonpost.com/wp-dyn/content/article/2008/02/15/AR2008021502901.html
How fortunate for leaders that men do not think. ~ Adolf Hitler