In the midst of a recent conversation with novelist Marilynne Robinson for The New York Review of Books, President Obama offered his thoughts on the factors driving American political polarization:
Part of the challenge is — and I see this in our politics — a common conversation. It’s not so much, I think, that people don’t read at all; it’s that everybody is reading [in] their niche, and so often, at least in the media, they’re reading stuff that reinforces their existing point of view. And so you don’t have that phenomenon of here’s a set of great books that everybody is familiar with and everybody is talking about.[i]
The President was not necessarily talking about Melville or Plato — in the interview, he saved his highest praise for Robinson’s own Gilead, published in 2004. But the notion that the knowledge base of its citizens has a direct effect on the health of American political discourse has been around since the nation’s founding. In 1987, University of Virginia Professor E.D. Hirsch, Jr. sought to align himself in this tradition with the publication of Cultural Literacy: What Every American Needs to Know, in which he argued that this “common conversation” requires a core body of cultural knowledge — a “literacy” in the essential terms, ideas, and topics from American cultural history. With this provocative work, Hirsch offered a vision for a new American knowledge base, ignited an impassioned educational debate, and provided the germ of an idea for the “What Every American Should Know” project undertaken by the Citizenship and American Identity Program at the Aspen Institute.[ii]
What follows is a historical overview of efforts to define the shape of American language, knowledge, and culture leading up to Hirsch’s 1987 work. It focuses on three historical moments: the drive to delineate American language and ensure an informed citizenry in the early republic through the 19th century; the construction of American identity inherent in Americanization programs at the start of the 20th century; and the late-20th-century culture and canon wars and conservative education reform efforts that surrounded Cultural Literacy’s publication. Throughout, two central points take shape: first, as President Obama suggested, the question of what Americans should know has remained crucial to notions of what it means to be an effective American citizen; and second, that the reverse has also been true: the definition and transmission of cultural knowledge has historically been an important facet of efforts to define who may and may not be called truly American.
As lore has it, a prescient exchange occurred between Benjamin Franklin and an unnamed woman upon Franklin’s departure from Independence Hall on the final day of the 1787 Constitutional Convention. The concerned onlooker asked, “Well, Doctor, what have we got — a republic or a monarchy?” “A republic,” Franklin replied, “if you can keep it.”[iii] While it may be apocryphal, American authors have frequently and liberally recounted this vignette as proof of the special responsibility the founding generation placed upon the American body politic for the maintenance of our democracy. This story reveals the key question that has been central to public discourse now for more than two centuries: what constitutes an effective citizenry? Put individually, what must a person know and do, to be an effective American citizen? As this brief historical sketch will show, while the particular theories and motivations informing the conversation around civic and cultural literacy in the United States have shifted since the end of the 18th century, this essentially functional consideration has remained paramount.
In the first years of the republic, American thinkers posited that the survival of a robust and functional democracy would require a degree of civic literacy among its citizens.
In the first years of the republic, American thinkers including Thomas Jefferson, Benjamin Rush, and Noah Webster posited that the survival of a robust and functional democracy would require a certain degree of civic literacy among its citizens. Jefferson, for his part, argued that the state should take a role in ensuring the liberal education of its citizens as a bulwark against tyranny: “Every government degenerates when trusted to the rules of the people alone. The people themselves therefore are its only safe depositories. And to render even them safe their minds must be improved to a certain degree.”[iv] State overreach was not the only perceived danger. The factionalism feared by many framers of the new American republic and explored most famously by James Madison in The Federalist Papers could be countered by state education creating “republican machines,” as Benjamin Rush put it. George Washington similarly expressed hope that a national system of education might hope to counter state and regional prejudices.[v]
Author and educator Noah Webster (below) understood early on that education in a common language (broadly defined) could help to ensure a sense of cultural unity among inhabitants of the new nation. In the preface to the 1793 edition of his popular “blue back” spelling book, Webster wrote: “To diffuse an uniformity and purity of language in America — to destroy the provincial prejudices that originate in the trifling differences of dialect, and produce reciprocal ridicule — to promote the interest of literature and the harmony of the United States — is the most ardent wish of the Author.”[vi] His work was immediately and enduringly popular and sold millions of copies over the next few decades.[vii] Webster soon supplemented his spelling primer with several editions of his textbook Elements of Useful Knowledge (1802-1812), covering topics from major political debates within the American Revolution to “Vegetable Productions,” all in an effort to ensure the general literacy of American schoolchildren. Taken together with other widely popular schoolbooks such as William McGuffey’s famous Eclectic Readers (1836-37, 1879), Webster’s ideals of a unified American literacy had a formative influence on the education of Americans through the 1800s and even into the 20th century.[viii]
Nonetheless, Webster faced considerable criticism in the popular press for his efforts to codify and promote an authentically American cultural language — objections that likewise have their echoes in contemporary debates. The criticism came on two fronts: first, disagreeing with Webster’s position that American language should be distinct from that of other nations (England, in this case); and second, that he showed insufficient sensitivity to regional and cultural differences within American English. The perceived stakes were high. Webster anticipated the thoroughly modern notion that nations are created linguistically (for which he has been credited by historians including Christopher Looby and Jill Lepore).[ix] This was closely related to the idea that motivated Webster’s most vehement detractors: that his alterations of the language could actually be damaging to their image of the ideal American. As historian Tim Cassedy explains, “Webster’s uniquely incendiary significance lay not in his partisan [conservative, Federalist] or class positions but in the fact that he powerfully asserted the possibility and even inevitability of defining American personhood by defining the language that Americans spoke.”[x]
It’s worth pausing here to reiterate that while the objections leveled at Webster’s work may sound trivial to our ears (e.g. “Pray, sir, do not the Americans speak the English language? Why, then, should we suffer such a barbarism as wharves, when we are sure that the proper English word to be used instead of it is wharfs?”), the argument had less to do with linguistic constructions per se and more to do with the foundations of American culture and — here lies the anxiety — its respectability and perceived civility on the world stage. [xi] This added, in other words, an additional consideration to the original problem of how to go about creating effective citizens. Eighteenth century commentators realized very quickly that they were likewise at a point of deciding how an American might look and sound, especially vis a vis dominant Western European (especially English) cultural norms.
In the meantime, throughout the 19th century, American students at every level interacted with a remarkably unified body of sources. For school-aged children, Presbyterian minister and educational lecturer William McGuffey’s Eclectic Readers series was the most common schoolbook, selling as much as 120 million copies during its many print runs (and enjoying roughly the same numbers as the Bible during that time period). McGuffey chose stories and literary selections emphasizing the moral virtues he thought most necessary to a nation in a state of constant growth and discord: “industry, thrift, sobriety, perseverance…honesty and respect for the property of others,” as well as, patriotism, reverence for the Founding Fathers and national symbols, and an abiding faith in the virtue of national unity. Rhymes and stories that McGuffey invented or popularized to this day remain a part of our national culture: “Mary Had a Little Lamb,” “Twinkle, Twinkle, Little Star,” and “The Boy Who Cried Wolf.”[xii] The few Americans who continued on to college could expect a canonical education in the oldest sense, based upon a regimented set of classical texts and drawing on a tradition which had changed slowly and little since the 17th century. At Harvard, the study of ancient Greek remained compulsory until the curricular reforms of President Charles W. Eliot, whose tenure began in 1869.[xiii]
E.D. Hirsch, in his argument for a content-based reappraisal of American school systems, pointed to this sense of alignment between public education and national purpose in the early United States as an intrinsically nobler pursuit than mere skills-based teaching. “The reason that our eighteenth-century founders and their nineteenth-century successors believed schools were crucial to the American future was not only that the schools would make students technically competent,” wrote Hirsch. “That aim was important, but their main worry was whether the Republic would survive at all.” In Hirsch’s view, 19th century proponents of the common school (who worked to institute a national system of public schools) were the true mantle holders of the Founding Fathers who argued the importance of a unified, purposeful curriculum — in effect, a basic cultural literacy — in preparing new citizens for the work of maintaining a stable republic.[xiv]
However, as we have seen, the cultural and linguistic vision of the United States as expressed by Webster and McGuffey was distinguished by a strict adherence to Anglo-Saxon, Protestant norms — to which, it was assumed, other groups would assimilate thanks to a unified and robust system of public schools as envisioned by Horace Mann and other educational reformers. We should note that a wide plurality of the United States was summarily excluded from this cultural ideal at the point of its inception, and indeed that only 49% of Americans were reportedly of English heritage even by the time of the 1790 census. From the beginning, the English-speaking, Anglo-Saxon, Protestant “default” which has held a position of prominence in American culture has in fact referred to a subset of the population rather than the majority — a subset which has historically deployed its rhetorical power in the service of maintaining a position of cultural and political power.[xv] This rather basic point will shortly become more salient in light of 20th century attempts to hearken back to a singular, unified, and purely “American” cultural vocabulary.
The essential question of what must a person know to be an effective American citizen arose with new force when the U.S. experienced an unprecedented boom in immigration at the turn of the 20th century.
This essential question — what must a person know to be an effective American citizen — tempered by the additional concern of how this person should ideally look, sound and act, arose with new force when the United States experienced an unprecedented boom in immigration around the turn of the 20th century. State and federal government, along with a long slate of Progressive associations and businesses established programs for “Americanization,” with goals ranging from the smooth incorporation of immigrant groups into a polyglot American culture at the most liberal end to the complete assimilation of immigrants to an assumed Anglo-American and Protestant cultural center at the most interventionist. The issue gained additional urgency as domestic and international events helped to stir nativist rhetoric in the American public sphere. A host of developments contributed to a distinct sense that challenges to the American republic would be external as well as internal: the growth of urban slums; American military action in Central America and the Pacific; violent labor disputes such as the Haymarket Riot of 1886; the assassination of President McKinley by a professed anarchist in 1901; the rise of Bolshevism and the Russian Revolution in 1917; and (most severely) World War I.[xvi] American cultural identity gained a negative definition: what an American should not be; and the shaping of new American citizens (both children and immigrants) toward a politically-determined cultural ideal took on a marked air of defensiveness.[xvii]
As historians have come to emphasize in recent decades, the ideal of immigrant assimilation to an Anglo-derived, American cultural center obscured a fundamental myth: there was no such unity in American culture. “The concept of an unchanging, monolithic, Anglo-American cultural core is dead,” wrote Russell Kazal in 1995.[xviii] This, in itself, is not surprising knowing how that cultural vision came about: that native-born Americans whose heritage was German, or Scandinavian, or who were Catholic, for example, might see less of themselves than others in a national narrative that held them beholden to a small group of Yankee forebears. But it does cast a particular light over efforts to define what new immigrants should learn about the culture and history of the United States. Americanizing reformers were shaping a new vision of themselves in the process of trying to reshape American immigrants.
In the first decades of the 20th century, a wealth of publications by and for private associations, educators, and public officials sought to address the “problem” of how to Americanize the immigrant. Many offered tips on what, after the English language, immigrants should be taught about the history, literature, and customs of their host country. John C. Almack wrote Americanization: A Suggested Outline for the Teachers of Aliens with a special eye toward teachers in his native Oregon in 1919. Imbuing a sense of nationalism, wrote Almack, should be a paramount concern in teaching immigrants, as it should implicitly be the concern of all Americans just coming out of wartime:
Any course in history and civics would be incomplete without a direct effort to instill a love of our country into the hearts of the coming Americans. Patriotism is one of the noblest virtues, and the pupils should thoroughly understand its meaning. Our national songs should be learned and sung as opening exercises. There is a place for these, also, for patriotic poems, and for speeches, stories, and dramatization from history, in special programs on our national holidays…In an informal way the teacher can show how it is a patriotic duty to obey the laws, to buy government bonds, and war savings stamps, to be industrious and thrifty.[xix]
Alongside geography and local, state, and national civics, Almack emphasized that in teaching history, “Many vivid instances of individual heroism and of group loyalty and patriotism should be given.”[xx] Again, Almack seemed to be emphasizing the patriotic duties of Oregon’s teachers as much as he was outlining a course for its immigrants.
President Woodrow Wilson was explicit in this theme of Americanization as the reshaping (or renewal) of American identity when he addressed the first national Citizenship Convention on July 13, 1916:
So my interest in this movement is as much an interest in ourselves as in those whom we are trying to Americanize, because if we are genuine Americans they cannot avoid the infection; whereas, if we are not genuine Americans, there will be nothing to infect them with, and no amount of teaching, no amount of exposition of the Constitution,—which I find very few persons understand,—no amount of dwelling upon the idea of liberty and of justice will accomplish the object we have in view, unless we ourselves illustrate the idea of justice and of liberty…This process of Americanization is going to be a process of self-examination, a process of purification, a process of rededication to the things which America represents and is proud to represent.[xxi]
As has been emphasized in recent public debates, Wilson was both a careful speaker and often unabashed in his racism, so it’s safe to assume that the racial overtones above were not unintended.[xxii] It does, however, make it more interesting that he chooses to close this rhetorical passage on what America represents. It’s not an unusual formulation, but representation is outward-facing (here, to incoming immigrants and to the world at large). Just as we saw with the controversy over Noah Webster’s formulation of American English, there is once again an important sense that the way American language and culture is taught to new citizens will have a reflection back on to that same culture. The president’s suggestion of purified Americans, “genuine Americans,” was a goal rather than a reflection of the present — an effort to shore up a racially and ideologically homogenous American identity in the face of internal and external strife.
A few days earlier at the same conference, Naturalization Commissioner Richard K. Campbell suggested that internal strife in the United States stemmed from foreign influence: “Every person within the reach of my voice knows that so far as we have had trouble in this country — industrial trouble, social trouble, class trouble — it is because those who have risen to the top are determined to follow the Old World example and selfishly utilize the ones that are at the bottom.” Campbell continued: “So the thing to teach these foreigners, and to teach our American children…is a devotion to the American principles.”[xxiii] The essential point is that while Campbell and Wilson presented “American principles” and “the things which America represents” as tacitly neutral — a sort of trans-historical American default — these ideas carried political, national, religious, and racial implications regarding who would be included in this vision, and who was to be marked as somehow outside of or apart from it.
The exclusion and alienation implicit in this version of “American” identity affected large populations of Americans as well, including African Americans. By the time of Wilson’s pronouncements on Americanization, black writers and orators had long been drawing attention to the hypocrisy of limiting the benefits of “American” ideals and identity to those who were white. W.E.B. Du Bois famously wrote in The Souls of Black Folk (1903) of “double-consciousness, this sense of always looking at one’s self through the eyes of others…One ever feels his two-ness, — an American, a Negro.”[xxiv] Half a century earlier, in a forceful address before an audience of northern abolitionists, Frederick Douglass had asked, “What, to the American slave, is your 4th of July? I answer: a day that reveals to him, more than all other days in the year, the gross injustice and cruelty to which he is the constant victim.”[xxv] It was no mystery, in other words, that one’s experience of Americanness and “American values” differed sharply along lines of color and class even for those whose families had been in the United States for generations (Du Bois’s great-great-grandfather, an ex-slave, fought for the Continental Army in the Revolutionary War). But this early twentieth century formulation of Americanism, so shot through with nativism and fear of the other, paid no heed to these distinctions. Wilson’s “genuine Americans” presumably looked and sounded much like Woodrow Wilson. Those Americans who did not found themselves in a position of arguing, as Langston Hughes wrote in 1926, that “I, too, am America.”[xxvi]
The question of who and what could rightly be viewed as part of the American tradition again became a topic of forceful public debate when “what every American should know” opened as a new front in the 1980s culture wars. The American Right, spurred on in reaction to liberal interventions of the 1960s, had reshaped and rebuilt itself through the ‘70s into a broad conservative coalition promoting ideals of Protestant evangelism, anticommunism, “family values,” traditionalism, and free-market capitalism; simultaneously, it harbored a deep mistrust of multiculturalism, the expansion of the welfare state, and efforts at sociopolitical equality undertaken by women and non-whites. The movement reached a new maturity with the election of Ronald Reagan in 1980, but presidential rhetoric linking this New Right to the reinstatement of a “lost” America had already been around since Nixon’s second inaugural address in 1973:
Above all else, the time has come for us to renew our faith in ourselves and in America. In recent years, that faith has been challenged. Our children have been taught to be ashamed of their country, ashamed of their parents, ashamed of America’s record at home and its role in the world.[xxvii]
A key political appeal of the Right in the ‘80s was that it was possible to “make America great again.”
A key political appeal of the Right in the ‘80s was that it was possible — to borrow from a contemporary seeker of the Republican mantle — to “make America great again.” And the schools and universities of the United States were a key battleground in this moral and political crusade.[xxviii]
Seeking to build on the successes of African American and women activists in the 1960s, progressive scholars and educators working in the nation’s schools and universities sought to provide students with a more diverse — and thus more representative — curriculum. The late 1960s and ‘70s saw the rise of women’s and ethnic studies departments on campuses across the nation, often thanks to the work and agitation of students, and direct action such as the Third World Liberation Front strikes at UC Berkeley in 1969 which led to the creation of Berkeley’s Department of Ethnic Studies.[xxix] At the primary and secondary level, similarly minded teachers and education leaders took up the mantle of multicultural education, sparking a broad effort to bring previously marginalized voices and viewpoints into American classrooms. Multiculturalism was intended as a counter to older models of assimilation that led students from minority backgrounds to experience “a process of self-alienation,” as leading scholar James Banks wrote. Banks clarified: “the multiculturalists view e pluribus unum as an appropriate national goal, but they believe that the unum must be negotiated, discussed, and restructured to reflect the nation’s ethnic and cultural diversity.”[xxx] Multicultural education remains a vibrant and expanding field in American education, a testament to the staying power of its underlying assertions. The slow gains made by multiculturalists in the 1970s, however, came under fire during 1980s debates over education reform, and what students were learning was as much a matter for debate as how measurably well they were learning it.
The National Commission on Excellence in Education, founded by Secretary of Education Terrell Bell at the behest of President Reagan, published a report of its findings in 1983. Its conclusions were couched as alarmingly as its title: A Nation at Risk: The Imperative for Educational Reform. It outlined a distressing array of measures pointing to declining student achievement, especially in math and science fields, relative to other developed nations. The authors wrote that the state of education in the U.S. was nothing less than a threat to America’s position in the Cold War world: “We have, in effect, been committing an act of unthinking, unilateral educational disarmament.”[xxxi] The report had its intended effect, setting off a wave of efforts at school reform around the country.
In the meantime, many of America’s elite colleges and universities continued to be sites of conflict over their continuing curricular adherence to a fixed and — as student activists argued — unrepresentatively white, male, and European-oriented Western canon. At Stanford, site of the most famous battle of the “canon wars,” students chanted, “hey hey, ho ho, Western culture’s got to go” in the midst of a 1987 rally with Jesse Jackson.[xxxii] The effort, of course, also drew fervent opposition. Harvard Professor Harry Levin gave the opening address for the National Council of Teachers of English in 1980, in which he recounted Harvard’s slow fall from adherence to the classical canon to a new program in general education. Under the new system, students would be able to fulfill their requirement of a “shared base of knowledge” by choosing from more than one hundred courses. For Levin, who had attended Harvard as an undergraduate in the 1920s, the absence of a true common core was deeply distressing: “where, amid the embarrassment of riches, is the ‘shared base of knowledge’ as it has been described? Given the variety of choices, how can the core be common?” Levin closed his declension narrative with considerable flourish: “Without a recallable past,” he warned, “we should live our lives groping through uncharted territory, amid the nebulous skies and lunar landscapes and unrecognizable landmarks and faceless androids of science fiction — a world which gets made up as it goes along.”[xxxiii] Allan Bloom later took the argument further in his widely-read and deeply polemical defense of the Western canon against the onslaught of “relativism,” The Closing of the American Mind, in which he argued that the dismantling of the traditional canon spelled the intellectual death of the American university.[xxxiv]
In his own defense of the canon, Secretary of Education William Bennett produced To Reclaim A Legacy: A Report on the Humanities in Higher Education exploring “whether today’s colleges and universities are offering to America’s youth an education worthy of our heritage.” Bennett’s conclusion, and that of the Study Group on the State of Learning in the Humanities in Higher Education, was that they were not. “A student can obtain a bachelor’s degree from 75 percent of all American colleges and universities without having studied European history, from 72 percent without having studied American literature or history, and from 86 percent without having studied the civilizations of classical Greece and Rome.” Bennett argued for a realignment of American higher education toward a firm grounding in the Western cultural tradition along the twin principles of “good teaching” and “good curriculum,” the latter on the basis that “Some things are more important to know than others.” Citing the influence of E.D. Hirsch, Bennett solicited lists of necessary texts from “several hundred educational and cultural leaders.” Bennett reported, “they listed hundreds of different texts and authors, yet four — Shakespeare’s plays, American historical documents (the Constitution, Declaration of Independence, and Federalist Papers), The Adventures of Huckleberry Finn, and the Bible — were cited at least 50 percent of the time.” Bennett allowed that individual schools would have some differences in their lists of central works, and even that their study should be bolstered by the inclusion of “at least one non-Western culture or civilization.” But the report on the whole was a renouncement of the principle of “intellectual relativism” in the curricular direction of American universities.[xxxv]
In the realm of public education policy, concerns over the perceived abandonment of canonical knowledge took the form of a debate on the importance of teaching content instead of skills. As chairman of the National Endowment for the Humanities, Lynne Cheney presided over the production of American Memory: A Report on the Humanities in the Nation’s Public Schools. The central argument of the Cheney report, echoing Bennett’s To Reclaim a Legacy, was that public education had done students and the nation a disservice by focusing on skills development to the exclusion of core content — rich grounding in the latter being crucial to the maintenance of a democratic society and for the intellectual life of its citizens. Jefferson, in fact, served as one of Cheney’s models: “Thomas Jefferson consulted no books when he wrote the Declaration of Independence. He did not need to; Locke was as familiar to him as Monticello.” Taking this Revolutionary-era ideal forward, Cheney hearkened back to the McGuffey readers, where “children encountered Longfellow, Hawthorne, Alcott, Dickens, and Shakespeare.” And, crucially, the problem of education was given a Cold War-era hook: “In a recent survey done for the Hearst Corporation, 45 percent of those polled thought that Karl Marx’s phrase ‘from each according to his ability, to each according to his need’ is in the U.S. Constitution.”[xxxvi]
E.D. Hirsch at the Aspen Institute, Fall 2015. Copyright Steve Johnson/The Aspen Institute
E.D. Hirsch served on the advisory board for the NEH project that led to Cheney’s American Memory report. He had given the “cultural literacy” debate its name in his first expression of the idea for The American Scholar in 1983.[xxxvii] Given the politically contentious and reform-minded milieu into which Cultural Literacy entered in 1987, it seems less surprising now — though it appears to have surprised Hirsch then — that the work would have been lumped together with the likes of Allan Bloom and met with impassioned and incredulous resistance from within the academy. At the same time, though, it found wide readership among Americans of various political persuasions moved by a general sense that something was amiss in the national conversation. The idea of nailing down, once again, what “every American needs to know” was simply too politically resonant for things to be otherwise.
Nearly thirty years after its publication, academics have started to reevaluate Hirsch’s work and return some strength to its original argument, most extensively in a 2009 Pedagogy symposium on those uncomfortable bedfellows: Hirsch’s Cultural Literacy and Bloom’s The Closing of the American Mind.[xxxviii] Hirsch himself has continued on unabated, founding the Core Knowledge Foundation which provides content-based curricula to schools — curricula which continue to address the most current concerns in public education, as he puts it. “Have these schools narrowed the achievement gap in reading for disadvantaged students? Bluntly, yes, and to a greater extent than any other program.”[xxxix] This, after all, was Hirsch’s original intent. In the preface to his 1987 book, he wrote “Cultural literacy constitutes the only sure avenue of opportunity for disadvantaged children, the only reliable way of combating the social determinism that now condemns them to remain in the same social and educational condition as their parents.”[xl] While one might quibble with the determinism of “one sure avenue,” we should grant the essential progressivism of this aim. And his much-maligned list, he cautioned from the beginning, was never to be read as exclusive or set in stone; it can and will shift with time. One such necessary shift will be an idea of American cultural literacy that honors the traditions of the “disadvantaged” alongside the traditions of the powerful. Another — directly addressed herein — is a correction to Hirsch’s assertion that “By accident of history, American cultural literacy has a bias toward English literate traditions.”[xli] We know now that this was no “accident” but rather the result of more than two centuries of debate and political effort to shape and define the American tradition — a history of which we, engaged once again in this quest, must remain aware.
In the meantime, American history as a field, a discipline, and even as an idea has started to better reflect the diversity that has been a defining characteristic of the country since its inception. “Multiculturalism” endures less as a culture wars buzzword and more as a general principle for how historians — ideally — strive to illuminate and account for cultural diversity in their studies of the past. This effort has yielded a new breadth in historical surveys, such as the late Ronald Takaki’s influential A Different Mirror: A History of Multicultural America (1993) which considered the experiences of Native Americans, Irish Americans, Asian Americans, African Americans, Jewish Americans, and Mexican Americans as a corrective to what he called the narrow and increasingly unrepresentative “master narrative of American history.”[xlii] But even for scholars addressing otherwise narrowly constrained topics, a common question of reviewers, mentors, and colleagues is, “have you considered the experience of ________?” While detractors have been prone to call this interventionist, it really has been a corrective: there have always, in the United States, been women; immigrants; people who were not white; people who in one form or another lacked a sociopolitical position that otherwise would have secured their place in the “master narrative” identified by Takaki. And rather than teaching our children “to be ashamed of their country,” as Nixon argued in 1973, this broadening of perspective can only enrich our understanding of the multivalence and diversity of American history. Kwame Anthony Appiah and Henry Louis Gates, Jr. addressed this point on a global level in the introduction to their 1997 Dictionary of Global Culture:
And we believe that in a world that is increasingly free of domination by ‘the West,’ we will be able both to acknowledge more frankly the evils that were done in the course of Europe’s expansion and to celebrate the very real achievements of those Western cultures—and at the same time to take pleasure in the benefits of the creation of a global culture under the steam of the economic, technological, religious, and cultural ideas of Europe and her heirs.[xliii]
It’s an alluring notion — to take pleasure in our traditions, warts and all. But we have to know the history.
Indeed, the intransigent, complicating factor in seeking to identify “what every American should know” is the deep history, outlined in this report, of American efforts to define a particular and often exclusionary American identity through the establishment of a set of linguistic, cultural, or educational criteria. “What every American needs to know,” considered in this light, hinges less on the content of the what and more on the question of who is included in the American. Some form of the essential ideal of establishing a body of knowledge to ensure an effective, knowledgeable, and engaged citizenry has been at the heart of American educational discourse for nearly 250 years. But all too often it has been used to draw boundaries around a desirable body politic. The challenge lies in making such a project truly open and inclusive. On that, this same history offers a few key lessons, offered in conclusion:
- First and foremost: this must be, at its core, a “big tent” project. Defining “what every American should know” is and always has been an essentially political act, and too often, those in the position to make such a pronouncement have used it to shore up an official version of “American” that looked essentially like themselves.[xliv] As such, a new effort to address this question must seek out a rich diversity of voices so as to reflect the right of every American to assert the meaningfulness and relevance of their experiences.
- As we have seen time and time again, many Americans have responded to times of change and upheaval with expressions of nativism, essentialism, and exclusion. If we, as Americans, aspire to cultural literacy in a world marked by continual change and constant globalization, we must resist the tendency to take a defensive and exclusionary stance in our definition of American heritage. This involves a shared trust that the values we hold most dear: life, liberty, democracy, will be strengthened and enriched rather than threatened by our efforts at an inclusive and more fully representative American tradition. And we must be ready to change our “lists” as cultural shifts necessitate. As E.D. Hirsch wrote in 1983, “Such canonical knowledge [can] not be fixed once and for all…The canon changeth.”[xlv]
- Better understanding what it means to be American necessitates a better understanding of our differences as well as our commonalities. This bears repeating: we must be willing to try to understand, and accept, that there are definitions of Americanness that are different from our own. It is on this basis that we can build what President Obama called, in the passage at the opening of this piece, “a common conversation.” There will absolutely be commonalities — additions to a core to which we can all agree. But we must admit that the first step in creating a new American cultural literacy is that there are points at which we will all, no matter our background, find ourselves illiterate. This cannot be another instance of one group of Americans sitting back and demanding that others adapt to their model; we all, together, have work to do.