Other Free Encyclopedias » Online Encyclopedia » Encyclopedia - Featured Articles » Contributed Topics from A-E » Cultural Racism


culture cultures educational poverty

The emergence of cultural racism partially reflects the discrediting of old biological explanations for racial inequality. Arguments of cultural differences in the United States were originally employed as an alternative to biological explanations for racial inequality, often by liberals committed to racial justice. Since the 1960s, anthropologists and other scientists have amassed evidence showing that biological races do not exist, that racial categories are cultural inventions rather than scientifically valid partitions of the human species, and that race is not a useful, accurate, or meaningful description of human biological variation (Mukhopadhyay and Henze 2003; Mukhopadhyay, Henze, and Moses 2007). In short, they have argued that race as biology is fiction and that racial classifications are historical and culturally specific ideologies invented to justify slavery and other forms of systematic, institutionalized inequality.

In the absence of biological explanations for racial differences and racial inequality, researchers turned to culture—exploring, for example, the role of cultural or linguistic factors in the educational achievement of minority groups or the role of family structure in reproducing poverty across generations. For liberals and anthropologists, culture (unlike biology) was never a barrier to achieving racial equality. All humans have the same capacity for culture, and all cultures are learned. Moreover, cultures are dynamic, flexible, creative human adaptations, changing over time and in different circumstances. If, as some argued, the culture of African Americans or Puerto Rican migrants differed from the dominant U.S. culture, that “problem” could be solved. New cultural ways could be learned, either by abandoning old ways or by acquiring a second cultural repertoire, much like a second language. Cultural differences, while recognized, were not viewed as insurmountable obstacles to racial equality. Culture was instead the explanatory paradigm for racial inequality, and cultural assimilation was the solution.

Cultural-difference arguments have come under scrutiny, however, and many scholars have come to consider them examples of cultural racism. Critics have pointed out that, historically, cultural differences between Europeans (or Euro-Americans) and non-Europeans have always been framed in terms of superiority and inferiority. In the United States, Africans and other racial groups were deemed culturally inferior to “whites” (meaning those from northwestern Europe). Nineteenth-century evolutionary science attempted to rank racial groups from “primitive” to “advanced.” They did not simply use biology, but also what would come to be called culture. For example, British marriage and kinship forms (monogamy and nuclear families) were considered more “advanced” than other cultural forms (e.g., polygamy or multigenerational, extended families).

During the twentieth century, arguments for the superiority of Anglo (Christian) culture grew more strident as U.S. anti-immigration legislation restricted the entry of “lower ranked” European subraces (such as “Semitic” or “Alpine”). Dominant groups feared cultural pollution from “inferior” cultures, and immigrants were expected to assimilate to the “superior” culture. The only question was whether all races and subraces, such as southern and eastern Europeans or the Irish, were capable of assimilating to the dominant Anglo (Protestant) culture.

With the rejection of race as biology in the post– World War I, post–civil rights era, cultural difference as cultural deficit, or what is now called “cultural racism,” was the reigning paradigm. During the 1960s, for example, African American school children were considered linguistically impoverished, possessing linguistic forms fundamentally inferior to the standard American English taught in schools. African American families, with a core matrifocal unit and extended kinship ties, were described as not only inferior but pathological (“dysfunctional”) relative to the European American nuclear family.

Oscar Lewis’s theory of a “culture of poverty,” initially based on fieldwork in Mexico and Puerto Rico, focused on cultural adaptations to the circumstances of poverty. Yet some interpreted his findings within what might be called a “poverty of culture” framework, seeing other cultures as clearly inferior and deficient compared to middle-class U.S. American or Western culture, and as the primary barrier to upward mobility. When applied to racial and ethnic minority groups in the United States, the culture of poverty approach, or more often, the poverty of culture approach, became the explanation for why families remained poor or children did poorly in school. Culture, in short, rather than any larger system of inequality, produced racialized poverty or educational underachievement. In the educational jargon of the late 1960s and early 1970s, minority children were “culturally deprived.” Implicitly, for those who wished to see it that way, poor people had only their culture (and hence themselves) to blame. Many scholars now characterize this literature as an example of cultural racism.

During the 1970s and 1980s, largely because of the activism of racial minorities, the U.S. and some European nations, including Britain, began to accept and even celebrate the cultural differences of racial groups. Racial minorities, including indigenous and immigrant groups, embraced their cultural roots, rejecting the prevailing philosophy that assimilation was essential for social advancement. Cultural relativism prevailed, at least in theory. All cultures became valued equally. In this sense, society had become “color-blind.” That is, “color” was irrelevant. Or rather, all colors were relevant.

From the perspective of many racial minorities, the goal was mutual respect and an institutionalized recognition of cultural diversity as legitimate. The era of “multiculturalism” took various forms. In Britain, it included having Imams as chaplains in prison and setting up separate public schools for Muslim children. In the United States, it ranged from recognizing alternative cultural celebrations such as Kwanzaa, to creating “cultural” (ethnic) clubs on campuses, establishing ethnic studies departments, pursuing Afrocentric curriculum, and broadening affirmative action goals to include cultural diversity.

Old “culturally deprived” terminology was replaced with cultural diversity, cultural competence, and other language that conveyed respect for multiple and equally valid cultural forms. In the educational context, teacher education programs emphasized diverse learning styles, expressive forms, and other educationally relevant cultural resources that children from varied racial backgrounds bring to school. Many well-intentioned educators committed themselves to teaching to the child, rather than forcing the child to assimilate to the culture of the school.

Nevertheless, despite success at institutionalizing multiculturalism, racial inequality persists. Educational under-achievement remains a major problem for most indigenous and racial minorities. In a 2006 editorial in the New York Times , Orlando Patterson put forward the idea that “cultural” arguments have been totally rejected, and that only structural explanations (the “system”) are currently acceptable explanations for underachievement. Yet cultural “differences,” while now positively valued, continue as a predominant explanatory framework for variations in the educational achievement of racial groups. Researchers continue to explore more complex, but nevertheless cultural, processes that depress educational achievement, such as cultures of “opposition” among some U.S. racial groups. These school peer cultures consciously “oppose,” it is argued, the perceived emphasis of the racially dominant culture on academic excellence.

While culture has become the new explanation for racial inequality, cultural racism employs a concept of culture that is, from an anthropological perspective, enormously simplistic, static, rigid, overly homogeneous, deterministic, ahistoric, and without context. Culture is depicted as so deeply embedded, so tradition-bound, that it is nearly “intrinsic” or “natural” to a group. In short, culture is “naturalized” and “essentialized,”making it nearly as immutable as biology. The line between cultural essentialism and biological determinism is sometimes indistinguishable. Culture thus becomes an explanation for racial inequality that offers little hope for change. Cultural racism depicts culture as an insurmountable obstacle for racial minorities or an insurmountable advantage for dominant racial groups.

Minority groups, of course, can also employ essentialized, naturalized images of cultures and ignore underlying structural factors. Sometimes this is a conscious political strategy, such as when it is used by Native Americans (as culturally superior “stewards of the land”) to maintain control over their lands. Nevertheless, such examples would not be considered “cultural racism” because of the power relations involved. That is, they are not the dominant groups’ characterization of a subordinate group.

Cultural-difference explanations for racial inequality are coming under increasing attack, partially for the reasons just cited. But critics go farther. Focusing on culture, they argue, ignores the larger national, global, economic, and political forces that contribute to social inequality, whether racial or nonracial. Thus, complex, multifactorial, multileveled, and nuanced analysis is needed to understand the processes that contribute, on different levels, to persistent racial inequality.

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or