1936; see racist.
The belief that some races are inherently superior (physically, intellectually, or culturally) to others and therefore have a right to dominate them. In the United States, racism, particularly by whites against blacks, has created profound racial tension and conflict in virtually all aspects of American society. Until the breakthroughs achieved by the civil rights movement in the 1950s and 1960s, white domination over blacks was institutionalized and supported in all branches and levels of government, by denying blacks their civil rights and opportunities to participate in political, economic, and social communities.