White Colonialism

Home > Ethnic and Cultural Studies > Colonialism > Colonialism and Race > White Colonialism

It refers to the dominance of white people in colonial empires and the belief that people of white European ancestry are superior to others.