Colonialism

Home > History by Chronology > Colonialism and Imperialism > Imperialism and Gender > Colonialism

Colonialism refers to the practice of acquiring and dominating foreign territories, peoples, or cultures by a more powerful country.