Colonial imperialism

Home > History by Chronology > Colonialism and Imperialism > Imperialism and International Relations > Colonial imperialism

Colonialism is a form of imperialism in which a Western imperial power takes control of a territory to exploit its resources, labor and land.