Colonialism in the Americas

Home > History by Field > Diplomatic History > Colonialism and Imperialism > Colonialism in the Americas

Colonialism in the Americas refers to the period of European domination and control over American territories during the age of exploration and colonization.