Definition of Political Imperialism

Home > History by Chronology > Colonialism and Imperialism > Political Imperialism > Definition of Political Imperialism

Political Imperialism is the policy or practice of acquiring and maintaining colonies or dependencies as a means of economic and political domination over other countries or regions.