Imperialism

Home > Gender and Sexuality Studies > Critical Race Studies > Decolonization > Imperialism

This is a form of colonialism where a more powerful country extends its control over the territory of a less powerful region, usually for economic gain.