Post-Colonialism

Home > Ethnic and Cultural Studies > Colonialism > The Impact of Colonialism > Post-Colonialism

The aftermath of colonialism whereby former colonies undergo decolonization and establish independent self-governance.