Feminist colonialism

Home > History by Chronology > Colonialism and Imperialism > Imperialism and Gender > Feminist colonialism

This involves the imposition of Western notions of femininity and feminism on colonized societies, often leading to the suppression of indigenous women's rights and knowledge.