French Colonialism

Home > Ethnic and Cultural Studies > Colonialism > Colonial History > French Colonialism

French colonialism refers to the period when the French Empire established its colonies in Africa, Asia, and the Americas. French colonialism was characterized by the imposition of French culture, language, and governance on the peoples of the colonized territories. The French Empire was a major colonial power during the 19th and early 20th centuries.