German Colonialism

Home > Ethnic and Cultural Studies > Colonialism > Colonial History > German Colonialism

German colonialism refers to the period when Germany established its colonies in Africa and the Pacific. German colonialism was characterized by the imposition of German culture, language, and governance on the peoples of the colonized territories. Germany's colonial empire was short-lived, lasting from 1884 to 1919.