American Colonialism

Home > Ethnic and Cultural Studies > Colonialism > Colonial History > American Colonialism

American colonialism refers to the period when the United States established its colonies in the Philippines, Puerto Rico, and Guam. American colonialism was characterized by the imposition of American culture, language, and governance on the peoples of the colonized territories. The United States acquired its colonies through the Spanish-American War in 1898.