Manifest Destiny

Home > Area Studies > American Studies > American Identity > Manifest Destiny

The belief that the United States was destined to expand westward, and the impact this has had on American identity.