less than 1 minute read

The West



West, The, western portion of the United States, formerly the region west of the Appalachian Mountains; presently, the territory west of the Mississippi River, in particular the northern part of this area. In U.S. history, the West was a region that lay at the rim of the settled land. This unsettled area was a place where unlimited land was available at a very cheap price to anyone willing to lead a life on the frontier.



Additional topics

21st Century Webster's Family Encyclopedia21st Century Webster's Family Encyclopedia - Watermelon to Will