Search results
- DictionaryWild West/ˌwʌɪl(d) ˈwɛst/
- 1. the western regions of the US in the 19th century, when they were lawless frontier districts. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west.
Powered by Oxford Dictionaries