The American way is a term for the way of life in the United States.
American Way may also refer to:
The American Way may also refer to: