The history of nursing in the United States focuses on the professionalization of Nursing in the United States since the Civil War.