List of nursing schools in the United States

This is a list of nursing schools in the United States of America, sorted by state. A nursing school is a school that teaches people how to be nurses (medical professionals who care for individuals, families, or communities in order to attain or maintain health and quality of life).