Women in nursing

Photograph of a bandaging class at Tredegar Hous

Historically, women have made up a large majority of the profession and academic discipline of nursing.[1] Women's nursing roles include both caring for patients and making sure that the wards and equipment are clean. In the United States, women make up the majority of the field of nursing, comprising 86% of Registered Nurses (RNs) in 2021;[2] globally, women comprise 89% of the nursing workforce.[3]

  1. ^ "Women's Leadership in the Development of Nursing" (PDF). www.sagepub.com. Retrieved 21 March 2023.
  2. ^ "Employed persons by detailed occupation, sex, race, and Hispanic or Latino ethnicity". Bureau of Labor Statistics. Retrieved 2022-02-14.
  3. ^ "The WHO Global Strategic Directions for Nursing and Midwifery (2021–2025)". www.who.int. Retrieved 2022-02-14.