Historically, women have made up a large majority of the profession and academic discipline of nursing.[1] Women's nursing roles include both caring for patients and making sure that the wards and equipment are clean. In the United States, women make up the majority of the field of nursing, comprising 86% of Registered Nurses (RNs) in 2021;[2] globally, women comprise 89% of the nursing workforce.[3]