This article needs additional citations for verification. (October 2023) |
A women's college is an institution of higher education where enrollment is all-female. In the United States, almost all women's colleges are private undergraduate institutions, with many offering coeducational graduate programs. In other countries, laws and traditions vary.