This article possibly contains original research. (October 2018) |
The feminization of the workplace is the feminization, or the shift in gender roles and sex roles and the incorporation of women into a group or a profession once dominated by men, as it relates to the workplace. It is a set of social theories seeking to explain occupational gender-related discrepancies.