Feminization (sociology)

In sociology, feminization is the shift in gender roles and sex roles in a society, group, or organization towards a focus upon the feminine. It can also mean the incorporation of women into a group or a profession that was once dominated by men.[1]

  1. ^ Ann Douglas (1977). The Feminization of American Culture. Farrar, Straus and Giroux ISBN 0-374-52558-7