Gender Roles

Home > Psychology > Gender psychology > Gender Stereotypes > Gender Roles

Gender roles refer to how society believes each gender should behave in a given situation or circumstance. For example, women are expected to stay at home and take care of the children, and men are meant to be the breadwinners.