Gender roles refer to the social norms and expectations surrounding what is considered appropriate behavior for men and women in a given society.
Gender roles refer to the social norms and expectations surrounding what is considered appropriate behavior for men and women in a given society.