Feminism refers to the belief in the social, political, and economic equality of the sexes, as well as efforts to end sexism and gender-based oppression.
Feminism refers to the belief in the social, political, and economic equality of the sexes, as well as efforts to end sexism and gender-based oppression.