Rape Culture

Home > Gender and Sexuality Studies > Sexism > Feminism > Rape Culture

A culture in which sexual violence is normalized and even condoned, seen in the objectification of women's bodies, victim-blaming, and slut-shaming.