Rape Culture

Home > Gender and Sexuality Studies > Masculinity and femininity > Feminism > Rape Culture

Rape culture refers to a society that normalizes and excuses sexual violence, often placing the blame on victims rather than perpetrators.