Definition of Rape Culture

Home > Gender and Sexuality Studies > Sexism > Rape Culture > Definition of Rape Culture

Rape culture is a term used to describe a society that normalizes rape and sexual violence through its attitudes, beliefs, and behaviors.