The History Of Rape Culture

917 Words2 Pages

Rape and rape culture have been a longstanding issue in American society. Looking at modern influences such as the media, our nation’s history, and the way our Consider the following: How has rape culture evolved through our history? What role does the media play in rape culture? And most importantly, how has rape become institutionalized in American society?
First things first, it is important to understand what the terms rape and rape culture truly mean. Many people see the two terms and interchangeable. According to the Federal Bureau of Investigation’s Rape Addendum, rape is defined as “Penetration, no matter how slight, of the vagina or anus with any body part or object, or oral penetration by a sex organ of another person, without the consent of the victim” (Rape Addendum). Meaning, that any sort of nonconsensual sex is and should be considered rape. Rape culture on the other hand, is a term coined by feminists in the 1970’s. According to Women Against Violence Against Women, rape culture is “the ways in which society blamed victims of sexual assault and normalized male sexual violence.” In simpler terms, it is society 's way of …show more content…

Imagine how much happier we would be, how much freer to be our individual selves, if we didn’t have the weight of gender expectations. ( Chimamanda Ngozi Adichie )” Taking a step back, it is clear that rape culture is not a part of our society that will change anytime soon. Nonetheless, it is important to recognize the role that we play in institutionalizing rape. We have become blinded by the presence of sex in the media and there is a lack of awareness in terms of rape in our society. This, in turn, has led to a belief that most rape crimes are more or less victimless. As a society it is time to enact change, promote truly equal gender equality, and create an environment where the victims are free from shame and

More about The History Of Rape Culture

Open Document