Words Matter with Digital Abuse

By The Cyber Civil Rights Initiative

As a community, we strive to adopt terms that clearly, consistently, and respectfully describe the evolving landscape of technology-facilitated sexual abuse. Wherever possible, we try to avoid using terminology created by perpetrators of abuse and instead seek to use terms that accurately convey the nature and the harm of the abuse. Words matter with digital abuse, and we must use survivor-centered language.

Image-Based Sexual Abuse (IBSA)

UK scholars Clare McGlynn and Erika Rackley coined the term Image-Based Sexual Abuse (IBSA) to refer to a variety of exploitative practices and the severity of the harm they cause. We have adopted this useful umbrella term, which includes the “non-consensual creation and/ or distribution of private, sexual images,” including forms of voyeurism, sextortion, and recordings of sexual assaults. The term IBSA also enables us to broadly describe all material created or distributed in these practices as IBSAM (Image-Based Sexual Abuse Material).

From “Revenge Porn” to Nonconsensual Distribution of Intimate Images (NDII)

Revenge porn” is a prime example of a popular term that was created by the perpetrators of the abuse in question. Referring to the unauthorized disclosure of private, sexually explicit images as “revenge porn” is inapt for several reasons. First, the use of the term “revenge” suggests that the victim is somehow responsible for and deserving of the perpetrator’s conduct. We firmly reject any implication that ending a relationship, sharing an image within a trusting relationship, or refusing someone’s advances provides a justification for abuse. Second, some victim-survivors object to the term “porn” because they feel it trivializes or mischaracterizes their intimate images.

To properly focus on the perpetrator’s abusive act of unauthorized distribution and to move away from the word “pornography,” we adopted the term Nonconsensual Distribution of Intimate Images (NDII).

Sextortion

Another form of image-based sexual abuse is the use of intimate images to coerce, threaten, or extort victims. We adopt the commonly used term “sextortion” to describe this abuse, which typically entails a threat to distribute intimate material to compel an individual to do something against their will. For instance, a predator who obtains some intimate material may demand that their target provide even more explicit material under the threat that they will distribute the images they have via the internet. Increasingly, we also see perpetrators engaging in sextortion to obtain money, Bitcoin, control of joint assets, or favorable divorce or custody settlements.

Sexually Explicit Digital Forgeries

As generative artificial intelligence (GAI) and other forms of machine learning continue to transform our legal and technological landscape, the nature of image “creation” is changing rapidly. The would-be creator or distributor of sexually explicit images no longer needs access to actual existing material or to use a hidden camera to obtain images; they can manufacture these images with nothing more than a photo of the target’s face and a few keystrokes. This technology, still in its early stages, can already produce astoundingly realistic videos depicting anyone doing anything, adding new and troubling dimensions to the traumas associated with image-based sexual abuse.

The popular term for this kind of digitally manipulated sexual material is “deepfakes,” but this is yet another term for abuse that was created by its perpetrators. The term originated on Reddit from an account with the username “deepfake” that distributed these images.

To avoid adopting the language of perpetrators and to better convey the harmful nature of this material, our current preferred terminology is sexually explicit digital forgeries. A sexually explicit digital forgery is a manipulated photo or video that falsely depicts an actual person nude or engaged in sexual conduct that is virtually indistinguishable from an authentic photo or video. We use the term “forgery” to highlight both the fraudulent nature of the material and the lack of consent of the individual to being depicted in this way.

Help Is Available

Whether we are advocating for legislative protections, speaking to the press, or communicating to the individuals whose rights we defend, we are educators as well as advocates. Clear, thoughtful terminology is essential for us to succeed in those roles. To learn more about the definitions explained in this blog, please view CCRI’s Coffee Chat on Terminology.

If you have experienced tech-facilitated sexual abuse or have questions, our advocates are available 24/7 for support. You are not alone.