close
close
Fri. Oct 18th, 2024

Policies needed to combat deepfake sexual harassment

Policies needed to combat deepfake sexual harassment

Illegal deepfake images of students are now circulating in U.S. schools, and most districts are ill-equipped to respond, according to a September report from the Center for Democracy and Technology (CDT), a nonprofit organization focused on the impact of technology on civil rights.

Deepfakes are fabricated photos, videos, or audio recordings that appear authentic thanks to advances in artificial intelligence. Deepfake “non-consensual intimate images” (NCII), as mentioned in the report, depicts the image or voice of a real person in a sexually explicit manner without their consent.

According to the CDT report, 15 percent of high school students have heard in the past year that deepfake NCII depicts someone from their school. That number is based on responses to online surveys completed by 1,316 high school students this summer. Of these students, only 10 percent said their school offers guidance to victims of such images.


CDT President and CEO Alexandra Reeve Givens said sexually explicit deepfakes contribute to the existing problem of authentic NCII in schools. Thirty percent of high school students surveyed said they had heard about authentic NCII that depicted someone from their school.

“Unfortunately, the rise of generative AI this past school year has collided with a long-standing problem in schools: the sharing of non-consensual intimate images,” Givens said in a public statement. “In the digital age, children urgently need support to cope with technology-enabled harassment, and schools have an important power to help curb this harm.”

Along with students, CDT also surveyed 1,006 teachers and 1,028 parents of both middle and high school students for the report. More than half of teachers said they had never heard from their school or district how to handle incidents involving authentic or deepfake NCII. Only 4 percent of parents surveyed said their child’s school had communicated with them about the risks and penalties for students who share sexually explicit deepfakes of other students without their consent.

A key recommendation from the report advises that “at a minimum, schools should update their Title IX policies to include explicit online conduct that creates a hostile environment for students at school, including NCII, and meaningfully communicate these policies to teachers, students and parents. .”

Title IX, which is enforced by the U.S. Department of Education’s Office for Civil Rights, requires schools that receive federal funding to protect their students and staff from discrimination, including sex-based harassment and sexual violence. An April 2024 amendment to the law updated the definition of online sexual harassment to include “the non-consensual dissemination of intimate images (including authentic images and images altered or generated by artificial intelligence technologies).”

CDT also recommends training for educators and Title IX coordinators on reporting NCII incidents, both deepfake and authentic; how to protect the privacy of the students involved; and how we can best support the victims of such images.

“Schools should also take educational preventative measures around NCII by taking actions such as directly addressing the issue in the curriculum, or incorporating it into broader sexual harassment or digital citizenship efforts,” the report said. “Overall, more emphasis is needed on proactive efforts to curb this behavior to prevent worse outcomes for all involved.”

By Sheisoe

Related Post