Newsletter Subject

AI has created a new form of sexual abuse

From

vox.com

Email Address

newsletter@vox.com

Sent On

Thu, May 2, 2024 11:01 AM

Email Preheader Text

Plus: A better dating app, what parents want for their kids, and more. May 2, 2024 Good morning! Tod

Plus: A (potentially) better dating app, what parents want for their kids, and more. May 2, 2024 [View in browser]( Good morning! Today, senior correspondent Anna North is here to talk about the rise of deepfake nudes — and what teenagers are doing to fight back. —Caroline Houck, senior editor of news   [Illustration of a woman sitting in a corner, looking at a smartphone.] Getty Images/iStockphoto How do you stop deepfake nudes? There’s a lot of [debate]( about [the role of technology]( in kids’ lives, but sometimes we come across something unequivocally bad. That’s the case with [AI “nudification” apps](, which teenagers are using to generate and share fake naked photos of their classmates. At Issaquah High School in Washington state, boys used an app to “strip” photos of girls who attended [last fall’s homecoming dance](, [according to the New York Times](. At [Westfield High School in New Jersey](, 10th grade boys created fabricated explicit images of some of their female classmates and shared them around school. Students from [California to Illinois]( have had deepfake nudes shared without their consent, in what experts call a form of [“image-based sexual abuse.”]( Now advocates — including some teens — are backing laws that impose penalties for creating and sharing deepfake nudes. [Legislation has passed]( in Washington, South Dakota, and Louisiana, and is in the works in California and elsewhere. Meanwhile, Rep. Joseph Morelle (D-NY) has [reintroduced a bill]( that would make sharing the images a federal crime. Francesca Mani, a 15-year-old Westfield student whose deepfaked image was shared, started pushing for legislative and policy change after she saw her male classmates making fun of girls over the images. “I got super angry, and, like, enough was enough,” she told Vox in an email sent via her mother. “I stopped crying and decided to stand up for myself.” Supporters say the laws are necessary to keep students safe. But some experts who study technology and sexual abuse argue that they’re likely to be insufficient, since the [criminal justice]( system has been [so inefficient]( at rooting out other sex crimes. “It just feels like it’s going to be a symbolic gesture,” said Amy Hasinoff, a communications professor at the University of Colorado Denver who has studied image-based sexual abuse. She and others recommend tighter regulation of the apps themselves so the tools people use to make deepfake nudes are less accessible in the first place. “I am struggling to imagine a reason why these apps should exist’’ without some form of consent verification, Hasinoff said. [silhouette face in front of computer screen] Arne Dedert/picture alliance via Getty Images Deepfake nudes are a new kind of sexual abuse So-called [revenge porn]( — nude photos or videos shared without consent — has been a problem for years. But with deepfake technology, “anybody can just put a face into this app and get an image of somebody — friends, classmates, coworkers, whomever — completely without clothes,” said Britt Paris, an assistant professor of library and information science at Rutgers who has studied deepfakes. There’s no hard data on how many American high school students have experienced deepfake nude abuse, but [one 2021 study]( conducted in the UK, New Zealand, and Australia found that 14 percent of respondents ages 16 to 64 had been victimized with deepfake imagery. Nude images shared without consent can be traumatic, whether they’re real or not. When she first found out about the deepfakes at her school, “I was in the counselor’s office, emotional and crying,” Mani said. “I couldn’t believe I was one of the victims.” When sexual images of students are shared around school, they can experience “shaming and blaming and stigmatization,” thanks to stereotypes that denigrate girls and women, especially, for being or appearing to be sexually active, Hasinoff said. That’s the case even if the images are fake because other students may not be able to tell the difference. Moreover, fake images can follow people throughout their lives, causing real harm. “These images put these young women at risk of being barred from future employment opportunities and also make them vulnerable to physical violence if they are recognized,” Yeshi Milner, founder of the nonprofit Data for Black Lives, told Vox in an email. [A teenage girl sitting on the floor watching a smartphone] Getty Images Stopping deepfake abuse may require reckoning with AI To combat the problem, [at least nine states]( have passed or updated laws targeting deepfake nude images in some way, and many others are considering them. In Louisiana, for example, anyone who creates or distributes deepfakes of minors can be sentenced to [five or more years in prison](. Washington’s new law, which takes effect in June, treats a first offense [as a misdemeanor](. The federal bill, [first introduced]( in 2023, would give victims or parents [the ability to sue perpetrators]( for damages, in addition to imposing criminal penalties. It has not yet received a vote in [Congress]( but has attracted bipartisan support. However, some experts worry that the laws, while potentially helpful as a statement of values, won’t do much to fix the problem. “We don’t have a legal system that can handle sexual abuse,” Hasinoff said, noting that [only a small percentage]( of people who commit sexual violence are ever charged. “There’s no reason to think that this image-based abuse stuff is any different.” Some states have tried to address the problem by [updating their existing laws on child sexual abuse images and videos]( to include deepfakes. While this might not eliminate the images, it would close some loopholes. (In one recent New Jersey lawsuit, lawyers for a male high school student [argued]( he should not be barred from sharing deepfaked photos of a classmate because federal laws were not designed to apply “to computer-generated synthetic images.”) Meanwhile, some [lawyers]( and legal scholars say that the way to really stop deepfake abuse is to target the apps that make it possible. Lawmakers could regulate app stores to bar them from carrying nudification apps without clear consent provisions, Hasinoff said. Apple and Google have [already removed several apps]( that offered deepfake nudes from the App Store and Google Play. However, users don’t need a specific app to make nonconsensual nude images; many [AI]( image generators could potentially be used in this way. Legislators could require developers to put guardrails in place to make it harder for users to generate nonconsensual nude images, Paris said. But that would require challenging the “unchecked ethos” of AI today, in which developers are allowed to release products to the public first and figure out the consequences later, she said. “Until companies can be held accountable for the types of harms they produce,” Paris said, “I don’t see a whole lot changing.” —[Anna North, senior correspondent](   [Listen]( One Flu Over the Cowcow’s Nest Avian flu, which recently leaped from chickens to cows, has now been detected in milk. How worried should humans be about the outbreak? [Listen now](   POLITICS - What values do you want your child to have?: The answer is probably [different depending on your party](. [[NPR](] - In case you somehow missed it: Kristi Noem for some reason seemingly thought that writing about killing a puppy would endear her to voters?. [[Vox](] - Arizona repeals its Civil War-era abortion ban: Why do these laws even exist in the first place? We've got the explainer for you. [[Vox](] [Kristi Noem speaking at a lectern with a bright red and blue background behind her.] Kent Nishimura/Bloomberg via Getty Images DESIRE - Falling for AI: Things can apparently get “very steamy” when you hijack ChatGPT Plus and get it to respond like your boyfriend. [[WSJ](] - On bodies and fitness: “These lifestyle gyms aren't competing with Ozempic — they're embracing it.” [[Quartz](] - Dating apps suck: Will one made by academics instead of corporations be better? [[Guardian](]   Ad  [Learn more about RevenueStripe...](   Are baby bonds a good investment? A unique policy program that could help close the racial wealth gap. [Listen now](   Are you enjoying the Today, Explained newsletter? Forward it to a friend; they can [sign up for it right here](. And as always, we want to know what you think. We recently changed the format of this newsletter. Any questions, comments, or ideas? We're all ears. Specifically: If there is a topic you want us to explain or a story you’re curious to learn more about, let us know [by filling out this form]( or just replying to this email. Today's edition was edited and produced by Caroline Houck. We'll see you tomorrow!   Ad  [Learn more about RevenueStripe...](   [Facebook]( [Twitter]( [YouTube]( [Instagram]( [TikTok]( [WhatsApp]( This email was sent to {EMAIL}. Manage your [email preferences]( [unsubscribe](param=sentences). If you value Vox’s unique explanatory journalism, support our work with a one-time or recurring [contribution](. View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1701 Rhode Island Ave. NW, Washington, DC 20036. Copyright © 2024. All rights reserved.

Marketing emails from vox.com

View More
Sent On

25/05/2024

Sent On

24/05/2024

Sent On

24/05/2024

Sent On

24/05/2024

Sent On

23/05/2024

Sent On

22/05/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.