High school senior Alex Bordeaux was scrolling through social media to check out Taylor Swift鈥檚 latest outfits when she stumbled on disturbing pornographic images of the pop music icon.
Bordeaux, 17, has studied artificial intelligence in her journalism and yearbook class at Dawson County High School in Dawsonville, Ga., so she knew immediately that she was looking at 鈥渄eepfakes鈥濃擜I-manipulated video, audio, or photos created using someone鈥檚 voice or likeness without their permission.
But she was furious鈥攁nd baffled鈥攖hat so many on social media at least pretended to treat the images as the real deal. 鈥淚t鈥檚 just so crazy how easily people can see something on the internet and immediately believe it,鈥 she said. 鈥淎 lot of kids my own age do not double check anything.鈥
At the same time, it hurt to see someone she鈥檚 admired so deeply, for so long, digitally degraded, added Bordeaux, who credits Swift鈥檚 2014 hit 鈥淲elcome to New York鈥 as inspiring her dream of living in the Big Apple.
鈥淭aylor Swift seems so untouchable because she鈥檚 so rich. She鈥檚 so famous. And she鈥檚 so sweet. And she鈥檚 basically been used鈥 by online trolls, Bordeaux said.
Not all students have the background that Bordeaux does to understand the role AI鈥攁 relatively nascent technology鈥攑lays in creating deepfakes like the ones targeting Swift, as well as other fake images and video designed to spread misinformation, influence public opinion, or con people out of money, experts say.
Schools need to make teaching about this type of technology a priority.
Deepfakes are 鈥渁 big concern鈥 because they 鈥減ollute our information environment to a pretty astonishing degree,鈥 said Kate Ruane, the director for the Center for Democracy and Technology鈥檚 Free Expression project, a nonprofit group that promotes digital rights.
If educators aren鈥檛 already thinking about teaching students about deepfakes, 鈥渢hey really should be 鈥 because this is the water that their students are swimming in every day,鈥 she added.
鈥業t鈥檚 gonna continue to happen because AI is growing so massively鈥
Many of the deepfake images of Swift were taken down, but not before they鈥檇 attracted plenty of eyeballs. One posted on X (formerly Twitter) racked up more than 45 million views, 24,000 reposts, and hundreds of thousands of likes before the user who shared the images had their account suspended for violating X鈥檚 policy,
By that point, the post had been accessible on the platform for about 17 hours, according to the Verge. The content became so problematic that X had to temporarily block searches of Swift,
Bordeaux knows Swift isn鈥檛 the first celebrity to be the victim of a viral deepfake. Former president , , and have all been recent targets. Male students at a . And a deepfake robocall audio recording of President Biden was circulated during the New Hampshire presidential primary.
鈥淚t鈥檚 gonna continue to happen because AI is growing so massively,鈥 Bordeaux said. 鈥淢ore people will learn how to use it. And the more people use it, the more people will abuse it. That鈥檚 just the way it works. 鈥 I think the ethical implications of AI are so important.鈥
While there may seem to be a lot of obvious negatives to deepfakes, teachers need to steer their students toward critical questions about the technology, discussing how policymakers and developers can work to mitigate the downsides, said Leigh Ann DeLyser, the executive director and co-founder of CSforALL, a nonprofit organization that seeks to help expand computer science education.
Teachers could ask students: 鈥淲hat are the benefits of deepfakes? What are the challenges of deepfakes? And if there are challenges, how can or how should we as society, create rules around them, like labeling a deep fake鈥 or getting permission before using someone鈥檚 image? she said.
Bordeaux鈥檚 journalism teacher, Pam Amendola, who received training on how to teach AI from the International Society for Technology in Education, said many of her students, especially those who consider themselves 鈥淪wifties鈥濃攁 nickname for Swift鈥檚 fans鈥攚ere incensed on the pop star鈥檚 behalf.
But they also considered what might have happened if the subject of the images wasn鈥檛 鈥渟omebody who had her [fame], how would they ever be able to combat it?鈥 Amendola said.
That question can provide an opening for teachers to remind students of their digital footprint, explaining how information they鈥檝e already put online can be twisted and used for nefarious purposes, Amendola added.
That lesson hit home with Bordeaux, who understands that AI is getting more sophisticated all the time.
鈥淚鈥檓 definitely worried that my face is out there,鈥 she said. 鈥淭here鈥檚 so many ways you can manipulate with deepfakes to make someone look horrible.鈥
鈥楶hotoshop on steroids鈥
There鈥檚 an opening to discuss the technical aspects of deepfakes, too, in the context of media literacy.
Many people were able to figure out right away that the images of Swift online weren鈥檛 authentic, said Ruane from the Center for Democracy and Technology.
Teachers could ask: 鈥淗ow did they do that? What are the things about the image, the person in the image, that led you to know those things? What are the instincts that you felt within yourself that led you to that conclusion?鈥 Ruane suggested.
That鈥檚 something that students have discussed in Elizabeth Thomas-Capello鈥檚 computer science class at Newburgh Free Academy, a public school in Newburgh, N.Y.
Taylor Swift came up when the class was 鈥渢alking about how there are things AI can鈥檛 do very well yet,鈥 when it comes to creating images of people, Thomas-Capello said. 鈥淚t can鈥檛 really form ear lobes, or the face is a little too perfect or the backgrounds are a little bit muddled. And [it] can鈥檛 quite get teeth yet.鈥
Keeping those flaws in mind, the class tried to identify which among a series of faces were AI-generated and which were real, Thomas-Capello said. 鈥淎nd we all still failed.鈥
Safinah Arshad Ali, a research assistant at the Massachusetts Institute of Technology who works on teaching AI to middle school students, describes deepfakes as 鈥減hotoshop on steroids鈥 and is quick to point out the technical weaknesses in the images it creates.
But she also asks students to 鈥渢hink critically about the source, think about who鈥檚 posting it, why would they be posting it?鈥
Amendola, too, reminds her students that technology is getting to a place where it is difficult to believe what you see with your own eyes. That means they must consider the context behind everything they see online鈥攚hether it鈥檚 pictures of a pop star or a message from a presidential candidate.
鈥淚 tell them, 鈥榪uestion everything because we鈥檙e at a point in history where you need to be a bit of a skeptic because you鈥檙e going to be taken advantage of otherwise,鈥欌 Amendola said.
The Taylor Swift deepfakes鈥攁nd the many similar incidents that are sure to follow them鈥攑rovide an opportunity for would-be computer scientists to delve into the ethics behind the technologies they are learning to create, Thomas-Capello said.
鈥淚 really try to emphasize that this what is occurring in our society. These are the implications for our society,鈥 Thomas-Capello said. 鈥淵ou as students are the ones who are going to write this. You are the ones who can create technology for good [and] make sure that these types of things are harder and harder to [produce].鈥
She added, 鈥淲e don鈥檛 really know where artificial intelligence is going. We鈥檙e just at the very, very beginning. But if we can train our students to use technology for good, then I think [we鈥檒l get to a] really good place.鈥