AI, Teens, and Fake Nudes

The father of a young fourteen-year-old girl in Spain was recently quoted in a news report as saying this: “Today a smartphone can be considered as a weapon. . . a weapon with a real potential of destruction, and I don’t want it to happen again.” What was this father talking about? He was the father of one of the over thirty twelve to fourteen year old girls in a town in Spain who were victimized by a group of male minors who used an artificial intelligence app to remove clothing from the girls in photos. The app, called Nudify, is one of a growing number of phone apps that uses AI to change a photo into a nude one. These fake nudes are then distributed through social media, leading to what’s rightly being called the direct exploitation and abuse of women and girls. In some cases, the photos are used for extortion purposes to get money from the victims at the threat of posting or sharing the photo online. Parents, teach your kids that exploitation of this or any type, is sinful and wrong.