March 14, 2024: Two Miami, Florida teenagers, ages 13 and 14, were arrested on December 22, 2023 for allegedly creating and sharing AI-generated nude images of their classmates without consent.

According to a police statement Report quoted by WIREDThe teens used an unnamed “AI app” to generate explicit images of female and male classmates aged 12 and 13.

The incident that occurred Pinecrest Cove Academy in Miamiled to the scholars’ suspension on Dec. 6 and was subsequently reported to Miami-Dade Police.

The arrests and charges against the teenagers are believed to be the primary of their kind within the United States in reference to the sharing of AI-generated nude photos.

Under a 2022 Florida law that criminalizes the distribution of pretend, sexually explicit images without the victim’s consent, the teenagers face third-degree felonies comparable to automotive theft or imprisonment.

So far, neither the parents of the accused boys nor the responsible investigator and public prosecutor have commented on the case.

The problem of minors creating AI-generated nudes and explicit images of other children has turn out to be increasingly common at school districts across the country.

While the Florida case is the primary known case of criminal charges related to AI-generated nude images, Similar cases have been reported within the USA and Europe.

The impact of generative AI on the problems of kid sexual abuse, non-consensual deepfakes, and revenge porn has led to varied states tackling the difficulty independently, as there may be currently no federal law that addresses non-consensual deepfake nudes.

President Joe Biden has issued an executive order on AIwhich asks authorities for a report on banning using generative AI to provide child sexual abuse material, and each the Senate and House of Representatives have introduced laws often called DEFIANCE Act of 2024 to resolve the issue.

Although the naked bodies depicted in AI-generated fake images are usually not real, they’ll appear authentic, potentially causing psychological distress and reputational damage to victims.

The White House called such incidents “alarming” and stressed the necessity for brand spanking new laws to deal with the issue.

The Internet Watch Foundation (IWF) has also reported that AI image generators lead to at least one Increase in child sexual abuse material (CSAM), which complicates investigations and makes it difficult to discover victims.

This article was originally published at