Google is making it easier to access CSAM deepfakes online

Google, Bing, and DuckDuckGo display, without any moderation, AI-generated images that explicitly depict child sexual exploitation (CSAM), facilitating access to this type of material.
Google is making it easier to access CSAM deepfakes online
Receba nossas newsletters

É de graça. Só tem coisa fina, você vai curtir!

Show de bola! Verifique sua caixa de entrada e clique no link para confirmar sua inscrição.
Erro! Por favor, insira um endereço de email válido!

Leia a reportagem em português

⚠️
This article discusses a sensitive topic that may be disturbing for many people, but it does not contain explicit images.

An online AI image generator is producing explicit images of child sexual exploitation without any moderation. These images are being indexed by major search engines, making this type of content easy to find.

Nucleo conducted tests combining the site’s name—the same one used to create deepfakes of U.S. Vice President Kamala Harris—with keywords like “girl” and “child” in English on Google, Bing, and DuckDuckGo.

Pornographic deepfakes of Kamala Harris are all over Google Images
Major search engines have been indexing suggestive images of Harris, which were made by a generative AI site without any moderation

All three search engines are indexing AI-generated child sexual exploitation content from this site, despite the expectation that their systems, especially Google's – which holds over 90% of the online search market –, would moderate such sensitive and globally prohibited material.

On July 18, 2024, Nucleo reported how Google indexes AI-generated images showing pregnant children.

📬
This article was prompted by a tip from a Nucleo reader. If you wish to contact the reporter about a story, please email [email protected].

Grotesque content

The images seen by Nucleo range from fetishization to explicit scenes of child sexual exploitation, such as children in underwear, completely nude, and even in sexually suggestive poses. There are hundreds of such images on the platform.

💡
Due to the illegal and grotesque nature of this content, Nucleo decided not to use screenshots of this material, even with blurring. If organizations or authorities investigating child sexual exploitation wish to access the material for their inquiries, please contact us via the official email [email protected].

Many of the prompts, which are the instructions given to the AI generator, contain discriminatory or prejudiced words against marginalized populations. Some of the prompts seen by the report include:

  • “black girl child with her little sister both with legs open no pants no shirt”
  • “child with skirt lifted by the wind”
  • “child lying down showing butt”
  • “rear view of a black girl and her little sister, both without thongs big butt”

Nucleo tested the site’s name with terms involving minors in English—such as girl, child, or teenager—and found that these contents are easily displayed in the image tab of Google, Bing, and DuckDuckGo, the world's leading search engines.

Some indexed images are inaccessible, with the site displaying the message: “Oops, access denied. It seems the page you are looking for no longer exists or is in private mode.”

In paid plans, the platform offers private mode as one of the benefits, which apparently does not prevent its indexing by search engines.

What site is this?

The site’s name will be omitted by Nucleo, unlike previous similar reports, as Brazilian law criminalizes any type of direct or indirect disclosure or dissemination of such materials.

According to the site’s information, the generator uses models from Stability AI and Hugging Face, both startups that create and commercialize open-source language models—free and modifiable for users.

  • Free users can access two models from Hugging Face and have the right to commercialize the produced images.
  • For subscribers, with plans ranging from $9.90 to $19.90 monthly or $99 annually, access to four different models is available.

The site’s terms and conditions prohibit users from creating “content that is abusive, offensive, pornographic, defamatory, misleading, obscene, violent, defamatory, hate speech or otherwise inappropriate.”

According to SemRush, a platform that analyzes keywords, domain, and traffic to aid in SEO strategies, some of the main terms driving users to the site are “AI titties” and “giantess boobs.”

Nucleo sent two inquiries via email to the company through the contact email provided at the site’s footer. We asked how content moderation works on the platform and which language model(s) is being used in the program.

At the time of publication, we had not received a response.

Pornographic deepfakes of Kamala Harris are all over Google Images
Major search engines have been indexing suggestive images of Harris, which were made by a generative AI site without any moderation
Reporting by Sofia Schurig
Editing by Sérgio Spagnuolo

Translated with the help of ChatGPT, and revised by human editors

Receba nossas newsletters e traga felicidade para sua vida.

Não perca nada: você vai receber as newsletters Garimpo (memes e atualidades), Polígono (curadoria de ciência nas redes sociais) e Prensadão (resumo semanal de tudo o que o Núcleo fez). É fácil de receber e fácil de gerenciar!
Show de bola! Verifique sua caixa de entrada e clique no link para confirmar sua inscrição.
Erro! Por favor, insira um endereço de email válido!