Bluesky struggles to moderate child abuse material in Portuguese

Nucleo and Brazilian researchers investigated the platform and mapped over 125 Portuguese-language profiles that sell or share child sexual abuse material on Bluesky.
Bluesky struggles to moderate child abuse material in Portuguese
Illustration by Rodolfo Almeida/Nucleo
Receba nossas newsletters

É de graça. Só tem coisa fina, você vai curtir!

Show de bola! Verifique sua caixa de entrada e clique no link para confirmar sua inscrição.
Erro! Por favor, insira um endereço de email válido!
⚠️
This report addresses a sensitive topic that may be disturbing to some readers, but it does not contain explicit images.

Bluesky, an alternative social network to X that has surpassed 10 million users, is facing significant challenges in moderating content in Portuguese related to the sexual exploitation of children and adolescents.

An investigation by Nucleo, in collaboration with independent researchers, identified 125 Portuguese-language profiles that share or sell illegal materials, including explicit photographs of child sexual abuse, without any censorship of the images.

Alongside Brazilian researchers Tatiana Azevedo and Letícia Oliveira, Nucleo's investigation mapped profiles and posts that blatantly violate Bluesky's terms of service and local law over a two-week period.

During the analysis, it was possible to identify that Bluesky was not moderating phrases, terms, or emojis commonly associated with child sexual abuse material (CSAM) in Portuguese, even when they are explicit.

In a statement to Nucleo, Bluesky said that "out moderation team takes down accounts that engage in this content, and we purge this data from our infrastructure. In the last month, we've increased the size of our moderation team to handle the influx of new accounts."

The social network moderation team has been spread thin since X was blocked in Brazil, in late August, which led over 2,5 million new Brazilian users to Bluesky in under one week.

On September 5, 2024, just six days after the suspension of X in Brazil, Bluesky's head of Trust & Safety, Aaron Rodericks, reported a tenfold week-over-week increase in the number of reports of this criminal content on the network.

All the questions we asked Bluesky

  • Does Bluesky have different strategies for dealing with text posts discussing child sexual abuse or grooming and images of child exploitation?
  • If so, what are these strategies?
  • Does Bluesky have a dedicated team for moderating child exploitation, and if so, how many members are part of this team?
  • Does Bluesky use image hashing (fingerprinting) for this type of content?
  • Why doesn’t Bluesky block known keywords and terms like “child pornography” or “cp”?
  • Where can users track the status of reported content and see if it has been reviewed by Bluesky?

Bluesky's full response

At Bluesky, we take child sexual abuse material very seriously. We have automated systems that scan image and video content to preemptively block occurences of CSAM, and we have a 24/7 moderation team that responds to reports. Our moderation team takes down accounts that engage in this content, and we purge this data from our infrastructure. In the last month, we've increased the size of our moderation team to handle the influx of new accounts. 

And here's a note from Aaron Rodericks, Bluesky's Head of Trust and Safety, sharing some additional details: We use image hashing to identify known cases of CSAM that are shared on Bluesky. Additionally, we investigate reports and networks that engage with CSAM, even if the content isn't on Bluesky, and remove all involved users from the site.

How to report a post on Bluesky
To report a post, go to the content you wish to report, click on the three dots (…), and select “Illegal and urgent.” You can provide up to 300 characters of context in your report.

Investigation

Over the course of two weeks, the investigation mapped 125 Portuguese-language profiles on Bluesky that solicit, sell, or share child sexual abuse material (CSAM). Each profile or post included at least two of the following elements:

  • Requests for sexually explicit photographs of children or adolescents;
  • Hashtags or keywords with sexual or criminal content;
  • Acronyms or references to “child pornography” and similar terms.
📝
Letícia Oliveira and Tatiana Azevedo are Brazilian independent researchers specializing in online extremism. They have previously contributed to reports on school attacks for Brazil's Ministry of Education and have collaborated on various investigative pieces with Nucleo.

GROUPS. This gap in moderating basic terms also allows posts linking to child sexual exploitation groups on messaging apps like Telegram to remain active.

The posts reviewed by the investigation include explicit requests from users to “be added to pedophilia groups.” In the comments, users even share their own phone numbers.

In August, Telegram's CEO and founder, Pavel Durov, was detained in France on charges of complicity in crimes occurring on the platform, including the selling of CSAM.

Self-generated CSAM

Last year, Stanford University mapped the measures tech companies are taking to battle “self-generated CSAM.” This term refers to sexual media captured and sold online with the consent of minors.

The study found that most platforms struggle to implement effective changes to their moderation algorithms to prevent the publication of such content. X and Discord were highlighted as major distributors of this type of material.

Now, Nucleo identified similar user-generated content being shared on Bluesky. One example is a post featuring a sexually explicit photograph of an apparent minor, accompanied by the caption “14y [years] here,” which garnered 90 likes in just a few hours.

📬
This story was based on a tip-off from a reader. If you would like to speak with the reporter about a story, please email [email protected] or contact her via Signal.

New methods

In addition to employing well-known terms and keywords, users on Bluesky are also using specific emojis associated with certain sports to index child sexual abuse material (CSAM).

The practice of using symbols to represent communities, both online and offline, is not new. In 2007, WikiLeaks published an official document detailing how the Federal Bureau of Investigation tracked symbols and artifacts used by pedophiles worldwide.

Some symbols described in that 2007 FBI document have been identified by Nucleo in posts analyzed on Bluesky, while others, particularly those related to sports, are more recent trends.

Oliveira and Azevedo traced the use of these emojis to the arrest of a former American football player, who was detained in 2015 for sexually abusing a child. According to a criminal case file obtained by Nucleo and the researchers, the player also filmed the crime.

The former player's name, his old jersey number, and references to the alleged crime video are now appearing on Bluesky alongside CSAM.

How moderation works on Bluesky

In addition to violating Brazilian law, the posts and profiles detected during the investigation also violate Bluesky's terms, which explicitly forbid even the “normalization of pedophilia” by its users.

Nucleo asked Bluesky about the methods that have been used to identify and remove CSAM on the platform. The app stated it employs a combination of automated systems, human moderators working around the clock, and image hashing technology.

Rodericks informed Nucleo that Bluesky is also investigating reports related to networks involved in child exploitation outside of the platform in order to remove all involved users from the site.

In Brazil, local law mandates that service providers preserve user data for up to six months. In the U.S., the recently enacted REPORT Act requires data retention for at least a year in cases involving CSAM that are reported to the National Center for Missing and Exploited Children (NCMEC). Regarding this issue, Bluesky stated that it does not retain images and videos of CSAM beyond what is necessary for reporting them to the authorities.

The failure to retain data related to online crimes has previously posed challenges for Brazilian authorities. For instance, during a wave of school attacks and shootings in local schools in April 2023, platforms did not retain data on users who were promoting violence.

How we did this

Researchers Letícia Oliveira and Tatiana Azevedo contacted Núcleo's reporter to report the presence of child sexual exploitation content on Bluesky. This led to a wider investigation and analysis of the terms, emojis, and keywords already mapped by the researchers. Next, we checked which posts and comments were still available and reached out to Bluesky’s press office.

💡
Due to the illegal and disturbing nature of the material, Nucleo decided not to use screenshots of it, even with redactions. If organizations or authorities investigating child sexual exploitation wish to access the material for their inquiries, they can contact us via email at [email protected].
Reporting by Sofia Schurig
Editing by Alexandre Orrico and Sérgio Spagnuolo

Translated with the help of ChatGPT and revised by human editors

Receba nossas newsletters e traga felicidade para sua vida.

Não perca nada: você vai receber as newsletters Garimpo (memes e atualidades), Polígono (curadoria de ciência nas redes sociais) e Prensadão (resumo semanal de tudo o que o Núcleo fez). É fácil de receber e fácil de gerenciar!
Show de bola! Verifique sua caixa de entrada e clique no link para confirmar sua inscrição.
Erro! Por favor, insira um endereço de email válido!