Receba nossas newsletters
É de graça. Só tem coisa fina, você vai curtir!
Read this story in Portuguese
After much anticipation, the preliminary report on the bill of law 2630/2020, the so-called Fake News Bill, was presented to the Brazilian House of Representatives last Thursday night (April 27th) and is intended to be voted next week.
The law aims to regulate the functioning of digital platforms and messaging apps in Brazil, and establishes a series of obligation for big tech companies.
Formally known as the Congress Bill on Responsibility and Transparency for Digital Platforms, it was written by the federal deputy Orlando Silva (PCdoB-SP), and was first presented in the Senate by Alessandro Vieira (PSDB-SE) in 2020.
If approved as they are now, the rules would be applied to platforms — social media, messaging, apps and search providers — with over 10 million users in Brazil.
Nucleo has prepared an overview on the main points of the text presented last Thursday:
- The text does not include the creation of an autonomous entity for the regulation and supervision of digital platforms, something that had been suggested by the government as well as in previous versions.
DETAILS — Regulatory entity off
The creation of a regulatory entity was included in the Executive’s proposal along with other suggestions for the bill. In the evaluation of Orlando Silva, during an interview given to Globonews, this point was facing resistance in the congress and could make the approval of the text unfeasible.
In the current version presented on Thursday, the text does not indicate who will be responsible for overseeing and ensuring that the platforms will comply with their obligations.
In the chapter “About the regulation to the providers”, the text presents attributions to the Internet Management Committee, however does not give that committee any inspection power.
By Tuesday (May 2nd, 2023), when the merits of the project will be voted in plenary, deputies must reach a new solution.
ANATEL. Also on Thursday night, the president of the National Telecommunications Agency (ANATEL), Carlos Baigorri, defended that ANATEL should be the autonomous entity within the scope of the project.
This choice is seen as reckless by experts, as it is an agency linked to the Ministry of Communications. The Agency already has attributions related to telephony and therefore relations with telecommunication companies.
The Rights Coalition on the Network, which unites several civil society entities, published a note pointing out that ANATEL does not have “the necessary expertise in platform regulation issues, having also repeatedly failed to fulfill its attributions in the telecommunications sector”. The idea of having ANATEL as the supervisor of the platforms must be “unequivocally rejected”, wrote the coalition.
SELF-REGULATION. One of the Executive’s suggestions was to allow the platforms to set up a self-regulatory entity. Orlando Silva did not incorporate this suggestion into the most recent version of the text.
- Companies will have to be much more active in their transparency with users, providing information about their content recommendation systems as well as semi-annual reports.
DETAILS - Transparency ON
One of the strengths of the text is the transparency obligations the platforms will need to have.
The companies must inform, in terms of use, the parameters used in content recommendation systems in order to make it clear to users why they have received certain content.
The text also requests semi-annual transparency reports that will have to be accessible on the companies’ website in Portuguese, providing information on content moderation procedures.
The bill determines that companies must annually carry out and publish an external and independent audit to ensure the obligations are being fulfilled.
- Good news for researchers and academics: the text makes it clear that access to data (including via APIs) for research will have to be provided free of charge.
DETAILS - Data for research
Data for researchThis is extremely important given that big companies such as Meta, Tik Tok, Twitter, Telegram and Kwai are quite conservative regarding those who can access their official APIs, including even academics and researchers.
- Politicians will have their parliamentary immunity extended to social networks.
DETAILS - Parliamentary immunity
The article 33 of the text maintained the extension of parliamentary immunity to social networks.
This topic is considered one of the most problematic of the bill, since it can discourage social networks from moderating content published or broadcast by parliamentarians, even if it is harmfully uninformative. In the Executive’s proposal, this issue had been left out.
- Platforms may be held civilly liable for content generated by third parties that has been promoted or when they fail to comply with duty of care obligations.
DETAILS - The duty of care obligations
It applies to illicit or harmful content linked to the following list of crimes, specified in the bill:
- crimes against the democratic rule of law and coup d’état;
- acts of terrorism and preparatory to terrorism;
- suicide encouragement or self-mutilation;
- crimes against children and adolescents;
- crimes of discrimination or prejudice based on race or color;
- political violence against women;
- sanitary infraction, by hindering the execution of sanitary measures when a public health emergency situation is declared.
The assessment of compliance or otherwise with the duty of care will be based on the “set of efforts and measures adopted by providers, with no assessment of isolated cases”.
- Companies will have to conduct recurring systemic risk analyses
DETAILS - Systemic risk analysis
Platforms will have to “diligently identify, analyze and assess systemic risks arising from the operation or design of their services and their related systems, including algorithmic systems”. According to the bill, this assessment must be published annually and before new features with potencial impact are introduced.
- The bill requests greater responsibilities from the platforms regarding third-party content and changes the accountability regime for intermediaries established by article 19 in the current Internet Civil Rights.
DETAILS - Responsibility over the content
The project requires, in chapter II, that providers can be held civilly liable “in a joint way” for:- damage repairs caused by third-party content that was posted on the platform;- damages resulting from content generated by third party when there is a breach on the duty of care obligations.
This way, platforms must adopt “reasonable, proportionate and effective mitigation measures”.
The proposal changes the understanding of article 19 of the Internet Civil Rights, which today establishes a regime of responsibility for intermediaries by saying that a platform can only be held responsible when it does not act on the content that was the subject of a court decision.
- Fine for non-compliance from a court decision (within over 24 hours) can reach R$1 million per hour of non-compliance.
DETAILS - Security protocol and sanctions
The project introduces the notion of a “safety protocol”, which may be introduced when there is an imminence of risk, negligence or insufficient action from the companies. This protocol will be established by reasoned decision for a period of 30 days, which may be extended for another 30 days.
Within the scope of this protocol and after notification, companies may be civilly liable for damages arising from content generated by third-parties when prior knowledge is evident. The complaint by users already configures “prior knowledge”.
In addition to the protocol, the text requires that companies must comply within 24 hours with court decisions that determine the removal of illicit content, under penalty of a fine varying from R$50,000 to R$1 million per hour of non-compliance.
The sanctions were scaled as follows:
- warning, indicating the deadline for adopting corrective measures;
- daily fine;
- simple fine, of up to 10% of the economic group’s revenue in Brazil in the last financial year or, in the absence of revenue, a fine from R$10 up to R$1,000 per registered user of the sanctioned provider, limited in total to R$50,000,000 (fifty million reais) per infraction;
- publication of the decision by the offender;
- prohibition of processing certain databases;
- temporary suspension of activities.
- Remuneration for journalistic content appears in the text, but leaves it open for separate regulation.
DETAILS - Remuneration for journalistic content
The project establishes in a very broad and generic way that providers will have to remunerate media organizations for journalistic content in text, video, audio or image, but leaves this matter to be established by another regulation.
Some criteria were established. To be eligible, media organizations will need to:
- be at least 2 years old;
- produce journalism regularly, organized and professionally;
- maintain physical address and responsible publisher in Brazil.
The text grants the technology companies and vehicles the responsibility to find an agreement.
This article is expected to generate new regulations to clarify specific compensation criteria.