The Responsibility of Digital Platforms in Brazil Post-Supreme Court: The Partial Unconstitutionality of Article 19 of the Brazilian Civil Rights Framework for the Internet

2025
8
mins read

The discussion on the civil liability regime for internet application providers in Brazil took on a new dimension following the Federal Supreme Court's recognition of the partial and progressive unconstitutionality of Article 19 of the Brazilian Civil Rights Framework for the Internet (Law No. 12,965/2014). The provision, originally conceived as a mechanism to protect freedom of expression and legal certainty, conditioned the civil liability of platforms on the existence of a specific court order determining the removal of illegal content. Although innovative at the time, this model became insufficient in the face of profound changes in the digital ecosystem, marked by the large-scale dissemination of illegal content, the coordinated action of artificial distribution networks, and the intensive use of platform infrastructure for the commission of serious crimes that affect both individual rights and the integrity of the democratic regime.

Context

The Brazilian Civil Rights Framework for the Internet (Law No. 12,965/2014) established a specific model of civil liability for application providers in Brazil. Its Article 19 enshrined the so-called "specific court order rule," according to which platforms can only be held civilly liable for damages resulting from content generated by third parties if, after being notified by court order, they fail to remove the content identified as illegal. This is a deliberate legislative choice: by ruling out private censorship mechanisms and automatic removal, the legislator sought to protect freedom of expression, preventing platforms from acting as "arbiters of legality" of speech. The provision, therefore, established a judicial safe harbor, distinct from the (broader) US model and the (more restrictive) European model at the time.

The US model, enshrined in Section 230 of the Communications Decency Act (CDA), enacted in 1996, is internationally recognized as the broadest platform immunity regime. In summary, the provision establishes that no provider or user of interactive computer services may be treated as a publisher or held responsible for content published by third parties. This means that platforms such as forums generally enjoy broad civil immunity and cannot be held liable for the maintenance of illegal content published by users, even if reported extrajudicially. The logic behind Section 230 is to encourage innovation, reduce legal risks, and prevent platforms from excessively removing content to avoid litigation—a central concern of US lawmakers in the 1990s, when the digital environment was still in its infancy.

In contrast, the original European model, enshrined in Directive 2000/31/EC (Electronic Commerce Directive), adopted a more cautious and restrictive stance. The directive established that hosting providers would only be held liable if, having "actual knowledge," they did not act promptly to remove illegal content. Unlike the US rule, the European Union adopted a mechanism known as notice and takedown, according to which liability arises when the provider is notified and remains inactive. The directive did not require a court order to characterize this "actual knowledge" — it was sufficient for a sufficiently clear notification to be sent to the provider. The European model sought to balance protection of freedom of expression and protection of fundamental rights such as honor, intellectual property, privacy, and security.

Brazil, in drafting the Civil Rights Framework for the Internet, made a peculiar choice. Instead of replicating the broad US immunity, the legislature chose to reduce the risk of abusive private removals, rejecting the European idea that the platform could be held liable simply for having "actual knowledge." A hybrid model was then created: a conditional safe harbor, which requires a specific court order for liability. This option is reflected in the core of Article 19, whose heading establishes that the provider can only be held liable if, after a court order, it does not make the content identified as illegal unavailable. The central idea was to prevent platforms, for fear of litigation, from preventively removing legitimate content, thus reinforcing freedom of expression.

This design, however, has become insufficient in the current digital landscape, where massive damage occurs in seconds and the requirement for a court order makes timely responses unfeasible. The core of the rule is in the heading of Article 19: "The internet application provider may only be held civilly liable for damages resulting from content generated by third parties if, after a specific court order, it fails to take steps, within the scope and technical limits of its service, to make the content identified as infringing unavailable."

Article 21 of the Civil Rights Framework, on the other hand, establishes a relevant exception to the rule in Article 19 by providing for liability regardless of a court order in situations involving intimate content, such as non-consensual pornography, leaked nudes, and sexual material disclosed without the victim's authorization. In this case, the provider must remove the content after extrajudicial notification containing minimum elements that allow for the identification of the illegality. If this duty is not fulfilled, the provider becomes liable for damages. The objective is to protect extremely sensitive personality rights, the damage to which spreads irreversibly and instantly. Article 21, therefore, introduced the logic of liability for omission after notification of the victim, functioning as the embryo of a more diligent and active responsibility on the part of platforms.

The Supreme Court, in reinterpreting Article 19, recognized that the American model is excessively permissive and that the European model at the time—although more protective—also does not adequately address systemic risks, which is why the European Union approved the Digital Services Act in 2022, introducing duties of diligence, transparency, and risk mitigation. The Brazilian decision is closer to this new paradigm.

Unconstitutionality of Article 19 of the Brazilian Civil Rights Framework for the Internet

It was in light of these two provisions that the Federal Supreme Court was called upon to rule on the constitutionality of Article 19 in the set of lawsuits that comprised the so-called "Direct Claims of Unconstitutionality of the Brazilian Civil Rights Framework for the Internet." The central issue in the judgment was whether the specific court order rule would be compatible with the Constitution in the current information environment, marked by coordinated disinformation, digital political violence, hate speech, attacks on vulnerable groups, algorithmic manipulation, and massive practices of violation of fundamental rights.

The Court opted for a model of progressive unconstitutionality, establishing that Article 19 remains in force but cannot be applied in a comprehensive and unrestricted manner, requiring interpretation in accordance with the law until the legislature produces normative discipline appropriate to the new technological realities. According to this new interpretation, the liability regime for providers now includes significant nuances and exceptions. First, until a specific law is enacted, platforms may be held civilly liable under Article 21 of the Civil Rights Framework itself whenever they are faced with content that constitutes serious crimes or highly reprehensible illegal acts, without prejudice to their duty to promote removal.

The Supreme Court has established, in detail, an exhaustive list of conduct that requires an immediate and diligent response, such as anti-democratic crimes provided for in the Penal Code, terrorist acts or preparatory acts, inducement or incitement to suicide and self-mutilation, crimes of racial or gender identity discrimination, crimes against women, sexual crimes against vulnerable persons, child pornography, and human trafficking. In these cases, the failure of platforms to address the massive circulation of such content constitutes a systemic failure and gives rise to liability, provided that the preventive or corrective measures adopted are proven to be insufficient.

In addition, the decision clarifies that isolated incidents do not automatically constitute civil liability, and in such cases, the provisions of Article 21 apply. On the other hand, when there is massive circulation of serious illegal content, the provider is required to take immediate action commensurate with the state of the art, imposing a qualified duty of care. This duty does not imply strict liability, which remains excluded, but establishes a presumption of liability whenever it involves paid advertisements and boosts or artificial distribution networks—such as bots or automated systems—used to amplify illegal content. In such cases, the platform is only exempt if it can prove that it acted diligently and within a reasonable time to make the content unavailable.

The decision also brought important improvements regarding recurring practices on platforms. For crimes against honor, the original regime of Article 19 remains in place, requiring a court order, but removal by extrajudicial notification is allowed whenever the provider chooses to do so. In cases of successive replications of content already declared illegal by court decision, all platforms must automatically remove identical publications, regardless of new decisions. This is a "notice and stay down" mechanism, which seeks to prevent the continuous reintroduction of the same offensive content, a problem that, in practice, undermines the effectiveness of traditional court decisions.

Furthermore, the Supreme Court has defined the cases in which Article 19 continues to apply in full, safeguarding the differentiated regime for services whose functionalities justify greater protection of confidentiality and private communication. Thus, email providers, closed video or voice meeting platforms, and instant messaging services remain subject to the original rules, exclusively with regard to interpersonal communications protected by Article 5, XII, of the Constitution.

As part of the reinterpretation of the regime, the Court established structural duties for platforms, aligning the Brazilian system with international trends such as the European Digital Services Act. Among these duties, the following stand out: (i) the need to adopt notification and due process mechanisms, (ii) the obligation to publish periodic transparency reports, the creation of accessible and widely publicized service channels, and (iii) the duty to maintain a legal representative in Brazil with full powers to respond administratively and judicially, comply with determinations, provide information, and bear penalties. Finally, the Supreme Court modulated the effects of the decision, establishing that the new guidelines apply only prospectively, in order to protect legal certainty and avoid retroactive impact in consolidated situations.

Conclusion

The decision represents a milestone in the redesign of platform liability in Brazil. By recognizing the inadequacy of the original model and establishing provisional guidelines for its correction, the Supreme Court has promoted a careful balance between the protection of fundamental rights, freedom of expression, and democratic integrity. The challenge now falls on the National Congress, which must draft comprehensive, technical legislation that is compatible with the systemic risks of digital communication, ensuring that the online environment is simultaneously safe, democratic, and promotes rights.

FAQs

Frequently asked questions about this article.

Question text goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Question text goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Question text goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Question text goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Question text goes here

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.