CyberBRICS

Accountability and Transparency in practice: necessary improvements to the 2630/2020 Bill

By Yasmin Curzi de Mendonça, Clara Leitão e Almeida, Henrique Torres, Leandro Rabelo e Raphael Santana


Following international trends in digital platform regulation ­– especially in online content moderation – legislative proposals have emerged in Brazil seeking to increase platforms’ accountability and guarantee more rights for the users. The Bill No. 2,630/2020, Platform Accountability and Transparency Bill, more commonly known as “Fake News Bill”, arises precisely with the intention of increasing transparency and defining duties for intermediaries, including the establishment of due process for users to question platform decisions. The purpose of this second article of the 2630/2020 Bill series is to present the main points of this piece of legislation on the accountability of platforms, its convergences with international literature and principles on the subject, as well as the possibilities for improvement towards achieving its high-minded and intended objective: to ensure that users’ rights can be safeguarded and that transparency measures are implemented.

Digital platforms are private entities that play the role of intermediaries in public communication. Although they cannot be held accountable for the content posted online – as the article 19 of the Brazilian Internet Civil Rights Framework has so clearly and adeptly defined[1]  –, it is through them that content is disseminated. As private actors, they have the right to make their own rules of conduct in the so-called terms of community services and policies (quasi-legislative power) and also to apply these rules in their environments (quasi-judicial and quasi-executive power). Thus, platforms are self-regulated entities, and this self-regulation is performed through techniques of online content moderation[2]. There is a consensus in the literature that this activity is inherent to the existence of platforms[3], and is desirable by users who seek conflict resolution based on these mechanisms.[4]

This series of articles is an initiative organized by the Research Group on Online Content Moderation of the Center for Technology and Society at Getulio Vargas Foundation.