Balancing free expression with fake news on social media

UNESCO official outlines conclusions of a report on governance of digital platforms

Guilherme Canela speaking to Nepali members of parliament and government officials recently at a dialogue in Kathmandu on safeguarding international standards of free expression in the digital ecosystem. Photo: UNESCO

Guilherme Canela is Chief of the Freedom of Expression and Safety of Journalists Section of UNESCO in Paris. He recently visited Nepal to discuss freedom of expression in the digital media. Nepali Times spoke to him about new challenges to free media. Excerpts:

Nepali Times: Can you tell us about how the discussions with Nepal’s lawmakers went?

Guilherme Canela: My visit, in partnership with the Inter-Parliamentary Union, was to have an open discussion with them about international standards on freedom of expression. It was very enriching to learn the key concerns of Members of Parliament regarding the current challenges Nepal faces regarding the existing legal framework regulating the media and digital ecosystem, vis a vis the immense changes in this environment related to the technological revolution we all are experiencing, and how these challenges can be addressed in alignment with the International Human Rights Law.

There is a fine balance between freedom of expression and that same freedom being used to spread intolerance and hatred. What are the main conclusions of UNESCO’s own study on the governance of digital platforms? 

Digital platforms have become a new front in the pursuit of peace. They have had a transformative role in advancing human rights. They have democratised access to knowledge and culture, enabled new voices to be heard, and fostered global connections, but, as you say, at the same time they have become ecosystems of misinformation, disinformation, ideological polarisation, and incitement to violence, discrimination, and hate

Such features have undermined democratic values and threatened human rights across the world. Our job, therefore, is too foster the huge positive side of this equation and to counter the negative externalities it generates.

Thus, whereas the digital companies don’t operate with transparency, accountability and due diligence, many countries have embarked on regulatory processes without keeping a human rights approach as a main principle thereby risking the shrink of the civic space and leading to different kinds of restriction not aligned with International Human Rights Law.

UNESCO aims to ensure that everyone’s freedom of expression, access to information and diverse cultural content are guaranteed, while various stakeholders, including governments, deal with the problems of dis- and misinformation and hate speech online. These problems will be better addressed, in a way that aligns with international human rights law, through implementing the UNESCO Guidelines in a comprehensive and coherent manner. In a nutshell, UNESCO’s proposal, which was debated worldwide for almost 2 years, and received 10,000 comments coming from actors from 134 countries, suggests a governance system that can improve the processes currently used by the tech companies, aiming to expand freedom of expression, while dealing with the negative externalities.

But when can we deem the risks so high that governments can justify restricting freedom of expression through digital platforms under international standards?

The rapidly evolving digital era has seen the rise of the imposition of restrictive measures on freedom of expression and the press such as infringement on right to privacy, censorship, and the prevention of general access to the dissemination of information, both online and offline, including internet shutdowns. However, the justification for governments to restrict such freedoms on digital platforms arises when there are identifiable and/or substantial risks to public safety, national security, or individual rights which cannot be sufficiently mitigated through other means.

Read also: Nepal should be open for South Asia

In order for any restriction to be legitimate it must fulfil the criteria set out in international standards on freedom of expression, such as the Articles 19 (3) and 20 of the International Covenant on Civil and Political Rights or the Three-Part Test, which demands that any interference must be provided for by the law, pursue a legitimate aim, and be necessary and proportional to secure one of those aims.

Guilherme Canela in Nepal
Guilherme Canela(Photo: Sonia Awale)

In essence, any statutory restriction to freedom of expression must be subject to oversight and review mechanisms in order to uphold the principle of transparency, accountability and prevent abuse of power.

While governments may be justified in restricting freedom of expression on digital platforms under certain circumstances, such restrictions must be carefully circumscribed to protect the fundamental rights of individuals and uphold the principles of democracy, pluralism, and open discourse in the digital age.

Is there not a danger, then, that states can justify banning digital platforms using the excuse of maintaining “social harmony”?

Measures applied by both companies and states should always be proportional to the harm that is intended to prevent. From companies, removal and blocking of content and account suspension should only be used as a last resort in the most serious cases. From the side of governments, banning a digital platform should be proportionate to the level of harm it causes and should be accompanied by transparent processes, due diligence, due process of law and consideration of alternative measures to mitigate risks while preserving the benefits of digital platforms for legitimate users.

Additionally, regulatory actions should be based on evidence-based assessments and subject to judicial review to ensure they are fair, just, and consistent with principles of freedom of expression and the rule of law.

Banning digital platforms can be considered proportionate in certain circumstances where their activities pose significant harm that outweighs the benefits they provide. Here are two scenarios where banning digital platforms might be considered proportionate. Systemic violation of international human rights standards: If a digital platform systemically violates international human rights standards despite warnings and sanctions. And second, Exploitative Practices: when platforms engage in exploitative practices such as facilitating illegal activities (such as child pornography) or enabling human trafficking.

However, it is important to outline that our Guidelines do not call on the government to establish a department or organisation to moderate content. The Guidelines consider that digital platforms should not be held liable when they act in good faith and with due diligence, carry out voluntary investigations, or take other measures aimed at detecting, identifying, and removing or disabling access to content that is prohibited under article 20 of the ICCPR or that has been restricted in terms of article 19(3) of the ICCPR.

How can journalists protect their freedoms when democracy itself is threatened?

Journalists can protect their freedom by first keeping on doing their job and hold all powers accountable. But they cannot do it alone or in a vacuum, neither should they be left alone to deal with existential safety and viability challenges. As proclaimed by all UNESCO’s member states, journalism should be valued as a fundamental ‘public good’ with citizens putting their trust in it also through teaching and encouraging media and information literacy. It demands the State protect media viability, the rule of law institutions defending media freedom from all attacks, including by elected officials, and for politicians committing to respect the fundamentals of democracy, including the role of the press as a watchdog. There is an urgent need to further demonstrate to our societies the value of journalism (and all fundamental freedoms) not only to guarantee political and civil rights, but to guarantee all other rights. When someone is protesting to have better education or better health, ultimately what they want is to protect the rights of education and a healthy life, but they can’t do that if they don’t have freedom of expression in the first place and if watchdogs, like journalists, are being silenced.

Your report also cites the critical need for digital platforms to protect electoral integrity. How can this be done? 

Firstly, in UNESCO Guidelines for the Governance of the Digital Platforms, we highlight the need for Digital platforms to recognise their role in supporting democratic institutions by preserving electoral integrity. Their acceptance of the role is not trivial as it implies an engagement regarding democracies. UNESCO also recommends establishing a comprehensive risk assessment process that focuses on the integrity of the electoral cycle leading up to and during major electoral events. These assessments should be conducted transparently and in consultation with all relevant stakeholders and should include a gender perspective to address the escalating online violence against women in electoral contexts. Proactive measures based on the identified risks are essential to prevent potential threats to the integrity of the electoral process.

Read also: Don't act on the act, Editorial

Secondly, UNESCO urges digital platforms to ensure that users have access to a diverse range of information and ideas, in line with international human rights law. Attention should be given to automated tools to prevent any hindrances in accessing election-related content and diverse viewpoints. Additionally, digital platforms should review their products, policies, and practices related to political advertising to avoid arbitrary limitations on candidates' or parties' ability to disseminate their messages during the electoral cycle. The promotion of independent fact-checking, advertisement archives, and public alerts is highlighted as crucial measures to uphold electoral integrity in content dissemination.

UNESCO guidelines for the governance of digital platforms

Finally, transparency is emphasised across various aspects, including the use and impact of automated tools, engagement with stakeholders and governance systems, and the identification of political advertisements. Digital platforms are responsible for disclosing funding details, applying equal content moderation rules, and tracking the monetisation of political posts. The retention of advertisements in a publicly accessible online library, along with information on funding and targeted demographics, contributes to an accountable and transparent digital environment during electoral cycles. If put in place by the platforms, these measures will collectively reinforce the commitment to preserving electoral integrity in the digital age.

How can digital platforms be held more responsible when Big Tech is controlled from outside national jurisdictions whereas the impact of their content is domestic? 

The strategy for the implementation of the UNESCO Guidelines for the Governance of Digital Platforms aims to promote the enabling environment for freedom of expression and information and enhance the responsibility of digital platforms worldwide by Building a Network of Networks by fostering international collaboration and cooperation among regulatory authorities and civil society to develop common standards and regulations for digital platforms. This can help ensure that platforms adhere to responsible practices regardless of their location of ownership and may also reduce lawfare.

Then there are Public-Private Partnerships to address regulatory challenges associated with digital platforms. Collaboration between governments, industry stakeholders, and civil society can facilitate the development of effective governance frameworks.

Read also: Media mêlée in the age of AI, Kunda Dixit

Promoting transparency and accountability of the governance system (independent regulatory authorities) and of the digital platforms policies and practices. Human rights due diligence and content moderation and curation based on international human rights standards must be upheld. The governance system itself must have independent oversight mechanisms to monitor the activities of digital platforms within national jurisdictions.

We must also empower users by providing them with tools and resources to understand and manage their online experiences through media and information literacy. This includes features for content control, privacy protection, and mechanisms for reporting harmful content or behaviour.

Lastly, there must be ethical guidelines for digital platforms that prioritise transparency, fairness, and respect for human rights. These guidelines can serve as a foundation for responsible platform governance.