Face-to-face to counter Facebook
Sakar Pudasaini, founder at Karkhana explores innovation, technology, education, and their social consequences in this new monthly column for Nepali Times.
The recent release of the ‘Facebook Papers’ and public testimony by whistleblowers have drawn attention to the use of social media, specifically Facebook, as a tool of civic and political action.
The ensuing firestorm has drawn much needed scrutiny and triggered a debate on how to regulate the social media giant. It has also made clear that the global governance of social media will deprioritise peripheral countries such as Nepal.
Both technological and human resource solutions that regulate social media are unlikely to take local needs into account. Thus, it is incumbent on domestic actors to develop ‘herd immunity’ through social consensus on how they use Facebook and other social media to advance their agenda and, engage in contentious debate.
Documents from Facebook whistleblowers show that the company does not treat all countries equally. Tier 0 (higher priority countries) such as the United States and India get strong content moderation that includes algorithmic monitoring and ‘war rooms’. Facebook develops technology to proactively identify misinformation, falsehood and inflammatory content in these high-priority markets.
These technological tools, built to understand local languages and local patterns of social media use, are supplemented by local moderators with contextual knowledge.
However, in Tier 3 countries like Nepal, Facebook appears to make no proactive effort to identify and moderate polarising content. Instead, it relies on user reporting to identify problematic content. In a Tier 0/1 country, content inciting violence could be proactively removed by algorithms or moderators. In a Tier 3 country, significant challenges would have to be overcome before such action is taken.
To begin with, enough local users would have to flag the content as a problem. Then, a moderator who is unfamiliar with the local context would have to make a decision to remove the content. Even if such a scenario comes to pass, it is unlikely to happen fast enough to prevent significant harm in rapidly-evolving situations.
It is therefore important that countries like Nepal develop mechanisms to fend for ourselves. The most important domestic actors that need to buy into the need for self-regulation are political parties.
There are three reasons:
First, disinformation, untruths and polarising propaganda are most prevalent around elections when both emotions are elevated and the stakes are high. With political power and its benefits at stake, political partisans will be more willing to cross the line from political posturing to problematic polarising messaging during these times. This is particularly relevant to Nepal, where federal and local elections are due in early 2023.
Second, political parties have the power to amplify content. Political leaders have massive followings on social media, but even more importantly, parties are nation-wide networks of people that can reshare and reinforce messaging. Recent research suggests that political influence on social media works differently from traditional advertising.
In traditional advertising, messaging is carried to a large number of people by a celebrity they know and trust, which is why Nepal’s MaHa Jodi seem to be selling us everything from cement to water tanks. This is thought to drive sales by positive association with the trusted figures. Political influence on social media, however, appears to be more effective when information is (re)shared in small groups of trust, rather than by a widely followed influencer.
Finally, political parties have the means and skills to craft effective messages. While it is easy to blame technology for causing the problems of disinformation, we have to remember that a human hand guides it. Political parties have skilled communicators with a deep understanding of the public's moods, desires and also prejudices.
At their best, these people help advance democratic discourse by effectively representing the different streams of public sentiment in the national dialogue. However, this same skill in capturing public sentiment also allows them to construct rumours, disinformation and conspiracies that capture the public imagination and go viral.
It may seem fanciful to imagine that political parties, especially in the heat of the battle for power, can be relied on for self-regulation. Yet, there are many examples of this in Nepal. The most obvious one is the peace process that some have poetically described as leading from the ‘bullet to the ballot box’.
More encouragingly, political violence, such as ‘booth capture’, have steadily declined to the point where individual incidents are rare enough to be widely noted.
This suggests that there are ways to cultivate consensus by making political forces understand the self-interest. Political violence diminishes their legitimacy to rule and increases the risk to their person and cadre. Thus, it is worth trading the momentary advantage of violence in one location to ensure advantages in the exercise of national power.
Social media manipulation and polarisation through disinformation have similar, perhaps even more destructive consequences. Around the world we see what political scientists McCoy and Somer call pernicious polarisation -- an us vs them mentality where the drive to identify with your political tribe outweighs other sociopolitical considerations, the politicisation of masking and vaccination for example.
Pernicious polarisation fragments social and political structures so as to make nations impossible to govern, but also creates national security risks. Highly fragmented and polarised societies are primed by their own political leaders to believe disinformation and polarising propaganda are easy targets for manipulation by international powers.
It is tempting to imagine there are technological solutions to technologically enabled problems. The Facebook Papers, however, make clear that countries like Nepal must not depend on faraway technological wizards to find us solutions because we are not their priority. Technological solutions, even if they can be found, will take a long time before they can be customised for the local context. Instead, we must face Facebook ourselves.
While it is not possible to prevent all attempts to infect the social and political system through social media manipulation, a ‘herd immunity’ that protects us from the worst of it, can be achieved by building a self-regulation consensus among key actors, chief among them political parties, with the skills and resources to mount large scale manipulation campaigns.
Fortunately, this is possible in the most low tech way through face-to-face conversations over countless cups of chia and many plates of samosa.
Sakar Pudasaini, Founder at Karkhana, explores innovation, technology, education, and their social consequences in Makeshift, a new monthly column in the Nepali Times. The name Makeshift gestures to the tenuous uncertain nature of new ideas and initiatives as well as their potential to drive profound change.