Defining Digital Literacy in the Age of Computational Propaganda and Hate Spin Politics

Just like much of the rest of the world, Indonesia is facing a crisis of fake news and bot network infiltration on social media, leading to rampant propaganda, mass belief of disinformation, and not fully understood effects on voters that may affect them deeply enough to alter election results. Salma (2019) describes this crisis and identifies the solution as critical digital literacy, essentially educating people about the nature of fake news, algorithmic gaming of social media platforms, and identifying bot networks.

Salma consolidates the issue into two problems: computational propaganda and hate spin politics. She defines computational propaganda as “the use of algorithms, automation, and human curation to purposefully distribute misleading information over social media networks” (p. 328). This includes fake news created and spread on social media, bot networks driving attention to and changing conversation around particular issues, and the groups who organize these campaigns of disinformation. Her definition of computational propaganda encompasses much of the fake news crisis currently rattling the United States, as well as other countries.

The other primary issue she identifies is hate spin politics, which is less easily defined. She describes it as “exploit[ing] freedom in democracy by reinforcing group identities and attempt[ing] to manipulate the genuine emotional reactions of citizens as resources in collective actions whose goals are not pro-democracy” (p. 329). Hate spin politics seems to be the weaponization of identity politics and emotion in the digital political sphere, using religion, nationality, sexuality, and other identity markers to turn people against each other. It not only aims to segregate people based on their identities, but to inspire people to self-select into identify groups to create political warfare.

Computational propaganda and hate spin politics are carried out by several groups in Indonesia. Salma identifies Saracen and Muslim Cyber Army as responsible for various fake news campaigns, and there have been notable suggestions of similar political interference from Russia (Sapiie and Anya, 2019). These tactics have shown to be successful on a large scale and with dire consequences in the case of Basuki Tjahaja Purnama, also known as Ahok, the politician who was imprisoned for blasphemy based largely on an edited video that went viral on social media.

Indonesian government officials are keenly aware of the problem computational propaganda presents, taking significant steps to counter fake news spread. In 2018, they began weekly fake news briefings intended to address false stories that have gained traction (Handley, 2018). Salma suggests an increased focus in critical digital literacy, or teaching people to “evaluat[e] online content or digital skills but also to understand the internet’s production, management and consumption processes, as well as its democratizing potential and its structural constraints” (p. 333). Essentially, critical digital literacy is to computer or technical literacy what reading comprehension is to literacy. It’s not enough for users to be able to use a computer and navigate the Internet; there needs to be a solid understanding of what they’re seeing and why, including who might have produced content and how it came to be presented to that user.

Who could argue with that? Of course increased education about the creation and spread of fake news and algorithmic manipulation would be useful to nearly all Internet users, and it might help counter the spread and impact of computational propaganda. However, Salma offers no explanation of how digital literacy would improve hate spin, which seems to be a larger social issue that’s just as likely to occur offline as on. Hate spin politics also traffics in emotional responses, meaning strictly logical literacy training might not be enough to retrain people to grapple with emotional manipulation.

Paper:

Salma, A. N. (2019). Defining Digital Literacy in the Age of Computational Propaganda and Hate Spin Politics. KnE Social Sciences & Humanities2019, 323-338.

Additional Resources:

Sapiie, M.A. & Anya, A. (2019, February 4). Jokowi accuses Prabowo camp of enlisting foreign propaganda help. From https://www.thejakartapost.com/news/2019/02/04/jokowi-accuses-prabowo-camp-of-enlisting-foreign-propaganda-help.html

Handley, L. (2018, September 27). Indonesia’s government is to hold public fake news briefings every week. From https://www.cnbc.com/2018/09/27/indonesias-government-is-to-hold-public-fake-news-briefings-each-week.html

Aksi Bela Islam: Islamic Clicktivism and the New Authority of Religious Propaganda in the Millennial age in Indonesia

Ahyar and Alfitri (2019) examine the way social media has reshaped the landscape of propaganda, and how it’s being used to change dominate religious authorities. Propaganda used to be a tool wielded almost exclusively by government bodies or other massive organizations. Ahyar and Alfitri say, “In previous eras – especially in authoritarian regimes prior to the reformation era in Indonesia – the state was an authoritative source for social campaigning” (p. 14). The resources needed to create and effectively spread propaganda were simply too great for small groups or individuals to harness.

Social media has completely changed this; the Internet has effectively allowed nearly anyone to create and spread their own propaganda for their own purposes, with the potential for massive virality and impact. Governments no longer have a monopoly on mass information (or disinformation) spreading. Ahyar and Alfitri explain that alternative groups have come to harness propaganda: “In the Reformation era in Indonesia, propaganda is also often done not only by the government, but also by social movements that echo multiple identities; be it a primordial identity of ethnicity, religion, political ideology and profession” (p. 12-13).

They go on to explain how social media has also revolutionized social movements and activism, again with disruption. Because movements can be planned and executed more easily, they need less hierarchical structure to form and continue. They say, “…Social movements appear massively outside the established political or institutional channels within a country. Social movement is closely related to a shared ideal and responds to a political power” (p. 9). Social movements need less planning, promotion, and organization to be successful. All they really need is a powerful motivating factor to spark mobilization. Propaganda can easily fill this role: “The pattern begins with an action of propaganda through the sails of technological devices, which is followed by supportive comments on the propaganda, and ends in mass mobilization for a real social movement for a purpose” (p. 4).

Although there is obvious good in breaking the government’s former monopoly on propaganda and in tools like social media making organizing and protesting easier than ever, there’s also the possibility for increased disinformation, chaos, and abuse. Ahyar and Alfitri consider the example of Basuki Tjahaya Purnama (also called Ahok), the former Jakarta governor who was imprisoned for blasphemy after a misleadingly edited video of one of his speeches went viral, causing controversy among Islamic communities in Indonesia. The doctored video functioned as propaganda, perfectly matching Ahyar and Alfitri’s definition of propaganda as “attempts to shape influence, change, and control the attitudes and opinions of a person or group for a particular purpose or to be brought in a particular direction” (p. 11). That propaganda spread rapidly through social media, acting as the spark that mobilized thousands of people to take to the streets in protests that were easily and spontaneously planned with improved technology and communication. Ahok’s imprisonment serves as testimony to the power and changed nature of propaganda and social movement, and to the danger that these powerful tools have when they are used rapidly and with little opportunity for oversight, consideration, and fact-checking.

Paper:

Ahyar, M & Alfitri. (2019) ‘Aksi Bela Islam: Islamic clicktivism and the new authority of religious propaganda in the millennial age in Indonesia’, Indonesian Journal of Islam and Muslim Societies, 9(i), pp. 1–29.

Countering Terrorist Narratives: Winning the Hearts and Minds of Indonesian Millennials

Narratives are powerful because they’re easy to follow. Factual information and research might provide someone with all of the pieces, but a well-crafted narrative presents itself as an already completed puzzle. Ansis (2018) discusses the narratives that terrorists and extremists use to recruit new members, and how those narratives can be shaped into convincing propaganda that is easily disseminated through social media, focusing primarily on the recruitment of young Indonesians and responses from the Indonesian government. Islamic extremist narratives give followers a consistent worldview, as well as a clearly defined role and purpose within that worldview. Once a follower has accepted extremist narratives, it’s difficult to counter them.

Islamic extremist groups build their narratives on social media the same way many use social media- consistent branding and plenty of selfies. Ansis says, “Many of the selfie photos of young jihadists express their happiness. They smile and carry weapons. The jihadists use this strategy to give a picture that they are powerful and own many weapons” (p. 196). Again, following pretty standard social media manipulation tactics, extremists can deceive followers. Ansis continues, “They may only have a few weapons and ask the jihadists to take turns taking selfie photos carrying the gun” (p. 196). They also use catchphrases. Ansis identifies the phrase “You Only Die Once” or “YODO,” (p. 193) a clear derivative of the popular hashtag #yolo. 

 Ansis’s examples of jidhadist recruiting, specifically her analysis of the film Jihad Selfie, reveal the targeted nature of their recruiting efforts. Extremists’s success isn’t from pouring money into Facebook advertisements; it’s from using social media to talk to vulnerable individuals. There seems to be more to gain from putting significant resources towards the small number of individuals who are able to be flipped than there is in mass recruitment tactics that will fall largely on deaf ears. Again, using social media for this kind of targeted advertising isn’t exclusive to jidhadist groups. Cambridge Analytica’s use of highly targeted advertising has caused outrage worldwide. 

Indonesia has taken several steps to attempt to counter extremist propaganda online, largely in the form of websites offering counter-narratives and promoting peacefulness (p. 202). However, it’s unclear how effective this approach can be. Ansis describes how jidhadists’ use of social media makes them look “cool,” according to former recruits, because of their handling of weapons and the interactions their post get from Muslim women (p. 197). If the appeal of jidhadist’s propaganda comes down to cool factor, it’s really difficult to imagine the government successfully creating something that will actually read as cool to young people. 

The weakest point of Ansis’s analysis comes from her failure to interrogate the term “lone wolf” terrorists. She points out, “Unlike in the past when a terrorist was defined as someone who completed a long process of training and indoctrination through a terrorist group, the lone wolf terrorists are not tied to any terrorist network and have gotten inspiration through the internet,” (p. 195) yet fails to connect that this inspiration through the Internet is often from interacting with content from terrorist networks. 

Paper:

Anis, E. Z. (2018). Countering Terrorist Narratives: Winning the Hearts and Minds of Indonesian Millennials. KnE Social Sciences & Humanities, 2018, 189.

The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions

In this 2018 study published in the European Journal of Communication, W. Lance Bennett and Steven Livingston trace the roots of online political disinformation affecting democratic nations.  They argue that declining confidence in democratic institutions makes citizens more inclined to believe false information and narratives, and spread them more broadly on social media. They identify the radical right, enabled and supported by Russia, as the predominant source of disinformation, and cite examples in the UK Brexit campaign, disruptions affecting democracies in Europe, and the U.S. election of Donald Trump.

Many articles on social media and political communications provide differing definitions of disinformation, misinformation, and fake news. Bennett and Livingston offer their own provisional definition: “intentional falsehoods spread as news stories or simulated documentary formats to advance political goals” (Bennett & Livingston, p.124). In addition to those who share disinformation originating on the Internet, they identify legacy media as an important factor in spreading it further. They say that when news organizations report on false claims and narratives, the effect is to amplify the disinformation. Even fact-checking can strengthen this amplifier effect, because the message is exposed and repeated to more people. As traditional news institutions are attacked as “fake news,” journalistic attempts to correct the record can be cited by propagandists and their supporters as proof of an elite conspiracy to hide the truth. The authors refer to this dynamic as the “disinformation-amplification-reverberation (DAR) cycle.”

It’s interesting that both the political left and right increasingly share contempt for neoliberal policies that benefit elites. But instead of coming together to address political and economic problems, they are being driven further apart by “strategic disinformation.” This hollowing out of the center produces a growing legitimacy crisis, and political processes that are increasingly superficial. The authors term this post-democracy: “(t)he breakdown of core processes of political representation, along with declining authority of institutions and public officials” (p.127).

The authors identify Russia as the primary source of disinformation and disruptive hacking in an increasing number of western democratic and semi-democratic nations: Germany, the UK, The Netherlands, Norway, Sweden, Austria, Hungary, Poland, Turkey, and most of the Balkans. They say Russia has learned to coordinate a variety of hybrid warfare tactics that reinforce their impact, such as troll factories, hackers, bots, and the seeding of false information and narratives by state-owned media channels. As other researchers have argued, Bennett and Livingston say Russia’s disinformation activities are geostrategic, aimed at undermining NATO and the cohesiveness of democratic nations who oppose the expansion of Russia’s power.

In response to the scale of disinformation and disruptions in democratic institutions, Bennett and Livingston suggest comparative research on the characteristics of disinformation in different societies, so as to identify similarities and differences, and the identification of contextual factors that provide either fertile ground for or resistance to disinformation. They also recommend that the current operations of trolls, hackers, and bots should be more central to political communications studies.

Reference

Bennett, W. Lance, and Steven Livingston. 2018. “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions.” European Journal of Communication 33 (2): 122–39. https://doi.org/10.1177/0267232118760317.