Don’t believe that!

 Don’t believe that! – Cofacts as a democratic technology to engage with online misinformation


Created by Taiwanese tech community g0v, the platform Cofacts promises "to help separate misinformation from fact". It aims to do so in a manner that is transparent and easily accessible to end users and fact checkers alike, inviting everyone to participate in the latter group while providing a simple interface via a bot for the LINE chat app for the former. Can it deliver on this promise and is fact-checking even the right avenue towards effectively combatting misinformation spread in societies that have been described as "post-truth"?


Barbara Engelhart 

Barbara Engelhart is a graduate student at the University of Vienna and teacher at a technical high-school in Vienna. Her areas of interest include ethics of technology, collective responsibility and anthropology of violence. She is currently finishing her Masters thesis in the field of Linguistics. 


Introduction


You open the Signal app – three forwarded articles by family members: “Toilet paper shortage: Covid masks cause a shortage in paper tissues”, “AIDS blood in Thai can food”, “Taiwanese man now fighting for Ukraine’s foreign legion”. Can you tell the truth value of each of these headlines right away?


Forms of intentional and unintentional spread of misinformation have become an increasingly pressing issue in online environments in recent years. Especially private chat rooms like WhatsApp and LINE are prone to the unregulated spread of information and the reinforcement of confirmation bias and echo chambers. This has led to a range of problematic developments: Remember the storming of the U.S. Capitol, the intake of questionable COVID medication, or the resurgence of persecution of the Rohingya community in India? All of them have either been rooted in or were fueled by online communication apps but had dire, for some even fatal, real-life renderings. Often, debunking these rumors and explaining the scientific or historical background does not do the job as I have discovered during numerous discussions about “Telegram” messages my dad received and vigorously believed. If not even close relatives, knowledgeable in matters affected by fake news can successfully change misbeliefs this might be a wake-up call for us as a society to rethink our misinformation management strategies.


Picking up on this pressing issue, fact-checking organizations are emerging all over the world: the state-run “Factually” website in Singapore, the European Commission-funded project Pheme, and the Truth Teller app developed by the Washington Post are just some examples of initiatives countering fake news. What they have in common is the regulation and assessment of the content by specific parts of the population: governmental agents, journalists, academics, etc. The reasoning behind delegating the task of fact-checking to these demographic groups is that they have professional training in researching reliably. On the other hand, this practice might cause distrust on the side of parties that are not involved in truth-finding processes, thereby reproducing the problem that was sought to be overcome in the first place. 


The rise of a collaborative fact-checking database 


The issue of non-inclusive approaches to problem solving and governance has been recognized early on in 2012 by the g0v movement that formed in Taiwan out of a bi-weekly hackathon. One central figure of the movement from its very beginnings has been Audrey Tang. Her way into politics begins with the storming of Taiwan’s government in 2014 and the protesters’ unwillingness to leave before the parliament’s agreement to more transparency. Even back then, her intention to include the public and counter rumors showed. Tang remembers: “I personally brought 350 meters of ethernet cables to make sure that the truth spreads faster than rumors, that we livestream what's happening in the Occupy Parliament to a large projector on the street where they can see, in real-time, what's being debated and what people said in the occupied parliament.” Ever since the movement has expanded to include citizens from all walks of life with the shared goal of rethinking the role of government from zero and enhancing government transparency.


One particularly popular application that has been created by g0v is Cofacts. It can be described as a chatbot in an end-to-end encrypted system that will provide details about the truth value of messages forwarded by users skeptical about the integrity of media content. “The message is then added to a database, fact-checked by [volunteer] editors, before returned to the user” (see fig. 1). This way, you would find out that only the last heading is based on facts and that you still don’t need to worry about toilet paper shortages; you can also keep eating canned Thai food. Are you about to install the app on your grandparents’ phone already? Before you do, let’s consider the premises for the funcitonality of this new technology in more depth. 

Figure 1: Cofacts’ system flow 


Why the practice of debunking won’t suffice 


By this point, probably most of us have gotten into one or the other dispute with family or friends about the nature of the COVID virus – possibly even about its existence – about the effectiveness and intention behind vaccinations, or the appropriateness of regulation measures to counteract the spread of the virus. Did any of these conversations end with the conclusion that “Oh, after all, I might have been wrong. Thanks for enlightening me!”? Even though I’m not familiar with your social surrounding, I dare imagine that has not been the case once.

Assume now that you’d use a fact-checking bot to convince them otherwise - it is questionable if the result would any more likely be an adaptation of the beliefs on either side. Wanting to debunk beliefs then does not seem to be a matter of simply pointing out “you’re wrong” – neither in spoken nor in written digital form. In fact, studies have shown that using language that goes against ideology or corrections coming from opposing ideological groups might lead to exacerbation of societal polarization and makes people hold onto wrong info (even if they understood that it’s wrong) to express their group identity.


But won’t this just be the case for Cofacts as well? If simply showing people the right facts does not suffice, what is the role of Cofacts in assisting truth-finding processes? Which values arise and are promoted in the interactions between volunteer fact-checkers, and skeptical, truth-seeking citizens mediated by the online tool? And can the outcome of the collaboration of people from all walks of life ever be facts closer to the truth? These are some of the questions we’ll grapple with in the following section.


Collective responsibility for knowledge construction 


A good starting point to approach these questions is a comment of Audrey Tang on the central value of the g0v movement – citizen engagement: “If you involve people and trust them, they develop their own initiative and instead of asking why the program is not available in English or in another version, someone simply translates it. This is the spirit of co-creation, of a collaborative process of creation”, she explains in an interview. Let’s untangle her position to get a clearer picture of the values embedded in Cofacts.


One crucial component is the transfer of responsibility to citizens in the process of knowledge construction. As she describes here – people no longer ask why nobody is taking care of problematic online content, but rather have the possibility to act themselves. In this process, they retrieve control over their belief-formation (e.g. by fact-checking) – they retain what in academic debates is referred to as “epistemic agency”. The somewhat abstract notion can be described as the capacity to willfully guide your beliefs and to take responsibility for the changes you make within your belief system. Aristoteles has identified the features of being in control and possessing relevant knowledge as central to epistemic agency, both of which are facilitated by Cofacts.


Returning to the example of your (grand)parents this could mean them taking responsibility for their beliefs by forwarding a contentious message to the bot; on the side of a volunteer, potentially someone just like you, it means acting responsibly by supporting others in their belief–formation process. What becomes visible here is a potential gap in agency, since it will be easier for certain parts of society, e.g. young people, to navigate the web – a skill necessary to become part of the fact-checker community, while e.g. elderly people are dependent on the research skills and reliability of the volunteers. Thereby they become vulnerable to being misled. Cofacts, from this perspective, needs to put measures in place that best minimize the vulnerability of its users and promote individual responsibility. Some measures could include ensuring transparency, simplicity, online literacy initiatives, etc.


However, there is more to it than just your grandparents’ or your own individual responsibility. Probably a Taiwanese citizen would not feel like they had changed the information culture as a whole just by volunteering to check a few facts a week given the amount of misinformation produced by troll farms, marketing departments and uncritical users. So how can Cofacts be understood as an effective tool then?


Firstly, the notion of collective responsibility helps to think about the process in a more holistic way. Collective responsibility as defined by Hannah Arendt means taking responsibility “for things we have not done”, in this case, for misleading information. It means joining political organizations that provide the infrastructure to respond to a problem jointly. Participating in the g0v movement as a Cofacts volunteer seems to be a good starting point in this light. It shifts away epistemic authority from individuals to a collective of knowledge constructors. Secondly, the structure of Cofacts reminds of practices applied in responsible innovation processes that aim at anticipating the long-term effects of new technologies. In this discourse, you and your grandparents and whoever else is affected by the renderings of fake news would be classified as stakeholders. Each stakeholder is involved in (1) steering the innovation in a direction that is best possibly aligned with the interests and needs of all parties involved and

(2) collectively dealing with grand challenges – in this case, harmful wrongful information.


Situatedness of knowledge 


The question still remains how Cofacts ensures that the collective knowledge construction by people from all walks of life develops in the right direction. Since often, “factchecking is not as straightforward as it seems, mostly because it involves interpretative work […]. [Real] lies and fabrications are easy to spot and correct, and it is important that this happens. Most conspiracy theories are, however, a different kind of beast”, as the sociologist Harambam points out. Wouldn’t it, in the end, still be a better idea to limit this responsible task to just experts and scholars with specified knowledge to ensure maximum objectivity and quality of information?


We might need to shift our understanding of facts to shed light on this question. Like the technology Cofacts itself, information needs to be considered in its context. As Max Weber noted, social sciences derive their knowledge from meaning-making practices, which is why no one can speak of finding the objective and real truth. Harambam similarly argues that “the only thing we can know is how people construct and attach meaning to that world. […] The question is therefore not how to stay neutral, but how to give shape to our situated position.” This perspective is also taken on by Latour, who argues that “facts were ‘networked’; they stood or fell not on the strength of their inherent veracity but on the strength of the institutions and practices that produced them and made them intelligible.” 

Social scientist Brian Martin goes even further and argues that social constructivism is not only a feature of knowledge but desirable and necessary to avoid power abuse. He notes that “society will be better off if more people are able and willing to openly question standard views. This holds true even if critics, by later judgment, turn out to be wrong. What is important is the process of open debate. When debate is inhibited or squashed, the potential for abuse of power is magnified enormously”. This view is shared by Audrey Tang who opposes censorship “because otherwise [the digital ministry] would be suspected of moving in a direction reminiscent of the `White Terror` - again.” Rather, she suggests resorting to humor to counter misinformation – even in the case of serious issues and hateful comments.


Danger of tapping into post-truth attitude 


So does that mean next time your grandmother tells you that there is a conspiracy keeping the dangers of the COVID vaccine hidden from the public, should you maybe just be fine with it since that is her subjective truth that emerged from her network of social interactions? What becomes clear is that such a position runs danger of tapping into an anything-goes relativism that can be instrumentalized by different political groups for their self-interest. However, what Latour rather wanted to draw attention to is that “facts remain robust only when they are supported by a common culture, by institutions that can be trusted, by a more or less decent public life, by more or less reliable media.” In an online environment of alternative facts, it will not so much depend on the veracity of information, but on making transparent “the conditions of its ‘construction’ — that is, who is making it, to whom it’s being addressed and from which institutions it emerges and is made visible”, as Latour concludes. The crucial factor in establishing knowledgeable societies striving and being open for belief adaption seems to be the negotiation of facts and their interpretation.


This requirement strongly reminds of the infrastructure provided by Cofacts: it does allow for a better understanding of the context misinformation emerges in, certainly for the volunteers researching the facts and also for its users in case they familiarize themselves with the structures of the app and develop a habit of questioning messages received on LINE. What can be identified as a positive feature of Cofacts then is that it makes traceable and transparent how scientific facts are produced, thereby giving insight into socio-material networks that enact and uphold (scientific) knowledge. Secondly, the app does not assume facts as a “mirror of nature”, but rather as product of human (inter)action embedded in a wide network of processes. This makes the facts not less true, but open to inspection.


Conclusion 


In conclusion, Cofacts should not be understood as a mere tool as such. The too limited focus on the act of sending an article to the chatbot, then receiving info about the veracity of the content does not do justice to its potential. Rather, we need to consider it embedded in the structures that it arose from, the structure of its functioning, i.e. diverse and broad stakeholder involvement, and the culture it promotes. What needs to be seen critically though is that Cofacts’ aspiration of citizen engagement is only done justice to a certain extent since various requirements such as access to computers, computer literacy, time resources, programming skills, etc. keep parts of the population from participating.


Moreover, the problem of the complexity of checking phenomena or events that cannot be reduced to a single fact recedes when we assume knowledge, especially about societal phenomena not as objective, fundamentally given, but rather emergent and constructed.

Cofacts then becomes the platform in which democratic knowledge construction takes place. The mere act of engaging meaningfully in discourse creation is likely to result in enhanced control over one’s beliefs, promoting democratic structures, thereby counteracting in-group belief fossilization and resulting polarization 

In the end, Audrey Tang notes, "democracy is a form of technology. And so is nonviolent communication. There are many different techniques to make sure we humans listen well to each other." Cofacts, then, is one such technique that might have the potential of being one cornerstone in the way out of our information crisis.


References 

Arendt, H. 1987. Collective Responsibility. In: Bernauer, S.J.J.W. (eds) Amor Mundi. Boston College Studies in Philosophy, vol 26. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-3565-5_3

Borger, J., 2021. American carnage: how Trump's mob ran riot in the Capitol. [online] The Guardian. Available at: <https://www.theguardian.com/us-news/2021/jan/06/trump-capitol-american-carnage-washington> [Accessed 21 September 2022].

Cha, Meeyoung; Gao, Wei; and LI, Cheng-Te. 2020. Detecting fake news in social media: An Asia-Pacific perspective. Communications of the ACM. 63, (4), 68-71. Research Collection School Of Computing and Information Systems.

Chang, Haider, S., & Ferrara, E. 2021. Digital Civic Participation and Misinformation during the 2020 Taiwanese Presidential Election. Media and Communication (Lisboa), 9(1S1), 147. https://doi.org/10.17645/mac.v9i1.3405

Cofacts. 2022. Retrieved 21 September 2022, from https://cofacts.tw

Everington, K., 2022. Taiwanese man now fighting for Ukraine's foreign legion | Taiwan News | 2022-04-25 13:52:00. [online] Taiwan News. Available at: <https://www.taiwannews.com.tw/en/news/4518011> [Accessed 21 September 2022].

G0v.tw. 2019. g0v 台灣零時政府. [online] Available at: <https://g0v.tw/intl/en/manifesto/en/> [Accessed 21 September 2022].

Gero, G. 2021. Was bringt uns zusammen?. Alles Ist Drin, 9 - 11. Retrieved from https://cms.gruene.de/uploads/documents/MdG_Magazin_2021_1_Optimismus_web.pdf

Gerth, H., Wright, C. (1946). From Max Weber: Essays in sociology. Oxford Univ. Press.

GitHub - cofacts/opendata: Open data of Cofacts collaborative fact-checking database. 2022. Retrieved 21 September 2022, from https://github.com/cofacts/opendata

Gunn, H., & Lynch, M. P. 2021. The internet and epistemic agency. Applied Epistemology, 389.

Hagen, L., 2021. Ivermectin-Vergiftungen: Konsequenzen für Kickl. [online] DER STANDARD. Available at: <https://www.derstandard.at/story/2000131258432/ivermectin-vergiftungen-konsequenzen-fuer-kickl> [Accessed 21 September 2022].

Harambam. 2021. Against modernist illusions: why we need more democratic and constructivist alternatives to debunking conspiracy theories. Journal for Cultural Research, 25(1), 104–122. https://doi.org/10.1080/14797585.2021.1886424

Harris, T. and Raskin, A., 2022. Digital Democracy is Within Reach with Audrey Tang. [podcast] Your Undivided Attention. Available at: <https://www.humanetech.com/podcast/digital-democracy-is-within-reach-rerun> [Accessed 21 September 2022].

KuanHung, K., 2020. Cofacts, A collaborative fact-checking system with LINE. [online] Youtube.com. Available at: <https://www.youtube.com/watch?v=DgcI0N2HTMA> [Accessed 21 September 2022].

Mangnale, A., 2017. When Government Resorts to ‘Fake News’, Rohingya’s Refugees are Just Another Victim of their Propaganda | SabrangIndia. [online] SabrangIndia. Available at: <https://sabrangindia.in/article/when-government-resorts-fake-news-rohingyas-refugees-are-just-another-victim-their> [Accessed 21 September 2022].

Martin. (1996). Confronting the experts. State University of New York Press. p. 7. 

Wilner, T., 2014. Meet the robots that factcheck. [online] Columbia Journalism Review. Available at: <https://archives.cjr.org/currents/robot_factchecking.php> [Accessed 21 September 2022].


Comments

Popular posts from this blog

The effects of biometrics on the understanding of who we are

The Last Supper?

How to 'deepfake' you way to military victory