Against Post-Truth

SEBBM share here an article from Carlos Sabin, first published on the SEBBM website in Spanish.
Against Post-Truth
Like

Statistical physics, mathematics and machine learning algorithms are essential to, on the one hand, describe the new dynamics of information in social networks and, on the other hand, help correct their problems, such as massive propagation of misinformation and fake news.

These days mass culture is channelled mainly through the Internet and social media. Initially, this phenomenon was greeted with almost unanimous joy for its potential to democratize knowledge, but it has not been long before serious problems have appeared.

Degradation of information quality is perhaps the most important one, to the point that in recent years we have been discussing the "post-truth" era. This concept alludes to the creation of an environment where facts are less important than emotions and beliefs when forming opinions.

In this context, and recalling Umberto Eco’s “Apocalypse Postponed” essays, it is not difficult to recognize those who are “apocalyptic”, and oppose technological advances and novel forms of communication, and those who are “integrated”, and celebrate them uncritically.

We must also remember the solutions proposed by Eco: to try to understand the new features of culture, to take advantage of the new possibilities, and propose aspects that could be improved. This requires mechanisms not just from the social sciences, but also from mathematics, statistical physics, computer science and artificial intelligence.

In fact, the problem has been on the cards for many scientists for several years, in relation to the phenomenon of "echo chambers": communities of users with strong ideological and emotional polarization that barely come into contact with the "outside world". This reinforces their polarization and increases their exposure to disinformation.

A group of highly interdisciplinary scientists (physical, mathematical, statistical and computer scientists) led by Walter Quattrociocchi at the School of Advanced Studies on Institutions, Markets and Technology in Lucca, in Italy, has been getting very interesting results in recent years on the dynamics of social contagion in various social networks, such as Facebook, Twitter and YouTube.

In one example, the researchers used machine learning algorithms to systematically analyze hundreds of thousands of English media entries on Facebook related to Brexit (in the months leading up to the referendum). They detected two perfectly differentiated communities, consisting respectively of 40% and 60% of individuals. The communities barely interacted with each other or commented on what was happening in the other community. These communities constitute "parallel universes" in which the same concept provokes completely different feelings (1).

In another study, conducted by the Quattrociocchi group for the newspaper Corriere della Sera before the referendum held in Italy in 2016, the researchers confirmed the emergence of two different communities on Facebook, with approximately 70-80% of individuals in one and 20-30% in the other, that had hardly any contact between them. The study showed that 20% of the interactions with news about the referendum were interactions with fake news, which mostly related to the smaller community. Considering that Facebook is the main source of information for an increasing number of people (35% of Italians, for example), these results are especially worrying, as they suggest that a significant amount of the population draws information almost exclusively from fake news.

Mathematical models of the dynamics of opinions in society aimed to explain the emergence of consensus, that is, a state of the system in which all people would have a similar opinion. But in the face of the phenomenon of echo chambers, we must look for mathematical models that serve to describe the new reality. And that is precisely what this same group of researchers does (2). In these new models, there is not only "confirmation bias", that is, attraction to those opinions that are sufficiently similar to our own, but also "polarization", that is, revulsion for discordant opinions. Models based on these characteristics are able to predict the appearance of echo chambers. This is particularly important because the way social media currently works is based precisely on polarization and confirmation bias, which favours interaction between individuals with similar opinions and feelings.

In addition to characterizing the phenomenon, scientific tools can also teach us how to combat it. For example, using machine learning algorithms that extract and analyze the semantic and structural content of a Twitter post, the success rate in detecting fake news is as high as 91% (3). While these tools are being fully developed, individual responsibility would certainly be most effective: not sharing information when we cannot guarantee its veracity. However, this is perhaps the hardest thing of all: lies spread further and faster than the truth, as a study published in Science in 2018 demonstrated.

Carlos Sabín
Institute of Fundamental Physics (IFF) of the Spanish National Research Council
(Consejo Superior de Investigaciones Científicas, CSIC)
[email protected]

References

  1. Mapping social dynamics on Facebook: the Brexit debate M. Del Vicario,F. Zollo, G. Caldarelli,A. Scala,W. Quattrociocchi, Social Networks 50 6 (2017). Versión en acceso abierto: arXiv: 1610. 06809.
  2. Modeling confirmation bias and polarization M. Del Vicario, A. Scala, G. Caldarelli, H. Eugene Stanley, W. QuattrociocchiScientific Reports 7, 40391 (2017).
  3. Polarization and Fake News: Early Warning of Potential Misinformation Targets M. Del Vicario, W. Quattrociocchi, A. Scala, F. Zollo TWEB 13, 10 (2019). Versión en acceso abierto: arXiv: 1802. 01400.

This is a translated version of an article first published on the website of the Spanish Society for Biochemistry and Molecular Biology: "Contra la posverdad", Carlos Sabin, March 2020, DOI: http://dx.doi.org/10.18567/sebbmdiv_RPC.2020.03.1


Top image of post: by Gerd Altmann from Pixabay  

Join the FEBS Network today

Joining the FEBS Network’s molecular life sciences community enables you to access special content on the site, present your profile, 'follow' contributors, 'comment' on and 'like' content, post your own content, and set up a tailored email digest for updates.

Go to the profile of Antonio Scala
over 3 years ago

regarding "lies spread further and faster than the truth", this does not seem the case in several studies: in our PNAS paper "The spreading of misinformation online" we observed that the maximum "reach" of a post is just related by the echo-chamber where it belongs, while in our last Nature SciRep paper "The COVID-19 social media infodemic" we observe that the "infodemic speed" is the same for questionable and main-stream news