Western democracies experience a growth of populism and authoritarianism that is accompanied by a loss of confidence in traditional political institutions and an increased importance of online social media where citizens are able to debate about political issues. The use of social media does not necessarily lead to a democratic discourse between different but equal citizens. Instead, It seems to be associated with the emergence of filter bubbles where people receive unilateral information, with an increase in hate speech, and with orchestrated disinformation campaigns to manipulate public opinion. The project addresses two related sources of these negative trends:
(1) the unintentional amplification of bias through automated systems, and (2) malicious social bots that are intentionally designed to introduce or amplify biases and promote polarization. While social bots are usually considered as a problem in public discourse, automated systems such as bots can also be a chance for the political discourse of a future society. U3B intends to explore the possibilities of building, testing, and evaluating technology that (1) helps to identify and convict malicious bot accounts, (2) supports public deliberation in OSNs, and (3) does so in a fashion that is sensitive to machine learned biases.
The observation of the emergence and dissolution of filter bubbles requires an understanding of network structures, interactional patterns of users and bots, as well as individual dispositions. This challenge can only be achieved by an interdisciplinary approach.
Funding / Grants
- Volkswagen Foundation (2019 - 2020)