Many people use social media as a primary information source, but their questionable reliability has pushed platforms to contain misinformation via crowdsourced flagging systems. Such systems, however, assume that users are impartial arbiters of truth. This assumption might be unwarranted, as users might be influenced by their own political biases and tolerance for opposing points of view, besides considering the truth value of a news item. In this paper we simulate a scenario in which users on one side of the polarity spectrum have different tolerance levels for the opinions of the other side. We create a model based on some assumptions about online news consumption, including echo chambers, selective exposure, and confirmation bias. A consequence of such a model is that news sources on the opposite side of the intolerant users attract more flags. We extend the base model in two ways: (i) by allowing news sources to find the path of least resistance that leads to a minimization of backlash, and (ii) by allowing users to change their tolerance level in response to a perceived lower tolerance from users on the other side of the spectrum. With these extensions, in the model we see that intolerance is attractive: news sources are nudged to move their polarity to the side of the intolerant users. Such a model does not support high-tolerance regimes: these regimes are out of equilibrium and will converge towards empirically-supported low-tolerance states under the assumption of partisan but rational users.
|Status||Udgivet - 26 maj 2022|