Erik Olsson

Filter Bubbles and Ideological Segregation Online: Do We Need Regulation of Search Engines?

We search for information every day using Google and other search engines. The results we get influence what we believe about the world, including on sensitive topics such as climate change and immigration. A popular view is that search engines can lead to so-called filter bubbles (Pariser, 2011): since search engines base their results also on what the user has been searching on before (“personalization”), there is a risk that the results confirm her prior beliefs. Filter bubbles are, moreover, thought to be detrimental to the individual and to society at large (e.g. Sunstein, 2001) and threaten democracy to the extent that there is need for state regulation (Simpson, 2012). Google is already required by EU law to remove, upon request, search results that compromise an individual’s reputation (“Right to be Forgotten”, C-131/12, 2014). The policy issue at stake here is whether there is need for further regulation, due to personalization and filter bubbles. Our specific research questions are: (a) Does personalization in search engines actually lead to filter bubbles and how do we find out? (b) Given that filter bubbles exist, now or in the future, to what extent are they detrimental, individually or socially? Our first aim is to answer (a) using a novel empirical methodology developed in earlier work. Our second aim is to answer (b) through critical, philosophical analysis of relevant arguments and debates in the literature in relation to previous studies of ours.
Final report
The purpose of the project was (a) to investigate whether personalized search algorithms in, for example, Google lead to filter bubbles and what methods can be used to investigate this and (b) to investigate whether such filter bubbles, to the extent that they exist, are negative for the individual and/or society. A filter bubble here refers to a situation where information is tailored to and potentially reinforces the user's pre-existing political views, which has been linked to societal polarization. The main practical question is whether regulation of search engines is needed to limit personalization based on search history, as advocated by British researcher Thomas Simpson. (The numbers in parenthesis below refer to the project’s publication list.)

The research area is expansive and relevant empirical and theoretical studies were published at a rapid pace during the project. The early research in the project mainly concerned question (a) and in particular whether personalization on the basis of search history leads to filter bubbles regarding political themes, a question that in the project has been answered in the negative. This mid-term result and developments in the research field justified some modification of the research questions for the second half of the project to make optimal use of the resources available.

The focus in the later stages of the project has been on other factors that can cause filter bubbles, in particular those that are user-driven and make filter bubbles self-imposed. We have also investigated whether there is a political bias in Google itself. Since a meta-analysis ("systematic review") has become increasingly important, it has been completed within the project (8). In addition to these questions, we have addressed a number of other important issues related to our research questions, such as how Google makes visible scientific articles that have been withdrawn due to scientific misconduct (6). A manuscript addressing question (b) is in preparation. The question is also addressed in our meta-analysis, where 230 scientific works on filter bubbles are coded with regard to whether they propose a remedy for filter bubbles. Apart from the research leader Erik J. Olsson, professor of theoretical philosophy at Lund University, Emmanuel Genot has worked as a postdoctoral researcher in the project, a role that was partly taken over in the latter part of the project by Guy Madison, professor of psychology at Umeå University. Axel Ekström and Melina Tsapos worked in the project as research assistants.

Question (a) has been answered in the negative: in the case of political topics, the search results obtained by different users when searching using the same terms do not depend significantly on search history. This conclusion is supported both by a study of our own (9) and by similar studies that we compiled in our meta-analysis (8). It follows that there is currently no need for legal regulation of search engines to limit the impact of search history on political issues. One reason may be the lack of commercial incentives to personalize such search results and the imminent risks to the brand involved.

The main findings in relation to the research frontier are:

(i) Users tend to pay attention to and select search results that are in line with their political views (2). This effect was slightly stronger in people who identified as politically conservative. The results were published in 2022 after peer review in the international journal Computers in Human Behavior Reports, with an impact factor of 4.1. The article has already received international attention and is cited in a review article in Nature and in the article "Filter bubble" in English Wikipedia (https://en.wikipedia.org/wiki/Filter_bubble).

(ii) Users tend to select specific search terms on political issues that are in line with their own views, which in turn produce search results that are also in line with those views (3). This potentially creates a political filter bubble. As this process is likely to be independent of the commercial search engine used (but see section 4), this, as in (i), the effect is essentially a user-generated filter bubble. This study was published in 2022 after peer review in the international journal Information, Communication and Society, with an impact factor of 4.2.

(iii) There is a liberal bias in Swedish Google search results on topics related to migration (7). However, the pro-migration as well as the generally politicized search results tends to come further down in the search result list. The study also found that the location of the search (e.g. Rosengård in Malmö versus Svedala) does not affect the search results. The study confirms results from similar studies conducted in the US.

Studies (i) and (ii) are the first of their kind and indicate two ways in which political filter bubbles are likely to arise as an effect of essentially cognitive mechanisms of the user. These two mechanisms which may interact in ways that amplify the effect. Study (iii) indicates that search engines themselves may also contribute to filter bubbles for certain groups of users. However, this is not an effect of search history but of some political bias in the search results themselves. This article is under review for an international journal.

The research raises a number of interesting questions, which we plan to investigate in our future research. For example, the question arises whether the filter bubble effect indicated by (ii) is independent of the commercial search engine used. This question could be appropriately answered by replicating the study, or parts of it, for search engines other than Google. Regarding (iii), the question arises as to the cause of the liberal bias indicated by the study. For example, is it due to Google or rather to a general pro-migration tendency online, which is then reflected in Google's search results? It would therefore be interesting to replicate (iii) on other search engines as well. If the effect persists, the second explanation seems more plausible.

The research leader has disseminated the research as a speaker on a number of seminars and conferences including: Jackman Humanities Institute, University of Toronto, Canada; Center for Information Technology, Princeton University, USA; Bled Epistemology Conference, Bled, Slovenia; Faculty Seminar, Nanyang Technological University, Singapore; and Knowledge, Reasoning, and Polarization, workshop, Copenhagen. Collaborators have disseminated the research in talks and proceedings at the national SweCog conference (1, 4). The research has also been disseminated at an international workshop in Lund organized by the principal investigator and funded by the project, and has been presented on several occasions at the Pufendorf Institute in Lund as part of a research group on the theme of political polarization on the Internet (5). A mid-term interview with the research leader was published on the Riksbankens jubileumsfond website (https://www.rj.se/Nyheter/2021/filterbubblor--vi-skapar-dem-sjalva/). Before the lecture at Princeton, a competitor of Google contacted the research leader to get permission to record the event. For reasons of objectivity, the principal investigator then chose not to pursue further contact with the company.
Grant administrator
Lunds universitet
Reference number
P18-0656:1
Amount
SEK 3,787,000.00
Funding
RJ Projects
Subject
Philosophy
Year
2018