According to a new study, you construct your own echo chamber, not algorithms from Google Search. The study discovered that people actively searched out content that matched their opinions rather than search engine algorithms, dispelling the myth that Google actively promotes a particular belief system.
Since the rapid rise of social media and search engines that collect user data, there has been concern that algorithms are putting us in “echo chambers,” where technology only shows content that confirms our political leanings. These echo chambers are said to downrank content that questions our viewpoint and boost stuff that coincides with it, pushing each person to the extreme ends of the spectrum.
But do we cause it to ourselves or are tech corporations really to blame? We have access to any information we want online, so are we only looking for what we find to be agreeable?
A team from Stanford University studied participants’ browsing patterns over the course of two US election cycles in order to determine which candidate it is. 262-333 people made up the first wave in 2018, while 459–688 people made up the second wave in 2020. They each installed a unique browser extension that kept track of the URLs that Google displayed to them, the URLs they clicked on, and the URLs that other users were also interacting with at the moment.
Once the data was gathered, the researchers investigated to determine if the participants were being pushed partisan content or if they were actively seeking it.
In both waves, the results showed that people were interacting with more political URLs than Google was showing them. They determined that user choice, rather than Google search, was driving the echo chambers, with consumers seeking out considerably more URLs than Google showed. They also discovered that the material obtained from Google searches was often of higher quality than that obtained from other sources discovered independently by the participants, implying that it may be doing more benefit than harm.
Even if the results may absolve Google of responsibility, not all echo chambers are unquestionably user-generated. It’s still possible that the person isn’t wholly to fault because people discover sources from social media, news tabs, and many other places that could be politically biased. Although this is certainly something we already understood, it does seem likely that people are more likely to look for stuff online that supports their opinions.