People, Not Google’s Algorithm, Create Their Own Partisan ‘Bubbles’ Online

From Thanksgiving dinner conversations to pop culture debates, it can often feel like individuals with different political ideologies occupy a whole other world, especially online. People often use algorithms—the invisible set of rules that shape the online environment, from social media to search engines—to digitally “filter” by feeding them content that enhances our existing worldview. accuse it of limiting its use to ‘bubbles’.

Algorithms are always biased. Studies show that Facebook ads target specific racial and gender demographics. Dating apps select matches based on a user’s previous swipe history. And search engines prioritize links based on what they deem most relevant. But not all algorithms foster political polarization, according to new research.

In a study published today, Nature It turns out that Google’s search engine does not return unfairly partisan results. Rather, politically polarized Google users tend to isolate themselves by clicking links to partisan news sites. These findings suggest that it may be easier for people to escape online echo chambers than previously thought, at least when it comes to Google searches. But only if you so choose.

Algorithms pervade nearly every aspect of our online lives and can shape the way we see the world around us. “They influence how we consume information and, in turn, how we form opinions,” said Katherine Ognyanova, a communications researcher at Rutgers University and a co-author of the new study. It has some impact,” he said.

However, it can be difficult to quantify the extent to which these programs promote political polarization. Algorithms could look at “who you are, where you are, what device you’re searching from, your geography, your language,” Ognyanova said. “But we don’t know exactly how the algorithm works. It’s a black box.”

Most of the research analyzing algorithmic political polarization has focused on social media platforms such as Twitter and Facebook, rather than search engines. That’s because, until recently, it was easy for researchers to get useful data from social media sites using consumer-facing software interfaces. “Search engines don’t have tools like that,” says Daniel Torrieri, incoming assistant professor of media and democracy at the University of Maryland, who wasn’t involved in the study.

However, Ognyanova and her co-authors found a way around this problem. Instead of relying on anonymized public data, they sent volunteers a browser extension that recorded every Google search result and the links they followed from that page over several months. This extension acted like a backyard camera trap that captured animals. In this case, a snapshot of everything present in each participant’s online environment was provided.

Researchers collected data from hundreds of Google users over the three months leading up to the 2018 US midterm elections and nine months leading up to the 2020 US presidential election. We then analyzed the content we collected in relation to participants’ age and self-reported political orientation, ranking them on a scale of 1 to 7 from strong Democrat to strong Republican. Yotam Shmargad, a computational social scientist at the University of Arizona, who was not a member of the research team, called the approach, which blends real-world behavioral data about participants’ search activity with survey information about political trends, “groundbreaking.” is called.

This kind of field data is also very valuable from a policy-making perspective, says Homa Hossein Mardi, a cybersecurity researcher at the University of Pennsylvania who was also not involved in the study. For a search engine giant like Google, which processes more than 8.5 billion queries daily, to operate with people’s best interests in mind, it’s not enough to know how algorithms work. is not. “We need to see how people are using the algorithms,” says Hossein Mardi.

Many lawmakers now call on tech giants to release de-identified user data to the public, which could lead platforms to publish misleading, distorted or incomplete information. Some researchers fear thatOne notable example is when Meta is A team of scientists unsuccessfully investigated the platform’s relationship to democracy and political polarization offer Half of the data you promised to share. “I think it makes a lot more sense to contact users directly,” says Ronald Robertson, a network scientist at Stanford University and lead author of the new study.

Ultimately, the team found that a simple Google search could not provide news articles that were selected based on users’ political leanings. “Google generally doesn’t do a lot of personalization,” says Robertson. “And if personalization is low, the algorithm might not actually change the page that much.” Rather, highly partisan users clicked on partisan links that fit their existing worldview. was likely to.

This does not mean that Google’s algorithms are flawless. Researchers found that unreliable or downright misleading news sources still appeared in the results regardless of user interaction. Robertson said there are other situations where Google does something pretty problematic, like women of color being significantly underrepresented in image search results.

Google did not immediately respond to a request for comment on the new research.

Shmargad notes that breaking down research data to a more granular level is not entirely free of bias. “There doesn’t seem to be much algorithmic bias across partisanship, but there may be algorithmic bias across age groups,” he says.

Users over the age of 65 see more right-leaning links in Google search results than other age groups, regardless of their political identity. However, the effect was modest, with the oldest age group accounting for only about one-fifth of all participants, so the effect of larger exposures on overall study outcomes disappeared in the macroscopic analysis.

Still, the findings reflect a growing body of research that suggests the role of algorithms in shaping political bubbles may be overstated. “I am not against blaming platforms,” Torrieri said. “But it’s a little disconcerting to learn that it’s not just about making sure the platform works properly. It’s our personal motivation to filter what we read to match our political biases.” is still strong.”

“We also want a split,” Torrieri added.

The silver lining is that “this study shows that it is not so difficult for people to escape their living environment,” Ognyanova said. [ideological] bubble. “Maybe. But first they need to want to go out.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *