10.3 C
Ottawa
Thursday, April 18, 2024

People, Not Google’s Algorithm, Create Their Own Partisan ‘Bubbles’ Online

Date:

the invisible sets of rules that shape online landscapes, from social media to search engines—for cordoning use off into digital “filter bubbles” by feeding us content that reinforces our preexisting world view.

Algorithms are always biased: Studies have shown that Facebook ads target particular racial and gender demographics. Dating apps select for matches based on a user’s previous swipe history. And search engines prioritize links based on what they deem most relevant. But according to new research, not every algorithm drives political polarization.

A study published today in Nature found that Google’s search engine does not return disproportionately partisan results. Instead politically polarized Google users tend to silo themselves by clicking on links to partisan news sites. These findings suggest that, at least when it comes to Google searches, it may be easier for people to escape online echo chambers than previously thought—but only if they choose to do so.

Algorithms pervade nearly every aspect of our online existence—and are capable of shaping the way we look at the world around us. “They do have some impact on how we consume information and therefore how we form opinions,” says Katherine Ognyanova, a communications researcher at Rutgers University and co-author of the new research.

But how much these programs drive political polarization can sometimes be difficult to quantify. An algorithm might look at “who you are, where you are, what kind of device you’re searching from, geography, language,” Ognyanova says. “But we don’t really know exactly how the algorithm works. It is a black box.”

Most studies analyzing algorithm-driven political polarization have focused on social media platforms such as Twitter and Facebook rather than search engines. That’s because, until recently, it’s been easier for researchers to obtain usable data from social media sites with their public-facing software interfaces. “For search engines, there is no such tool,” says Daniel Trielli, an incoming assistant professor of media and democracy at the University of Maryland, who was not involved with the study.

But Ognyanova and her co-authors found a way around this problem. Rather than relying on anonymized public data, they sent volunteers a browser extension that logged all of their Google search results — and the links they followed from those pages—over the course of several months. The extension acted like backyard camera traps that photograph animals—in this case, it provided snapshots of everything populating each participant’s online landscape.

The researchers collected data from hundreds of Google users over the three months leading up to the 2018 U.S. midterm election and the nine months before the 2020 U.S. presidential election. Then they analyzed what they had gathered in relation to participants’ age and self-reported political orientation, ranked on a scale of one to seven, from strong Democrat to strong Republican. Yotam Shmargad, a computational social scientist at the University of Arizona, who was not a member of the research team, calls the approach “groundbreaking” for melding real-world behavioral data on participants’ search activity with survey information about their political leanings.

Field data of this type are also extremely valuable from a policymaking perspective, says University of Pennsylvania cybersecurity researcher Homa Hosseinmardi, who also did not participate in the research. In order to ensure that search engine giants such as Google—which sees more than 8.5 billion queries each day—operate with people’s best interest in mind, it’s not enough to know how an algorithm works. “You need to see how people are using the algorithm,” Hosseinmardi says.

While many lawmakers are currently pushing for huge tech companies to release their anonymized user data publicly, some researchers worry that this will incentivize platforms to release misleading, skewed or incomplete information. One notable instance was when Meta hired of a team of scientists to investigate the platform’s relationship to democracy and political polarization and then failed to provide half of the data it promised to share. “I think it makes a lot more sense to go straight to the user,” says Ronald Robertson, a network scientist at Stanford University and lead author of the new study.

Ultimately, the team found that a quick Google search did not serve users a selection of news stories based on their political leanings. “Google doesn’t do that much personalization in general,” Robertson says. “And if personalization is low, then maybe the algorithm isn’t really changing the page all that much.” Instead strongly partisan users were more likely to click on partisan links that fit with their preexisting worldview.

This doesn’t mean that Google’s algorithm is faultless. The researchers noticed that unreliable or downright misleading news sources still popped up in the results, regardless of whether or not users interacted with them. “There’s also other contexts where Google has done pretty problematic stuff,” Robertson says, including dramatically underrepresenting women of color in its image search results.

A spokesperson for Google said that the company “appreciate[s] the researchers’ work in the new study.” In an email statement, the company said that it tries to keep its algorithm both “relevant and reliable.” The search function, it said, is not designed to infer sensitive information—race, religion or political affiliation—in its results. 

Shmargad points out that the study’s data aren’t entirely bias-free if you break them down to a more granular level. “It doesn’t appear like there’s much algorithmic bias happening across party lines,” he says, “but there might be some algorithmic bias happening across age groups.”

Users age 65 and older were subject to more right-leaning links in their Google search results than other age groups regardless of their political identity. Because the effect was slight and the oldest age group only made up about one fifth of the total participants, however, the greater exposure’s impact on the overall results of the study disappeared in the macroanalysis.

Still, the findings reflect a growing body of research that suggests that the role of algorithms in creating political bubbles might be overstated. “I’m not against blaming platforms,” Trielli says. “But it’s kind of disconcerting to know that it’s not just about making sure that platforms behave well.” Our personal motivations to filter what we read to fit our political biases remains strong,”

“We also want to be divided,” Trielli adds.

The silver lining, Ognyanova says, is that “this study shows that it is not that difficult for people to escape their [ideological] bubble.” That may be so. But first they have to want out.

ABOUT THE AUTHOR(S)

    Joanna Thompson is an insect enthusiast and former Scientific American intern. She is based in New York City. Follow Thompson on Twitter @jojofoshosho0

    know more

    Popular

    More like this
    Related

    Wall Street pushes out rate-cut expectations, sees risk they don’t start until March 2025

    Federal Reserve Chair Jerome Powell speaks during a House...

    Trump Media shares close more than 15% higher after days of declines

    PoliticsPublished Wed, Apr 17 20249:52 AM EDTUpdated 3 Hours...

    ASML earnings drag semiconductor stocks lower

    Workers at Carl Zeiss ZMT are seen outside giant...