Google’s personalized search results are isolating people in bubbles of their own political preferences, making it harder for voters to make informed decisions on contentious issues, according to a study by DuckDuckGo, a company that runs the privacy-oriented search engine DuckDuckGo.com.
The company had 87 volunteers across the country conduct the same series of searches on Google within about one hour. As expected, nearly all were shown significantly different results.
Bubble Polarization
The volunteers searched for terms related to politically contentious issues, such as “gun control,” “immigration,” and “vaccination.”“The filter bubble is particularly pernicious when searching for political topics,” the report stated, adding, “Undecided and inquisitive voters turn to search engines to conduct basic research on candidates and issues in the critical time when they are forming their opinions on them.”
The bubble thus worsens polarization in society, according to DuckDuckGo founder and CEO Gabriel Weinberg.
“You’re getting a viewpoint that you’re already more likely to agree with that’s pushing you toward your preexisting beliefs so you don’t really consider the other candidate or other side of the issue as much as you should in your research mode,” he said in a phone interview.
Google Power
The impact of filter bubbles on “political outcomes in aggregate” can be “significant,” the study stated.Google’s algorithms can shift 20 percent or more of votes among voters and up to 80 percent in some demographic groups, according to research by Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology. Google representatives stated the company didn’t agree with Epstein’s research methodology.
Excluding mobile searches, 76 of the study participants were shown 62 different sets of results when they searched “gun control” logged out and browsing in privacy mode, the study found.
The top two or three results were usually the same for all the DuckDuckGo study participants, meaning most people would end up clicking on the same links.
Area Bubble
Google denies personalizing search results for people with browser privacy mode on based on the users’ Google account search history.From Google’s response to the study, Weinberg gathered “that location is largely driving” the differences.
That suggests the bubble may not be personal but includes a group of people accessing the web from the same area.
“If that’s the case, then you would be consistently showing different types of links to different zip codes, to different locations, and that would create a persistent filter bubble effect,” Weinberg said.
Shadow Profiles
With all he’s seen, Weinberg suspects that Google goes beyond creating area-based bubbles. The location information used by Google and other web services stems from data like the IP address and the “browser fingerprints,” which include information like browser type and version, the user device’s operating system, time zone, language, and various browser and device settings.Tracking users based on the IP and fingerprints, even when they apparently don’t wish to be tracked, has been dubbed “shadow profiling” and can be hard to detect because it doesn’t leave a trace on the user’s device, according to the EFF.
Shadow profiling may be a way for Google to maintain deniability over search personalization. If Google personalizes search results based on a profile build using the IP and browser fingerprints and if it keeps such a profile separate from the user’s Google account, the company may try to claim the search results are not really “personalized” because they’re not connected to a specific person, Weinberg speculated.
“You could semantically call that ‘not personal,’ ” he said.