Big Tech companies such Google and Facebook have enormous power to shift people’s opinions and voting preferences through manipulation of the content of their home pages that users see on screens of their devices, without people knowing, said Dr. Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology in California.
“I calculated months ago, that if all the Silicon Valley companies, the most powerful two being Google and Facebook, if they’re all pushing in the same direction, that could easily shift in this election 15 million votes, which means they, in effect, decide who the next president is going to be,” Epstein said in an interview on Epoch Times’ “American Thought Leaders” program.
Epstein has conducted a large monitoring project “to determine what the big tech companies were showing people in the days leading up to the 2016, 2018, and now the 2020 election.”
The project team recruited “a diverse group of 733 registered voters, Republicans, Democrats, and independents,” from “three very critical battleground states: Arizona, Florida, and North Carolina.”
Those field agents were equipped with special software that tracked their activity on the internet, “for example, doing searches on Google, Bing, and Yahoo,” Epstein explained.
The software gave the project team capability to see–with the agents’ permission—all election-related activities on the internet performed by the field agents, Epstein said, it was like seeing agents’ screens. “We are looking over their shoulders, using software,” Epstein explained.
The purpose of the monitoring was to capture ephemeral content such as search results, reminders on the Google or Facebook homepage, search suggestions, newsfeeds, YouTube sequences. This fleeting content can impact users when it appears but later it disappears forever leaving no trace, Epstein explained.
Epstein has collected and preserved more than half a million ephemeral messages on Google and Facebook home pages of his field agents that otherwise would have been lost forever.
“And we have indeed found evidence of bias and we’ve also found what some people might want to call a smoking gun,” Epstein said.
“We found that during the week of Oct. 26, that’s quite close to the election, only our liberal field agents were getting vote reminders on Google’s home page,” Epstein said.
Among those who identified themselves as conservative, “not a single person saw that reminder on the homepage,” he added.
On October 29, Epstein made his findings public.
On the same day at night “starting at midnight, on Oct. 29, just days before the election, all of our field agents began to receive that vote reminder on Google’s homepage. And that continued until the very end of Election Day on Nov. 3,” Epstein said.
Epstein explained his findings, “If you’re supporting one candidate, of course, you want to mobilize the base ... to get those voters off of their sofas if they haven’t yet voted by mail.” “Secondly, you want to discourage supporters of the candidate you oppose from voting, so you want to keep those people home,” he said.
To influence people who are still undecided, “you’re going to apply the most pressure ... to try to nudge those undecided voters in one direction. or the other,” Epstein said adding that “normally in a close election, those people decide who wins.”
Google “home page is seen in the United States 500 million times a day. If that kind of reminder was being used systematically over a period of time, it affected more than who voted on election day it affected who sent in mail-in votes, it affected who registered to vote,” Epstein said.
YouTube which is a part of Google can also significantly impact people‘s opinions by suggesting more videos to watch, Epstein said adding that “of the videos that people watch on YouTube around the world 70 [percent] are suggested by YouTube’s up next algorithm.”
Google did not immediately respond to a request for comments.