from Sputnik News:
In this exclusive report, distinguished research psychologist Robert Epstein explains the new study and reviews evidence that Google’s search suggestions are biased in favor of Hillary Clinton. He estimates that biased search suggestions might be able to shift as many as 3 million votes in the upcoming presidential election in the US.
Biased search rankings can swing votes and alter opinions, and a new study shows that Google’s autocomplete can too.
A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete.
Because of recent claims that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research. As you will see, there is some cause for concern here.
In June of this year, Sourcefed released a video claiming that Google’s search suggestions — often called “autocomplete” suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook.
The video’s narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump.
“The intention is clear,” said Lieberman. “Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site.”
Google responded to the Sourcefed video in an email to the Washington Times, denying everything. According to the company’s spokesperson, “Google Autocomplete does not favor any candidate or cause.” The company explained away the apparently damning findings by saying that “Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name.”
Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman’s claims. What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google’s search suggestions to alter what people search for.
Lieberman insisted that Google’s search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google’s lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us.
Our investigation is ongoing, but here is what we have learned so far:
Bias in Clinton’s Favor
To test Lieberman’s claim that Google’s search suggestions are biased in Mrs. Clinton’s favor, my associates and I have been looking at the suggestions Google shows us in response to hundreds of different election-related search terms. To minimize the possibility that those suggestions were customized for us as individuals (based on the massive personal profiles Google has assembled for virtually all Americans), we have conducted our searches through proxy servers — even through the Tor network — thus making it difficult for Google to identify us. We also cleared the fingerprints Google leaves on computers (cache and cookies) fairly obsessively.
Google says its search bar is programmed to avoid suggesting searches that portray people in a negative light. As far as we can tell, this claim is false.
Generally speaking, we are finding that Lieberman was right: It is somewhat difficult to get the Google search bar to suggest negative searches related to Mrs. Clinton or to make any Clinton-related suggestions when one types a negative search term. Bing and Yahoo, on the other hand, often show a number of negative suggestions in response to the same search terms. Bing and Yahoo seem to be showing us what people are actually searching for; Google is showing us something else — but what, and for what purpose?
As for Google Trends, as Lieberman reported, Google indeed withholds negative search terms for Mrs. Clinton even when such terms show high popularity in Trends. We have also found that Google often suggests positive search terms for Mrs. Clinton even when such terms are nearly invisible in Trends. The widely held belief, reinforced by Google’s own documentation, that Google’s search suggestions are based on “what other people are searching for” seems to be untrue in many instances.
Please follow SGT Report on Twitter & help share the message.