Search engine manipulation effect

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

The search engine manipulation effect (SEME) is the change in consumer preference from manipulations of search results by search engine providers. SEME is one of the largest behavioral effects ever discovered. This includes voting preferences. A 2015 study indicated that such manipulations could shift the voting preferences of undecided voters by 20 percent or more and up to 80 percent in some demographics.[1]

The study estimated that this could change the outcome of upwards of 25 percent of national elections worldwide.

On the other hand, Google denies secretly re-ranking search results to manipulate user sentiment, or tweaking ranking specially for elections or political candidates.[2]

<templatestyles src="Template:TOC limit/styles.css" />

Scenarios

At least three scenarios offer the potential to shape/decide elections. The management of a search engine could pick a candidate and adjust search rankings accordingly. Alternatively, a rogue employee who has sufficient authority and/or hacking skills could surreptitiously adjust the rankings. Finally, since rankings influence preferences even in the absence of overt manipulation, the ability of a candidate to raise his or her ranking via traditional search engine optimization would influence voter preferences. Simple notoriety could substantially increase support for a candidate.[1]

Experiments

Five experiments were conducted with more than 4,500 participants in two countries. The experiments were randomized (subjects were randomly assigned to groups), controlled (including groups with and without interventions), counterbalanced (critical details, such as names, were presented to half the participants in one order and to half in the opposite order) and double-blind (neither subjects nor anyone who interacted with them knows the hypotheses or group assignments). The results were replicated four times.[1]

US

In experiments conducted in the United States, the proportion of people who favored any candidate rose by between 37 and 63 percent after a single search session.[1]

Participants were randomly assigned to one of three groups in which search rankings favored either Candidate A, Candidate B or neither candidate. Participants were given brief descriptions of each candidate and then asked how much they liked and trusted each candidate and whom they would vote for. Then they were allowed up to 15 minutes to conduct online research on the candidates using a manipulated search engine. Each group had access to the same 30 search results—each linking to real web pages from a past election. Only the ordering of the results differed in the three groups. People could click freely on any result or shift between any of five different results pages.[1]

After searching, on all measures, opinions shifted in the direction of the candidate favored in the rankings. Trust, liking and voting preferences all shifted predictably. 36 percent of those who were unaware of the rankings bias shifted toward the highest ranked candidate, along with 45 percent of those who were aware of the bias.[1]

Divorcees, Republicans and those who reported low familiarity with the candidates were among the most subject to the effect, while participants who were better informed, married or reported annual household income between $40,000 and $50,000 were harder to sway. Moderate Republicans were the most susceptible, increasing support for the favored candidate by 80%.[3]

Slightly reducing the bias on the first result page of search results – specifically, by including one search item that favoured the other candidate in the third or fourth position masked the manipulation so that few or even no subjects noticed the bias, while still triggering the preference change.[4]

On election day in 2010, Facebook sent ‘go out and vote’ reminders to more than 60 million of its users. The reminders caused about 340,000 people to vote who otherwise would not have. In another 2014 Facebook experiment for a period of a week, 689,000 Facebook users were sent news feeds that contained either an excess of positive terms, an excess of negative terms, or neither. Those in the first group subsequently used slightly more positive terms in their communications, while those in the second group used slightly more negative terms in their communications. Both experiments were conducted without the knowledge or consent of the participants.[4]

Later research suggested that search rankings impact virtually all issues on which people are initially undecided around the world. Search results that favour one point of view tip the opinions of those who are undecided on an issue. In another experiment, biased search results shifted people’s opinions about the value of fracking by 33.9 per cent.[4]

India

A second experiment involved 2,000 eligible, undecided voters throughout India during the 2014 Lok Sabha election. The subjects were familiar with the candidates and were being bombarded with campaign rhetoric. Search rankings could boost the proportion of people favoring any candidate by more than 20 percent and more than 60 percent in some demographic groups.[1]

United Kingdom

A UK experiment was conducted with nearly 4,000 people just before the 2015 national elections examined ways to prevent manipulation. Randomizing the rankings or including alerts that identify bias had some suppressive effects.[1]

European antitrust lawsuit

European regulators accused Google of manipulating its search engine results to favor its own services, even though competitive services would otherwise have ranked higher. As of August 2015, the complaint had not reached resolution, leaving the company facing a possible fine of up to $6 billion and tighter regulation that could limit its ability to compete in Europe. In November 2014 the European Parliament voted 384 to 174 for a symbolic proposal to break up the search giant into two pieces—its monolithic search engine and everything else.[5]

The case began in 2009 when Foundem, a British online shopping service, filed the first antitrust complaint against Google in Brussels. In 2007, Google had introduced a feature called Universal Search. A search for a particular city address, a stock quote, or a product price returned an answer from one of its own services, such as Google Maps or Google Finance. This saved work by the user. Later tools such as OneBox supplied answers to specific queries in a box at the top of search results. Google integrated profile pages, contact information and customer reviews from Google Plus. That information appeared above links to other websites that offered more comprehensive data, such as Yelp or TripAdvisor.[5]

Google executives Larry Page and Marissa Mayer, among others, privately advocated for favoring Google’s own services, even if its algorithms deemed that information less relevant or useful.[5]

Google acknowledges adjusting its algorithm 600 times a year, but does not disclose the substance of its changes.[1]

2016 Presidential election

In April 2015, Hillary Clinton hired Stephanie Hannon from Google to be her chief technology officer. In 2015 Eric Schmidt, chairman of Google's holding company started a company – The Groundwork – for the specific purpose of electing Clinton. Julian Assange, founder of WikiLeaks, called Google her ‘secret weapon’. Researchers estimated that Google could help her win the nomination and could deliver between 2.6 and 10.4 million general election votes to Clinton via SEME. No evidence documents any such effort, although since search results are ephemeral, evidence could only come via a Google whistleblower or an external hacker.[4]

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. 4.0 4.1 4.2 4.3 Lua error in package.lua at line 80: module 'strict' not found.
  5. 5.0 5.1 5.2 Lua error in package.lua at line 80: module 'strict' not found.

External links

  • Lua error in package.lua at line 80: module 'strict' not found.