Google Suggest is hurting lawmakers

Search Engines Are a Thorn in Congress’ Side

By Kate Tummarello
Roll Call Staff
Jan. 9, 2012,

Congressional staffers control the content on Members’ websites. They control Members’ Facebook and Twitter accounts. They can even manage Internet search results by buying ads and using search engine optimization techniques.

But Hill staffers can’t control what people wonder about their bosses.

The latest trend in helpful Web search technologies is quietly causing headaches for Members of Congress and those who manage their reputations. Search engines such as Google now offer suggested search terms that appear in a drop-down menu as users begin typing.

Those search terms, formulated partly based on what other users are searching, often serve up all kinds of negative associations about Members of Congress — from keeping gaffes alive to raising sexual questions — and there’s not much politicians can do about it.

Look no further than Rep. Alcee Hastings, who was impeached by the House in 1988 and removed from his job as a U.S. district judge in Florida after being charged with bribery and perjury. Though he was later cleared of charges and is now a respected voice on human rights issues, someone typing his name into Google today might think the Florida Democrat had been impeached again.

A search for “Alcee Hastings” brings up an autocomplete drop-down menu with a few search suggestions based on what Google thinks users might be looking for. Some of the terms that come up are “impeachment,” “trial” and “bribery.”

Autocomplete also won’t forget Rep. Jean Schmidt’s past. A Google search for the Ohio Republican is likely to bring up the terms “ethics,” “corrupt” and “Armenian genocide,” referring to an ethics investigation in July 2011. A voter looking only for her website or contact information might search instead for details about the investigation after being prompted by Google. A spokesman from Schmidt’s office said the staff does not monitor what Google autocomplete suggests.

Then there’s GOP Sen. Scott Brown (Mass.), who can’t escape from his decades-ago nude photo shoot for Cosmopolitan magazine. And Speaker John Boehner (R-Ohio) has a penchant for crying, and Google reminds anyone looking for him that he’s publicly shed many tears.

For Rep. Hank Johnson, autocomplete usually suggests “Guam” and “Guam capsize,” referring to the Georgia Democrat’s 2010 gaffe questioning if a large population in Guam would cause the island to “tip over.” When looking for Rep. Emanuel Cleaver, Google will suggest “spit,” reminding users the Missouri Democrat said he was spit on by a tea party protester at a rally in 2010. And anyone looking for Rep. Joe Walsh (R-Ill.) will be reminded that he made headlines when his wife accused him of having unpaid child support.

In the insta-world of politics, those suggestions matter.

Patrick Hynes, president of online communications agency Hynes Communications, said something as fleeting as autocomplete can leave a lasting image and a potentially negative one at that. “Split-second associations with just a small assortment of words is sometimes enough to brand you as something that you may not be or something that you try to get past,” he said.

“It’s certainly something that every politician and every organization that’s concerned about its reputation needs to focus on,” Hynes said, calling the search suggestions “yet another area where their reputation is at stake.”

Hynes explained that politicians tend to focus on search engine optimization, which allows individuals and organizations to reorder their search results by presenting positive information on websites in ways that more easily draw the attention of search engine algorithms. That manipulation aims to place favorable content higher on the list of search results and be seen first, while negative content is moved farther down the list.

Politicians also frequently purchase advertising through Google’s AdWords so that their preferred message appears when certain terms are searched. For instance, ex-presidential candidate Herman Cain famously bought ads linking to a website run by his campaign that would appear when people searched for terms including “Sharon Bialek,” the name of a woman who accused him of sexual harassment, and “Herman Cain scandal.” President Barack Obama’s campaign and the Democratic National Committee have used similar tactics to direct surfers to positive sites when they search for negative terms.

One of the difficulties in monitoring and potentially managing autocomplete is that Google provides different suggestions to different people. As a part of Google’s ongoing attempt to personalize its search tool, autocomplete provides suggested terms based on a number of factors, including the location of the user and the previous searches from that Google account, according to industry experts.

(To ensure the searches Roll Call did for this story would be a good indication of what most users would see while searching with the same term, the searches were conducted after clearing location information, Google account information and browsing history from the computer used.)

Rob Ousbey, vice president of operations at search engine optimization firm Distilled, explained that one important factor in autocomplete’s algorithm is the search terms being entered by others.

“The more people search for those things, the more likely they are to be suggested,” he said. Those suggestions are “a genuine reflection of what people have been interested in in the past,” he said.

“These searches are algorithmically determined based on a number of purely objective factors (including popularity of search terms) without human intervention. All of the queries shown in autocomplete have been typed previously by other Google users,” a Google spokeswoman said in an email. “The autocomplete dataset is updated frequently to offer fresh and rising search queries.”

In one highly sensitive example, people are frequently searching a Member’s name alongside the term “gay,” “girlfriend” or “married,” presumably to determine if the Congressman is openly homosexual.

Autocomplete suggests “gay” for at least nine men in Congress.

Although few people try to manipulate it, the system can be gamed. One industry insider known for doing just that is Brent D. Payne, a search engine optimization expert.

“There are a lot of problems with the Google suggest tool,” Payne said. Autocomplete can create a vicious cycle that keeps even misinformation from disappearing. If Google autocomplete suggests a certain search term — such as “Alcee Hastings impeachment” — people are more likely to search it; the more people search a certain term, the more likely Google autocomplete is to suggest it.

“The human being tends to click on the negative suggestion, even if it’s not the first thing on the list,” Payne said. “Humans are drawn to that negativity.”

Although Google policy prohibits autocomplete from suggesting search terms “related to pornography, violence, hate speech, and copyright infringement” — type in your favorite four-letter word and you’ll likely see nothing but any obscene term you’ve searched in the past — Payne criticized Google for having no official channel to remove or protest autocomplete’s organic suggestions.

“There’s no system inside of Google to remove this type of negative information about a person or company,” he said.

So Payne found a way to fix the problem himself. With the help of Amazon Mechanical Turk, a website that connects those looking for quick, simple tasks with those looking to have quick, simple tasks completed, Payne altered autocomplete so that, at the time, users who type in “Brent P” would see “Brent Payne manipulated this” in autocomplete’s list of suggestions.

“You have to have enough searches to make up for the natural search suggestions,” he said. Payne paid small amounts to other Amazon Mechanical Turk users to continually search the term he wanted autocomplete to suggest.

“It’s not Google-approved, that’s for sure,” he said. “You need to be careful not to do it too fast, or Google sees it as spamming. ... They can’t catch you if you do it slowly.”

Though he is chastised within the industry for the manipulation, Payne thinks addressing the underlying flaws in Google’s algorithm is worth it: “Real people are being harmed by this.”

KateTummarello@cqrollcall.com | @ktummarello

http://www.rollcall.com/issues/57_78....html?ref=corg