A manipulated image of Michelle Obama that drew headlines and controversy when it became the first result users saw when performing a Google Image search for the First Lady was pulled down Wednesday — but not by Google. While the picture may have disappeared, questions about Google’s response to so-called Googlebombs remain for the company.
The image, which was originally displayed on a blog, was doctored to make Obama’s face appear ape-like, drawing massive criticism and demands that Google cease displaying it in its Image Search engine. Yet Google is firm with its answer: It won’t play Web censor when it comes to search results.
A blog called “Hot Girls,” hosted by the Google-owned Blogger service, removed the picture and replaced it with an apology in Chinese. An accompanying (and literal) English-language translation seems to indicate remorse that Google’s indexing technologies ended up ranking the picture high on its results list: “I am very sorry for this article, and that this is the program automatically issued a document from the article. Do not the subject of race and politics make the discussion too radical and sincere hope that the world is very peaceful.”
Before the picture was removed from the blog, Google posted a link near the offending image to its explanation for why it doesn’t remove all offensive material. “Sometimes Google search results from the Internet can include disturbing content, even from innocuous queries. We assure you that the views expressed by such sites are not in any way endorsed by Google. Search engines are a reflection of the content and information that is available on the Internet,” read the disclaimer.
“Google views the integrity of our search results as an extremely important priority,” it stated. “Accordingly, we do not remove a page from our search results simply because its content is unpopular or because we receive complaints concerning it.”
The Google Image Policy
The Obama controversy harkens back to at least two similar instances of “Googlebombing” — gaming Google’s indexing system for search results. In 2004, search queries for the word “Jew” sent users to hate group Web sites. In the waning years of the Bush administration, typing in “miserable failure” resulted in Wikipedia entries for George W. Bush.
Incidents like these serve as an opportunity for Google to tweak its search technologies, and to remind users how the search process works, said company spokesperson Scott Rubin.
“We don’t manually edit search results,” Rubin told TechNewsWorld. “It’s algorithms. We do get requests from individuals or organizations to alter search results, and of course we reserve the right to consider everything on a case-by-case basis, but in general the search is through algorithms. We are working all the time to improve the algorithms.”
Google did not contact Hot Girls and ask that the image be taken down, Rubin said. There is a difference, he also stressed, between user-generated content meeting one set of policy and standards guidelines, and what comes up during general search results. The Blogger pic, as offensive as it was, still didn’t violate that service’s guidelines. Child porn or other illegal material would, and Rubin said his company is quick to remove that kind of content.
Content that is critical of a religion might be allowed in search results, but attacking a person or group based on their membership in a religion, race or ethnic group would be a violation, Rubin said.
“We believe that more information is better than less, so that you have the information you need to make a good decision. That includes stuff like this or other things that might be offensive,” he explained. “That doesn’t mean we don’t have policies and standards.”
The Impact on Business
Playing Internet police can put Google on very slick slopes, but advertisers and other business partners may need more clarification of the guidelines, said Frost and Sullivan digital media analyst Mukul Krishna.
“Ultimately, if you’re making money off the Internet, you have some amount of responsibility,” he said. “You don’t have to be the police, but some responsibility is involved, and that can be seen in terms of flagging offensive content. You already have safe search, advanced search, features like that.”
Google may not be able to rely on just software fixes before long, but possible solutions will have to walk a thin line. “People are going to always find a way of going around algorithms. There has to be a comprehensive policy as to what is really offensive,” Krishna told TechNewsWorld. “You’re making money off of advertisers in all sorts of ways. You might lose some money if you ban some content, but you might lose more money if people think you’re not doing enough.”
What kinds of conversations might Google be having with advertisers and business partners regarding the issue? “What I can tell you is, what’s important to us is to return relevant results for search queries, so we’re always tweaking the algorithms,” Google’s Rubin said. “There are many, many factors that go into algorithms, and we’re always looking for ways to improve that for more relevant results — to help our users find what they’re looking for. We think that’s a better user experience and that advertisers would want to have eyeballs on ads that are more relevant as well.”