Can an automated Google feature that ostensibly helps users with a search be a basis for libel? Courts in Germany, Italy and Hong Kong have had to field that question.
Google’s position is that there is no human intervention, and that its algorithm is based merely on what others have searched for, or strings of words in indexed pages.
Autocomplete “predictions are possible search terms, not statements by other people or Google about the terms, and not the answer to a search,” the company claims.
Google’s Autocomplete algorithm, which is a fundamental part of Google’s search engine, “automatically detects and excludes a small set of search terms for things like pornography, violence, hate speech, illegal and dangerous things, and terms that are frequently used to find content that violates copyrights,” according to Google.
Nonetheless, at least three courts apparently have been persuaded that libel can result from Autocomplete.
Case No. 1: Italy, 2011 – Google Required to Filter Search Suggestions
A court in Milan Italy on March 31, 2011, ordered Google to filter libelous Autocomplete search suggestions, ZDNet reported.
“The facts are simple and very well described in the order,” wrote Carlo Piana, the lawyer who represented the anonymous plaintiff, in a blog post.
“Basically, typing in the Google search field “Name Surname” of my client, the autocompletion and the “suggested searches” (now “related searches”) offered to complete it with “con man” (“truffatore”) and “fraud” (“truffa”), which caused a lot of trouble to the client, who has a public image both as an entrepeneur [sic] and provider of educational services in the field of personal finance,” Piana explained.
Following the Milan court ruling, Google reported that it was disappointed and made this statement, quoted in the ZDNet report: “We believe that Google should not be held liable for terms that appear in autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself.”
This was not Google’s first court problem in Italy. In 2010, three Google executives were convicted in absentia in Italy for allowing the posting of a YouTube video that showed a disabled teenager being bullied. That situation captured many headlines. The conviction stood, even though Google took the video down within 24 hours of its posting after receiving two complaints.
Case No. 2: Germany, 2013 – Court Orders Google to Modify Autocomplete
The Federal Court of Justice in Berlin upheld a 2010 complaint from the founder (identified as “R.S.”) of a company that sold nutritional supplements, AP reported. The search predictions from Google’s Autocomplete included “Scientology” and “fraud,” according to the AP report.
R.S. claimed his reputation was tarnished because he was not connected to Scientology, and the search results made it appear as if he were accused of fraud. The court ordered Google to respect R.S.’ request to remove Autocomplete entries that were “defamatory.”
Even though Google was not ordered to turn off the Autocomplete function, that ruling created a new duty on Google to somehow monitor what it was displaying in Germany via Autocomplete.
Case No. 3: Hong Kong, 2014 – Libel Suit Under Way
In a libel suit for Google Autocomplete, a Hong Kong judge “cited Europe’s recent ‘right to be forgotten’ ruling which requires Google to remove embarrassing or outdated search results upon request,” The Washington Post reported.
Deputy High Court Judge Marlene Ng on Aug. 5, 2014, ruled that a lawsuit against Google for libel based on Google Autocomplete could proceed.
In that case, when Hong Kong business tycoon Albert Yeung Sau-shing googled his name, the Autocomplete prediction suggested the word “triad.” In Asia, that term is associated with organized crime.
Hong Kong libel standards are similar to those of the U.S., The Washington Post noted, and even though Yeung had an impressive business background, he also had been convicted of some crimes.
Yeung is the founder and chairman of Emperor Group, a sprawling business empire that includes property development, entertainment and financial services.
He has been found guilty of crimes including illegal bookmaking and perverting the course of public justice, and he has been fined for insider trading.
The Future of Google Autocomplete
When Autocomplete predictions for a particular word or topic are not displayed, according to Google, it may be for the following reasons:
- “The search term is not popular enough. Queries that are not often searched for are less likely to be useful in Autocomplete.
- “The search term is too fresh. It can sometimes take a few days or weeks for newly popular search terms to appear consistently.
- “The search term was mistaken for a policy violation. Sometimes, we try not to show a search in one language that would be perfectly fine in another language. For example, we might accidentally not show a compound word because it includes a translation of a bad word from another language.”
Google asks users to report an “offensive Autocomplete prediction,” in order to “improve predictions for everyone.”
Google may have to address this issue head on in the future, if for no other reason than because it accounts for more than 77 percent of searches in the U.S., according to Comscore.
Will Google Autocomplete have to be filtered to alter predictions in different countries, and even for various key words? Since Google’s Autocomplete predictions are based on relevant searches that users have conducted previously — and also related to Google+ profile information about individuals who are being searched — Google may have to find another way to predict results.