Hot on HuffPost Tech:

See More Stories
AOL Tech

Google Autocomplete Blacklist Marks 'Gay Men' and 'Cocaine' as Offensive

google lacks autocomplete results
Google's auto-complete search feature has always provided a unique window into the Web-surfing habits of other Google users, but, with Google Instant, a single typo can now result in a full page of results you probably never wanted to see. Launched earlier this month, Google Instant reveals search results as you type, no longer requiring you to press 'Enter' to search. So, in order to protect users from accidentally coming across "offensive" things they'd rather not know about, the company has apparently cooked up a lengthy, if somewhat questionable, list of blacklisted auto-completes. To give it a try, type 'lover' into the search bar, and you'll get results as you type. Searching for 'lovers,' though, will cause Instant Search results to immediately clear, forcing you to hit enter to search for the term.

As you'd imagine, most of the words on Google's blacklist, as gathered by 2600: The Hacker Quarterly, are of an explicitly sexual nature, although the list also includes four-letter words, racial slurs, and potentially taboo phrases like 'cocaine,' 'white power,' 'lesbian' and 'gay men' (the latter only when followed by a space).

A few quick tests reveal that the system errs on the side of being strict, flagging certain words regardless of context. So, a search for Canadian indie-pop supergroup The New Pornographers leaves you with blank results, because of 'pornographers.' Similarly, the presence of 'teen' leaves Nirvana fans still looking when trying to search for 'Smells Like Teen Spirit.' A couple more odd examples:
  • "Pamela Anderson" (though "Pam Anderson" works)
  • "bisexual"
  • "[insert anything] is evil"
  • "women rapping"
  • "hairy"
  • "babes in toyland"
  • "white power"
  • "i hate"
  • "fuchsia"
When contacted by Mashable, a Google spokesperson responded:
There are a number of reasons you may not be seeing search queries for a particular topic. Among other things, we apply a narrow set of removal policies for pornography, violence, and hate speech. It's important to note that removing queries from Autocomplete is a hard problem, and not as simple as blacklisting particular terms and phrases.
And, as thorough as Google's filter is, BoingBoing discovered several questionable words that are noticeably absent from the blacklist. Ultimately, however, the acceptability of many queries depends on how users space their words.

Tags: autocomplete, cocaine, filter, gay, GayMen, goat.se, google, GoogleAutocomplete, GoogleInstant, offensive, porn, search, security, sex, top, web

Comments

70