Tuesday, September 28, 2010

Google Suggest Filtering

Google Suggest was supposed to help users type a query by providing useful suggestions. Unfortunately, some of the suggestions are offensive and Google had to filter the searches related to pornography, violence, and hate speech.

Google's over-protective algorithms now filter all the suggestions that include "is evil", "I hate", "[ethnic group] are" (for example, "chinese are"). Google Suggest also filters "Smells Like Teen Spirit", the name of a popular Nirvana song.


"Queries in autocomplete are algorithmically determined based on a number of objective factors (including search term popularity) without manual intervention," explains Google. Google Suggest's filtering flaws are more obvious, now that Google Instant previews the results without having to press Enter. If you type [google is e], Google no longer previews the results and suggests to "press Enter to search".

Google Blacklist (not safe work and potentially offensive) lists some of the rules used by Google to censor the list of suggestions. "Like everything these days, great care must be taken to ensure that as few people as possible are offended by anything. Google Instant is no exception. Somewhere within Google there exists a master list of "bad words" and evil concepts that Google Instant is programmed to not act upon, lest someone see something offensive in the instant results... even if that's exactly what they typed into the search bar."

{ via waxy.org }

No comments:

Post a Comment