Adult sex chat typing
Both algorithms are pretty good about letting through more clinical terminology, such as “vaginas,” “nipples,” or “penises.” , “Many governments impose some censorship in their jurisdiction according to content that is illegal under national laws.” So it’s not entirely surprising that, in order to head off more direct government intervention, corporations like Google and Microsoft self-regulate by trying to scrub their autocomplete results clean of suggestions that lead to child pornography. While you might think it wry that Google and Bing suggest completions for “prostitute,” the fact that Google also offers completions of “child prostitute” for “images” or “movies” is far more alarming.As shown in the next figure, both algorithms do get much stricter when you add “child” before the search term. Moreover, searching for “child genital” or “child lover” on Google or Bing, as well as “child lust” on Google, all lead to disturbing suggestions that relate to child pornography.Autocomplete is one of those modern marvels of real-time search technology that almost feels like it’s reading your mind.Thanks to analyzing and mining what millions of other users have already searched for and clicked on, Google knows that when you start typing a query with a “d,” you’re most likely looking for a dictionary.So what happens when unsavory things, perhaps naughty or even illegal, creep into those suggestions?As a society we probably don’t want to make it easier for pedophiles to find pictures of naked children or to goad the violently predisposed with new ideas for abuse.Shortcut list of some of the common acronyms and abbreviations used in chat rooms, SMS messages on cell phones, and instant messengers such as Yahoo and MSN.
As Google writes in its autocomplete FAQ, “we exclude a narrow class of search queries related to pornography, violence, hate speech, and copyright infringement.” Bing, on the other hand, makes sure to “filter spam” as well as to “detect adult or offensive content,” according to a recent post on the Bing blog.
Initially it would appear Google is stricter, blocking more sex-related words than Bing. Instead of outright blocking all suggestions for “dick” as Google does, Bing will just scrub the suggestions so you only see the clean ones, like “dick’s sporting goods.” Sometimes Bing will rewrite the query, pretending a dirty word was a typo instead.
For instance, querying for “fingering” leads to wholesome dinner suggestions for “fingerling potato recipes,” and searching for “jizz” offers suggestions on “jazz,” for the musically minded searcher, of course.
In response, a Microsoft spokesperson commented that, “Sometimes seemingly benign queries can lead to adult content,” and consequently are filtered from autosuggest.
By that logic, it would seem that “homosexual” merely leads to “too much” adult content, causing the algorithm to flag and filter it.
Querying “child lover,” for instance, offers suggestions for “child lover pics,” “child lover guide,” and “child lover chat.” Given Google and Microsoft’s available technology and resources, and combined with their ostensible commitment, it’s hard to believe that these types of errors slipped through the cracks.