Google’s Offensively Bad Image Search

November 24, 2009
By | Comments Off on Google’s Offensively Bad Image Search

michelleobamaIf you did a Google image search this morning on “Michelle Obama,” you would’ve seen a horribly racist caricature of the First Lady depicted as part monkey. I was alerted to this by a Huffington Post article that complained about it. HuffPo also noted that Google had tried to respond to and predict criticism about this by putting up ads on that page with a link for “Offensive Search Results,” and if you clicked on this, you’d read a little apologia on how the nature of being a search engine is that some images might be offensive, and that’s just how they roll. In their own words:

Search engines are a reflection of the content and information that is available on the Internet. A site’s ranking in Google’s search results relies heavily on computer algorithms using thousands of factors to calculate a page’s relevance to a given query [….] The beliefs and preferences of those who work at Google, as well as the opinions of the general public, do not determine or impact our search results.

In short, they tried to protect themselves with a safety blanket of (1) this being the way that algorithms work, and (2) them not wanting to mess with the system for the sake of seeming biased. Both defenses are just as offensive as the image. Let me explain:

To take the “algorithm” issue first, I get that some people have smart ways of working around the algorithm to make their page or image appear first, and I get that Google won’t always catch them. But if they’re placing an ad with their public explanation right above this search and this image, they clearly did catch this one. And now that they know some racist goofs messed with their system, why don’t they fix it? If a librarian finds porn in the kids section, wouldn’t we expect them to move it, rather than leave a note on why it’s there?

This leads to the second issue, since they’re missing the point by discussing the image as “offensive,” and by making this seem like a freedom of speech thing (in passing, why is this a defense to all bad behavior in the US? It’s the Randy Marsh baseball hooligan defense). Sure, it’s offensive, but the problem lies in it being inaccurate. If I conduct a Google image search for “chair” and get a picture of a window, they haven’t done their job. And similarly if I search for Michelle Obama and get this, they’ve screwed up. A search engine that gives you the wrong info is a bad one. Forget about the offensiveness for a second – it’s just incompetence. I totally reject their silly defense/explanation. If Michelle Obama appeared in a real photo that was also offensive, fair enough if that appeared frequently. But just as I’d think it a sign of a bad search engine if a “Sarah Palin” image search lead to the infamous Photoshop’d pic of her in a bikini with a gun, the pic that they had up is simply incorrect.

Since this morning, though, I see they have somewhat changed their tone (or, perhaps, others have found ways to move better images up?), as it’s now no longer the lead image. Equally offensive, though, is that they now give a single recommendation for a “related search”: “Michelle Obama monkey.” Which is no improvement. If I wanted to look for a book on Jewish history, would Amazon tell me I might also enjoy Mein Kampf? Google can offer all the politicized, BS explanations that they want, but they’re simply a bad search engine if they’re giving such results and sticking by them, like a librarian leading you to 16th century French poetry when you asked for Ginsberg. As sucky as Bing is, it’s seeming a lot better today.

Share

Tags: , , , ,

Comments are closed.