Algorithmic bias

Seattle Times:

Search for a female contact on LinkedIn, and you may get a curious result. The professional networking website asks if you meant to search for a similar-looking man’s name.

A search for “Stephanie Williams,” for example, brings up a prompt asking if the searcher meant to type “Stephen Williams” instead.

It’s not that there aren’t any people by that name — about 2,500 profiles included Stephanie Williams.

But similar searches of popular female first names, paired with placeholder last names, bring up LinkedIn’s suggestion to change “Andrea Jones” to “Andrew Jones,” Danielle to Daniel, Michaela to Michael and Alexa to Alex.
The pattern repeats for at least a dozen of the most common female names in the U.S.
Searches for the 100 most common male names in the U.S., on the other hand, bring up no prompts asking if users meant predominantly female names.
LinkedIn says its suggested results are generated automatically by an analysis of the tendencies of past searchers. “It’s all based on how people are using the platform,” said spokeswoman Suzi Owens.

One company that’s risen to the bias challenge is neighborhood social network Nextdoor.  Wired explains how they’ve cut racial profiling simply by prompting users for other information e.g. what was the person wearing?  The Guardian explores why companies such as Airbnb don’t implement the same checks.