Algorithmic bias
Search for a female contact on LinkedIn, and you may get a curious result. The professional networking website asks if you meant to search for a similar-looking man’s name.
A search for “Stephanie Williams,” for example, brings up a prompt asking if the searcher meant to type “Stephen Williams” instead.
It’s not that there aren’t any people by that name — about 2,500 profiles included Stephanie Williams.
One company that’s risen to the bias challenge is neighborhood social network Nextdoor. Wired explains how they’ve cut racial profiling simply by prompting users for other information e.g. what was the person wearing? The Guardian explores why companies such as Airbnb don’t implement the same checks.