People of color have been dealing with racist cabbies for decades, and according to a new study, that discrimination is alive and well in the world of ride-hailing apps. Not only are black people more likely to wait longer or have their ride cancelled, women in general also are getting taken for a ride—to either boost the fare or flirt.
The National Bureau of Economic Research, a respected non-profit and non-partisan research organization, has released the findings of a two-year study that tracked discrimination of riders using Uber, Lyft, and Flywheel in Seattle and Boston. The study was done by researchers at MIT, Stanford and the University of Washington.
The study involved nearly 1,500 rides across the two cities, with work beginning in Seattle late last year to this March. Undergrads from the University of Washington were given identical phones with the three ride-sharing apps pre-loaded, instructed to take a handful of prescribed routes, and then noting when the ride was requested, when it was accepted by the driver, when they were picked up, and finally when they got to their destination.
In the Seattle experiment, trip requests from black riders took between 16 to 28 percent longer to be accepted by both UberX and Lyft, and breaking UberX out showed a wait time of 29 to 35 percent longer than their white counterparts.
ProRepublica – Facebook Lets Advertisers Exclude Users by Race:
Imagine if, during the Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers.
That’s basically what Facebook is doing nowadays.
The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.
I’m surprised that more hasn’t been made of this by newspapers, if not MPs.
Technology reflects what is around it. That’s especially true when teaching AI systems; what they’re taught is inevitably a reflection of society as is, not as we wish it to be.
The title is from The Who’s Won’t Get Fooled Again.