Physics

If you’re black, self-driving cars are more likely to drive you, study suggests

If you’re black, self-driving cars are more likely to drive you, study suggests

Congresswoman Alexandria Ocasio-Cortez was widely criticized in January for suggesting that algorithms could be biased. She said no to the annual ML at the event,Algorithms are still man-made and those algorithms are still associated with basic human hypotheses.” “These are just automatic guesses. And if you do not set the bias, you continue the bias automatically. “

Certainly, she is right. There are plenty of examples of this in consumer technologies, from facial recognition technology that does not recognize non-white skin tones to cameras that tell Asians to stop flashing, to racist soap distributors who don’t give you soap if you’re black. Particularly worrying is when this technology is shrunk from soap distributors and mobile phones which lead us to a new problem: it seems that self-driving cars can also have racist problems.

Researchers have found that darker spots on the skin are worse than in soap distribution, automated car systems.  A new study from the Georgia Institute of Technology has found that self-driving cars are more likely to drive you if you are black.

According to the team’s paper, which is available for reading on Archives, they were “inspired by many recent examples of machine learning and perspective systems showing higher error rates for a specific population group than others.” They point out that “already a few autonomous vehicle systems on the road have shown an inability to fully reduce the risk of pedestrian deaths,” and recognizing pedestrians is a key issue in avoiding death.

They collected a large number of photographs showing pedestrians of different skin tones at different lights (using the Fitzpatrick scale to classify skin tones) and fed them to eight different image-recognition systems.  The team then analyzed how long machine-learning systems accurately identified human presence across all skin tones.

They’ve got a bias between the systems, which means it’s less likely that an automated car has gotten a darker mark on someone’s skin and will continue to drive between them. On average, they found that systems were 5 percent less accurate in identifying skin-toned people. This was also true when considering the time of day and partially obstructing the view of pedestrians.

After all, it’s not just skin tone that can be biased against algorithms. Voice recognition systems seem to fight women’s voices as more recognition than men, and women are 47 percent more likely to be injured when wearing seat belts because car safety is mostly designed with men in mind.

There were limitations to the study; It used models made by academics rather than car manufacturers themselves, but it is still effective for technology companies to identify repetition problems, which can be easily solved by adding broad and precise varieties while rigorously testing new products.

The author decided in the survey, “We hope that this study provides compelling evidence of the real problem that could arise if this source of capture bias is not considered before establishing this type of recognition model.”

Fingers crossed Tesla and Google is feeding their machine-learning algorithms more data from people with diverse skin tone than academic models, otherwise, we may soon face a situation where AI is able to physically kill you and if you do it more likely Not white.