Skip to content

Multiple testing approaches needed to reduce AI bias

Many object detection systems have trouble recognising certain demographics. To balance the bias, a renewed focus on datasets, industry standards and transparency is needed. By Betti Hunter

The artificial intelligence (AI) systems that self-driving vehicles rely on apparently have a problem with perception. A 2019 study by the Georgia Institute of Technology found that autonomous vehicles (AVs) might be more able to detect people with lighter skin tones, posing a potential danger to pedestrians with darker skin. The research, which aimed to discover how effectively object detection systems identify people from different demographics, analysed a large dataset of pedestrians that were categorised according to the Fitzpatrick scale, which classifies human skin tones. The findings revealed that detection systems were, on average, consistently five percentage points less accurate for people within the dark-skinned group.

Subscribe to Automotive World to continue reading

Sign up now and gain unlimited access to our news, analysis, data, and research

Subscribe

Already a member?

https://www.automotiveworld.com/articles/multiple-testing-approaches-needed-to-reduce-ai-bias/

Welcome back , to continue browsing the site, please click here