Skip to content

Multiple testing approaches needed to reduce AI bias

Many object detection systems have trouble recognising certain demographics. To balance the bias, a renewed focus on datasets, industry standards and transparency is needed. By Betti Hunter

The artificial intelligence (AI) systems that self-driving vehicles rely on apparently have a problem with perception. A 2019 study by the Georgia Institute of Technology found that autonomous vehicles (AVs) might be more able to detect people with lighter skin tones, posing a potential danger to pedestrians with darker skin. The research, which aimed to discover how effectively object detection systems identify people from different demographics, analysed a large dataset of pedestrians that were categorised according to the Fitzpatrick scale, which classifies human skin tones. The findings revealed that detection systems were, on average, consistently five percentage points less accurate for people within the dark-skinned group.

It’s time to log in (or subscribe).

Not a member? Subscribe now and let us help you understand the future of mobility.

Pro
£495/year
or £49.50/month
1 user
News
yes
Magazine
yes
Articles
yes
Special Reports
yes
Research
no
OEM Tracker
no
OEM Model Plans
no
OEM Production Data
no
OEM Sales Data
no
Pro+
£1,950/year
or £195/month
1 user
News
yes
Magazine
yes
Articles
yes
Special Reports
yes
Research
yes
OEM Tracker
yes
OEM Model Plans
yes
OEM Production Data
yes
OEM Sales Data
yes
Pro+ Team
£3,950/year
or £395/month
Up to 5 users
News
yes
Magazine
yes
Articles
yes
Special Reports
yes
Research
yes
OEM Tracker
yes
OEM Model Plans
yes
OEM Production Data
yes
OEM Sales Data
yes
Pro+ Enterprise
Unlimited
News
yes
Magazine
yes
Articles
yes
Special Reports
yes
Research
yes
OEM Tracker
yes
OEM Model Plans
yes
OEM Production Data
yes
OEM Sales Data
yes

Welcome back , to continue browsing the site, please click here