After communal clashes in Delhi’s Jahangirpuri area last year, police said they used facial recognition technology to identify and arrest dozens of men, the second such instance after a more violent riot in the Indian capital in 2020.
In both cases, most of those charged were Muslim, leading human rights groups and tech experts to criticise India’s use of the AI-based technology to target poor, minority and marginalised groups in Delhi and elsewhere in the country.
As India rolls out AI tools that authorities say will increase efficiency and improve access, tech experts fear the lack of an official policy for the ethical use of AI will hurt people at the bottom, entrenching age-old bias, criminalising minorities and channeling most benefits to the rich.
“It is going to directly affect the people living on the fringes - the Dalits, the Muslims, the trans people. It will exacerbate bias and discrimination against them,” said Shivangi Narayan, a researcher who has studied predictive policing in Delhi.
With a population of 1.4 billion powering the world’s fifth-biggest economy, India is undergoing breakneck technological change, rolling out AI-based systems - in spheres from health to education, agriculture to criminal justice - but with scant debate on their ethical implications, experts say.
In a nation beset by old and deep divisions, be it of class, religion, gender or wealth, researchers like Narayan - a member of the Algorithmic Governance Research Network - fear that AI risks exacerbating all these schisms.
“We think technology works objectively. But the databases being used to train AI systems are biased against caste, gender, religion, even location of residence, so they will exacerbate bias and discrimination against them,” she said.
Facial recognition technology - which uses AI to match live images against a database of cached faces - is one of many AI applications that critics say risks more surveillance of Muslims, lower-caste Dalits, Indigenous Adivasis, transgender and other marginalised groups, all while ignoring their needs.
Did you find this article useful? Join the EB Circle!
Your support helps keep our journalism independent and our content free for everyone to read. Join our community here.