'Digital revolution' excluding the most vulnerable
From tackling diseases to improving transport, technology like data and artificial intelligence has unleashed a wave of opportunities, but those still exclude society's most vulnerable citizens, according to a leading human rights researcher.
The “digitisation of information” impacts every sector in society but not everyone benefits equally, said Carly Kind, head of the Ada Lovelace Institute, a British- based research body named after the British mathematician and computer pioneer.
“We see huge power imbalances in terms of who governs, hoards and uses data, and in what ways,” said Kind, who previously led a European Commissionfunded project on data governance and privacy regulation.
Tech giants, once seen as engines of economic growth and a source of innovation, have come under fire on both sides of the Atlantic for allegedly misusing their power and for failing to protect their users' privacy.
Glen Weyl, a principal researcher at the research arm of U.S. tech giant Microsoft, said that “tech companies make up five of the six largest companies in the world and they have a business model driven effectively by surveillance.” “We need a society that treats people as agents of their own privacy rather than passive subjects in a surveillance state,” he said at the Thomson Reuters Foundation's annual Trust Conference in London on Thursday.
Kind cited the criminal justice system as one area where marginalised communities have been discriminated against by the use of facial recognition and algorithms.
Computers have become adept at identifying people in recent years, unlocking a myriad of applications for facial recognition, but critics have voiced concerns that the technology is still prone to errors.
“Research shows that policing technologies predicting where crime might occur can be informed by biased datasets,” said Kind.
“That could lead them to wrongly identify black people and people of colour as more likely to offend, and create over-policing in certain areas.”