The embattled nation uses details resulting from the process to try to track down and notify the families of the dead, in an act Ukraine says is aimed at piercing Russia’s war information filter.
While this type of artificial intelligence could offer closure to families denied it by the fog of war or Kremlin secrecy, the potential for mistakes is considerable and consequential.
“If you’re a Russian parent being informed that your child has been killed when it’s not true, that gets into a complex ethical dilemma,” said Jim Hendler, director of the Institute for Data Exploration and Applications at Rensselaer Polytechnic Institute in New York state.
US-based Clearview AI, often criticized by privacy advocates, says it gave Ukrainian officials free access to its service that matches images from the internet to pictures uploaded by users trying to identify someone.
“The Ukrainian officials who have received access to Clearview AI have expressed their enthusiasm, and we await to hear more from them,” Hoan Ton-That, the firm’s co-founder and CEO, said in a statement.
Ukrainian vice prime minister Mykhailo Fedorov on Wednesday wrote that his nation was using “artificial intelligence” to search social networks for Russian soldiers’ profiles using images of their bodies, to then report their deaths to loved ones.
He added one of the purposes was to “dispel the myth of a ‘special operation,’” referring to Moscow insisting the invasion and war be designated as such.
Ukraine authorities did not reply to AFP requests for further information on Fedorov’s statement.
The Kremlin’s last officially released toll was just under 500 soldiers killed, but that has not been updated for weeks and NATO officials reportedly estimated the number of dead, wounded, missing or otherwise out of action Russian troops at up to 40,000.
News of soldiers’ deaths and their funerals have appeared in local Russian media, with reports saying that officials have told families roughly where the deceased fell but few other details.
Facial recognition arrives in the war as technology that has met with significant and sustained doubts, ranging from its intrusion on people’s privacy to criticism it can make errors identifying people of colour.
Experts noted that face recognition could be particularly problematic when used on the dead, especially after battlefield wounds leave people looking very different than in a smiling, well-lit picture from a wedding, for example.
“One of the most well-known problems with facial recognition technology is that it’s not perfect, and it will make errors and in some cases those misidentifications can be life changing,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law.
Yet he noted that long after wars end many families are left not knowing what happened to loved ones who went to battle and were never seen again, and noted the potential utility of technology in those cases.
“We can imagine a circumstance where the ability to reduce the number of people missing in action actually would help,” he added, though he noted face recognition may not be the right solution.
In a letter offering its services to Ukrainian authorities, Clearview - built with images from public websites and which it touts as a tool for law enforcement - argued that it indeed would be very useful.
The firm, which Italy fined 20 million euros (B736mn) earlier this month over its software, claimed its database includes some two billion images from VK, Russia’s equivalent to Facebook, and can help identify the dead without needing information like fingerprints.
As for being able to accuracy identify the deceased, Clearview claimed to work “effectively regardless of facial damage that may have occurred,” it added, though AFP could not independently verify that claim.