Law enforcement in India is starting to use facial recognition technology. According to documents obtained by the Internet Freedom Foundation through a public records request, Delhi police said they would consider 80 percent accuracy and above as a positive match if they were to identify people involved in civil unrest in northern India.

The expansion of Indian law enforcement officials using facial recognition data as evidence for potential prosecution is ringing alarm bells. Critics say that the 80 percent accuracy threshold is too low and could result in consequences for people marked as a match. India doesn't have a comprehensive data protection law

If a match is less than 80 percent, it would be considered a false positive and the person would be subject to verification with other corroborative evidence.

Even though facial recognition is not giving them the result that they themselves have decided is the threshold, they will still continue to investigate. The technology is saying that they look similar to the person the police are looking for, which could lead to harassment of the individual. The move by the Delhi Police could lead to harassment of people from communities that have historically been targeted by law enforcement officials.

Police said they are using convict photographs to run facial recognition. They said that they could be used as evidence, but wouldn't give more details. In a case of a positive match, police officials would conduct further investigations before taking any action. The Delhi Police did not reply to our requests.

The threshold of an 80 percent match is meaningless according to the man who has researched the legality of facial recognition. The conditions for testing facial recognition technology models against benchmark data sets are very important in determining accuracy numbers.

A model developed on training data and validation data is compared with a benchmarking data set to determine normal accuracy with facial recognition or machine learning systems. The training data needs to be benchmarked against a third-party data set or a slightly different data set after it's changed. He says benchmarking is what is used to calculate the percentage.

There is evidence of racial bias in facial recognition models. Police use of a system with an overall 80 percent accuracy threshold is unusual. According to a study by the US National Institute of Standards and Technology, systems used to match travelers faces to a database with their photos had an accuracy rate of 99.5%. When used to identify women with dark complexions, the error rate was as high as 34.7%.