It is being used more and more in the public's interest. Australia has recently increased its use of facial recognition to enforce safety precautions under covid-19. Quarantined people are subject to random checks-ins in which they must send a selfie to prove they are following the rules. According to Reuters, location data is also collected.
Greer states that when it comes to basic necessities like housing and food assistance, the first priority should always be to ensure everyone has access to help. While fraud prevention is an acceptable objective, Greer says that the primary goal should be to ensure people receive the benefits they need.
Human rights and vulnerable people's needs must be considered from the beginning when systems are built. Greer states that these cannot be left to the last minute. After something goes wrong, they can't be bug fixes.
ID.mes Hall claims that his company's services are superior to existing methods of verifying identification and have helped states reduce massive unemployment fraud since the implementation of face verification checks. According to him, unemployment claims have a 91% pass rate either on their own or via a video conference with an ID.me representative.
He says that this was our goal from the beginning. If 91% of this could be automated, states that are outnumbered in terms of resources will have the ability to offer white-glove concierge services to the 9%.
Hall says that ID.me emails users to follow-up if they are unable to pass the face recognition process.
He says that everything about the company is about helping people access things they are eligible for.
Technology in the real world
JB was unable to make ends meet for months. Stress was caused by financial worries. Other problems, such as a broken computer, added to the stress. Their former employer could not or would not help them cut through the red tape.
It is very lonely to feel like this, JB said.
Experts on the government side agree that it is understandable that the pandemic brought technology to the forefront. However, cases such as JBs prove that technology alone is not enough. Anne L. Washington is an assistant professor at New York University of data policy. She says it's tempting to think that a new technology from government is a success if it works well during the research phase, but fails in real life 5% of time. The result is likened to a game called musical chairs in which five people will always have a seat in a room with 100.
She says that while governments may have some technology, and it works 95% of time, they think it is solved. Human intervention is more important than ever. Washington says they need a system that can manage the five standing people.
Private companies are at an additional risk. Washington states that the most important issue in the rollout a new technology is the storage of the data. Sensitive data could be lost or misused if it is not protected by a trusted entity. What would you feel if the federal government gave your Social Security numbers to a private company?
Problem is that governments often get technology and think it solves their problems Anne L. Washington University
Unchecked and widespread use of face recognition tools can also have the potential to negatively impact already marginalized groups. For example, transgender people have reported frequent issues with Google Photos. This tool may ask if pre- or post-transition photos are the same person. This means that you have to deal with the software repeatedly.
Daly Barnett, a Technologist at Electronic Frontier Foundation, states that technology is not able to accurately reflect the diversity and edge cases found in the real world. They can't accurately classify, compute, and reflect these beautiful edge cases.
Failure is worse than success
Face recognition is often discussed in conversations about how technology can fail or discriminate. Barnett urges people to look beyond the effectiveness of biometric tools or the presence of bias in technology. Barnett insists that they are not necessary. Greer and other activists warn that the tools can be dangerous even if they work well. Although face recognition has been used to identify, punish or stifle protesters before, people are taking action. Protesters in Hong Kong wore masks and glasses to conceal their faces from police surveillance. Federal prosecutors dropped the charges against a US protester who was identified by face recognition and accused of assaulting officers.