The country needs new laws to govern the use ofbiometrics and the government needs to come forward with primary legislation according to an independent review.

The legal review recommends that public use of live facial recognition be suspended pending the creation of a legally binding code of practice governing its use and the passing of wider, technologically neutral legislation to create a statutory framework governing the use of facial recognition.

Civil rights challenges and condemnation by human rights groups have been caused by a number of UK police forces adopting LFR.

The UK's information commissioner went public with concerns about reckless and inappropriate use of LFR in public places.

The Information Commissioner's Office fined the controversial, U.S.-based facial recognition company, which uses selfies scrubbed off the internet without consent to power an artificial intelligence-based identification matching service it's targeted at law enforcement, for violating the Information Commissioner's Office's rules

The government's digital policymaking has largely focused elsewhere to date, such as on online content regulation and post-divorce data protection deregulation, in the digital sphere.

The forthcoming Data Reform Bill will support the development of policing-led guidance such as new codes of conduct, according to the government.

The legal review published today calls for a more comprehensive approach to regulating public sector use of biometrics.

London’s Met Police switches on live facial recognition, flying in face of human rights concerns

The UK's current legal regime is "fragmented, confused and failing to keep pace" with developments inbiometrics according to the review.

There is an urgent need for a new legislative framework for the use of fingerprints. Under inadequate laws and insufficient regulation, we must not allow the use ofbiometrics.

He wants the scope of the legislation to cover use of the technology not only for identification of individuals but also for classification.

The review argues that the legal framework needs to provide appropriate safeguards in order to remove the rights-intrusive capacity of the system.

Specific and detailed duties that arise in particular use cases are set out in the codes of practice. Existing duties under the Human Rights Act, Equality Act and Data Protection Act should be supplemented by a framework governing use of biometrics.

To have a statutory advisory role in respect of public sector biometrics use is one of the recommendations. The review recommends that the advice is published and that bodies that don't follow it must explain why.

It is necessary to clarify and properly resource the regulation and oversight of biometrics. The overlap and fragmented nature of oversight makes it hard for good governance.

The review states that it requires either a specific independent role or a specialist commissioner for it to be effective. It needs to be adequately resourced financially, logistically, and in expertise to perform the governance role that this field requires.

The authors of the review are calling for more study of private sector applications of biometrics to consider how best to shape appropriate legislation given the porous relationship between private-sector organizations gathering and processing.

AI startup Faculty wins contract to predict future requirements for the UK’s NHS

There is a perception that law and regulation impedes the use ofbiometrics. The case should not be this way. In practice, a clear regulatory framework enables those who work with biometric data to be confident of their ethical and legal lines.

They are free from the burden of self-regulation that comes from unclear guidelines. Working practices are encouraged by this confidence. Lawmakers and regulators don't always help people who want to act in a responsible way.

The research institute that commissioned the review is publishing a policy report to accompany it in which it presses the government to act, as well as conducting a survey on UK public attitudes towards facial recognition.

According to the survey and the citizens' council, the public supports stronger safeguards for biometrics.

Some of the Institute's recommendations echo those in the legal review, such as urging government to pass primary legislation to govern the use of biometrics and that oversight and enforcement of the regime should sit within a new regulatory function.

It is calling for the proposed regulators to assess the accuracy, reliability and validity of the technologies and to make sure that they are used in a way that is fair to everyone.

It suggests that the test should consider individual harms, collective harms and societal harms. As long as the system is in use, the regulatory function should keep an eye on the technology.

The Institute believes that the creation of codes of practice should be triggered by the regulatory monitor. The Institute is calling for a moratorium on the use of fingerprints for one-to-many identification in public spaces until governance legislation is passed.

The public supports stronger safeguards and the legal landscape is not adequate according to the Institute's director. This is an important issue for the government to address and bring new legislation to it.

The EU came out with a draft proposal last year for regulating applications of artificial intelligence. Civil society and human rights groups are concerned that the EU's risk-based framework for regulating applications of artificial intelligence does not go far enough to protect fundamental rights.

Critics argue that the provision is not a meaningful limitation because it contains so many qualifications.

There is an opportunity for the UK to go further, but only if the policy recommendations that are being made today are adopted by the government.

The EU Act doesn't adequately grapple with the risks of emotional recognition systems. She argues that they only require users to be transparent when the technology is being deployed for example through labeling or disclosure.

There are similar risks to the identification. The Citizens Biometrics Council was concerned about accuracy, both whether tools work well and whether the categories are roots in evidence or pseudoscience, and they pose privacy risks as intimate data is used and could reveal or presume sensitive information about you, like sexuality or religion.

A majority of uses in the public sector, by public services, in public places or with significant effect, have to undergo a proportionality test prior to use. We recommend that comprehensive high standards of regulation are applied to the private and public sectors.

The measures the UK government has put in place so far don't go far enough, according to the author.

The focus of the Government seems to be efforts to simplify and reduce confusion. The need to substantially strengthen oversight functions is something we have identified. Existing legislation and oversight mechanisms are fragmented, unclear, ineffective and failing to keep pace with the technologies being developed according to a review.

The approach to regulation needs to be strengthened. We want to know if the tools are built on stereotypical or pseudoscientific assumptions. A requirement of a proportionality test is one of the things we are suggesting for any use of biometrics in the public sector, in public spaces, or where significant decisions are made about individuals. Before using or procurement, that assessment should be about the context of the technology.

According to our research, we need to be more ambitious about regulation than we are currently seeing. We will be able to see the further details of the legislation.

Europe’s AI Act falls far short on protecting fundamental rights, civil society groups warn

UK wants to replace cookie pop-ups with browser-based opt-outs