The U.K. government announced that it will consult on plans to overhaul the national data protection system, in order to determine how to deviate from EU rules after Brexit.
It's also been a year since Britain published its national data strategy, in which it stated that pandemic data sharing should be Britain's new norm.
Today, the Department for Digital, Culture, Media and Sport (DCPS), presented an incoming reform to the Information Commissioners Office. It stated that it wants to expand the ICOs remit in order to champion businesses and sectors that use personal data in innovative and responsible ways to improve peoples lives. They also promised simplified rules to encourage data-driven research that benefits peoples lives, like in the area of healthcare.
It also proposes a new regulator structure, including an independent board and chief executive of the ICO. This will mirror the governance structures of other regulators like the Competition and Markets Authority and Financial Conduct Authority.
It also stated that the data reform consultation will examine how the new regime could help mitigate the risk of algorithmic bias, something the EU is currently moving to legislate on. In April, it presented a risk-based proposal to regulate applications of AI.
The U.K. could be left behind if it focuses only on bias mitigation and doesn't consider the larger picture of AI's impact on its citizens.
DCMS announces the consultation by highlighting an artificial intelligence partnership between Moorfields Eye Hospital (now University College London Institute of Ophthalmology) and Moorfields Eye Hospital (which began in 2016). This is an example of the kind of data sharing DCMS wants to encourage. The researchers revealed last year that their AI was able to predict the onset of wet age-related maculopathy more accurately than doctors.
Although DeepMind, now Google Health, was also part of the partnership (Google-owned), the PR does not mention the tech giants' involvement. This is a curious oversight, considering DeepMinds name is associated with a notorious U.K. data-sharing scandal. In 2017, another London-based NHS Trust (the Royal Free), was sanctioned by ICO for improperly sharing patient information with Google during the development phase a clinician support app. Google is currently discontinuing this partnership.
DCMS might be reluctant to mention that the DCMS goal for data reforms, aka to remove any unnecessary barriers to responsible use of data, could make it easier for Google and other commercial entities to access U.K citizens' medical records.
The sizeable public backlash over the most recent government attempt to requisition NHS users medical records for vaguely defined research purposes (aka the General Practice Data for Planning and Research, or GPDPR, scheme) suggests that a government-enabled big-health-data-free-for-all might not be so popular with U.K. voters.
Data reforms by the government will clarify the rules for the use personal data for research purposes. This will lay the foundation for further scientific breakthroughs. DCMS PR avoids sensitive topics like sharing health data.
There is talk elsewhere of enforc[ing] businesses' responsibility to protect personal information while empowering them grow and innovate. This sounds like a no to data security, but what about privacy and control over your information?
It seems that the government is saying that this will depend on other goals, principally economic ones attached to the U.K.'s ability to conduct data-driven analysis or secure trade agreements with other countries that don't have the same (currently high) standards of data privacy.
There are also some populist flourishes, with DCMS expressing its desire for a data system that is based on common sense and not box ticking. It flagged up plans to increase penalties for nuisance calls as well as text messages. A crackdown on spam is a welcome sound, and it's not hard to imagine why.
Other than spam text messages and nuisance call, this is a rather quaint concern to zero on in an era where apps and data-driven democracy-disrupting surveillance mass surveillance. This was something that the outgoing information commissioner highlighted as a major concern during her tenure with the ICO.
Ministers have used the same populist antispam messaging to attack the need for internet users to consent to the dropping of tracking cookies. Oliver Dowden, the digital minister, recently stated that he would like to eliminate this practice for high-risk purposes.
A system of rights that wraps people's data gives them control over how it is used. This message from the government seems to be irresponsible, or even non-patriotic. DCMS pushes the idea that such rights are preventing more important economic and social goals.
It has not provided any evidence to support that claim. The U.K.'s current data protection regime also got in the way (very ample) data sharing during COVID-19. Negative uses of peoples data are being condensed into DCMS messaging to the smallest possible definition of spam that an individual can see, notwithstanding how they were targeted by the spam texts/nuisance calls.
The government is using its usual "eat your cake" approach to spin its reform plan. It claims it will protect citizens data and also highlights the importance of citizens information being easily accessible to anyone who requests it. While also boasting that it will make it easy for citizens to access their information, as long as they can claim some innovation, it also uses canned quotes to call the plan bold and ambitious.
DCMS announces that the U.K.'s current-leading data protection standards will be maintained, but it directs back, saying that the new regime will only (merely) improve on some broad-brush elements of the existing rules. It specifically states that it will retain principles about data processing, rights of individuals, and supervision and enforcement mechanisms.
The details of the proposals, which will be published tomorrow morning, will be the devil. Expect more analysis to discredit the spin soon.
DCMS has a specific trailed change that it is trying to implement to get away from a one size fits all approach to data protection compliance. It wants organisations to be able to show compliance in ways that are more relevant to their particular circumstances while still protecting personal data of citizens to a high standard.
This means that DCMSs PR use the example of a hairdresser to illustrate the point. However, startups can have fewer employees than average barbers shops and may not be able expect to get a pass to disregard those high standards in future.
This suggests that the U.K.'s high standards could, under Dowdens supervision, end up looking more like a Swiss cheese
Data protection is a do and not a don't.
John Edwards, the New Zealand privacy commissioner, is the man most likely to be the U.K.'s next information commissioner. This was as MPs were considering whether or not to support his nomination.
Edwards, if he is confirmed for the job, will be responsible to implement any new data regime that the government creates.
He refuted the idea that the U.K.'s current data protection regime is a barrier to data-sharing, saying that laws such as GDPR should be seen more as a way to facilitate innovation and a how-to.
He stated that he disagreed with the dichotomy you had presented (about privacy and data-sharing). I don't believe policymakers, businesses, and governments have to choose between data protection and sharing. Privacy laws and data protection laws wouldn't be needed if information wasn't shared. These are not two sides of one coin.
They are the UK DPA [data protection law] and UK GDPR. The COVID-19 disaster has taught the UK and other jurisdictions that this lesson is important. It is essential to have high quality information, minute by minute. It is also important to be able to transfer information between organizations without any friction. There are times when privacy and data protection laws can cause friction. However, I believe that the UK has shown that things can be done quickly when they are needed.
He suggested that the U.K. could achieve a lot of economic gains by making minor changes to existing rules rather than having to do a complete overhaul. He will not be in charge of setting the rules, but he will be responsible for enforcing any new system.
He told MPs that if we can improve the current law's administration, which is very similar to the UK GDPR, then that will allow us to try different regulatory approaches. Even more so if we start from scratch if the government decides to do that.
TechCrunch asked another Edwards (no relation) Newcastle Universitys Lilian Edwards, professor of law, innovation and society for her thoughts on the governments direction of travel, as signalled by DCMS pre-proposal-publication spin, and she expressed similar concerns about the logic driving the government to argue it needs to rip up the existing standards.
Data protection's entire purpose is to strike a balance between fundamental rights and the free flow of data. The current scheme, which has been in effect since 1998, is a good compromise between economic concerns. She explained that the great things we did with data in COVID-19 were legal and easy to do under existing rules.
She also criticized the plan to restructure the ICO to be a quango whose primary purpose is to drive economic growth. She pointed out that DCMS PR does not mention privacy or fundamental rights and argued that creating a new regulator is unlikely to improve public trust, which is declining in nearly all polls.
She suggested that the government is ignoring the economic harm the UK would suffer if the EU changes its standards. It is difficult to see the need to be concerned about adequacy in this situation. This will be reviewed to our detriment, prejudicing 43% trade for a few low-value trade deals and some hopeful sales of NHS data (again likely to take a toll on trusting the GPDPR scandal).
While she praised the goal of regulating algorithmic biased, she also pointed out the possibility of the U.K. falling behind other jurisdictions that are more focused on how to regulate artificial Intelligence.
According to a DCMS press release the government appears to want an existing advisory body called the Centre for Data Ethics and Innovation to play a crucial role in supporting its policymaking in this field. The body will be focusing on trustworthiness in data and AI in real-world. It has not yet appointed Roger Taylor as the new chair of CDEI, but it did announce today an interim chair and some new advisors.
Edwards argued that the world has changed since CDEIs' work in this field. Now, we realize that AI regulation must be considered alongside other regulatory tools. This includes data protection. Although the proposed EU AI Regulation has flaws, it goes beyond data protection to mandate better training sets and transparent systems that can be built from scratch. The UK must examine the various global models that are being proposed if it is serious about regulation. However, it seems like its main concerns right now are narrow-minded, populist and insular.
MedConfidential is a patient privacy advocacy group that has often fought with the government over how it approaches data protection. It also questioned DCMS' continued attachment to CDEI for policymaking in this crucial area. This was pointing out last year's biased algorithm exam grading scandal which occurred under Taylors supervision.
(NB: Taylor was also chair of Ofqual, and his resignation in December cited a difficult Summer, even though his departure from CDEI leaves an awkward void now.
CDEI's culture and leadership led to the A Levels algorithm. Why should any government official have any faith in what they will say next? asked Sam Smith, MedConfidentials.