Today in the U.K., the 12-month grace period to comply with the design code to protect children online has expired. This means that app developers offering digital services to children on the market (defined in this context to include users under 18 years) must adhere to a set standards to prevent them from being profiled and tracked.
Although the age-appropriate design code was implemented on September 2, 2013, the U.K.'s data protection watchdog, ICO, gave organizations the maximum grace period to comply to allow them time to adapt to their services.
It expects that the code will be followed from today.
The code can be used for services that are connected to toys and games, edtech, and online retail. It also applies for for-profit online services like social media and video sharing platforms.
The codes stipulate that settings should have a high level of privacy if the user (or is suspected) to be a child. This includes specific provisions about geolocation and profiling being disabled by default (unless there is a compelling reason).
App developers are required to give parental controls to their app. They must also provide the child with appropriate information about such tools. The code warns against parental tracking tools that can be used to silently or invisibly monitor a child's movements without making them aware.
Another standard targets dark pattern design and warns app developers against using nudge tactics to get children to give unnecessary personal data.
The code is in its entirety 15 standards long, but it's not a law. It's a set design recommendations that the ICO wants app developers to follow.
They must comply with the child privacy standards of the watchdog. This is because they are explicitly linking it to compliance with other data protection requirements, which are part of U.K law.
Apps that disregard the standards run the risk of being reported to the watchdog through a complaint, proactive investigation or a formal investigation. This could lead to an ICO audit that examines their entire approach to data protection.
We will conduct proactive audits to ensure compliance with this code, consider complaints and take the appropriate actions to enforce the underlying data security standards. This is subject to applicable law as well as our Regulatory Action Policy. The ICO has provided guidance on its website. We will focus our attention on individuals and organisations suspected of repeating or wilful misconduct, or failing to comply with the law in order to ensure effective and proportionate regulation.
It warns that a failure to comply with the children privacy code could be viewed as a possible black mark against (enforceable U.K. data security laws).
Stephen Bonner, ICOs executive Director of regulatory futures, and innovation, posted a blog last week warning app makers that they would be asking social media platforms, music streaming sites, and the gaming industry to inform us about how their services comply with the code. We will help you identify any areas that need our support. If necessary, we can also audit or investigate organisations.
He said that the most dangerous areas for current threats are video and music streaming websites, as well as video gaming platforms. These sectors are where children's personal data is used to bombard them with personalized content and services. These may include inappropriate ads, unsolicited messages, friend requests, and privacy-eroding nudges encouraging children to stay online. We are concerned about the potential harms this data usage could cause, including financial, psychological, and physical.
We expect organisations to demonstrate that children's best interests are their primary concern. Children's rights must be respected. Bonner said that the code clarifies how organisations can use children's data in compliance with the law and that we expect organisations to be committed to protecting children by developing designs and services in accordance to the code.
At least on paper, the ICO's enforcement powers are quite extensive. GDPR gives it the power to fine infringers up 17.5 million or 4% their annual worldwide turnover, whichever comes first.
The watchdog may also issue orders prohibiting data processing and requiring that services be modified. Apps that flout the Children's Design Code could face regulatory bumps, or worse.
There have been indications that major platforms are paying attention to the ICO compliance deadline. Instagram, YouTube, and TikTok have all announced changes to their handling of minors' data and account settings in recent months. This was ahead of the September 2 deadline.
Instagram announced in July that it would automatically assign teens to private accounts for minors under 18 years old. This includes certain countries, which the platform confirmed to us. In August, Google made similar changes to accounts on YouTube's video-charting platform.
TikTok announced that it would increase privacy protections for teens a few days later. Although it had made changes to privacy defaults earlier for those under 18 years.
Apple has also been in trouble with the digital rights community after announcing child safety-focused features, including a tool to detect child sexual abuse material (CSAM), which scans uploaded photos to iCloud; as well as an opt-in parental safety feature, which allows iCloud Family account users to turn on alerts about the viewing of explicit images by minors via its Messages app.
All these product updates on mainstream platforms share a common theme: child protection.
Online child safety has been gaining attention in the U.S., with a growing concern about the ways that some apps abuse children's data. There are also a lot of open probes in Europe (such this Commission investigation into TikTok) but the U.K. could be having disproportionate impact due to its efforts to promote age-focused design standards.
This code is also incorporated with incoming U.K. legislation that will apply a duty to care on platforms to take a broad safety-first stance towards users. It also has a strong focus on children (and it's also being broadly targeted so it covers all children, rather than just kids under 13, as with COPPA in America).
Bonner, ICOs, sought credit for the "significant changes" made by platforms such as Facebook, Google, Instagram, and TikTok in the blog post before the compliance deadline expired. Bonner wrote: It is also the first of its type, so it has an impact globally. The U.S. Senate has urged Congress and major U.S. gaming and tech companies to adopt the ICOs code of conduct for children in America.
He also mentioned that the Data Protection Commission in Ireland is currently preparing to introduce The Childrens Fundamentals to safeguard children online. This link closely with the code and follows the same core principles.
And there are other examples in the EU: Frances data watchdog, the CNIL, looks to have been inspired by the ICOs approach issuing its own set of child-protection-focused recommendations this June (which also, for example, encourage app makers to add parental controls with the clear caveat that such tools must respect the childs privacy and best interests).
The U.K.'s emphasis on online child safety isn't just making waves abroad, but also generating growth in the domestic compliance services sector.
The ICO, for instance, announced last month the first set of GDPR certification criteria, including two schemes that focus on the age-appropriate code. You can expect many more.
Bonner's blog post also noted that the watchdog will formally state its position on the topic of age assurance in autumn. This will allow it to provide further guidance to organizations that are within the scope of the code. Bonner suggested that it could actually be verifying ages, or age estimation. Age assurance services will soon be offering compliance-focused sales pitches, regardless of what the recommendations might be.
The U.K. has made children's safety online a major focus of its policymakers in recent years. However, the Online Safety (ne Harms) Bill is still in draft stage.
After widespread criticism, a previous attempt by U.K. lawmakers in 2017 to establish mandatory age checks to stop kids accessing adult content websites was dropped in 2019. This was in response to concerns that it would not be practical and pose a huge privacy risk to adult porn users.
The government didn't abandon its determination to regulate online services for child safety. Online age verification checks seem to be, if not a blanket requirement, for all digital services increasingly introduced by the backdoor through a kind of recommended feature creep (as was warned by the ORG).
According to the current age appropriate design code, app developers should adopt a risk-based approach in recognising users' age and ensure that you apply the standards in this Code to child users. This means they: Either establish age with a level that is appropriate to the potential risks to the rights of children that may arise from data processing or apply this code to all of your users.
Gleichzeitig besteht a risk that the wider push for online safety by the governments could conflict with some of their noble aims in the ICOs non-legally binding code on children's privacy design.
The code, for instance, includes the (welcomed) suggestion that digital services gather as much information about children as possible. However, earlier this summer, U.K. lawmakers released guidance for messaging and social media platforms ahead of the Online Safety legislation. This suggests that they prevent children from using end-to-end encryption.
This is correct; the government advises data-mining platforms that will prepare them for the requirements of the incoming legislation. It also suggests not to use E2E encryption for children.
The U.K. government's official message to app developers is that the law will allow commercial services to have access to more children's information in order to keep them safe. This contradicts the data minimization push in the design code.
There is a risk that a growing focus on privacy for kids could be masked by poorly thought-out policies that encourage platforms to monitor children to show protection against a wide range of online harms such as pro-suicide posts, cyberbullying, and CSAM.
This law encourages platforms to demonstrate compliance to reduce the risk of child abuse, retention of data, and possibly risk profiling. It could also lead to age verification checks and risk profiling (that could be applied to all users; it's like sledgehammer to break a nut). A privacy dystopia.
These mixed messages and inconsistent policymaking are set to add confusion and even conflicting requirements to digital services operating in the U.K. Tech businesses will be legally responsible for finding clarity among the policy chaos and could face huge fines if it is not.
Conforming to the ICOs design guidelines may be easy.