In 14 different countries, the three social media platforms do not offer the same level of privacy and safety protections for children.

According to the report, there is significant variation in children's experience across different countries on the same platforms.

Fairplay advocates for an end to marketing that targets children.

TikTok was problematic in this area. Almost 40 child safety and digital rights advocacy groups have signed a letter calling on the company to offer a safety by design and children's rights by design approach.

There are 39 child protection and digital rights advocacy organizations from 11 countries, including the UK's 5 Rights Foundation and the Africa Digital Rights Hub inGhana.

In the UK and certain EU markets, TikTok's default settings to private were found to be used by 17-year-olds.

Many non- European markets where TikTok fails to provide its terms of service in young people's first language have been identified. TikTok sometimes provides users with conflicting information, making it difficult for minor users to know if the service is appropriate for them.

TikTok's biggest markets are in the US, Indonesia and Brazil. The report authors argue that all children and young people need an appropriate experience.

The methodology for Fairplay's research involved central researchers, based in London andSydney, analyzing platforms' privacy policies and T&Cs, with support from a global network of local research organizations.

After EU child safety complaints, TikTok tweaks ad disclosures but profiling concerns remain

The researchers suggest that the claims of social media giants to care about protecting children are questionable since they don't provide the same safety and privacy standards to children all over the world.

Social media platforms seem to be using gaps in the global patchwork of legal protections for minor to prioritize commercial goals, like boosting engagement, at the expense of kids' safety and privacy.

In Europe, legal frameworks have already been enacted to protect their online experience, such as the UK's Age Appropriate Design Code, which went into effect in September 2020

A spokeswoman for Fairplay said that regulation works and tech companies don't act without it, in response to a question. She believes that a lack of regulation leaves users more vulnerable to the business model of the platform.

The authors made a direct appeal to lawmakers to implement settings and policies that give the most protection for young people's wellbeing and privacy.

The report is likely to add to calls for lawmakers outside Europe to pass legislation to protect children in the digital era, and avoid the risk of platforms concentrating their most discriminating and predatory behaviors on minor living in markets which lack legal checks on datafication.

Lawmakers in California have been trying to pass a UK style age appropriate design code. While, earlier this year, a number of US senators proposed a Kids Online Safety Act as the child online safety issue has gained more attention, passing federal-level privacy legislation of any stripe in the US still remains a major challenge.

It's troubling to think that these companies are picking and choosing which young people to give the best safety and privacy protections to It is reasonable to think that once a company has figured out how to make their products better for kids, they would roll it out to everyone. Social media companies are letting us down once more. Digital service providers need to be required to design their products in ways that work for young people.

She pointed out in remarks to accompany the report that many jurisdictions around the world are looking at this type of regulation. The Age Appropriate Design Code, which is in front of the state Assembly, could help eliminate some of these risks for young people. You can expect social media companies to care about privacy and safety.

Senators propose the Kids Online Safety Act after five hearings with tech execs

The spokeswoman for Fairplay said that the researchers found TikTok to be the worst performing platform and that the co-signatories felt the most urgent. The two Meta-owned platforms are also discussed in the report.

TikTok has over a billion active users, and various estimates suggest that between a third and quarter are too young. The safety and privacy decisions your company makes has the ability to affect 250 million young people globally, and these decisions need to ensure that children and young people's best interests are realized, and realized equally.

We urge you to adopt a Safety By Design and Children's Rights by Design approach and immediately undertake a risk assessment of your products around the world to identify and remedy privacy and safety risks on your platform. TikTok should adopt this globally where a local practice is found to maximize children's safety or privacy. All of TikTok's younger users deserve the best privacy and protections, not just children from European countries.

The key word is relative, even in Europe, which is considered the defacto global leader in data protection.

Child safety criticisms of TikTok in the region persist, especially related to its extensive profiling and targeting of users, and many of the legal actions and investigations remain unresolved.

The Italian data protection agency sounded the alarm about a planned change to TikTok's privacy policy which it suggested does not comply with EU privacy laws. The platform was urged not to continue with the switch because it could have troubling ramifications for children on the service.

Italy ordered the company to block users it could not age verify after child safety concerns were linked to the TikTok challenge. Over half a million accounts in the country were removed by TikTok because it was not certain if they were at least 13 years old.

Italy warns TikTok over privacy policy switch