The UK government has completed a major revision to controversial but populist online safety legislation that has been in the works for years, but has been paused since this summer due to turmoil in the governing Conservative Party.

The new secretary of state for digital said in September that the new government would make changes to the bill before it was brought back to parliament.

The draft legislation will be back in the House of Commons next week.

The changes to the online safety bill were made in response to concerns that it could lead to platforms over blocking content and chilling freedom of expression online.

This aspect of the bill has caused a lot of controversy.

The Secretary of state for digital issues wrote in a press release that any incentives for social media firms to over-remove peoples legal online content will be taken. The Bill won't define specific types of legal content that companies must address, but firms will still need to protect children and remove content that is illegal.

I promised I would make some common-sense tweaks and I have.

This is a stronger, better bill for it. It is focused where it needs to be: on protecting children and on stamping out illegality online.

Now it is time to pass it.

— Michelle Donelan MP (@michelledonelan) November 29, 2022

Future governments can't have any influence on what private companies do about legal speech on their sites or if companies are motivated to take down legitimate posts in order to avoid sanctions. Changes will be made to social media platforms to make them more transparent and accountable to their users.

Changes will force tech firms to publish more information about the risks of their platforms so people can see what dangers sites really hold. Firms will be made to show how they enforce their user age limits to stop kids circumventing methods and they will have to publish details of when the Regulator Ofcom has taken action against them.

Donelan wrote in the Telegraph that she had changed the bill to reflect the values of the way of life. The online safety bill was created to protect children, and the changes I have made strengthen the child protection elements of the bill a lot.

The rights of adults to choose what legal speech they say and see were violated by the clause in the bill. A new system based on choice and freedom has been created. Tech giants should not remove anything if it isn't forbidden in their terms and conditions. Users will have the right of appeal for the first time and platforms will need to be more transparent. Silicon Valley executives won't be able to treat some people differently than others.

The amendment to the legislation that was revealed over the weekend will make it a criminal offence to encourage self harm, meaning that platforms will have a legal duty to remove it.

Criminalizing the sharing of deepfake porn without consent is one of the changes it recently announced.

‘Triple shield’

DCMS is pitching its new approach with the Online Safety Bill as a triple shield of online protection which is most strongly focused on children but still offers measures intended to help general consumers shield themselves from a range of online harms.

If adult users want to limit their exposure to content that may be unpleasant to them but which does not meet the bill's higher bar of being strictly illegal, provisions in the revised bill could allow them to do so.

Using tools that the biggest platforms will need to offer to let them control whether they can be contacted by unverified social media users has been retained by the government.

The triple shield will replace the duties relating to legal but harmful content accessed by adults in order to make sure the Bill strikes the right balance. Adults will be given more control over online posts they don't want to see.

If users encounter certain types of content, such as the glorification of eating disorders, racism, anti-semitism or misogyny, internet companies will have to offer adults tools to help them avoid it. Human moderation, blocking content flagged by other users or sensitivity could be included.

The government has strengthened provisions to protect children at the same time as adapting it to respond to concerns over the bill's impact on freedom of expression for adults.

She argued that nothing was being watered down or taken out when it came to children. Extra in is being added by us. There is no change to kids.

She said platforms will still be required to prevent children from being exposed to legal but harmful speech, even though they don't enforce them. She said that the legislation would require platforms to live up to their promises.

Ian Russell, the father of Molly Russell, the 14-year-old British girl who killed herself five years ago after viewing social media content promoting self- harm and suicide, expressed concern that the bill was being watered down.

This legal but harmful content was included in the bill when it would have had a third reading in the Commons.

Russell gave an example of a pencil-style drawing of the dangers of legal but harmful content that was found in the inquest into his daughter's death.

He argued that the content had to be regulated against if it was sent to someone who was young and vulnerable. It's important to look into the algorithms as well. That's what the worry is.

Social media a factor in death of UK schoolgirl, inquest finds

The platforms haven't paid enough attention to age verification and age assurance, Russell said. He said that they turned a blind eye to the age of people on their platforms.

While not embracing the government's edits to 'legal but harmful' duties in the bill, Russell did welcome DCMS' drive to dial up transparency obligations on platforms as a result of revisions that will require them to publish risk assessments

The content that is harmful or could hurt children but is not illegal will still be removed under this version of the bill. The content that Molly Russell saw won't be allowed because of this bill. There will be no cases like that in the future.

She believes the revised bill will force platforms to enforce their own age restrictions.

She said that the bill had been strengthened. Companies can't just say yes we only allow children over 13 to join, then they allow 10 year olds and actively promote it to them. We are stopping that from happening, you have to tell parents how you are doing that, and we are saying no. When you produce the guidelines, you need to work with the children's commissioner.

Donelan pointed to what she described as the "very punitive sanctions" still in the bill, including fines of up to 10% of global annual turnover, if a company violates any aspects of the bill. It is a big incentive not to break the bill.

She said that the government has strengthened this aspect of the bill, and that companies have to be assured of the age of their users.

We are not saying you have to use X specific tech because it will be out of date by next week, but you can use a range of age assurance technology or age verification technology.

This component of the bill is likely to face fierce opposition from digital rights campaigner who are already warning that biased artificial intelligence will likely be the tech that gets applied at scale to predict users' age as platforms seek to meet compliance requirements.

The removal of a harmful communications offence is one of the changes the government has made to the bill.

False and threatening comms have been retained.

The criminal law will continue to protect people from harmful communications, including racist, sexist and misogynistic abuse, even though the government will no longer repeal elements of the Malicious Communications Act.

There will be a requirement for major platforms not to remove content that does not violate the law or suspend or ban users where there has not been a violation of their ToS.

The criminal offence of controlling or coercive behavior will be added to the list of priority offenses in the Bill.

Platforms will have to take proactive steps, such as putting in measures to allow users to manage who can interact with them or their content, instead of only responding when this illegal content is flagged to them through complaints.

The Children's Commissioner is now seen as a statutory consultee to the Regulator, Ofcom's, codes of practice, which platforms will be required to cleave to shrink their Legal Risk.

The government tabled some of the latest amendments to the Bill in the Commons for Report Stage on December 5, but noted that further amendments will be made at later stages of the Bill's passage.

According to the new polling from Ipsos, which DCMS claims shows "overwhelming public backing for action", 83% of people think social media companies should have a duty to protect children who use their platforms.

According to the survey, eight in ten people think the government should make sure social media companies protect children when they are online and seven in ten think social media companies should be punished if they don't.

Donelan commented in a statement.

“Unregulated social media has damaged our children for too long and it must end.

I will bring a strengthened Online Safety Bill back to Parliament which will allow parents to see and act on the dangers sites pose to young people. It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views.

Young people will be safeguarded, criminality stamped out and adults given control over what they see and engage with online. We now have a binary choice: to get these measures into law and improve things or squabble in the status quo and leave more young lives at risk.”

Additional remarks by Donelan were published in the Telegraph newspaper, and a reference was made to a survey by DCMS.

Coroner’s report into UK schoolgirl’s suicide urges social media regulation

UK to change Online Safety Bill limits on ‘legal but harmful’ content for adults