Should under-16s be banned from social media?

Technology secretary Peter Kyle said measure 'on the table' as he warned tech companies must take more action to protect children.

Under-16s social media ban: Plans to introduce online age limits explainediStock

A ban on social media for under-16s is being considered in the UK as the government seeks to tighten laws around online safety.

Technology secretary Peter Kyle said the measure is “on the table” and warned tech companies to take more action to protect children.

The minister has set out his priorities for the online safety regulator, Ofcom, as it prepares to implement and enforce the laws outlined in the Online Safety Act next year.

Social media has been blamed for an increase in children taking their own lives and developing eating disorders due to bullying and exposure to negative body images.

Should under-16s be banned from social media?

Has your family been impacted by negative aspects of social media? Tell us your story.

The content you submit using this form is to inform our journalists and (if you provide contact details) to allow them to reach out for further information. It is not for publication. You are not required to include your name or contact details and we will not reach out in all cases where it is provided. You can find more details on how we use your personal data here.

What does the Act entail?

The Act will see new safety duties placed on social media platforms for the first time, requiring them to enforce age limits and protect users, particularly children, from harmful content.

This will include a crackdown on under-13s having access to age-restricted content.

Companies will have three months from guidance being finalised to carry out risk assessments and make changes to safeguard users.

Platforms could face fines of up to £18m from the watchdog if they do not comply with the Online Safety Act rules when they come into force.

Platforms could face penalties for not complying with online safety laws when they come into forceiStock

Speaking to the Telegraph newspaper, Kyle suggested the UK would have to move to “another level of regulation” if tech companies do not get together to enforce the Act.

The Technology Secretary said he would not pursue further law changes until he understood how the Online Safety Act works.

When asked if the UK could raise its age limit to 16, Kyle told the Telegraph, “When it comes to keeping young people safe, everything is on the table.”

The Bill has drawn criticism from some groups, with many questioning how the policy will be enforced and whether it protects children.

Critics have argued that removing children from social media reduces incentives for platforms to provide safer online environments.

Which other countries are doing this?

Australia has unveiled world-first legislation banning children younger than 16 from platforms such as X, Instagram, Facebook, and TikTok.

Australia plans to require social media platforms to act to prevent online harms to users such as bullying, predatory behaviour and algorithms pushing destructive content as part of proposed changes to the Online Safety Act, the government said.

“The Digital Duty of Care will place the onus on digital platforms to proactively keep Australians safe and better prevent online harms,” communications minister Michelle Rowland said.

Prime Minister Anthony Albanese said the measures could become law late next year.

Australia is also trialing an age-verification system to assist in blocking children from accessing social media platforms, as part of a range of measures that include some of the toughest controls imposed by any country to date.

The Technology Secretary signalled he had already been speaking to politicians from Australia about the plans.

What’s next?

As the Government prepares to enforce the Act, Kyle has published for the first time a statement of strategic priorities for watchdog Ofcom.

This says Ofcom should ensure the concept of “safety by design” is being followed by platforms from the start so more harm is caught before it occurs, and pushes for more transparency from tech firms on what harms are occurring on their platforms.

It also urges them to create digital worlds that are inclusive and resilient to harm, including disinformation.

Ofcom will also have to ensure it is “agile” in how it regulates the sector by monitoring and tackling emerging potential harms, such as AI, and embracing online safety technologies to also help improve user safety.

The Government said Ofcom will have to consider each of the stated priorities as it enforces the Act, and report back on what action it has taken to ensure safer online spaces are being delivered.

“Keeping children safe online is a priority for this Government. That is why today I will be the first secretary of state to exercise the power to set out my strategic priorities,” Kyle said.

He also announced ministers will launch a research project aimed at helping it understand the impact of smartphones and social media use on children.

iStock

What are charities saying?

Ian Russell, chairman of the Molly Rose Foundation, said the new priorities offered some “course correction” for the Online Safety Act and would allow Ofcom to be “bolder”, but warned more reform to the rules was still needed.

The Molly Rose Foundation (MRF) was set up by Russell and his family in memory of his daughter, Molly, who ended her life aged 14 in November 2017 after viewing harmful content on social media.

“This announcement outlines a much-needed course correction, vital for improved online safety and to prevent the new regulation from falling badly short of expectations,” he said.

“However, while this lays down an important marker for Ofcom to be bolder, it is also abundantly clear that we need a new Online Safety Act to strengthen current structural deficiencies and focus minds on the importance of harm reduction.”

Maria Neophytou, director of strategy and knowledge at the NSPCC, said the new priorities have “the potential to change the online world for children”.

She added: “Through Childline, we hear daily from young people about the range of harms they are experiencing online, including online bullying, access to content encouraging suicide and eating disorders and child sexual abuse and exploitation.

“Tech companies must be transparent about the harm happening on their platforms. They should be disrupting ‘safe havens’ for offenders by tackling the hidden abuse taking place through private messaging.

“It is right that the Government is focusing on driving innovation and new technology that can identify and disrupt abuse and prevent harm from happening in the first place.”

What did Ofcom say?

An Ofcom spokesperson said: “Our resolve to create a safer life online for children and adults in the UK has never been stronger.

“We welcome the Government’s draft statement of strategic priorities for online safety, which, once finalised, will help shape this important work.”

Help and support is available now if you need it. Details of services available can be found at stv.tv/advice 

The Samaritans can be contacted any time, from any phone, free on 116 123, email at jo@samaritans.org, or visit samaritans.org to find your nearest branch. Details of other services and more information can be found on the NHS website here. 

STV News is now on WhatsApp

Get all the latest news from around the country

Follow STV News
Follow STV News on WhatsApp

Scan the QR code on your mobile device for all the latest news from around the country

WhatsApp channel QR Code
Posted in