From Friday, websites and apps containing pornography or harmful content will require “highly effective” age verification.
The new law under the Online Safety Act enforced by the UK’s communications watchdog Ofcom aims to protect children online.
Research has found that most teenagers have recently seen “potentially harmful content”.
The changes will force platforms to have a duty of care toward young users.
What will the new laws change?
From July 25, when the law will be implemented, websites, apps, social media platforms and search engines will be required to do more to protect children.
Ofcom has said that it has been too easy for children to see pornography online.
According to an Ofcom survey, 8% of children aged eight to 14 had visited an online porn site or app in the previous month – including around 3% of eight to nine-year-olds – the youngest children asked.
Platforms will now require a “highly effective” barrier stopping anyone who cannot prove they are over 18.
Ofcom said thousands of sites have already committed to age checks including the UK’s biggest pornography site PornHub as well as dating apps and social media sites such Discord and Reddit.
How can you prove your age?
Websites and apps can use various methods to verify a user’s age and they might carry out checks themselves or use a another company to do it for them.
- AI age estimation – Technology will analyse a photo of video of your face to estimate your age
- Open banking – An age-check service will securely access information from your bank about whether you are over 18
- Digital ID – Services such as digital identity wallets can securely store and share information which proves your age
- Credit card – Because you must be over 18 to have a credit card, providing your card details allows a payment processor to check if the card is valid
- Email-based age estimation – Technology uses your email address to analyse other online services where it has been used, such as banking or utility providers
- Mobile network operator – The service checks whether or not your mobile phone number has age filters applied to it
- Photo-ID matching – You upload an image of a document that shows your face and age, and an image of yourself at the same time – these are compared to confirm if the document is yours
How will this effect social media?
Social media companies have been criticised for failing to protect children online.
Under the new laws, Ofcom says platforms such as Facebook and Instagram must “configure their algorithms to filter out harmful content from children’s feeds”.
This includes content including self-harm, suicide, porn, eating disorders and violent content.
It also includes content that is misogynistic, violent, hateful or abusive material such as online bullying and dangerous viral challenges.
Platforms will also need to implement easier reporting and moderation of harmful content and must respond to complaints with appropriate action.
According to Ofcom, 31% of children that go online have seen something that they found worrying or nasty.
How will websites be held to account?
Ofcom has said these new codes demand platforms take a “safety-first” approach when operating in the UK.
Sites and apps will not be told how to regulate their own platforms and they must carry out their own child risk assessments.
But Ofcom will be responsible for enforcing these new rules.
Platforms that fail to comply risk facing a fine of up to £18m or 10% of revenue and could be shut out of the UK entirely.
Dame Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online.
“They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content.”
“Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
Follow STV News on WhatsApp
Scan the QR code on your mobile device for all the latest news from around the country
