TikTok is to limit the direct messaging abilities of accounts belonging to 16 and 17-year-olds as part of a new wave of security features designed to protect younger users.
The video-sharing app said a change to its privacy settings will see the ability to direct message others on the site turned off by default for those teenagers.
It means users aged 16 and 17 will need to actively switch to a different sharing option, TikTok said, with the app also confirming it will now ask all users under 16 to choose who they would like to see a post before they publish their first video.
As part of the update, they will be able to choose to publish content to all their followers, only friends or just themselves.
TikTok said the changes are being rolled out “over the coming months” and have been created to enhance protections for younger users of the platform, as well as to help teenagers better understand the sharing options available to them on the site.
The update follows the decision taken by the platform earlier this year to set accounts belonging to under-16s to private by default and limit features such as direct messaging to those aged 16 and older.
TikTok’s rules allow anyone aged 13 or over to open an account, but online safety campaigners have previously urged platforms to do more to protect younger users from the harms that can be found on social media.
“TikTok’s priority is to ensure our community has a safe and positive experience on the platform,” Alexandra Evans, TikTok’s head of child safety public policy, said.
“This announcement builds on our groundbreaking decision to make all under-16 accounts private by default, and adds to our growing list of features designed to safeguard our teenage users.
“Through our work with teenagers, parents, NGOs and academics, we’ll continue to develop new ways to allow teens to express their creativity and find joy on TikTok whilst ensuring they have a safe experience.”
The latest safety update also includes a new mindfulness feature, with the accounts of 13 to 15-year-olds now having their notifications disabled from 9pm each night, with the tool kicking in at 10pm for those aged 16 and 17.
TikTok said it wanted its younger users to “develop positive digital habits early on” and managing screen time was a key aspect of this.
“These changes continue to build on our ongoing commitments as there’s no finish line when it comes to protecting the safety, privacy, and well-being of our community,” the social media giant said.
“We’re working with teens, community organisations, parents and creators to further innovate and we’re excited to share more over the coming months.”
Andy Burrows, head of child safety online policy at children’s charity the NSPCC, said he was encouraged by the new tools.
“These increased privacy measures will give children more control over who can contact them and view their content, reducing opportunities for offenders to groom them,” he said.
“TikTok continues to show industry leadership when it comes to protecting children and we urge those tech firms who have been slow to catch up to be similarly proactive.
“However, the raft of safety announcements we have seen in recent weeks have been driven by the Age Appropriate Design Code coming into force next month and shows the positive impact regulation has on children’s safety.”
Follow STV News on WhatsApp
Scan the QR code on your mobile device for all the latest news from around the country