Content that glorifies or celebrates self-harm and suicide is widely available via internet search engines, Ofcom has warned.
The regulator said research carried out on its behalf by the Network Contagion Research Institute found that one in every five links (22%) in the search results it analysed led to content which gloried or offered instruction about self-harm, suicide or eating disorders.
The researchers entered common search terms linked to self-injury, as well as more cryptic phrases used by online communities to conceal their true meaning in order to generate their results, and analysed more than 37,000 result links on five major search engines – Google, Microsoft Bing, DuckDuckGo, Yahoo and AOL.
According to the research, image searches provided the highest proportion of harmful results, with 50% of results being considered extreme.
Ofcom noted that previous research has shown that images are harder for detection algorithms to filter out as it can be difficult to distinguish between visuals glorifying self-harm and those shared in a medical or recovery context.
The study also found that the cryptic search terms returned more harmful content, with users six times more likely to find dangerous content about self-harm when using deliberately obscure search terms.
However, the research did note that help, support and educational content was available and signposted – with one in five search results linking to content focused on getting people help.
Ofcom also acknowledged that some search engines offer safety measures, such as a safe search mode, which restricts inappropriate content, and these were not used by the researchers in the study.
The regulator warned that search engines must act to ensure they are ready to fulfil their requirements under the Online Safety Act, which legally requires internet companies to protect children from harmful content.
Ofcom’s online safety policy development director, Almudena Laraat, said: “Search engines are often the starting point for people’s online experience, and we’re concerned they can act as one-click gateways to seriously harmful self-injury content.
“Search services need to understand their potential risks and the effectiveness of their protection measures – particularly for keeping children safe online – ahead of our wide-ranging consultation due in spring.”
Each of the search engines included in the research has been approach for comment.
Follow STV News on WhatsApp
Scan the QR code on your mobile device for all the latest news from around the country