Charities and advocacy groups have warned that TikTok isn't doing enough to deal with self-harm and eating disorder content on its platform.
More than two dozen charities and advocacy groups, including the NSPCC and the Molly Rose Foundation, have signed a letter to TikTok’s head of safety urging the social media platform to improve its moderation of self-harm and eating disorder content.
The coalition of charities and advocacy claim in their letter that TikTok has not acted quickly enough following research published by the Center for Countering Digital Hate (CCDH). The research suggested that TikTok’s algorithm pushes content around self-harm and eating disorders to teenagers — just minutes after expressing any interest in these or related topics.
They warn that TikTok is acting too slow in dealing with self-harm and eating disorder content on its platform, accusing the social media company of denying a problem, deflecting responsibility and delaying taking appropriate action.
The coalition also claim that TikTok has removed very few of the 56 hashtags associated with eating disorders that were brought to attention in recent research by the Center for Countering Digital Hate — who claim that these hashtags have been viewed a further 1.6 billion times since they released their research findings in December 2022.
In the letter, the coalition has put forward potential measures that TikTok could implement in dealing with the issue:
- "Strengthening your content moderation policies to better address harmful eating disorder and suicide content.
- Working with mental health experts and advocacy organisations to develop a comprehensive approach to identifying and removing harmful content.
- Providing resources and support to users who may be struggling with eating disorders or thoughts of suicide.
- Increasing transparency and accountability by regularly reporting on the steps you are taking to address these issues and the impact of those efforts."
During the same week the letter was published, TikTok announced that they are limiting teenagers' accounts to one hour of usage per day. However, this can be changed or removed in the user’s app settings.
A spokesperson for the social media platform said in response to the letter: “Our community guidelines are clear that we do not allow the promotion, normalisation or glorification of eating disorders, and we have removed content mentioned in this report that violates these rules. We are open to feedback and scrutiny, and we seek to engage constructively with partners who have expertise on these complex issues, as we do with NGOs in the US and UK.”
However, concern about content on the social media platform and TikTok's response remains, fitting into a wider discussion of the potentially negative impact social media and their algorithms can have on people’s mental health.
If you are struggling with your mental health, find local sources of support on this website