TikTok's Trending Mental Health Advice: Over 50% Misinformation, Study Reveals
A recent investigation highlights the prevalence of misleading information in popular mental health content on TikTok, raising concerns about public safety.
An investigation has revealed that over half of the top trending videos on TikTok offering mental health advice contain misinformation.
The findings reflect a growing trend of individuals seeking mental health support through social media, yet many influencers are disseminating incorrect information, which includes misused therapeutic language, 'quick fix' solutions, and unfounded claims.
Among the dubious advice circulating is the suggestion to eat an orange while showering to alleviate anxiety, encouragement of supplements like saffron and magnesium glycinate without robust evidence, and claims that trauma can be healed within an hour.
Other guidance has inaccurately presented normal emotional experiences as indicative of more severe disorders such as borderline personality disorder.
Experts, including psychologists and psychiatrists, have described the findings as troubling and emphasized the need for stronger regulatory measures to protect the public from harmful misinformation.
The investigation analyzed the top 100 videos under the #mentalhealthtips hashtag and determined that 52 of these videos, addressing various mental health issues such as trauma and anxiety, contained misinformation.
Many others were noted to be vague or unhelpful.
David Okai, a neuropsychiatrist at King’s College London, pointed out that certain posts misuse therapeutic language, conflating terms like wellbeing, anxiety, and mental disorders, which could lead to misunderstandings about mental health issues.
He noted the risks of general advice based purely on personal anecdotes, suggesting that such content may not be broadly applicable.
Dan Poulter, a former health minister and NHS psychiatrist, cautioned that some videos pathologize everyday emotional experiences, which can mislead impressionable viewers and trivialize the realities faced by those with serious mental health conditions.
Amber Johnston, a psychologist accredited by the British Psychological Society, observed that while many videos contain elements of truth, they often over-simplify complex issues like post-traumatic stress disorder (PTSD).
She stressed that trauma and PTSD symptoms are unique to each individual and necessitate professional guidance for accurate understanding and support.
In response, TikTok has stated that content is removed if it dissuades individuals from seeking medical assistance or endorses dangerous treatments and has measures in place to direct users seeking mental health-related information to NHS resources.
Chi Onwurah, a Labour MP, noted that her committee’s investigation into social media misinformation has underscored significant concerns regarding the effectiveness of the Online Safety Act in addressing harmful online content.
She pointed to algorithmic recommendations that amplify misleading information as a central issue that requires urgent attention.
MP Victoria Collins described the findings as “damning” and called on the government to take action against harmful misinformation.
Paulette Hamilton, chair of the health and social care select committee, echoed concerns that social media ‘tips’ should not replace professional mental health support.
Prof Bernadka Dubicka, online safety lead for the Royal College of Psychiatrists, highlighted the necessity for accessible, evidence-based health information.
She reaffirmed the importance of comprehensive assessments by qualified professionals for accurate mental health diagnoses.
Responding to the research, a TikTok spokesperson defended the platform as a space for authentic mental health journeys while stating that there are limitations to the study's methodology.
They asserted efforts to collaborate with health experts to counter misinformation effectively.
A government spokesperson mentioned ongoing actions to mitigate the effects of harmful misinformation online, particularly emphasizing provisions in the Online Safety Act requiring platforms to address illegal or harmful content.