Technology/Digital Health
Mental health misinformation on social media: Review and future directions
Clare Dierckman, None (she/her/hers)
Undergraduate student, lab manager
Indiana University
Bloomington, Indiana, United States
Isabella Starvaggi, B.S. (she/her/hers)
Ph.D. Student
Indiana University Bloomington
BLOOMINGTON, Indiana, United States
Lorenzo Lorenzo-Luaces, Ph.D.
Associate Professor
Indiana University
Bloomington, Indiana, United States
Background: Social media is a hallmark of life in the 21st century. In the United States, 72% of people report using at least one social media platform. While social media helps us connect with others, an unfortunate consequence is that it has facilitated the spread of misinformation and harmful content, particularly about health-related topics such as vaccines or COVID-19. In recent years, video-sharing platforms like TikTok have skyrocketed in popularity, especially among adolescents and young adults. By the end of 2023, TikTok reached 1.5 billion active users, a 16% increase from 2022 (Curry, 2024). Users may discuss health related issues, including their mental health. In a previous study, we found that negative content about cognitive behavioral therapy (CBT) is common on TikTok and is focused on CBT’s efficacy and suitability for specific subgroups of patients (Lorenzo-Luaces, Dierckman, & Adams, 2023). Research on mental health misinformation is sparse compared to research on misinformation in other areas of health. The present study aimed to document the literature that exists in this area.
Method: To document the literature that exists and to provide recommendations for future work, our team conducted a review of the literature on mental health misinformation on social media. To gather studies, we searched terms for misinformation, social media, and common mental disorders (e.g., depression, anxiety).
Results: We observed that mental health misinformation is a common occurrence, and its prevalence differs by diagnosis and treatment option. For example, many videos about neurodevelopmental disorders, a popular topic on TikTok, have been found to include “information lacking scientific evidence” (Yeung, Ng, & Abi-Jaoude, 2022). In addition to misinformation, we observed that content may be misleading or harmful without directly conveying misinformation. One study found that TikTok users who indicated that they had dissociative identity disorder (DID) closely identified with the plurality of their “alter” identities (Greene et al., 2023). While this is not necessarily misinformation, it may be harmful because of its contrast with the field’s understanding of treatment for the disorder—to limit the perception of plurality. Lastly, we recognize that some content shared by minoritized or underserved groups deemed as "incorrect" or "misleading" might be a result of experiences with inadequate care or prejudice in a healthcare environment. For example, in our content analysis of CBT on TikTok, anecdotes of negative experiences in CBT by racial-ethnic minorities and neurodivergent individuals were common (Lorenzo-Luaces, Dierckman, & Adams, 2023).
Implications: Despite the relatively limited research conducted on mental health misinformation, when researchers study misinformation about mental health, the data clearly indicate it is a widespread problem. Thus, the literature provides undeniable evidence that mental health misinformation is a significant public health problem.