It’s a relatable feeling. You’re feeling unwell, have multiple symptoms, and you decide to Google what might ail you. Sometimes doom takes hold and you self-diagnose with the worst-case scenario. But it’s not just physical ailments. CBS News found that as the mental health crisis among American teens deepens, they seek out alternative forms of support and information. Increasingly young people turn to social media platforms like TikTok and diagnose themselves with serious mental health problems.
With qualified therapists expensive and increasingly hard to find many young people search for answers on social media platforms where the answers are abundant and free but not necessarily accurate.
According to a recent Pew Research survey, 1 in 6 Gen Zers use TikTok as a search engine — a place not only to watch dance videos and lip synching, but to find bite-sized chunks of informational content.
“If I’m trying to figure out how to do something, I feel like it’s easier to go on TikTok,” said Alexis Diomino, a third-year psychology student.
It’s not just a search engine. It’s a place to give advice, share feelings and experiences and talk about serious mental health disorders.
“Social media therapy”
At an early age, 19-year-old Samantha Fridley was diagnosed with anxiety and depression. By the time she was in fifth grade she began having suicidal thoughts. Even with therapy her struggles with anxiety and thoughts about self-harm made her feel alone.
“I felt like there was no one that could help me. And I felt because I had been through so much that there is just no treatment for me,” said Fridley. “I started looking for other people through social media. Then I looked through Instagram — any social media I could. And then Tik Tok.”
It was on TikTok that Fridley said she began searching for mental health advocacy and motivation. It’s not uncommon. On TikTok the hashtag “mental health” has been searched more than 67 billion times.
The phenomenon now gaining traction is referred to as social media therapy.
“What they are doing is they are going into the interactive media space to soothe themselves, to make themselves feel better, to make themselves the master of that environment when they don’t feel that they’ve mastered the environment of the outside world,” said Dr. Michael Rich, director of the Digital Wellness Lab at Boston Children’s Hospital.
“There is a demand that’s being filled by people now. The question really is, is how well and how safely is it being filled?”
Experts like Rich say that open conversation about mental health can provide support and reduce stigma. But there are concerns that turning to social media influencers as de facto therapists is risky.
“I think that we need to understand why people are coming to these influencers for help. But we also have to have some kind of quality control,” said Rich. “Unfortunately, when these young people, usually young people, are untrained and with the best of intentions, are trying to be there for their peers, they are, first of all, not able to necessarily detect how much distress someone is in or how much how close someone is to actually harming themselves. And so, I think there’s a real issue here of the demand far outstripping the need.”
The algorithm feeding frenzy
Fridley says her search for mental health related content led her down a dangerous rabbit hole. She was already seeing a therapist for her diagnosed anxiety and depression but started watching countless videos of influencers sharing thoughts on serious mental health conditions — and says TikTok flooded her feed with hundreds more.
“As you look through TikTok and as the algorithm strengthens, it turned into diagnosis and turned into other things like ADHD and borderline personality disorder and more depression and anxiety,” said Fridley.
The content appearing in her “For You” feed was the result of TikTok’s unique algorithm which sends suggested videos based on what you’ve searched, shared or liked. Fridley says being bombarded with vague symptoms of various mental disorders led to her diagnosing herself.
“It just got to a point where I was losing sleep because of it. I would be up until like 3 a.m. on TikTok, just like researching,” she said. But Fridley was never professionally diagnosed with any of those disorders.
In an email to CBS News TikTok said the “For You” feed “…reflects preferences unique to each user. The system recommends content by ranking videos based on a combination of factors, including videos you like or share, accounts you follow, comments you post, and content you create.”
TikTok told CBS News it began testing ways to avoid recommending a series of similar content on topics to users and is checking to see if their system inadvertently feeds a narrower range of content to its viewers.
But there are concerns about what the unintended consequences of providing a steady stream of mental health content can lead to — especially, when it’s inaccurate or misinformed.
In one recent study by the Center for Countering Digital Hate, researchers posed as 13-year-old users and searched and “liked” mental health videos. They found that TikTok pushed potentially harmful content to these users on average every 39 seconds. Some users received recommendations for content about suicide within 2.6 minutes of joining the app.
“What’s online is a free-for-all. There really is no accountability for this and there is no responsibility taken,” said Rich.
According to one analysis published in The Canadian Journal of Psychiatry of popular TikTok videos about ADHD, 52% were deemed misleading.
TikTok wouldn’t agree to an interview but in a statement to CBS News a spokesperson wrote: “We will remove misinformation that causes significant harm to individuals, our community, or the larger public regardless of intent.”
The company also wrote: “We care deeply about the well-being of our community, which is why we continue to invest in digital literacy education aimed at helping people evaluate and understand content they engage with online. We strongly encourage individuals to seek professional medical advice if they are in need of support.” (Read the full TikTok statement here.)
“I’ve talked even to people at TikTok, and I kept saying, ‘you know, at some point, I know you only care about misinformation if it’s COVID, or politics. You don’t care about misinformation, about psychology, but you have to understand this is mental health,'” said Dr. Inna Kanevsky, a professor of psychology at San Diego Mesa College.
For the last few years, Kanevsky has been battling psychological misinformation on TikTok, debunking faulty mental health information one video at a time.
“You can give people advice based on your experience as long as you’re clear that that’s where you’re coming from,” said Kanevsky.
With a million followers and more than 36 million views, Kanevsy has become a TikTok star and reverse influencer herself. But her candid takes on mental health misinformation is not always well received — especially from those users who find themselves on the receiving end of a debunking video.
“If I correct people… people get very mad at me because they [say] they are just talking about their personal experience. They’re not doing anybody any harm.”
But Kanevsky says there can be actual harm which is why she steps in.
“People believe all kinds of things that are not actually true because somebody they find relatable said it, and they find this person more relatable than some medical doctor or some Ph.D. with science articles. And they want to value the personal experience.”
Social media experts say that is at the heart of the problem.
“There are content creators who are trained physicians, trained clinicians people working in disinformation in this space trying to counter it. But it’s almost like being a salmon and swimming upstream,” said Robin Stevens, an associate professor of communications at the University of Southern California in Los Angeles.
“To see real change, it really does have to come at the platform level and requires quite a bit of content moderation,” she said.
Stevens runs the Health Equity and Media Lab at USC. She typically works with Black and Latinx youth and studies how they are using social media to find solutions to the public health issues they face — including mental illness.
For most of her career, Stevens studied and critiqued social media platforms. But this past year she began working with Instagram’s Well-being Creator Collective — a pilot program aimed at influencers and content creators to educate and train them on how to create responsible mental health content.
Meta, the parent company of Facebook and Instagram, recently held a two-day summit with these content creators in Los Angeles. Stevens is one of their expert advisers.
“As they created Reels, we content-analyzed them to see what the level of disinformation was. What was the level of how much the teens respond to it? What was the content that teens were viewing over and over?” she said. “And then we would feed that back to them to help them create better Reels.”
“I was a little skeptical to see what they would really be doing. Was this just PR? And I will say that working in the Wellness Collective, they actually had a brilliant approach of how to bring more supportive content that showed they understood how youth use media,” said Stevens.
But until more content moderation happens at the platform-level how youth use media means users must be aware of the ways in which their feeds are populated.
Samantha Fridley says it took a full detox for her to finally free herself from the grips of mental health influencers and self-diagnosis. She spent 56 days in residential rehab away from her phone and TikTok. And while she still uses the app, the way she uses it has changed dramatically. She stopped watching mental health videos and searched for content that would reset her feed, like K-pop and comedy.
“It’s a great resource for funny videos,” said Fridley. “But it’s not a good resource for diagnosing yourself. And if you start diagnosing yourself, you’re going to fall into a spiral that you will really have a hard time getting out of.”
Advice for teens and parents
Both Dr. Kanevsky and Dr. Rich say parents need to play an active part in how their children are engaging with mental health-related social media posts. Dr. Rich says it’s like a power tool — using it safely must be taught.
A couple of strategies teens can use if their feeds are flooded with negative posts is to try and reset the algorithm by changing the types of videos they watch, like and comment on. Watching positive posts can help displace the negative content. They can even delete their accounts and start from scratch.
TikTok, Snapchat, YouTube and Meta are here.joined by families around the country claiming the platforms’ algorithms have caused depression, eating disorders and suicide in young people. Statements from Snapchat, YouTube owner Google and Meta to CBS News can be found