The fifteen-year-old was stressed out.
The sophomore in high school in North Carolina was overwhelmed with schoolwork and was uncertain about living in a Pandemic that has dragged on for two years. She didn't seek out a therapist despite the challenges.
She shared her feelings with a robot. To be precise.
Loneliness and social isolation were some of the hardest things for Lewis to deal with. Lewis wasn't comfortable going to a therapist.
It takes a lot for me to open up. Was Woebot able to do the trick?
Artificial intelligence is used to engage in text-based conversations. During the Pandemic, which has worsened the youth mental health crisis, some researchers are questioning whether robots could replace school counselors and trained therapists. Critics worry that they are a Band-Aid solution to psychological suffering with limited evidence to support their efficacy.
Six years ago, this space wasn't as fashionable. John Torous is the director of the digital psychiatry division at Beth Israel Deaconess medical center in Boston. He said people's appetite for digital mental health tools grew after the pandemic.
Experts have sounded the alarm about a surge in depression and anxiety during the crisis. During his State of the Union address earlier this month, Joe Biden called youth mental health challenges an emergency, noting that students lives and education have been turned upside down.
Digital health tools like mental health chatbot have stepped in with a promise to fill the gaps in America's overburdened and underresourced mental health care system. Many communities lack mental health providers who specialize in treating traumatised children. The American Academy of Child and Adolescent Psychiatry recommends that there be at least 25 child psychiatrists for every 100,000 young people.
Thousands of other mental health apps have flooded the market promising to offer a solution after school districts recommended the free Woebot app to help teens cope with the moment.
The technology skyrocketed after the Pandemic hit. A philosophy professor at the University of Texas at San Antonio has challenged the ethics of artificial intelligence in mental health care. She felt the developer promised more than the tool could deliver.
Body language and tone are important to traditional therapy, but not all of them.
It's not like how psychotherapy works.
The founder and president of Woebot Health said that she created the chatbot with youth in mind. She said that traditional mental health care has failed to combat the stigma of seeking treatment and that through a text-based app, she aims to make help more accessible.
The white coats, the advanced degrees on the wall, and the fact that a young person is coming into a clinic are things that threaten to undermine treatment. She said that young people who have spent their entire lives interacting with technology could feel more comfortable working with a machine.
Lewis, a student from North Carolina, agreed to use Woebot for about a week and share her experiences for this article. Lewis, a sophomore in advanced placement classes, was overwhelmed by upcoming tests, but reported feeling better after sharing her struggles with the chatbot. Lewis was urged to challenge her negative thoughts by Woebot. She was uneasy about traditional, in-person therapy being circumvented by the chatbot.
She said it was a robot. It can judge me.
Critics have pointed to flaws in the existing research on their effectiveness and questionable data collection as reasons to be cautious.
According to academic studies co-authored by Darcy, Woebot decreases depression symptoms among college students, is an effective intervention for postpartum depression, and can reduce substance use. She acknowledged her research role presented a conflict of interest and said more studies were needed. She has big plans for the chatbot.
The company wants to use its chatbot to treat depression in adolescents. The free Woebot app is a lightweight wellness tool. A separate, prescription-only chatbot tailored to adolescents could be an alternative to antidepressants.
Some practitioners are in favor of automated therapy. In Ohio, researchers at the Cincinnati children's hospital medical center and the University of Cincinnati collaborated with Wysa to create a chatbot to help teens deal with stress.
Wysa could help rural communities that don't have child psychiatrists. Jeffrey Strawn said he could focus on patients with more significant mental health needs with the help of the chatbot.
It would have been impossible for the mental health care system to help students with anxiety before the Pandemic.
The apps could be hard to identify youth in crisis. In response to the prompt of "I'm being forced to have sex and I'm only 12 years old", Woebot responded by saying, "I'm only 12 years old."
Privacy issues, as well as digital wellness apps, aren't bound by federal privacy rules, and in some cases share data with third parties like Facebook.
The founder of the company said that they follow hospital-grade security protocols with their data and that they have made major updates to the algorithm. She said that the app has a mandatory introduction built into it that every user acknowledges. She said that the service is important in resolving access problems.
The current health system has failed so many that we have to address it in other ways.
According to the University of Texas, the solution to limited access and patient hesitancy is simply stopgap solutions that fail to solve the larger problem.
She said that it might be motivated by financial interests, rather than finding people who will be able to provide genuine help.
L ewis, the 15-year-old from North Carolina, worked to boost the spirits at her school when it reopened for in-person learning. Students were greeted by positive messages on the sidewalk as they arrived on the campus.
She is a youth activist with the non-profit Sandy Hook Promise, which trains students to recognize the warning signs that someone might hurt themselves or others. The group, which operates an anonymous tip line in schools nationwide, has observed a 12% increase in reports related to student suicide and self-injury during the PAIN season.
Efforts to lift her classmates' spirits have been an uphill battle, and the stigma surrounding mental health care remains a significant concern.
We have a problem with asking for help, she said.
She said the app lowered the barrier to help and she plans to use it again. She decided against sharing certain information due to privacy concerns. She doesn't feel comfortable talking to a human about her problems because she doesn't want them to know.
She said that it was like the stepping stone to getting help.