By Liora Engel-Smith
When Tessah O’Neill felt sad about her miscarriage, she turned to Facebook for comfort. It was her first Mother’s Day after the loss, the 22-year-old Jacksonville resident recalled.
“I ended up losing my eventual kid early in my pregnancy and I don’t feel like a mother … ,” O’Neill wrote in the Facebook mental health community, “A group where everyone is super fucking supportive for no reason.” The group bills itself as a destination “for anyone who is struggling with mental health, self harm, or anything that needs support.”
Without a child to “prove” her motherhood to the world, she wrote, the holiday made her feel like an intruder.
She had joined the super supportive group a few months before. It took answering a few questions and an administrator’s approval, but the more than 20,000-member group was at her fingertips that Mother’s Day. On any given day, members like O’Neill share all manner of hardships with strangers, from dealing with anxiety, depression or loss to questions about sexuality, self-esteem and even thoughts of suicide and self-harm. As per the group’s rules, whatever advice, support or validation they give each other has to be positive, or members risk being banned from the private group.
Facebook groups of all kinds have flourished in recent years, as part of an effort on the social media giant’s part to foster meaningful connections. As of last year, more than 400 million Facebook users have joined at least one group, a 75 percent increase from 2017.
There’s now a Facebook community for everything, it seems, from hobbies to sports to illness-specific support groups. Though Facebook did not respond to repeated inquiries seeking information about the mental health groups on its platform, a search on the platform yielded pages of results.
With the coronavirus pandemic prompting much of social life to move online, coupled with a rise in the incidence of anxiety and depression, the question of easy-to-access avenues for mental health support has gained a new urgency.
Like other online venues, Facebook communities hold tremendous power for good, but the groups can also influence people in crisis in negative ways, perhaps even heightening their risk for suicide. But exactly who bears the responsibility for members’ mental health and well-being when something goes wrong isn’t always clear.
‘We try our best’
Kelsi Chlovechok, a past moderator at the “Endometriosis Support Group” and a 30-year-old Goldsboro resident, knows this tension first hand. Though the group focuses on a physical illness, the chronic pain that’s often associated with the disorder has led several members of the group to post about suicidal thoughts. Chlovechok, who worked in the mental health field until she lost her job recently, divided her free time between running the endometriosis group and the super supportive group.
But last November, a member of the endometriosis group who called herself Mari Sol went through with her plans after posting about her distress in the group.
The following week, a Facebook friend who is also a member of the group notified the community that Mari Sol had killed herself. Along with prayers and rest-in-peace comments, one member asked the group’s admins if they can do safety checks on people who express suicidal thoughts in posts.
Francesca Pooley, a group administrator who did not respond to a request for an interview, wrote that such checks aren’t a responsibility that administrators and moderators can take on. While admins reach out to members who they suspect are suicidal via private message, she wrote, reaching out to every struggling member in a group with thousands of people would be too time-consuming.
“There is no way we could guarantee picking up on every concerning post or say with certainty that we are putting suicidal members in touch with the correct authorities to access the help they need,” she wrote. “We try our best and one of the mod team has pinned an announcement on the group with suicide prevention and mental health contacts. There’s not much more we can do.”
Chlovechok, who checked in with Mari Sol over private messages, said she can’t help but feel that she failed the woman.
“I know that it’s not my fault, but of course I feel guilty,” she said. “This woman reached out for help and then she fucking died.”
The super supportive group is the brainchild of 17-year-old Issy Abu-Joudeh, who lives in Michigan. It began with a simple premise: Abu-Joudeh, who herself struggles with anxiety and depression, said she wanted to create a positive space online for people with mental illness.
So in April 2019, she started her first mental health community, “A group where everyone is super fucking nice to each other for no reason.” As the group grew, Abu-Joudeh, who screens member posts before they go live, noticed that some of them involved heavier topics, including self-harm and suicide.
Facebook has strict rules surrounding discussions of suicide and other self-harm, according to its community standards. While the social media platform allows discussions on these topics generally, Facebook may remove content that it deems too graphic or provides encouragement or instructions for engaging in harmful behaviors.
Users can report such posts to Facebook or to group moderators for review, and groups that repeatedly get reported risk being deleted from the platform altogether.
Abu-Joudeh said she created the super supportive group in July 2019 to protect the super nice group from being deleted, while still allowing people to discuss heavier topics in a supportive environment. Abu-Joudeh and a group of more than 70 of volunteer moderators screen all of the posts in both groups before they go live, but members are also instructed to tag the group admins on a problematic post for review. They are told not to report questionable posts to Facebook, a strategy many groups follow.
“Reporting can get us zucced!,” the group rules say, using an online slang term that means that Facebook will delete the group for not complying with its community standards. Admins mute members who report a post to Facebook, preventing those who reported from commenting or responding to posts for 24 hours, according to the group rule.
Despite that safeguard, Abu-Joudeh knows that Facebook can delete the group if it gets reported too many times.
“Our main goal is to help people, and if Facebook doesn’t agree with that, well, we’ll just make another one,” she said.
Even when admins vet every post, questionable posts do get through.
One woman in the super supportive group detailed a suicide plan in a January post. A February post by another woman describes admiring scars from self-harm. Another post from May details a cutting ritual.
Abu-Joudeh counters that though members discuss suicide and self-harm in the group, she and the admins who help run the group don’t know of any member who died by suicide. Approval of posts, she said, can sometimes vary from moderator to moderator, particularly with new volunteers, who sometimes approve some of the more graphic posts in the group.
“That’s the gray area,” said Chlovechok, 30, who is also a past admin of the super nice and super supportive groups. “You can mention [suicide and self-harm], but once you mention it you can’t go into detail. You can’t say you’re thinking of doing it.”
Straddling the line between allowing valuable discussion in the group without triggering others can be tricky, Chlovechok said. To help members decide what to read, every post with potentially problematic content must include a trigger warning for mentions of suicide or other problematic topics at the top before it gets approved. Even if members include content warnings, moderators may reach out to users whose pending posts are too graphic, requiring them either to amend the post before it gets published or supporting them one-on-one through private messages.
As a past admin in the super supportive group, Chlovechok said she’s had “at least a dozen if not more” personal conversations per week with members whose posts about suicide and self-harm were deemed too graphic or inappropriate for the group. Dozens of other members also sent private messages about suicidal thoughts and self-harm to her personal Facebook page daily, she added.
Even with one-on-one attention, Chlovechok said, the help she could provide was limited.
The power of ‘a stranger’s support’
O’Neill, the woman whose miscarriage prompted her to ask for help in the super supportive group, said the responses her post garnered helped her deal with the pain she felt. One commenter wrote that mothers may lose their children but they never lose motherhood. It didn’t matter that O’Neill’s child wasn’t born yet, or that her pregnancy ended early on, the woman wrote. These words had a lasting impact on O’Neill.
“A stranger’s support can sometimes be the best support … ,” O’Neill said. “It’s people you have no relations to and they can choose to be mean to you … but they are actively choosing to open their heart and say ‘hey, I’m here, you can come to me. I am here for you’ which is astounding because strangers aren’t normally that nice.”
Mental health support communities on Facebook can be a salve to social isolation, cutting through the stigma that often comes with these conditions, and allowing members to interact with others in relative anonymity. Unlike appointment-only professional support, the internet is there 24/7 and offers the ability to interact with others almost immediately, an option that many adolescents and young adults with depression favor, research shows.
The National Suicide Prevention Lifeline operates a 24/7 hotline at 1-800-273-8255 and a live chat available at https://suicidepreventionlifeline.org/chat/
The Charlotte-based Promise Resource Network operates a 24/7 warm-line that’s open to all North Carolina residents and is available at 1-833-390-7728.
Strangers with shared experiences can be a source of hope and information in dark times, said 30-year old Garner resident Chandler Picot. A trained peer support specialist Picot said talking to others on Facebook helped him explore his options. Picot, who also has treatment-resistant depression, used groups to explore electroconvulsive therapy and transcranial stimulation, both are options for people for whom antidepressants don’t work.
“It’s hard to believe someone if they haven’t been through it,” he said. “ … If you’ve been through what I’ve been through and you’re telling me that there’s a way that you’ve gotten better, then I’ll believe you.”
Not a substitute for professional help
It’s hard to say how many users post cries for help on Facebook because many mental health groups are private so their content is hidden from non-members. And posts that moderators decline to publish never become public. Facebook did not respond to an inquiry about the number of posts it flags because of suicide or self-harm mentions that violate its community rules.
But concerns over the live-streaming of suicide prompted the social media company to tighten its controls over such content in 2017. Facebook now uses an algorithm that scans posts for signs of crisis and passes along information about potential suicide risks to law enforcement agencies when needed. Facebook also sends users who are deemed at risk an automatic message with the suicide prevention hotline.
As Mari Sol’s experience in the endometriosis group shows, however, these safeguards don’t always work.
“Nothing replaces human connection, that ability to touch each other and to read each other’s nuanced expression,” said Cherene Caraco, CEO of the peer-run Promise Resource Network, an agency that supports uninsured people with mental health needs in Mecklenburg County.
Online interactions, she said, can draw in people who are at the beginning of their mental health journey and are not yet ready for professional help. Typing a relatively anonymous post on a private group can feel less threatening than calling a therapist, for example.
But when topics such as suicide and self-harm come up, she said, well-meaning people can make matters worse, either by offering platitudes or by saying the wrong thing, with the result of deterring a vulnerable person in crisis from reaching out in the future. an appropriate response, Caraco said, expresses sympathy for the person’s feelings, acknowledges their pain or invites them to talk about their feelings either in private, in the group or through a hotline.
O’Neill, the 22-year-old who got help on the super supportive group after her miscarriage, agrees that Facebook mental health communities aren’t a substitute for professional help, but knowing that a group of strangers is ready to offer positive words can be powerful.
“They go hand in hand,” she said. “A professional can be like ‘here are ways to fix this,’ but in the group, it’s more like ‘here are ways to connect, we’re here for you if you need support.’”