Military and Social Media Minds Meet About Suicide Prevention at Pentagon

Top minds in the U.S. military, mental health, suicide prevention and social media arenas converged Wednesday in a packed room at the Pentagon to find immediate, actionable ways to help people at risk of suicide. As his last official act before the transition to the new administration, Army Secretary Eric Fanning led the daylong summit. Here are some highlights:

[See: 9 Things to Do or Say When a Loved One Talks About Taking Their Life.]

Making mental health services more accessible while breaking down biases and stigma is an ongoing effort, Fanning said, one that’s not partisan or exclusive to the military. But with two wars in the last 15 years, he said, serving in the “crucible of combat” with multiple tours away from home puts added strain on service members. To reach new audiences with a mental health message and services means speaking to people where they are. “Social media is a tool that can help us save lives,” Fanning said.

“I have to tell you I’m not a big fan of social media,” said Gen. Paul Selva, vice chairman of the Joint Chiefs of Staff. Selva’s not on Facebook himself, but he acknowledged that social media is where most soldiers, sailors, airmen and marines choose to interact today. He said he is eager to use technology for good, like facial recognition to identify and help service members in distress.

Leaders at the highest levels, including Surgeon General Vivek Murthy, shared personal experiences of family members, friends and fellow solders lost to suicide. Murthy and other speakers noted that emotional well-being is the core of readiness, whether for serving on the battlefield or getting through day-to-day life. “The foundation of connection is dialogue,” he said. Technology can support and build silos that keep people apart, he said, or it can help break silos down.

Social media postings might predict which service members are at risk, according to research presented at the meeting. A study of 1,400 military deaths, half by suicide, found reliable signals in public social media postings of service members who took their own lives. The things people wrote about in the months leading up to their deaths offer potential clues, says AnnaBelle Bryan, director of education and outreach for the National Center for Veterans Studies at the University of Utah.

For instance, posts about stressful life events followed by posts about negative emotions a few days later were a signal for death by suicide — but the reverse order was not. Social media posts about behaviors such as alcohol use, social withdrawal and aggression, followed by posts on negative emotions in the next few days, did not necessarily indicate suicide. However, a danger sign was when service members continued posting about these behaviors but no longer wrote about negative thoughts.

[See: 11 Simple, Proven Ways to Optimize Your Mental Health.]

While good treatments are available for nearly all mental health disorders, the issue is getting treatments to those who need them, said Dr. Thomas Insel, a former director of the National Institute of Mental Health and leader of the mental health team at Verily Life Sciences. Suicide rates have barely budged despite advances in medicine, neuroscience and genomics.

Information technology offers the promise of near-term, actionable prevention methods, Insel said. For example, some adaptive interventions use mobile technologies, such as cellphones and wearable monitors, to gather data and deliver online therapy, coaching and peer support.

Privacy around sensitive personal data is an issue. “This whole opportunity is going to crash if we don’t get trust from people we want to help,” Insel said, stressing the need for transparency, agency and responsibility.

Matthew Nock, a professor of psychology and director of Harvard University’s Laboratory for Clinical and Developmental Research, discussed using smartphones and wearable technology to study imminent suicide risk and the potential for delivering prevention interventions through social media platforms like TalkLife.

Dr. Dan Reidenberg, executive director of SAVE — Suicide Awareness Voices of Education — and managing director of the National Council for Suicide Prevention, has conducted many psychological autopsies of people who’ve died by suicide. Technology can be used to do more around mental health and suicide prevention than ever thought possible, Reidenberg said, with important reservations.

Risks posed by social media include unmoderated chat rooms, pro-suicide groups that attract vulnerable individuals and, in infrequent cases, streaming live incidents of self-harm or death, Reidenberg said. On the other hand, social media facilitates connection for troubled people and increases awareness of prevention programs. Any new tools that emerge for these global platforms must be culturally sensitive and appropriated, he added.

Google searches provide a unique method for reaching out, says Karthik Raman, a research scientist at Google. Because search queries on suicide go into a search engine, he says, there’s no stigma or fear of judging. When people launch queries like “reasons not to commit suicide,” the search automatically provides resources with the message “Confidential help is available for free” and a chat link and number for the National Suicide Prevention Lifeline (800-273-8255).

To pinpoint suicide-related queries with slight wording variations, Raman says, Google’s machine-learning toolkit identifies general wording patterns, which is much more effective in capturing users’ suicidal intent, making them more likely to click on help information.

“We don’t track the emotive tone of your posts on LinkedIn,” notes Daniel Savage, head of the veterans program at the online professional network. Instead, he says, LinkedIn serves veterans in different ways, like helping those challenged by the transition to civilian life reach out to online peers rather than struggle in isolation.

Susan Booker, a designer who works on suicide prevention initiatives at Apple, is trying to get into people’s heads so Siri, the intelligent voice assistant, can best respond. On all of Apple’s platforms worldwide, Siri feels safe and private, so people get right to the point: “Where is the nearest hospital?” or “I want to end my life.”

Siri’s meant to have an empathetic voice and a personality. “We want Siri to be quirky and funny and glass half-full, but we don’t know where people are emotionally,” Booker says. Apple is working on more nuanced responses, Booker says, with the goal: “How can Siri keep people safe?”

Facebook focuses on suicide prevention because of the scale of the issue and the opportunity for support, says Vanessa Callison-Burch, a Facebook product manager. In collaboration with experts and drawing on users who’ve experienced suicidal ideations or attempts — and keeping privacy issues in mind — the company introduced its most refined prevention tools in 2016.

[See: Apps to Mind Your Mental Health.]

When reporting a concerning post, Facebook users receive initial choices regarding what to say and do and how to reach out to a friend. Troubled users can opt to receive carefully crafted messages of caring and support. Instagram has a version of this tool as well, Callison-Burch pointed out.

More from U.S. News

9 Phobias That Are Surprisingly Common

Coping With Depression at Work

7 Ways Technology Can Torpedo Your Health

Military and Social Media Minds Meet About Suicide Prevention at Pentagon originally appeared on usnews.com

Federal News Network Logo
Log in to your WTOP account for notifications and alerts customized for you.

Sign up