"I Think This Subreddit Should Seriously Consider Having Suicide Hotline Info Posted": How Can We Improve Proximal Suicide Prevention Efforts on Social Media Platforms?
Abstract
The recent volatility of cryptocurrencies has caused significant financial losses, leading many individuals to communicate their stress via social media platforms. On one such platform, Reddit, the increased suicide-related chatter has prompted users to post suicide help lines within their forums (subreddits). While this may represent a positive first step in suicide prevention on social media platforms, helplines alone may be unspecific to the user's location and overly dependent on user engagement. Advancements in machine learning algorithms may, therefore, offer a more effective frontier for suicide prevention. We discuss the potential for machine learning algorithms to identify communication of distress and subsequently message people in-app to provide support when they may be in distress. We further comment on the adaptability of this approach to benefit broader health promotion efforts.
Keywords
Suicide prevention, Social media, Algorithms, Health promotion
Comments
Suicide continues to be a global health issue [1]. The complex and unique factors that lead an individual towards considering suicide make it difficult to develop proximal prevention strategies for suicide. The effectiveness of such strategies would be determined by their ability to be time-sensitive to the escalation of the persons distress and to provide highly accessible options for support. Partnership with social media platforms to develop appropriate responses to suicide-related content may, therefore, represent a key goal for the improvement of global health outcomes around suicide.
The recent volatility of cryptocurrencies has wiped out an estimated 1 trillion dollars from the stock market. This drop has led to an eruption of discussions on social media platforms, like Reddit, where users have reported experiencing significant stress about losing large sums of money. Since their emergence, social media platforms have become an integral part of how people communicate and receive news. More and more, they serve as a platform for online communities where people can discuss common interests (e.g., cryptocurrency). However, despite their increasingly dominant role in people's lives, there is yet, no formal monitoring of user distress and risk on these platforms.
The increase in this type of chatter has prompted the moderators of some cryptocurrency Reddit forums (e.g. r/terraluna, r/Bitcoin, r/CryptoCurrency) to pin links to suicide helplines at the top of discussions. While easy to overlook, the pinning of helplines may represent a key first step for suicide prevention on social media forums. If developed further, the active inclusion of targeted helplines within a user's social media experience may provide a foundation for active online intervention and connect users who may be experiencing distress to relevant support services. Importantly, such an approach would not be limited to suicide prevention but could be adapted for a range of other health conditions.
Previous research has identified that A) People post about suicide online and B) Functions within social media platforms may possibly be able to mitigate suicide risk [2,3]. For example, some 'subreddits' (topic threads) on Reddit are dedicated solely to supporting individuals who are experiencing suicidal thoughts or planning suicide (e.g. r/SuicideWatch). While such peer-support focused forums may benefit individuals who engage with them, people who make suicide-related posts on other subreddits or other platforms may not receive similar support, highlighting the opportunity for more widespread monitoring of suicide-related content.
Posts following the recent cryptocurrency market crash show how key Reddit users are attempting to compensate for the lack of platform led interventions by adding suicide hotlines to posts. While these are an important first step, is there more that we can do than just pin links to suicide hotlines?
Organisations like Live for Tomorrow (https://livefortomorrow.co/) may provide a blueprint for what successful outreach might look like on social media sites. Live for Tomorrow is a "help line that moves first" and focuses on identifying posts from people in crisis and providing evidence-based support in the application the user made the post on (e.g., using the private message function on the site that the user posted to). While some social media platforms have partnered with these organizations, to date, we are yet to see active implementation of real time suicide prevention strategies within many social platforms themselves. Current strategies are limited to algorithmic detection and person-to-person responding by proactively reaching out to support individuals in crisis. While proactive outreach may represent a gold standard of care that we can strive too, it may lack the immediacy required in periods of acute suicidal distress.
Recent advances with machine learning algorithms offer potential for targeting at-risk individuals who communicate their possible risk for suicide on social media platforms [2,4-6]. By implementing evidence-based algorithmic strategies for suicide prevention, it may be possible to provide real-time support to users who are experiencing suicidal distress. For example, taking the blueprint from Live for Tomorrow, algorithms may be able to identify posts and automatically message people in-app and provide crucial resources in a timely fashion.
By developing specific automated interventions, it may be possible to help guide individuals toward support services during suicidal crises. Posts that contain keywords or phrases could be identified and activate a pop-up warning advising individuals to seek help. While a constant challenge would be the identification of novel terms being used to communicate suicidal distress, the importance and benefit of reaching out to those we can identify should not be overlooked. Past research indicates that individuals often overlook the seriousness of their distress, even at the level of suicidal crises [7,8]. Thus, if automated pop-up messages could trigger greater self-awareness and lead to help-seeking behaviors, then such an approach could catalyze significant change within suicide prevention strategies and once adapted, other health-related areas.
Overall, the proliferation of internet chatter following the cryptocurrency market crash is the latest example of how individuals communicate distress on social media platforms. Such posts are not constantly monitored, however, and while other users are attempting to pin suicide helplines in an attempt to offer support, these may not be engaged with by the user or may not be providing country specific information (e.g., help lines, emergency services). The advancements in machine learning offer a critical frontier for social media platforms to identify communication of distress and message people in-app to provide support that is time-sensitive and links individuals to region-specific support services. Such action would have the potential to increase the effectiveness of proximal suicide prevention efforts and, in turn, directly correspond to greater positive mental health outcomes globally. As our world becomes increasingly digital, so to must our strategies for identification and prevention for suicide risk.
Declarations
Conflict of interest statement
The authors declare there is no conflict of interest.
Funding source
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Authors contributions
AM & BR conceived the idea. AM wrote the original draft. AM, BR, & DS were all involved in the review and editing of the manuscript.
Acknowledgements
The authors wish to acknowledge the support of Prof. Gareth Treharne for his contributions in providing general feedback on this work.
References
- World Health Organisation (2021) Suicide.
- Kavuluru R, Williams AG, Ramos-Morales M, et al. (2016) Classification of helpful comments on online suicide watch forums. Proceedings of the 7th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics, 32-40.
- Mason A, Jang K, Morley K, et al. (2021) A content analysis of reddit users' perspectives on reasons for not following through with a suicide attempt. Cyberpsychol Behav Soc Netw 24: 642-647.
- Alambo A, Lokala U, Kursuncu U, et al. (2019) Question Answering for Suicide Risk Assessment using Reddit. 13th IEEE International Conference for Semantic Computing, Newport Beach, California.
- Liu T, Zheng Z, Zhou Y, et al. (2022) Enriching an online suicidal dataset with active machine learning. Proceedings of the 2022 ACM Southeast Conference, 196-200.
- Monselise M, Yang CC (2022) "I'm always in so much pain and no one will understand" - Detecting patterns in suicidal ideation on Reddit. Companion Proceedings of the Web Conferences 2022, 686-691.
- Czyz EK, Horwitz AG, Eisenberg D, et al. (2013) Self-reported barriers to professional help seeking among college students at elevated risk for suicide. J Am Coll Health 61: 398-406.
- Ennis E, McLafferty M, Murray E, et al. (2019) Readiness to change and barriers to treatment seeking in college students with a mental disorder. J Affect Disord 252: 428-434.
Corresponding Author
Andre Mason, BSc, Department of Psychology, PO Box 56, University of Otago, Dunedin 9054, New Zealand, Tel: +64-479-7636
Copyright
© 2022 Mason A, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.