Filtering by
- All Subjects: Suicide
- Creators: Brewer, Gene
- Creators: Grumbach, Elizabeth
Suicide is a significant public health problem, with incidence rates and lethality continuing to increase yearly. Given the large human and financial cost of suicide worldwide alongside the lack of progress in suicide prediction, more research is needed to inform suicide prevention and intervention efforts. This study approaches suicide from the lens of suicide note-leaving behavior, which can provide important information on predictors of suicide. Specifically, this study adds to the existing literature on note-leaving by examining history of suicidality, mental health problems, and their interaction in predicting suicide note-leaving, in addition to demographic predictors of note-leaving examined in previous research using data from the National Violent Death Reporting System (NVDRS, n = 98,515). We fit a logistic regression model predicting leaving a suicide note or not, the results of which indicated that those with mental health problems or a history of suicidality were more likely to leave a suicide note than those without such histories, and those with both mental health problems and a history of suicidality were most likely to leave a suicide note. These findings reinforce the need to tailor suicide prevention efforts toward identifying and targeting higher risk populations.
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.