Matching Items (4)
Filtering by

Clear all filters

Description
Elizabeth Grumbach, the project manager of the Institute for Humanities Research's Digital Humanities Initiative, shares methodologies and best practices for designing a digital humanities project. The workshop will offer participants an introduction to digital humanities fundamentals, specifically tools and methodologies. Participants explore technologies and platforms that allow scholars of all

Elizabeth Grumbach, the project manager of the Institute for Humanities Research's Digital Humanities Initiative, shares methodologies and best practices for designing a digital humanities project. The workshop will offer participants an introduction to digital humanities fundamentals, specifically tools and methodologies. Participants explore technologies and platforms that allow scholars of all skills levels to engage with digital humanities methods. Participants will be introduced to a variety of tools (including mapping, visualization, data analytics, and multimedia digital publication platforms), and how and why to choose specific applications, platforms, and tools based on project needs.
ContributorsGrumbach, Elizabeth (Author)
Created2018-09-26
Description

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.

ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
Description
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05