Filtering by
- All Subjects: Depression
- All Subjects: Prelicensure
- Creators: Grumbach, Elizabeth
- Creators: Allen, Angela
- Member of: Barrett, The Honors College Thesis/Creative Project Collection
Minority mental health patients face many health inequities and inequalities that may stem from implicit bias and a lack of cultural awareness from their healthcare providers. I analyzed the current literature evaluating implicit bias among healthcare providers and culturally specific life traumas that Latinos and African Americans face that can impact their mental health. Additionally, I researched a current mental health assessments tool, the Child and Adolescent Trauma Survey (CATS), and evaluated it for the use on Latino and African American patients. Face-to-face interviews with two healthcare providers were also used to analyze the CATS for its’ applicability to Latino and African American patients. Results showed that these assessments were not sufficient in capturing culturally specific life traumas of minority patients. Based on the literature review and analysis of the interviews with healthcare providers, a novel assessment tool, the Culturally Traumatic Events Questionnaire (CTEQ), was created to address the gaps that currently make up other mental health assessment tools used on minority patients.
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.