Barrett, The Honors College at Arizona State University proudly showcases the work of undergraduate honors students by sharing this collection exclusively with the ASU community.

Barrett accepts high performing, academically engaged undergraduate students and works with them in collaboration with all of the other academic units at Arizona State University. All Barrett students complete a thesis or creative project which is an opportunity to explore an intellectual interest and produce an original piece of scholarly research. The thesis or creative project is supervised and defended in front of a faculty committee. Students are able to engage with professors who are nationally recognized in their fields and committed to working with honors students. Completing a Barrett thesis or creative project is an opportunity for undergraduate honors students to contribute to the ASU academic community in a meaningful way.

Displaying 1 - 4 of 4
Filtering by

Clear all filters

135141-Thumbnail Image.png
Description
Information Measurement Theory (IMT) is a concept which was devised for the purpose of understanding how information works in the universe. At its core, it states that 100% of information exists in the universe at any one time, and with enough perception, any event can be predicted based on the

Information Measurement Theory (IMT) is a concept which was devised for the purpose of understanding how information works in the universe. At its core, it states that 100% of information exists in the universe at any one time, and with enough perception, any event can be predicted based on the initial conditions preceding the event. With this idea in mind, the author of IMT developed the Kashiwagi Solution Model (KSM) which deals with how people best utilize the information present in the universe. Simply put, the ideas presented by KSM encourage people to think in a more logical manner through the utilization of relevant information. The following thesis details an autobiographical case study which focuses on the life of a college student undergoing severe depressive symptoms during the course of their academic career. The usage of concepts stemming from IMT and KSM are then used to determine the root causes of the depression in order to prevent it from ever happening again. The case study acts as a guide to others in order to better help them deal with similar situations that are happening with their lives while providing evidence that the concepts detailed by IMT and KSM are factually relevant.
ContributorsChauhan, Amit (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Barrett, The Honors College (Contributor)
Created2016-05
Description

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.

ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
Description
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05