Matching Items (5)
Filtering by

Clear all filters

151920-Thumbnail Image.png
Description
This study examined the relationship that gender in interaction with interpersonal problem type has with outcome in psychotherapy. A sample of 200 individuals, who sought psychotherapy at a counselor training facility, completed the Outcome Questionnaire-45(OQ-45) and the reduced version of the Inventory of Interpersonal Problems (IIP-32). This study was aimed

This study examined the relationship that gender in interaction with interpersonal problem type has with outcome in psychotherapy. A sample of 200 individuals, who sought psychotherapy at a counselor training facility, completed the Outcome Questionnaire-45(OQ-45) and the reduced version of the Inventory of Interpersonal Problems (IIP-32). This study was aimed at examining whether gender (male and female), was related to treatment outcome, and whether this relationship was moderated by two interpersonal distress dimensions: dominance and affiliation. A hierarchical regression analyses was performed and indicated that gender did not predict psychotherapy treatment outcome, and neither dominance nor affiliation were moderators of the relationship between gender and outcome in psychotherapy.
ContributorsHoffmann, Nicole (Author) / Tracey, Terence (Thesis advisor) / Kinnier, Richard (Committee member) / Homer, Judith (Committee member) / Arizona State University (Publisher)
Created2013
137301-Thumbnail Image.png
Description
Christian psychotherapy appears to be useful especially for Christian clients seeking therapy, and is a growing preference among this population. Thus, the need for research on the efficacy and effectiveness of Christian interventions must be recognized. This study reviews 13 effectiveness and 21 efficacy studies of Christian psychotherapeutic interventions in

Christian psychotherapy appears to be useful especially for Christian clients seeking therapy, and is a growing preference among this population. Thus, the need for research on the efficacy and effectiveness of Christian interventions must be recognized. This study reviews 13 effectiveness and 21 efficacy studies of Christian psychotherapeutic interventions in various areas of psychotherapy. The majority of effectiveness and efficacy studies were shown to give positive outcomes for Christian psychotherapy, and overall, Christian psychotherapy is promising as an effective alternative to secular therapy. The need for further research in most areas is discussed.
ContributorsRodriguez, Gina Alexandra (Author) / Valiente, Carlos (Thesis director) / Seeley, Bridget (Committee member) / Skinner, Tad (Committee member) / Barrett, The Honors College (Contributor) / School of Music (Contributor) / Department of Psychology (Contributor)
Created2014-05
Description

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.

ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
Description
The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed

The recent popularity of ChatGPT has brought into question the future of many lines of work, among them, psychotherapy. This thesis aims to determine whether or not AI chatbots should be used by undergraduates with depression as a form of mental healthcare. Because of barriers to care such as understaffed campus counseling centers, stigma, and issues of accessibility, AI chatbots could perhaps bridge the gap between this demographic and receiving help. This research includes findings from studies, meta-analyses, reports, and Reddit posts from threads documenting people’s experiences using ChatGPT as a therapist. Based on these findings, only mental health AI chatbots specifically can be considered appropriate for psychotherapeutic purposes. Certain chatbots that are designed purposefully to discuss mental health with users can provide support to undergraduates with mild to moderate symptoms of depression. AI chatbots that promise companionship should never be used as a form of mental healthcare. ChatGPT should generally be avoided as a form of mental healthcare, except to perhaps ask for referrals to resources. Non mental health-focused chatbots should be trained to respond with referrals to mental health resources and emergency services when they detect inputs related to mental health, and suicidality especially. In the future, AI chatbots could be used to notify mental health professionals of reported symptom changes in their patients, as well as pattern detectors to help individuals with depression understand fluctuations in their symptoms. AI more broadly could also be used to enhance therapist training.
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05
ContributorsSimmons, Emily (Author) / Bronowitz, Jason (Thesis director) / Grumbach, Elizabeth (Committee member) / Barrett, The Honors College (Contributor) / Department of Psychology (Contributor)
Created2023-05