Tess Neal is an Assistant Professor of Psychology in the ASU New College of Interdisciplinary Arts and Sciences and is a founding faculty member of the [Program on Law and Behavioral Science](http://lawpsych.asu.edu/). Dr. Neal has published one edited book and more than three dozen peer-reviewed publications in such journals as PLOS ONE; American Psychologist; Psychology, Public Policy, and Law; and Criminal Justice and Behavior. Neal is the recipient of the 2016 Saleem Shah Award for Early Career Excellence in Psychology and Law, co-awarded by the American Psychology-Law Society and the American Academy of Forensic Psychology. She was named a 2016 "Rising Star" by the Association for Psychological Science, a designation that recognizes outstanding psychological scientists in the earliest stages of their research career post-PhD "whose innovative work has already advanced the field and signals great potential for their continued contributions." She directs the ASU [Clinical and Legal Judgment Lab](http://psych-law.lab.asu.edu).

Displaying 1 - 3 of 3
Filtering by

Clear all filters

141335-Thumbnail Image.png
Description

The Sixth Amendment guarantees defendants the right to trial by an impartial jury. Attorneys are expected to obtain information about potential juror biases and then deselect biased jurors. Social networking sites may offer useful information about potential jurors. Although some attorneys and trial consultants have begun searching online sources for

The Sixth Amendment guarantees defendants the right to trial by an impartial jury. Attorneys are expected to obtain information about potential juror biases and then deselect biased jurors. Social networking sites may offer useful information about potential jurors. Although some attorneys and trial consultants have begun searching online sources for information about jurors, the privacy rights of potential jurors’ online content has yet to be defined by case law. Two studies explored the issue of possible intrusion into juror privacy. First, an active jury venire was searched for online content. Information was found for 36% of the jurors; however, 94% of the information was found through simple Google searches. Only 6% of the information we found was unique to other sites. We concluded that searching for potential jurors online is feasible, but that systematically searching sites other than Google is generally not an effective search strategy. In our second study we surveyed attorneys, trial consultants, law students, and undergraduate students about ethical and privacy issues in the use of public domain information for jury selection. Participants evidenced concern about the rights of jurors, the rights of the defendant and accuser, and the role of tradition in court processes.

ContributorsNeal, Tess M.S. (Author) / Cramer, Robert J. (Author) / Ziemke, Mitchell H. (Author) / Brodsky, Stanley L. (Author)
Created2013
141310-Thumbnail Image.png
Description

This project began as an attempt to develop systematic, measurable indicators of bias in written forensic mental health evaluations focused on the issue of insanity. Although forensic clinicians observed in this study did vary systematically in their report-writing behaviors on several of the indicators of interest, the data are most

This project began as an attempt to develop systematic, measurable indicators of bias in written forensic mental health evaluations focused on the issue of insanity. Although forensic clinicians observed in this study did vary systematically in their report-writing behaviors on several of the indicators of interest, the data are most useful in demonstrating how and why bias is hard to ferret out. Naturalistic data was used in this project (i.e., 122 real forensic insanity reports), which in some ways is a strength. However, given the nature of bias and the problem of inferring whether a particular judgment is biased, naturalistic data also made arriving at conclusions about bias difficult. This paper describes the nature of bias – including why it is a special problem in insanity evaluations – and why it is hard to study and document. It details the efforts made in an attempt to find systematic indicators of potential bias, and how this effort was successful in part but also how and why it failed. The lessons these efforts yield for future research are described. We close with a discussion of the limitations of this study and future directions for work in this area.

ContributorsNeal, Tess M.S. (Author)
Created2018-04-19
141326-Thumbnail Image.png
Description

Prisoners sentenced to death must be competent for execution before they can actually be executed (Ford v. Wainwright, 1986). The decision for many mental health professionals whether to conduct competence for execution evaluations may be fraught with complex ethical issues. Mental health professionals who do not personally support capital punishment

Prisoners sentenced to death must be competent for execution before they can actually be executed (Ford v. Wainwright, 1986). The decision for many mental health professionals whether to conduct competence for execution evaluations may be fraught with complex ethical issues. Mental health professionals who do not personally support capital punishment may have a particularly difficult decision to make in this regard but should seriously consider the consequences of their decisions. This article applies Bush, Connell, and Denney’s (2006) eight-step ethical decision-making model to the ethicality of deciding to or abstaining from conducting competence for execution evaluations. This article does not propose what decisions an individual evaluator should make regarding this work, but rather presents a systematic guide for mental health professionals (particularly those who do not support capital punishment) to consider.

ContributorsNeal, Tess M.S. (Author)
Created2010