Matching Items (2)
Filtering by

Clear all filters

133140-Thumbnail Image.png
Description
The Internet has made it possible to exchange information at a rapid rate. With this extraordinary ability, media companies and various other organizations have been able to communicate thoughts and information to an extremely large audience. As a result, news subscribers are overwhelmed with biased information, which makes it very

The Internet has made it possible to exchange information at a rapid rate. With this extraordinary ability, media companies and various other organizations have been able to communicate thoughts and information to an extremely large audience. As a result, news subscribers are overwhelmed with biased information, which makes it very easy to be misinformed. Unfortunately, there is currently no way to stay truly informed without spending countless hours searching the Internet for different viewpoints and ultimately using that information to formulate a sound understanding. This project (nicknamed "Newsie") solves this problem by providing news subscribers with many news sources to every topic, thereby saving them time and ultimately paving a way to a more informed society. Since one of the main goals of this project is to provide information to the largest number of people, Newsie is designed with availability in mind. Unsurprisingly, the most accessible method of communication is the Internet \u2014 more specifically, a website. Users will be able to access Newsie via a webpage, and easily view to most recent headlines with their corresponding articles from several sources. Another goal of the project is to classify different articles and sources based on their bias. After reading articles, users will be able to vote on their biases. This provides a crowdsourced method of determining bias.
ContributorsAlimov, Robert Joseph (Author) / Meuth, Ryan (Thesis director) / Franceschini, Enos (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2018-12
141320-Thumbnail Image.png
Description

This chapter integrates from cognitive neuroscience, cognitive psychology, and social psychology the basic science of bias in human judgment as relevant to judgments and decisions by forensic mental health professionals. Forensic mental health professionals help courts make decisions in cases when some question of psychology pertains to the legal issue,

This chapter integrates from cognitive neuroscience, cognitive psychology, and social psychology the basic science of bias in human judgment as relevant to judgments and decisions by forensic mental health professionals. Forensic mental health professionals help courts make decisions in cases when some question of psychology pertains to the legal issue, such as in insanity cases, child custody hearings, and psychological injuries in civil suits. The legal system itself and many people involved, such as jurors, assume mental health experts are “objective” and untainted by bias. However, basic psychological science from several branches of the discipline suggest the law’s assumption about experts’ protection from bias is wrong. Indeed, several empirical studies now show clear evidence of (unintentional) bias in forensic mental health experts’ judgments and decisions. In this chapter, we explain the science of how and why human judgments are susceptible to various kinds of bias. We describe dual-process theories from cognitive neuroscience, cognitive psychology, and social psychology that can help explain these biases. We review the empirical evidence to date specifically about cognitive and social psychological biases in forensic mental health judgments, weaving in related literature about biases in other types of expert judgment, with hypotheses about how forensic experts are likely affected by these biases. We close with a discussion of directions for future research and practice.

ContributorsNeal, Tess M.S. (Author) / Hight, Morgan (Author) / Howatt, Brian C. (Author) / Hamza, Cassandra (Author)
Created2017-04-30