Matching Items (66)
Filtering by

Clear all filters

136723-Thumbnail Image.png
Description
This paper explores how marginalist economics defines and inevitably constrains Victorian sensation fiction's content and composition. I argue that economic intuition implies that sensationalist heroes and antagonists, writers and readers all pursued a fundamental, "rational" aim: the attainment of pleasure. So although "sensationalism" took on connotations of moral impropriety in

This paper explores how marginalist economics defines and inevitably constrains Victorian sensation fiction's content and composition. I argue that economic intuition implies that sensationalist heroes and antagonists, writers and readers all pursued a fundamental, "rational" aim: the attainment of pleasure. So although "sensationalism" took on connotations of moral impropriety in the Victorian age, sensation fiction primarily involves experiences of pain on the page that excite the reader's pleasure. As such, sensationalism as a whole can be seen as a conformist product, one which mirrors the effects of all commodities on the market, rather than as a rebellious one. Indeed, contrary to modern and contemporary critics' assumptions, sensation fiction may not be as scandalous as it seems.
ContributorsFischer, Brett Andrew (Author) / Bivona, Daniel (Thesis director) / Looser, Devoney (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / School of Politics and Global Studies (Contributor) / Department of English (Contributor)
Created2014-12
136760-Thumbnail Image.png
Description
Through collection of survey data on the characteristics of college debaters, disparities in participation and success for women and racial and ethnic minorities are measured. This study then uses econometric tools to assess whether there is an in-group judging bias in college debate that systematically disadvantages female and minority participants.

Through collection of survey data on the characteristics of college debaters, disparities in participation and success for women and racial and ethnic minorities are measured. This study then uses econometric tools to assess whether there is an in-group judging bias in college debate that systematically disadvantages female and minority participants. Debate is used as a testing ground for competing economic theories of taste-based and statistical discrimination, applied to a higher education context. The study finds persistent disparities in participation and success for female participants. Judges are more likely to vote for debaters who share their gender. There is also a significant disparity in the participation of racial and ethnic minority debaters and judges, as well as female judges.
ContributorsVered, Michelle Nicole (Author) / Silverman, Daniel (Thesis director) / Symonds, Adam (Committee member) / Dillon, Eleanor (Committee member) / Barrett, The Honors College (Contributor) / Economics Program in CLAS (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Politics and Global Studies (Contributor)
Created2014-12
136691-Thumbnail Image.png
Description
Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has

Covering subsequences with sets of permutations arises in many applications, including event-sequence testing. Given a set of subsequences to cover, one is often interested in knowing the fewest number of permutations required to cover each subsequence, and in finding an explicit construction of such a set of permutations that has size close to or equal to the minimum possible. The construction of such permutation coverings has proven to be computationally difficult. While many examples for permutations of small length have been found, and strong asymptotic behavior is known, there are few explicit constructions for permutations of intermediate lengths. Most of these are generated from scratch using greedy algorithms. We explore a different approach here. Starting with a set of permutations with the desired coverage properties, we compute local changes to individual permutations that retain the total coverage of the set. By choosing these local changes so as to make one permutation less "essential" in maintaining the coverage of the set, our method attempts to make a permutation completely non-essential, so it can be removed without sacrificing total coverage. We develop a post-optimization method to do this and present results on sequence covering arrays and other types of permutation covering problems demonstrating that it is surprisingly effective.
ContributorsMurray, Patrick Charles (Author) / Colbourn, Charles (Thesis director) / Czygrinow, Andrzej (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2014-12
136516-Thumbnail Image.png
Description
Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot

Bots tamper with social media networks by artificially inflating the popularity of certain topics. In this paper, we define what a bot is, we detail different motivations for bots, we describe previous work in bot detection and observation, and then we perform bot detection of our own. For our bot detection, we are interested in bots on Twitter that tweet Arabic extremist-like phrases. A testing dataset is collected using the honeypot method, and five different heuristics are measured for their effectiveness in detecting bots. The model underperformed, but we have laid the ground-work for a vastly untapped focus on bot detection: extremist ideal diffusion through bots.
ContributorsKarlsrud, Mark C. (Author) / Liu, Huan (Thesis director) / Morstatter, Fred (Committee member) / Barrett, The Honors College (Contributor) / Computing and Informatics Program (Contributor) / Computer Science and Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136271-Thumbnail Image.png
Description
The OMFIT (One Modeling Framework for Integrated Tasks) modeling environment and the BRAINFUSE module have been deployed on the PPPL (Princeton Plasma Physics Laboratory) computing cluster with modifications that have rendered the application of artificial neural networks (NNs) to the TRANSP databases for the JET (Joint European Torus), TFTR (Tokamak

The OMFIT (One Modeling Framework for Integrated Tasks) modeling environment and the BRAINFUSE module have been deployed on the PPPL (Princeton Plasma Physics Laboratory) computing cluster with modifications that have rendered the application of artificial neural networks (NNs) to the TRANSP databases for the JET (Joint European Torus), TFTR (Tokamak Fusion Test Reactor), and NSTX (National Spherical Torus Experiment) devices possible through their use. This development has facilitated the investigation of NNs for predicting heat transport profiles in JET, TFTR, and NSTX, and has promoted additional investigations to discover how else NNs may be of use to scientists at PPPL. In applying NNs to the aforementioned devices for predicting heat transport, the primary goal of this endeavor is to reproduce the success shown in Meneghini et al. in using NNs for heat transport prediction in DIII-D. Being able to reproduce the results from is important because this in turn would provide scientists at PPPL with a quick and efficient toolset for reliably predicting heat transport profiles much faster than any existing computational methods allow; the progress towards this goal is outlined in this report, and potential additional applications of the NN framework are presented.
ContributorsLuna, Christopher Joseph (Author) / Tang, Wenbo (Thesis director) / Treacy, Michael (Committee member) / Orso, Meneghini (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Department of Physics (Contributor)
Created2015-05
136409-Thumbnail Image.png
Description
Twitter, the microblogging platform, has grown in prominence to the point that the topics that trend on the network are often the subject of the news and other traditional media. By predicting trends on Twitter, it could be possible to predict the next major topic of interest to the public.

Twitter, the microblogging platform, has grown in prominence to the point that the topics that trend on the network are often the subject of the news and other traditional media. By predicting trends on Twitter, it could be possible to predict the next major topic of interest to the public. With this motivation, this paper develops a model for trends leveraging previous work with k-nearest-neighbors and dynamic time warping. The development of this model provides insight into the length and features of trends, and successfully generalizes to identify 74.3% of trends in the time period of interest. The model developed in this work provides understanding into why par- ticular words trend on Twitter.
ContributorsMarshall, Grant A (Author) / Liu, Huan (Thesis director) / Morstatter, Fred (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2015-05
136442-Thumbnail Image.png
Description
A model has been developed to modify Euler-Bernoulli beam theory for wooden beams, using visible properties of wood knot-defects. Treating knots in a beam as a system of two ellipses that change the local bending stiffness has been shown to improve the fit of a theoretical beam displacement function to

A model has been developed to modify Euler-Bernoulli beam theory for wooden beams, using visible properties of wood knot-defects. Treating knots in a beam as a system of two ellipses that change the local bending stiffness has been shown to improve the fit of a theoretical beam displacement function to edge-line deflection data extracted from digital imagery of experimentally loaded beams. In addition, an Ellipse Logistic Model (ELM) has been proposed, using L1-regularized logistic regression, to predict the impact of a knot on the displacement of a beam. By classifying a knot as severely positive or negative, vs. mildly positive or negative, ELM can classify knots that lead to large changes to beam deflection, while not over-emphasizing knots that may not be a problem. Using ELM with a regression-fit Young's Modulus on three-point bending of Douglass Fir, it is possible estimate the effects a knot will have on the shape of the resulting displacement curve.
Created2015-05
136381-Thumbnail Image.png
Description
According to the Tax Policy Center, a joint project of the Brookings Institution and Urban Institute, the Earned Income Tax Credit (EITC) will provide 26 million households with 60 billion dollars of reduced taxes and refunds in 2015 \u2014 resources that serve to lift millions of families above the federal

According to the Tax Policy Center, a joint project of the Brookings Institution and Urban Institute, the Earned Income Tax Credit (EITC) will provide 26 million households with 60 billion dollars of reduced taxes and refunds in 2015 \u2014 resources that serve to lift millions of families above the federal poverty line. Responding to the popularity of EITC programs and recent discussion of its expansion for childless adults, I select three comparative case studies of state-level EITC reform from 2005 to 2013. Each state represents a different kind of policy reform: the creation of a supplemental credit in Connecticut, credit reduction in New Jersey, and finally credit expansion for childless adults in Maryland. For each case study, I use Current Population Survey panel data from the March Supplement to complete a differences-in-differences (DD) analysis of EITC policy changes. Specifically, I analyze effects of policy reform on total earned income, employment and usual hours worked. For comparison groups, I construct unique counterfactual populations of northeastern U.S. states, using people of color with less than a college degree as my treatment group for their increased sensitivity to EITC policy reform. I find no statistically significant effects of policy creation in Connecticut, significant decreases in employment and hours worked in New Jersey, and finally, significant increases in earnings and hours worked in Maryland. My work supports the findings of other empirical work, suggesting that awareness of new supplemental EITC programs is critical to their effectiveness while demonstrating that these types of programs can affect the labor supply and outcomes of eligible groups.
ContributorsRichard, Katherine Rose (Author) / Dillon, Eleanor Wiske (Thesis director) / Silverman, Daniel (Committee member) / Herbst, Chris (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor)
Created2015-05
133831-Thumbnail Image.png
Description
Many fear that the growth of automation and artificial intelligence will lead to massive unemployment since human labor would no longer be needed. Although automation does displace workers from their current jobs, it is unclear the total net effect on jobs this period of advancement will have. One possible solution

Many fear that the growth of automation and artificial intelligence will lead to massive unemployment since human labor would no longer be needed. Although automation does displace workers from their current jobs, it is unclear the total net effect on jobs this period of advancement will have. One possible solution to help displaced workers is a Universal Basic Income. A Universal Basic Income(UBI) is a set payment paid to all members of society regardless of working status. Compared to current unemployment programs, a Universal Basic Income does not restrict participants in how to spend the money and is more inclusive. This paper examines the effects of a UBI on a person's motivation to work through a study on current college students. There is reason to believe that a Universal Basic Income will lead to fewer people working as people may become dependent on a base payment to meet their basic needs and not look for work. In addition, some people may drop out of their current jobs and rely on a UBI as their main form of income. The current literature does not offer a consensus opinion on this relationship and more studies are being completed with the threat of mass unemployment looming. This study shows the effects of a UBI on participants' willingness to work and then applies these results to the current economic model. With these results and new economic model, a decision about future policies surrounding a UBI can be made.
ContributorsAgarwal, Raghav (Author) / Pulido Hernadez, Carlos (Thesis director) / Foster, William (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
137020-Thumbnail Image.png
Description
In many systems, it is difficult or impossible to measure the phase of a signal. Direct recovery from magnitude is an ill-posed problem. Nevertheless, with a sufficiently large set of magnitude measurements, it is often possible to reconstruct the original signal using algorithms that implicitly impose regularization conditions on this

In many systems, it is difficult or impossible to measure the phase of a signal. Direct recovery from magnitude is an ill-posed problem. Nevertheless, with a sufficiently large set of magnitude measurements, it is often possible to reconstruct the original signal using algorithms that implicitly impose regularization conditions on this ill-posed problem. Two such algorithms were examined: alternating projections, utilizing iterative Fourier transforms with manipulations performed in each domain on every iteration, and phase lifting, converting the problem to that of trace minimization, allowing for the use of convex optimization algorithms to perform the signal recovery. These recovery algorithms were compared on a basis of robustness as a function of signal-to-noise ratio. A second problem examined was that of unimodular polyphase radar waveform design. Under a finite signal energy constraint, the maximal energy return of a scene operator is obtained by transmitting the eigenvector of the scene Gramian associated with the largest eigenvalue. It is shown that if instead the problem is considered under a power constraint, a unimodular signal can be constructed starting from such an eigenvector that will have a greater return.
ContributorsJones, Scott Robert (Author) / Cochran, Douglas (Thesis director) / Diaz, Rodolfo (Committee member) / Barrett, The Honors College (Contributor) / Electrical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2014-05