Matching Items (591)
Filtering by

Clear all filters

132679-Thumbnail Image.png
Description
Background.
Type 2 Diabetes Mellitus (T2DM) is a leading cause of health disparities, among Hispanic populations, which are disproportionately afflicted by T2DM. The growing research strongly argues that diabetes treatment interventions should be culturally sensitive to address the needs of their target populations. Nonetheless, there is little consensus regarding the

Background.
Type 2 Diabetes Mellitus (T2DM) is a leading cause of health disparities, among Hispanic populations, which are disproportionately afflicted by T2DM. The growing research strongly argues that diabetes treatment interventions should be culturally sensitive to address the needs of their target populations. Nonetheless, there is little consensus regarding the necessary components of a culturally sensitive intervention. This review will examine the intervention contents and activities, and the strategies that have been implemented into culturally sensitive diabetes treatment interventions. This review will also to observe how interventions handle complex issues such as the heterogeneity of Hispanic populations and communities. The overarching research questions examined in this study were, “What are the core components of the culturally tailored diabetes interventions currently implemented with Hispanic populations in the US, and why are they needed?” and 2) “How are studies evaluating the impact of their interventions, and how can the proposed study designs be improved?”
Method.
A systematic review across 3 databases was used to identify culturally sensitive diabetes treatment interventions (CSDTI) developed for Hispanic populations. Accordingly, we searched for studies designed to treat Hispanic individuals already diagnosed with having T2DM. All identified studies provided information on the core components of these culturally sensitive interventions, while only studies that included a control or comparison group were used to assess how the studies evaluated outcomes.
Results.
First, we examined intervention effects as examined from two study designs. We examined a total of [17] interventions in this section. Our review of one study design (Design #1 Studies) includes 12 studies that developed a culturally sensitive intervention and evaluated it using a one-group pretest posttest design, or did not evaluate their intervention at all. A second study design (Design #2 Studies) includes 5 studies. These consisted of a two-group randomized controlled field study that conducted pre-post analyses of the culturally adapted intervention comparing it against a control or comparison group. The heterogeneity of all studies made a conventional meta-analysis impossible.
Second, another review section focused on examining and describing various culturally sensitive core components, we examined a total of 17 studies to describe the types of culturally sensitive components that were incorporated into the diabetes treatment intervention. This analysis resulted in a list of 11 general types of culturally sensitive components as included within these 17 interventions. Of the articles that used control or comparison groups, the manner in which interventions evaluated different outcome measures and their conclusions regarding success were examined.
Discussion.
The culturally sensitive aspects identified from these articles were used to address diverse issues that included: (a) communication barriers, (b) the inclusion of cultural relevant content, for relevance to Hispanic/Latinx patients’ lives, (c) selecting appropriate channels and settings for interventions, and (d) addressing specific cultural values, traditions, and beliefs that can either help or hinder healthy behaviors. It should be noted that the Hispanic populations are extremely heterogeneous, and so interventions that would be sensitive culturally to some sectors of a Hispanic community may not be sensitive to other Hispanic sectors of that same community. The issue of heterogeneity of Hispanic communities was addressed well by the authors of some articles and ignored by others.
Conclusions.
It was ultimately impossible draw quantitative conclusions regarding the efficacy or effectiveness of these two types of diabetes treatment interventions (CSDTIs) as delivered to their targeted sample of Hispanic participants. An emerging conclusion is that factors including ethics, cost, and lack of community acceptance, may constitute factors contributing to the higher proportion of one-group pre-test post-test designs and lower proportion of rigorous scientific designs. In the latter case, some communities oppose the use of randomized controlled studies within their community, and thus that objection may explain the low numbers of these randomized controlled studies. The use of viable and rigorous alternatives to RCTs have been proposed to address this community concern. In this review, the author sought to conduct comparative studies between culturally adapted interventions and their associated unaltered or minimally altered evidence-based interventions, although there exists various difficulties that are associated with the conduct of these analyses.
Core components of CSDTIs for Hispanic adults were identified, and their purposes were explained. Additionally, suggestions for improvement to studies were made, to aid in improving our knowledge of CSDTIs through future studies.
Created2019-05
132698-Thumbnail Image.png
Description
This study attempts to reconcile the gap in literature between the abundant research in the social consequences of sanctions but a consistent lack of information regarding its economic effectiveness. I apply a modified neoclassical growth model to analyze the extent that sanctions imposed by the US and UN impact real

This study attempts to reconcile the gap in literature between the abundant research in the social consequences of sanctions but a consistent lack of information regarding its economic effectiveness. I apply a modified neoclassical growth model to analyze the extent that sanctions imposed by the US and UN impact real per capita GDP growth rate. Using the original data, I modify the model employed in the Neuenkirch and Neumeier (2015) study by replacing a fixed effect model with time trends. The results are more aligned with previous economic research on sanctions where sanctions imposed by the US have a moderate but significant 1.5 percent decline effect on GDP growth rate. On the other hand, sanctions imposed by the UN are similarly negative, imposing about a .9 percent decline in GDP growth, however are not statistically significant. While I cannot reject the conclusion by the original authors, I feel that this model provides a more fitting analysis of the impact sanctions impose on GDP growth.
ContributorsHendricks-Costello, Caitlyn (Author) / Silverman, Daniel (Thesis director) / Mendez, Jose (Committee member) / Department of Economics (Contributor) / School of Politics and Global Studies (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132633-Thumbnail Image.png
Description
The physics of waves control most of the world, in multiple forms, such as electromagnetic waves. Mathematicians and physicists have developed equations which describe the patterns in which waves evolve over time, while moving through space. Due to their partial differential form, solutions to these equations must be approximated. This

The physics of waves control most of the world, in multiple forms, such as electromagnetic waves. Mathematicians and physicists have developed equations which describe the patterns in which waves evolve over time, while moving through space. Due to their partial differential form, solutions to these equations must be approximated. This study introduces a new numerical scheme to perform the approximation which is highly stable and computationally efficient. This numerical scheme is formulated with respect to Maxwell’s equations, employing spatial and temporal staggering to implement a fourth-order phase accuracy. It is then compared to the traditional Yee scheme and the Runge-Kutta 3 scheme in one-dimensional applications, revealing a similar accuracy to the Runge-Kutta 3 scheme while requiring less computations per time step. Simulations are then performed in the two-dimensional case. First, no boundary conditions are implemented, causing reflection at the edge of the spatial domain. Next, the simulation is conducted while employing absorbing boundary conditions, simulating wave propagation over an infinite spatial domain. These results are compared to the results of a large domain simulation, in which the wave propagation does not reach the boundaries. Comparing the simulations, it is concluded that the numerical scheme is stable and highly accurate when employing absorbing boundary conditions. Finally, the scheme is tested in two dimensions with wave propagation through nonlinear media, as opposed to the prior simulations which were performed as if in a vacuum. After performing spectral analysis on the resulting waves after a long-time domain simulation, the resulting angular frequencies match those expected from theory. Therefore, the scheme is concluded to be powerful in one-dimensional, two-dimensional, and nonlinear simulations, all while being computationally efficient.
ContributorsKirvan, Alex Ander (Author) / Moustaoui, Mohamed (Thesis director) / Kostelich, Eric (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132397-Thumbnail Image.png
Description
Professor Alarcon’s lab is producing proton beam detectors, and this project is focused on informing the decision as to which layout of detector is more effective at producing an accurate backprojection for an equal number of data channels. The comparison is between “square pad” detectors and “wire pad” detectors. The

Professor Alarcon’s lab is producing proton beam detectors, and this project is focused on informing the decision as to which layout of detector is more effective at producing an accurate backprojection for an equal number of data channels. The comparison is between “square pad” detectors and “wire pad” detectors. The square pad detector consists of a grid of square pads all of identical size, that each collect their own data. The wire pad detector consists of large rectangular pads that span the entire detector in one direction, with 2 additional layers of identical pads each rotated by 60° from the previous. In order to test each design Python was used to simulate Gaussian beams of varying amplitudes, position and size and integrate them in each of the two methods. They were then backprojected and fit to a Gaussian function and the error between the backprojected parameters and the original parameters of the beam were measured.
ContributorsFoley, Brendan (Author) / Alarcon, Ricardo (Thesis director) / Galyaev, Eugene (Committee member) / Department of Physics (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132421-Thumbnail Image.png
Description
The objective of this paper is to find and describe trends in the fast Fourier transformed accelerometer data that can be used to predict the mechanical failure of large vacuum pumps used in industrial settings, such as providing drinking water. Using three-dimensional plots of the data, this paper suggests how

The objective of this paper is to find and describe trends in the fast Fourier transformed accelerometer data that can be used to predict the mechanical failure of large vacuum pumps used in industrial settings, such as providing drinking water. Using three-dimensional plots of the data, this paper suggests how a model can be developed to predict the mechanical failure of vacuum pumps.
ContributorsHalver, Grant (Author) / Taylor, Tom (Thesis director) / Konstantinos, Tsakalis (Committee member) / Fricks, John (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
Description
Non-Destructive Testing (NDT) is integral to preserving the structural health of materials. Techniques that fall under the NDT category are able to evaluate integrity and condition of a material without permanently altering any property of the material. Additionally, they can typically be used while the material is in

Non-Destructive Testing (NDT) is integral to preserving the structural health of materials. Techniques that fall under the NDT category are able to evaluate integrity and condition of a material without permanently altering any property of the material. Additionally, they can typically be used while the material is in active use instead of needing downtime for inspection.
The two general categories of structural health monitoring (SHM) systems include passive and active monitoring. Active SHM systems utilize an input of energy to monitor the health of a structure (such as sound waves in ultrasonics), while passive systems do not. As such, passive SHM tends to be more desirable. A system could be permanently fixed to a critical location, passively accepting signals until it records a damage event, then localize and characterize the damage. This is the goal of acoustic emissions testing.
When certain types of damage occur, such as matrix cracking or delamination in composites, the corresponding release of energy creates sound waves, or acoustic emissions, that propagate through the material. Audio sensors fixed to the surface can pick up data from both the time and frequency domains of the wave. With proper data analysis, a time of arrival (TOA) can be calculated for each sensor allowing for localization of the damage event. The frequency data can be used to characterize the damage.
In traditional acoustic emissions testing, the TOA combined with wave velocity and information about signal attenuation in the material is used to localize events. However, in instances of complex geometries or anisotropic materials (such as carbon fibre composites), velocity and attenuation can vary wildly based on the direction of interest. In these cases, localization can be based off of the time of arrival distances for each sensor pair. This technique is called Delta T mapping, and is the main focus of this study.
ContributorsBriggs, Nathaniel (Author) / Chattopadhyay, Aditi (Thesis director) / Papandreou-Suppappola, Antonia (Committee member) / Skinner, Travis (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132515-Thumbnail Image.png
Description
This Creative Project was carried out in coordination with the capstone project, Around the Corner Imaging with Terahertz Waves. This capstone project deals with a system designed to implement Around the Corner, or Non Line-of-Sight (NLoS) Imaging. This document discusses the creation of a GUI using MATLAB to control the

This Creative Project was carried out in coordination with the capstone project, Around the Corner Imaging with Terahertz Waves. This capstone project deals with a system designed to implement Around the Corner, or Non Line-of-Sight (NLoS) Imaging. This document discusses the creation of a GUI using MATLAB to control the Terahertz Imaging system. The GUI was developed in response to a need for synchronization, ease of operation, easy parameter modification, and data management. Along the way, many design decisions were made ranging from choosing a software platform to determining how variables should be passed. These decisions and considerations are discussed in this document. The resulting GUI has measured up to the design criteria and will be able to be used by anyone wishing to use the Terahertz Imaging System for further research in the field of Around the Corner or NLoS Imaging.
ContributorsWood, Jacob Cannon (Author) / Trichopoulos, Georgios (Thesis director) / Aberle, James (Committee member) / Electrical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132524-Thumbnail Image.png
Description
This project looks at the change in strikeout patterns over the past 19 years of Major League Baseball. New research in 2001 revolutionized the pitching statistics field, and non-coincidentally, the number of strikeouts has ballooned since then. I first detail the statistical nature of the increase, looking at where the

This project looks at the change in strikeout patterns over the past 19 years of Major League Baseball. New research in 2001 revolutionized the pitching statistics field, and non-coincidentally, the number of strikeouts has ballooned since then. I first detail the statistical nature of the increase, looking at where the additional strikeouts are coming from. Then, a discussion of why this has happened, referencing changes in baseball strategy and talent usage optimization follows. The changes in the ways MLB teams use their pitching staffs are largely the cause of this increase. Similar research is cited to confirm that these strategy changes are valid and are having the effect of increasing strikeouts in the game. Strikeout numbers are then compared to other pitching statistics over the years to determine whether the increase has had any effect on other pitching metrics. Lastly, overall team success is looked at as a verification method as to whether the increased focus on increasing strikeouts has created positive results for major league teams. Teams making the MLB playoffs consistently ranked much higher than non-qualifying teams in terms of strikeout rates. Also included in the project are the details of data acquisition and manipulation, to ensure the figures used are valid. Ideas for future research and further work on the topic are included, as the amount of data available in this field is quite staggering. Further analysis could dive into the ways pitches themselves are changing, rather than looking at pitching outcomes. Overall, the project details and explains a major shift in the way baseball has been played over the last 19 years, complete with both pure data analysis and supplementary commentary and explanation
ContributorsCasalena, Jontito (Author) / Doig, Stephen (Thesis director) / Pomrenke, Jacob (Committee member) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132535-Thumbnail Image.png
Description
This honors thesis explores and models the flow of air around a cylindrical arrow that is rotating as it moves through the air. This model represents the airflow around an archery arrow after it is released from the bow and rotates while it flies through the air. This situation is

This honors thesis explores and models the flow of air around a cylindrical arrow that is rotating as it moves through the air. This model represents the airflow around an archery arrow after it is released from the bow and rotates while it flies through the air. This situation is important in archery because an understanding of the airflow allows archers to predict the flight of the arrow. As a result, archers can improve their accuracy and ability to hit targets. However, not many computational fluid dynamic simulations modeling the airflow around a rotating archery arrow exist. This thesis attempts to further the understanding of the airflow around a rotating archery arrow by creating a mathematical model to numerically simulate the airflow around the arrow in the presence of this rotation. This thesis uses a linearized approximation of the Navier Stokes equations to model the airflow around the arrow and explains the reasoning for using this simplification of the fully nonlinear Navier Stokes equations. This thesis continues to describe the discretization of these linearized equations using the finite difference method and the boundary conditions used for these equations. A MATLAB code solves the resulting system of equations in order to obtain a numerical simulation of this airflow around the rotating arrow. The results of the simulation for each velocity component and the pressure distribution are displayed. This thesis then discusses the results of the simulation, and the MATLAB code is analyzed to verify the convergence of the solution. Appendix A includes the full MATLAB code used for the flow simulation. Finally, this thesis explains potential future research topics, ideas, and improvements to the code that can help further the understanding and create more realistic simulations of the airflow around a flying archery arrow.
ContributorsCholinski, Christopher John (Author) / Tang, Wenbo (Thesis director) / Herrmann, Marcus (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
132572-Thumbnail Image.png
Description
The impact of the 2008 Great Recession was felt on a global level. While many European countries moved to
implement large fiscal adjustments in response to the financial crisis, various other economic consequences
were felt, such as inflation, public debt growth, and a decrease in purchasing power. A result from these
consequences that

The impact of the 2008 Great Recession was felt on a global level. While many European countries moved to
implement large fiscal adjustments in response to the financial crisis, various other economic consequences
were felt, such as inflation, public debt growth, and a decrease in purchasing power. A result from these
consequences that typically occur every recession are demand shocks within the employment sector. As firms
are put into tight financial positions, employers are forced to make employment decisions to cut costs for
long-term sustainability, such as laying off workers, or reducing their working hours.

This paper aims to investigate how weekly working hours are impacted by shocks to the economy across European countries. Using the 2008 recession as the basis, an empirical analysis was conducted with panel data for 32 countries over 33 years, with average weekly working hours across four occupational groups as the variable of interest, and various economic indicators such as GDP growth as independent variables. Additionally, countries were split up and grouped based on geographical location to examine potential country and region-specific trends.
Over time, there is a decreasing trend in weekly working hours across all observed occupations and countries. This decreasing trend continues during the 2008 recession, but the slope of decrease is not significant relative to the entire time period. However, when dis-aggregated into occupational groups with a distinction between full-time and part-time workers, the trends in working hours are a much more noticeable, both during the recession and over the entire time frame of observation.
ContributorsDong, William (Author) / Veramendi, Gregory (Thesis director) / Bick, Alexander (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Department of Economics (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05