Matching Items (9)
Filtering by

Clear all filters

151788-Thumbnail Image.png
Description
There has been a tremendous amount of innovation in policing over the last 40 years, from community and problem-oriented policing to hot spots and intelligence-led policing. Many of these innovations have been subjected to empirical testing, with mixed results on effectiveness. The latest innovation in policing is the Bureau of

There has been a tremendous amount of innovation in policing over the last 40 years, from community and problem-oriented policing to hot spots and intelligence-led policing. Many of these innovations have been subjected to empirical testing, with mixed results on effectiveness. The latest innovation in policing is the Bureau of Justice Assistance's Smart Policing Initiative (2009). Created in 2009, the SPI provides funding to law enforcement agencies to develop and test evidence-based practices to address crime and disorder. Researchers have not yet tested the impact of the SPI on the funded agencies, particularly with regard to core principles of the Initiative. The most notable of these is the collaboration between law enforcement agencies and their research partners. The current study surveyed SPI agencies and their research partners on key aspects of their Initiative. The current study uses mean score comparisons and qualitative responses to evaluate this partnership to determine the extent of its value and effect. It also seeks to determine the areas of police agency crime analysis and research units that are most in need of enhancement. Findings indicate that the research partners are actively involved in a range of aspects involved in problem solving under the Smart Policing Initiative, and that they have positively influenced police agencies' research and crime analysis functions, and to a lesser extent, have positively impacted police agencies' tactical operations. Additionally, personnel, technology, and training were found to be the main areas of the crime analysis and research units that still need to be enhanced. The thesis concludes with a discussion of the implications of these findings for police policy and practice.
ContributorsMartin-Roethele, Chelsie (Author) / White, Michael D. (Thesis advisor) / Ready, Justin (Committee member) / D'Anna, Matthew (Committee member) / Arizona State University (Publisher)
Created2013
152586-Thumbnail Image.png
Description
The computation of the fundamental mode in structural moment frames provides valuable insight into the physical response of the frame to dynamic or time-varying loads. In standard practice, it is not necessary to solve for all n mode shapes in a structural system; it is therefore practical to limit the

The computation of the fundamental mode in structural moment frames provides valuable insight into the physical response of the frame to dynamic or time-varying loads. In standard practice, it is not necessary to solve for all n mode shapes in a structural system; it is therefore practical to limit the system to some determined number of r significant mode shapes. Current building codes, such as the American Society of Civil Engineers (ASCE), require certain class of structures to obtain 90% effective mass participation as a way to estimate the accuracy of a solution for base shear motion. A parametric study was performed from the collected data obtained by the analysis of a large number of framed structures. The purpose of this study was the development of rules for the required number of r significant modes to meet the ASCE code requirements. The study was based on the implementation of an algorithm and a computer program developed in the past. The algorithm is based on Householders Transformations, QR Factorization, and Inverse Iteration and it extracts a requested s (s<< n) number of predominate mode shapes and periods. Only the first r (r < s) of these modes are accurate. To verify the accuracy of the algorithm a variety of building frames have been analyzed using the commercially available structural software (RISA 3D) as a benchmark. The salient features of the algorithm are presented briefly in this study.
ContributorsGrantham, Jonathan (Author) / Fafitis, Apostolos (Thesis advisor) / Attard, Thomas (Committee member) / Houston, Sandra (Committee member) / Hjelmstad, Keith (Committee member) / Arizona State University (Publisher)
Created2014
152777-Thumbnail Image.png
Description
The objective of this thesis is to investigate the various types of energy end-uses to be expected in future high efficiency single family residences. For this purpose, this study has analyzed monitored data from 14 houses in the 2013 Solar Decathlon competition, and segregates the energy consumption patterns in various

The objective of this thesis is to investigate the various types of energy end-uses to be expected in future high efficiency single family residences. For this purpose, this study has analyzed monitored data from 14 houses in the 2013 Solar Decathlon competition, and segregates the energy consumption patterns in various residential end-uses (such as lights, refrigerators, washing machines, ...). The analysis was not straight-forward since these homes were operated according to schedules previously determined by the contest rules. The analysis approach allowed the isolation of the comfort energy use by the Heating, Venting and Cooling (HVAC) systems. HVAC are the biggest contributors to energy consumption during operation of a building, and therefore are a prime concern for energy performance during the building design and the operation. Both steady state and dynamic models of comfort energy use which take into account variations in indoor and outdoor temperatures, solar radiation and thermal mass of the building were explicitly considered. Steady State Inverse Models are frequently used for thermal analysis to evaluate HVAC energy performance. These are fast, accurate, offer great flexibility for mathematical modifications and can be applied to a variety of buildings. The results are presented as a horizontal study that compares energy consumption across homes to arrive at a generic rather than unique model - to be used in future discussions in the context of ultra efficient homes. It is suggested that similar analyses of the energy-use data that compare the performance of variety of ultra efficient technologies be conducted to provide more accurate indications of the consumption by end use for future single family residences. These can be used alongside the Residential Energy Consumption Survey (RECS) and the Leading Indicator for Remodeling Activity (LIRA) indices to assist in planning and policy making related to residential energy sector.
ContributorsGarkhail, Rahul (Author) / Reddy, T Agami (Thesis advisor) / Bryan, Harvey (Committee member) / Addison, Marlin (Committee member) / Arizona State University (Publisher)
Created2014
150990-Thumbnail Image.png
Description
The world of healthcare can be seen as dynamic, often an area where technology and science meet to consummate a greater good for humanity. This relationship has been working well for the last century as evident by the average life expectancy change. For the greater of the last five decades

The world of healthcare can be seen as dynamic, often an area where technology and science meet to consummate a greater good for humanity. This relationship has been working well for the last century as evident by the average life expectancy change. For the greater of the last five decades the average life expectancy at birth increased globally by almost 20 years. In the United States specifically, life expectancy has grown from 50 years in 1900 to 78 years in 2009. That is a 76% increase in just over a century. As great as this increase sounds for humanity it means there are soon to be real issues in the healthcare world. A larger older population will need more healthcare services but have fewer young professionals to provide those services. Technology and science will need to continue to push the boundaries in order to develop and provide the solutions needed to continue providing the aging world population sufficient healthcare. One solution sure to help provide a brighter future for healthcare is mobile health (m-health). M-health can help provide a means for healthcare professionals to treat more patients with less work expenditure and do so with more personalized healthcare advice which will lead to better treatments. This paper discusses one area of m-health devices specifically; human breath analysis devices. The current laboratory methods of breath analysis and why these methods are not adequate for common healthcare practices will be discussed in more detail. Then more specifically, mobile breath analysis devices are discussed. The topic will encompass the challenges that need to be met in developing such devices, possible solutions to these challenges, two real examples of mobile breath analysis devices and finally possible future directions for m-health technologies.
ContributorsLester, Bryan (Author) / Forzani, Erica (Thesis advisor) / Xian, Xiaojun (Committee member) / Trimble, Steve (Committee member) / Arizona State University (Publisher)
Created2012
151125-Thumbnail Image.png
Description
This research proposes that a cross-cultural disconnect exists between Japanese and American English in the realm of bodily functions used as metaphor. Perhaps nowhere is this notion illustrated more clearly than by a cartoon that was inspired by recent tragic events in Japan. In the afternoon of Friday, March 11,

This research proposes that a cross-cultural disconnect exists between Japanese and American English in the realm of bodily functions used as metaphor. Perhaps nowhere is this notion illustrated more clearly than by a cartoon that was inspired by recent tragic events in Japan. In the afternoon of Friday, March 11, 2011, the northeast coast of Japan was struck by a massive earthquake and tsunami that caused immeasurable loss of life and property and catastrophic damage to the nuclear power plant in Fukushima Prefecture. In the immediate wake of these events, Japanese artist Hachiya Kazuhiko, determined to make the situation comprehensible to children, created a cartoon in which he anthropomorphized the damaged Fukushima Daiichi reactor and likened the dangers associated with it to illness and bodily functions. This cartoon garnered considerable notoriety, both in Japan and abroad. The reactions of English speakers appeared to differ from those of Japanese speakers, suggesting the existence of a possible cross-cultural disconnect. This research into the reactions to the cartoon and other relevant literature (both in English and Japanese), viewed against federal regulations regarding the broadcast of "obscenity" in the United States, commentary on American society, and how the use of similar language in American cartoons is seen, clearly indicates that negative attitudes toward the use of bodily functions as metaphor exist in the United States, while the same usage is seen differently in Japan.
ContributorsHacker, Michael (Author) / Adams, Karen (Thesis advisor) / Gelderen, Elly van (Thesis advisor) / Prior, Matthew (Committee member) / Arizona State University (Publisher)
Created2012
156852-Thumbnail Image.png
Description
Modern computer systems are complex engineered systems involving a large collection of individual parts, each with many parameters, or factors, affecting system performance. One way to understand these complex systems and their performance is through experimentation. However, most modern computer systems involve such a large number of factors that thorough

Modern computer systems are complex engineered systems involving a large collection of individual parts, each with many parameters, or factors, affecting system performance. One way to understand these complex systems and their performance is through experimentation. However, most modern computer systems involve such a large number of factors that thorough experimentation on all of them is impossible. An initial screening step is thus necessary to determine which factors are relevant to the system's performance and which factors can be eliminated from experimentation.

Factors may impact system performance in different ways. A factor at a specific level may significantly affect performance as a main effect, or in combination with other main effects as an interaction. For screening, it is necessary both to identify the presence of these effects and to locate the factors responsible for them. A locating array is a relatively new experimental design that causes every main effect and interaction to occur and distinguishes all sets of d main effects and interactions from each other in the tests where they occur. This design is therefore helpful in screening complex systems.

The process of screening using locating arrays involves multiple steps. First, a locating array is constructed for all possibly significant factors. Next, the system is executed for all tests indicated by the locating array and a response is observed. Finally, the response is analyzed to identify the significant system factors for future experimentation. However, simply constructing a reasonably sized locating array for a large system is no easy task and analyzing the response of the tests presents additional difficulties due to the large number of possible predictors and the inherent imbalance in the experimental design itself. Further complications can arise from noise in the system or errors in testing.

This thesis has three contributions. First, it provides an algorithm to construct locating arrays using the Lovász Local Lemma with Moser-Tardos resampling. Second, it gives an algorithm to analyze the system response efficiently. Finally, it studies the robustness of the analysis to the heavy-hitters assumption underlying the approach as well as to varying amounts of system noise.
ContributorsSeidel, Stephen (Author) / Syrotiuk, Violet R. (Thesis advisor) / Colbourn, Charles J (Committee member) / Montgomery, Douglas C. (Committee member) / Arizona State University (Publisher)
Created2018
154659-Thumbnail Image.png
Description
In the past 10 to 15 years, there has been a tremendous increase in the amount of photovoltaic (PV) modules being both manufactured and installed in the field. Power plants in the hundreds of megawatts are continuously being turned online as the world turns toward greener and sustainable energy. Due

In the past 10 to 15 years, there has been a tremendous increase in the amount of photovoltaic (PV) modules being both manufactured and installed in the field. Power plants in the hundreds of megawatts are continuously being turned online as the world turns toward greener and sustainable energy. Due to this fact and to calculate LCOE (levelized cost of energy), it is understandably becoming more important to comprehend the behavior of these systems as a whole by calculating two key data: the rate at which modules are degrading in the field; the trend (linear or nonlinear) in which the degradation is occurring. As opposed to periodical in field intrusive current-voltage (I-V) measurements, non-intrusive measurements are preferable to obtain these two key data since owners do not want to lose money by turning their systems off, as well as safety and breach of installer warranty terms. In order to understand the degradation behavior of PV systems, there is a need for highly accurate performance modeling. In this thesis 39 commercial PV power plants from the hot-dry climate of Arizona are analyzed to develop an understanding on the rate and trend of degradation seen by crystalline silicon PV modules. A total of three degradation rates were calculated for each power plant based on three methods: Performance Ratio (PR), Performance Index (PI), and raw kilowatt-hour. These methods were validated from in field I-V measurements obtained by Arizona State University Photovoltaic Reliability Lab (ASU-PRL). With the use of highly accurate performance models, the generated degradation rates may be used by the system owners to claim a warranty from PV module manufactures or other responsible parties.
ContributorsRaupp, Christopher (Author) / Tamizhmani, Govindasamy (Thesis advisor) / Srinivasan, Devarajan (Committee member) / Rogers, Bradley (Committee member) / Arizona State University (Publisher)
Created2016
149584-Thumbnail Image.png
Description
Construction work is ergonomically hazardous, as it requires numerous awkward postures, heavy lifting and other forceful exertions. Prolonged repetition and overexertion have a cumulative effect on workers often resulting in work related musculoskeletal disorders (WMSDs). The United States spends approximately $850 billion a year on WMSDs. Mechanical

Construction work is ergonomically hazardous, as it requires numerous awkward postures, heavy lifting and other forceful exertions. Prolonged repetition and overexertion have a cumulative effect on workers often resulting in work related musculoskeletal disorders (WMSDs). The United States spends approximately $850 billion a year on WMSDs. Mechanical installation workers experience serious overexertion injuries at rates exceeding the national average for all industries and all construction workers, and second only to laborers. The main contributing factors of WMSDs are ergonomic loads and extreme stresses due to incorrect postures. The motivation for this study is to reduce the WMSDs among mechanical system (HVAC system) installation workers. To achieve this goal, it is critical to reduce the ergonomic loads and extreme postures of these installers. This study has the following specific aims: (1) To measure the ergonomic loads on specific body regions (shoulders, back, neck, and legs) for different HVAC installation activities; and (2) To investigate how different activity parameters (material characteristics, equipment, workers, etc.) affect the severity and duration of ergonomic demands. The study focuses on the following activities: (1) layout, (2) ground assembly of ductwork, and (3) installation of duct and equipment at ceiling height using different methods. The researcher observed and analyzed 15 HVAC installation activities among three Arizona mechanical contractors. Ergonomic analysis of the activities using a postural guide developed from RULA and REBA methods was performed. The simultaneous analysis of the production tasks and the ergonomic loads identified the tasks with the highest postural loads for different body regions and the influence of the different work variables on extreme body postures. Based on this analysis the results support recommendations to mitigate long duration activities and exposure to extreme postures. These recommendations can potentially reduce risk, improve productivity and lower injury costs in the long term.
ContributorsHussain, Sanaa Fatima (Author) / Mitropoulos, Panagiotis (Thesis advisor) / Wiezel, Avi (Committee member) / Guarascio-Howard, Linda (Committee member) / Arizona State University (Publisher)
Created2011
171401-Thumbnail Image.png
Description
Parkinson’s Disease is one of the most complicated and abundantneurodegenerative diseases in the world. Previous analysis of Parkinson’s disease has identified both speech and gait deficits throughout progression of the disease. There has been minimal research looking into the correlation between both the speech and gait deficits in those diagnosed with Parkinson’s. There

Parkinson’s Disease is one of the most complicated and abundantneurodegenerative diseases in the world. Previous analysis of Parkinson’s disease has identified both speech and gait deficits throughout progression of the disease. There has been minimal research looking into the correlation between both the speech and gait deficits in those diagnosed with Parkinson’s. There is high indication that there is a correlation between the two given the similar pathology and origins of both deficits. This exploratory study aims to establish correlation between both the gait and speech deficits in those diagnosed with Parkinson’s disease. Using previously identified motor and speech measurements and tasks, I conducted a correlational study of individuals with Parkinson’s disease at baseline. There were correlations between multiple speech and gait variability outcomes. The expected correlations ranged from average harmonics-to-noise ratio values against anticipatory postural adjustments-lateral peak distance to average shimmer values against anticipatory postural adjustments-lateral peak distance. There were also unexpected outcomes that ranged from F2 variability against the average number of steps in a turn to intensity variability against step duration variability. I also analyzed the speech changes over 1 year as a secondary outcome of the study. Finally, I found that averages and variabilities increased over 1 year regarding speech primary outcomes. This study serves as a basis for further treatment that may be able to simultaneously treat both speech and gait deficits in those diagnosed with Parkinson’s. The exploratory study also indicates multiple targets for further investigation to better understand cohesive and compensatory mechanisms.
ContributorsBelnavis, Alexander Salvador (Author) / Peterson, Daniel (Thesis advisor) / Daliri, Ayoub (Committee member) / Berisha, Visar (Committee member) / Arizona State University (Publisher)
Created2022