Matching Items (10)
Filtering by

Clear all filters

150085-Thumbnail Image.png
Description
The wood-framing trade has not sufficiently been investigated to understand the work task sequencing and coordination among crew members. A new mental framework for a performing crew was developed and tested through four case studies. This framework ensured similar team performance as the one provided by task micro-scheduling in planning

The wood-framing trade has not sufficiently been investigated to understand the work task sequencing and coordination among crew members. A new mental framework for a performing crew was developed and tested through four case studies. This framework ensured similar team performance as the one provided by task micro-scheduling in planning software. It also allowed evaluation of the effect of individual coordination within the crew on the crew's productivity. Using design information, a list of micro-activities/tasks and their predecessors was automatically generated for each piece of lumber in the four wood frames. The task precedence was generated by applying elementary geometrical and technological reasoning to each frame. Then, the duration of each task was determined based on observations from videotaped activities. Primavera's (P6) resource leveling rules were used to calculate the sequencing of tasks and the minimum duration of the whole activity for various crew sizes. The results showed quick convergence towards the minimum production time and allowed to use information from Building Information Models (BIM) to automatically establish the optimal crew sizes for frames. Late Start (LS) leveling priority rule gave the shortest duration in every case. However, the logic of LS tasks rule is too complex to be conveyed to the framing crew. Therefore, the new mental framework of a well performing framer was developed and tested to ensure high coordination. This mental framework, based on five simple rules, can be easily taught to the crew and ensures a crew productivity congruent with the one provided by the LS logic. The case studies indicate that once the worst framer in the crew surpasses the limit of 11% deviation from applying the said five rules, every additional percent of deviation reduces the productivity of the whole crew by about 4%.
ContributorsMaghiar, Marcel M (Author) / Wiezel, Avi (Thesis advisor) / Mitropoulos, Panagiotis (Committee member) / Cooke, Nancy J. (Committee member) / Arizona State University (Publisher)
Created2011
156924-Thumbnail Image.png
Description
Highly automated vehicles require drivers to remain aware enough to takeover

during critical events. Driver distraction is a key factor that prevents drivers from reacting

adequately, and thus there is need for an alert to help drivers regain situational awareness

and be able to act quickly and successfully should a

Highly automated vehicles require drivers to remain aware enough to takeover

during critical events. Driver distraction is a key factor that prevents drivers from reacting

adequately, and thus there is need for an alert to help drivers regain situational awareness

and be able to act quickly and successfully should a critical event arise. This study

examines two aspects of alerts that could help facilitate driver takeover: mode (auditory

and tactile) and direction (towards and away). Auditory alerts appear to be somewhat

more effective than tactile alerts, though both modes produce significantly faster reaction

times than no alert. Alerts moving towards the driver also appear to be more effective

than alerts moving away from the driver. Future research should examine how

multimodal alerts differ from single mode, and see if higher fidelity alerts influence

takeover times.
ContributorsBrogdon, Michael A (Author) / Gray, Robert (Thesis advisor) / Branaghan, Russell (Committee member) / Chiou, Erin (Committee member) / Arizona State University (Publisher)
Created2018
133413-Thumbnail Image.png
Description
Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on

Catastrophe events occur rather infrequently, but upon their occurrence, can lead to colossal losses for insurance companies. Due to their size and volatility, catastrophe losses are often treated separately from other insurance losses. In fact, many property and casualty insurance companies feature a department or team which focuses solely on modeling catastrophes. Setting reserves for catastrophe losses is difficult due to their unpredictable and often long-tailed nature. Determining loss development factors (LDFs) to estimate the ultimate loss amounts for catastrophe events is one method for setting reserves. In an attempt to aid Company XYZ set more accurate reserves, the research conducted focuses on estimating LDFs for catastrophes which have already occurred and have been settled. Furthermore, the research describes the process used to build a linear model in R to estimate LDFs for Company XYZ's closed catastrophe claims from 2001 \u2014 2016. This linear model was used to predict a catastrophe's LDFs based on the age in weeks of the catastrophe during the first year. Back testing was also performed, as was the comparison between the estimated ultimate losses and actual losses. Future research consideration was proposed.
ContributorsSwoverland, Robert Bo (Author) / Milovanovic, Jelena (Thesis director) / Zicarelli, John (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
154942-Thumbnail Image.png
Description
Tolerance specification for manufacturing components from 3D models is a tedious task and often requires expertise of “detailers”. The work presented here is a part of a larger ongoing project aimed at automating tolerance specification to aid less experienced designers by producing consistent geometric dimensioning and tolerancing (GD&T). Tolerance specification

Tolerance specification for manufacturing components from 3D models is a tedious task and often requires expertise of “detailers”. The work presented here is a part of a larger ongoing project aimed at automating tolerance specification to aid less experienced designers by producing consistent geometric dimensioning and tolerancing (GD&T). Tolerance specification can be separated into two major tasks; tolerance schema generation and tolerance value specification. This thesis will focus on the latter part of automated tolerance specification, namely tolerance value allocation and analysis. The tolerance schema (sans values) required prior to these tasks have already been generated by the auto-tolerancing software. This information is communicated through a constraint tolerance feature graph file developed previously at Design Automation Lab (DAL) and is consistent with ASME Y14.5 standard.

The objective of this research is to allocate tolerance values to ensure that the assemblability conditions are satisfied. Assemblability refers to “the ability to assemble/fit a set of parts in specified configuration given a nominal geometry and its corresponding tolerances”. Assemblability is determined by the clearances between the mating features. These clearances are affected by accumulation of tolerances in tolerance loops and hence, the tolerance loops are extracted first. Once tolerance loops have been identified initial tolerance values are allocated to the contributors in these loops. It is highly unlikely that the initial allocation would satisfice assemblability requirements. Overlapping loops have to be simultaneously satisfied progressively. Hence, tolerances will need to be re-allocated iteratively. This is done with the help of tolerance analysis module.

The tolerance allocation and analysis module receives the constraint graph which contains all basic dimensions and mating constraints from the generated schema. The tolerance loops are detected by traversing the constraint graph. The initial allocation distributes the tolerance budget computed from clearance available in the loop, among its contributors in proportion to the associated nominal dimensions. The analysis module subjects the loops to 3D parametric variation analysis and estimates the variation parameters for the clearances. The re-allocation module uses hill climbing heuristics derived from the distribution parameters to select a loop. Re-allocation Of the tolerance values is done using sensitivities and the weights associated with the contributors in the stack.

Several test cases have been run with this software and the desired user input acceptance rates are achieved. Three test cases are presented and output of each module is discussed.
ContributorsBiswas, Deepanjan (Author) / Shah, Jami J. (Thesis advisor) / Davidson, Joseph (Committee member) / Ren, Yi (Committee member) / Arizona State University (Publisher)
Created2016
155460-Thumbnail Image.png
Description
On-line dynamic security assessment (DSA) analysis has been developed and applied in several power dispatching control centers. Existing applications of DSA systems are limited by the assumption of the present system operating conditions and computational speeds. To overcome these obstacles, this research developed a novel two-stage DSA system to provide

On-line dynamic security assessment (DSA) analysis has been developed and applied in several power dispatching control centers. Existing applications of DSA systems are limited by the assumption of the present system operating conditions and computational speeds. To overcome these obstacles, this research developed a novel two-stage DSA system to provide periodic security prediction in real time. The major contribution of this research is to develop an open source on-line DSA system incorporated with Phasor Measurement Unit (PMU) data and forecast load. The pre-fault prediction of the system can provide more accurate assessment of the system and minimize the disadvantage of a low computational speed of time domain simulation.

This Thesis describes the development of the novel two-stage on-line DSA scheme using phasor measurement and load forecasting data. The computational scheme of the new system determines the steady state stability and identifies endangerments in a small time frame near real time. The new on-line DSA system will periodically examine system status and predict system endangerments in the near future every 30 minutes. System real-time operating conditions will be determined by state estimation using phasor measurement data. The assessment of transient stability is carried out by running the time-domain simulation using a forecast working point as the initial condition. The forecast operating point is calculated by DC optimal power flow based on forecast load.
ContributorsWang, Qiushi (Author) / Karady, George G. (Thesis advisor) / Pal, Anamitra (Committee member) / Holbert, Keith E. (Committee member) / Arizona State University (Publisher)
Created2017
155399-Thumbnail Image.png
Description
The 21st century will be the site of numerous changes in education systems in response to a rapidly evolving technological environment where existing skill sets and career structures may cease to exist or, at the very least, change dramatically. Likewise, the nature of work will also change to become more

The 21st century will be the site of numerous changes in education systems in response to a rapidly evolving technological environment where existing skill sets and career structures may cease to exist or, at the very least, change dramatically. Likewise, the nature of work will also change to become more automated and more technologically intensive across all sectors, from food service to scientific research. Simply having technical expertise or the ability to process and retain facts will in no way guarantee success in higher education or a satisfying career. Instead, the future will value those educated in a way that encourages collaboration with technology, critical thinking, creativity, clear communication skills, and strong lifelong learning strategies. These changes pose a challenge for higher education’s promise of employability and success post-graduation. Addressing how to prepare students for a technologically uncertain future is challenging. One possible model for education to prepare students for the future of work can be found within the Maker Movement. However, it is not fully understood what parts of this movement are most meaningful to implement in education more broadly, and higher education in particular. Through the qualitative analysis of nearly 160 interviews of adult makers, young makers and young makers’ parents, this dissertation unpacks how makers are learning, what they are learning, and how these qualities are applicable to education goals and the future of work in the 21st century. This research demonstrates that makers are learning valuable skills to prepare them for the future of work in the 21st century. Makers are learning communication skills, technical skills in fabrication and design, and developing lifelong learning strategies that will help prepare them for life in an increasingly technologically integrated future. This work discusses what aspects of the Maker Movement are most important for integration into higher education.
ContributorsWigner, Aubrey (Author) / Lande, Micah (Thesis advisor) / Allenby, Braden (Committee member) / Bennett, Ira (Committee member) / Arizona State University (Publisher)
Created2017
154871-Thumbnail Image.png
Description
Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job to devise a proper tolerance scheme to allow reasonable freedom to a manufacturer for imperfections without compromising performance. It takes

Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job to devise a proper tolerance scheme to allow reasonable freedom to a manufacturer for imperfections without compromising performance. It takes years of experience and strong practical knowledge of the device function, manufacturing process and GD&T standards for a designer to create a good tolerance scheme. There is almost no theoretical resource to help designers in GD&T synthesis. As a result, designers often create inconsistent and incomplete tolerance schemes that lead to high assembly scrap rates. Auto-Tolerancing project was started in the Design Automation Lab (DAL) to investigate the degree to which tolerance synthesis can be automated. Tolerance synthesis includes tolerance schema generation (sans tolerance values) and tolerance value allocation. This thesis aims to address the tolerance schema generation. To develop an automated tolerance schema synthesis toolset, to-be-toleranced features need to be identified, required tolerance types should be determined, a scheme for computer representation of the GD&T information need to be developed, sequence of control should be identified, and a procedure for creating datum reference frames (DRFs) should be developed. The first three steps define the architecture of the tolerance schema generation module while the last two steps setup a base to create a proper tolerance scheme with the help of GD&T good practice rules obtained from experts. The GD&T scheme recommended by this module is used by the tolerance value allocation/analysis module to complete the process of automated tolerance synthesis. Various test cases are studied to verify the suitability of this module. The results show that software-generated schemas are proper enough to address the assemblability issues (first order tolerancing). Since this novel technology is at its initial stage of development, performing further researches and case studies will definitely help to improve the software for making more comprehensive tolerance schemas that cover design intent (second order tolerancing) and cost optimization (third order tolerancing).
ContributorsHejazi, Sayed Mohammad (Author) / Shah, Jami J. (Thesis advisor) / Davidson, Joseph K. (Committee member) / Hansford, Dianne (Committee member) / Arizona State University (Publisher)
Created2016
158300-Thumbnail Image.png
Description
At least 30 datacenters either broke ground or hit the planning stages around the United States over the past two years. On such technically complex projects, Mechanical, Electrical and Plumbing (MEP) systems make up a huge portion of the construction work which makes data center market very promising for MEP

At least 30 datacenters either broke ground or hit the planning stages around the United States over the past two years. On such technically complex projects, Mechanical, Electrical and Plumbing (MEP) systems make up a huge portion of the construction work which makes data center market very promising for MEP subcontractors in the next years. However, specialized subcontractors such as electrical subcontractors are struggling to keep crews motivated. Due to the hard work involved in the construction industry, it is not appealing for young workers. According to The Center for Construction Research and Training, the percentages of workers aged between 16 to 19 years decreased by 67%, 20 to 24 years decreased by 49% and 25 to 34 age decreased by 32% from 1985 to 2015. Furthermore, the construction industry has been lagging other industries in combatting its decline in productivity. Electrical activities, especially cable pulling, are some of the most physically unsafe, tedious, and labor-intensive electrical process on data center projects. The motivation of this research is the need to take a closer look at how this process is being done and find improvement opportunities. This thesis focuses on one potential restructuring of the cable pulling and termination process; the goal of this restructuring is optimization for automation. Through process mapping, this thesis presents a proposed cable pulling and termination process that utilizes automation to make use of the best abilities of human and robots/machines. It will also provide a methodology for process improvement that is applicable to the electrical scope of work as well as that of other construction trades.
ContributorsHammam, MennatAllah (Author) / Parrish, Kristen (Thesis advisor) / Ayer, Steven (Committee member) / Irish, Elizabeth (Committee member) / Arizona State University (Publisher)
Created2020
132267-Thumbnail Image.png
Description
AARP estimates that 90% of seniors wish to remain in their homes during retirement. Seniors need assistance as they age, historically they have received assistance from either family members, nursing homes, or Continuing Care Retirement Communities. For seniors not wanting any of these options, there has been very few alternatives.

AARP estimates that 90% of seniors wish to remain in their homes during retirement. Seniors need assistance as they age, historically they have received assistance from either family members, nursing homes, or Continuing Care Retirement Communities. For seniors not wanting any of these options, there has been very few alternatives. Now, the emergence of the continuing care at home program is providing hope for a different method of elder care moving forward. CCaH programs offer services such as: skilled nursing care, care coordination, emergency response systems, aid with personal and health care, and transportation. Such services allow seniors to continue to live in their own home with assistance as their health deteriorates over time. Currently, only 30 CCaH programs exist. With the growth of the elderly population in the coming years, this model seems poised for growth.
ContributorsSturm, Brendan (Author) / Milovanovic, Jelena (Thesis director) / Hassett, Matthew (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
165923-Thumbnail Image.png
Description

The objective of this study is to build a model using R and RStudio that automates ratemaking procedures for Company XYZ’s actuaries in their commercial general liability pricing department. The purpose and importance of this objective is to allow actuaries to work more efficiently and effectively by using this model

The objective of this study is to build a model using R and RStudio that automates ratemaking procedures for Company XYZ’s actuaries in their commercial general liability pricing department. The purpose and importance of this objective is to allow actuaries to work more efficiently and effectively by using this model that outputs the results they otherwise would have had to code and calculate on their own. Instead of spending time working towards these results, the actuaries can analyze the findings, strategize accordingly, and communicate with business partners. The model was built from R code that was later transformed to Shiny, a package within RStudio that allows for the build-up of interactive web applications. The final result is a Shiny app that first takes in multiple datasets from Company XYZ’s data warehouse and displays different views of the data in order for actuaries to make selections on development and trend methods. The app outputs the re-created ratemaking exhibits showing the resulting developed and trended loss and premium as well as the experience-based indicated rate level change based on prior selections. The ratemaking process and Shiny app functionality will be detailed in this report.

ContributorsGilkey, Gina (Author) / Zicarelli, John (Thesis director) / Milovanovic, Jelena (Committee member) / Barrett, The Honors College (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2022-05