Matching Items (9)
Filtering by

Clear all filters

134600-Thumbnail Image.png
Description
Workplace productivity is a result of many factors, and among them is the setup of the office and its resultant noise level. The conversations and interruptions that come along with converting an office to an open plan can foster innovation and creativity, or they can be distracting and harm the

Workplace productivity is a result of many factors, and among them is the setup of the office and its resultant noise level. The conversations and interruptions that come along with converting an office to an open plan can foster innovation and creativity, or they can be distracting and harm the performance of employees. Through simulation, the impact of different types of office noise was studied along with other changing conditions such as number of people in the office. When productivity per person, defined in terms of mood and focus, was measured, it was found that the effect of noise was positive in some scenarios and negative in others. In simulations where employees were performing very similar tasks, noise (and its correlates, such as number of employees), was beneficial. On the other hand, when employees were engaged in a variety of different types of tasks, noise had a negative overall effect. This indicates that workplaces that group their employees by common job functions may be more productive than workplaces where the problems and products that employees are working on are varied throughout the workspace.
ContributorsHall, Mikaela Starrantino (Author) / Pavlic, Theodore P. (Thesis director) / Cooke, Nancy (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
Description

The first step in process improvement is to scope the problem, next is measure the current process, but if data is not readily available and cannot be manually collected, then a measurement system must be implemented. General Dynamics Mission Systems (GDMS) is a lean company that is always seeking to

The first step in process improvement is to scope the problem, next is measure the current process, but if data is not readily available and cannot be manually collected, then a measurement system must be implemented. General Dynamics Mission Systems (GDMS) is a lean company that is always seeking to improve. One of their current bottlenecks is the incoming inspection department. This department is responsible for finding defects on parts purchased and is critical to the high reliability product produced by GDMS. To stay competitive and hold their market share, a decision was made to optimize incoming inspection. This proved difficult because no data is being collected. Early steps in many process improvement methodologies, such as Define, Measure, Analyze, Improve and Control (DMAIC), include data collection; however, no measurement system was in place, resulting in no available data for improvement. The solution to this problem was to design and implement a Management Information System (MIS) that will track a variety of data. This will provide the company with data that will be used for analysis and improvement. The first stage of the MIS was developed in Microsoft Excel with Visual Basic for Applications because of the low cost and overall effectiveness of the software. Excel allows update to be made quickly, and allows GDMS to collect data immediately. Stage two would be moving the MIS to a more practicable software, such as Access or MySQL. This thesis is only focuses on stage one of the MIS, and GDMS will proceed with stage two.

ContributorsDiaz, Angel (Author) / McCarville, Daniel R. (Thesis director) / Pavlic, Theodore (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
134662-Thumbnail Image.png
Description
The overall energy consumption around the United States has not been reduced even with the advancement of technology over the past decades. Deficiencies exist between design and actual energy performances. Energy Infrastructure Systems (EIS) are impacted when the amount of energy production cannot be accurately and efficiently forecasted. Inaccurate engineering

The overall energy consumption around the United States has not been reduced even with the advancement of technology over the past decades. Deficiencies exist between design and actual energy performances. Energy Infrastructure Systems (EIS) are impacted when the amount of energy production cannot be accurately and efficiently forecasted. Inaccurate engineering assumptions can result when there is a lack of understanding on how energy systems can operate in real-world applications. Energy systems are complex, which results in unknown system behaviors, due to an unknown structural system model. Currently, there exists a lack of data mining techniques in reverse engineering, which are needed to develop efficient structural system models. In this project, a new type of reverse engineering algorithm has been applied to a year's worth of energy data collected from an ASU research building called MacroTechnology Works, to identify the structural system model. Developing and understanding structural system models is the first step in creating accurate predictive analytics for energy production. The associative network of the building's data will be highlighted to accurately depict the structural model. This structural model will enhance energy infrastructure systems' energy efficiency, reduce energy waste, and narrow the gaps between energy infrastructure design, planning, operation and management (DPOM).
ContributorsCamarena, Raquel Jimenez (Author) / Chong, Oswald (Thesis director) / Ye, Nong (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
134361-Thumbnail Image.png
Description

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the

Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.

ContributorsArmstrong, Julia Robin (Author) / McCarville, Daniel R. (Thesis director) / Montgomery, Douglas (Committee member) / Industrial, Systems (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
137647-Thumbnail Image.png
Description
The widespread use of statistical analysis in sports-particularly Baseball- has made it increasingly necessary for small and mid-market teams to find ways to maintain their analytical advantages over large market clubs. In baseball, an opportunity for exists for teams with limited financial resources to sign players under team control to

The widespread use of statistical analysis in sports-particularly Baseball- has made it increasingly necessary for small and mid-market teams to find ways to maintain their analytical advantages over large market clubs. In baseball, an opportunity for exists for teams with limited financial resources to sign players under team control to long-term contracts before other teams can bid for their services in free agency. If small and mid-market clubs can successfully identify talented players early, clubs can save money, achieve cost certainty and remain competitive for longer periods of time. These deals are also advantageous to players since they receive job security and greater financial dividends earlier in their career. The objective of this paper is to develop a regression-based predictive model that teams can use to forecast the performance of young baseball players with limited Major League experience. There were several tasks conducted to achieve this goal: (1) Data was obtained from Major League Baseball and Lahman's Baseball Database and sorted using Excel macros for easier analysis. (2) Players were separated into three positional groups depending on similar fielding requirements and offensive profiles: Group I was comprised of first and third basemen, Group II contains second basemen, shortstops, and center fielders and Group III contains left and right fielders. (3) Based on the context of baseball and the nature of offensive performance metrics, only players who achieve greater than 200 plate appearances within the first two years of their major league debut are included in this analysis. (4) The statistical software package JMP was used to create regression models of each group and analyze the residuals for any irregularities or normality violations. Once the models were developed, slight adjustments were made to improve the accuracy of the forecasts and identify opportunities for future work. It was discovered that Group I and Group III were the easiest player groupings to forecast while Group II required several attempts to improve the model.
ContributorsJack, Nathan Scott (Author) / Shunk, Dan (Thesis director) / Montgomery, Douglas (Committee member) / Borror, Connie (Committee member) / Industrial, Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2013-05
Description

Lean philosophy is a set of practices aimed at reducing waste in an industry/enterprise. By eliminating the aspects of a system that do not add value, the system process will be able to work continuously in a flow, and as a result have a shorter cycle time. With a shorter

Lean philosophy is a set of practices aimed at reducing waste in an industry/enterprise. By eliminating the aspects of a system that do not add value, the system process will be able to work continuously in a flow, and as a result have a shorter cycle time. With a shorter cycle time, less resources are diminished, and efforts can be properly distributed in order to achieve maximum efficiency. In relation, Six Sigma is a process that aims to reduce the variability of a system, and in turn reduce the number of defects and improve overall quality of a product/process. For this reason, Lean and Six Sigma go hand-in-hand. Cutting out non-value adding steps in a process will increase efficiency and perfecting the steps still in place will improve quality. Both aspects are important when it comes to the success of a business practice. DNASU Plasmid Repository would be a major benefactor of the Lean Six Sigma process. The process of cloning DNA requires great attention to detail and time in order to avoid defects. For instance, any mistake made in the bacteria growth process, such as contamination, will result in a significant amount of time being wasted. In addition, the purification of DNA steps also necessitates vigilant observation since the procedure is highly susceptible to little mistakes that could have big impacts. The goal of this project will be to integrate Lean Six Sigma methodology into the DNASU laboratory. By applying numerous aspects of Lean Six Sigma, the DNA repository will be able to improve its efficiency and quality of processes and obtain its highest rate of success.

ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
ContributorsMorton, Haley (Author) / McCarville, Daniel (Thesis director) / Eyerly, Ann (Committee member) / Taylor, Clayton (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor)
Created2023-05
Description

This paper analyzes the impact of the December 2022 winter storm on Southwest Airlines (SWA). The storm caused delays and cancellations for all airlines, but SWA was the only major airline that was unable to recover fully. The disruption was unique due to the higher volume of people traveling during

This paper analyzes the impact of the December 2022 winter storm on Southwest Airlines (SWA). The storm caused delays and cancellations for all airlines, but SWA was the only major airline that was unable to recover fully. The disruption was unique due to the higher volume of people traveling during the holiday season and the lack of good alternative transportation for stranded passengers. The paper explains SWA's point-to-point (PTP) model, which allows them to offer competitive ticket prices, and organizational factors that have helped them hold a significant market share. The paper also discusses previous failures of SWA's IT and aircraft maintenance management systems and the outdated crewing system, which were not addressed until after the storm. The paper uses AnyLogic agent based modeling to investigate why SWA was so affected and why it took them so long to recover.

ContributorsBray, Mariana (Author) / McCarville, Daniel (Thesis director) / Kucukozyigit, Ali (Committee member) / Barrett, The Honors College (Contributor) / Industrial, Systems & Operations Engineering Prgm (Contributor) / School of Mathematical and Statistical Sciences (Contributor)
Created2023-05