The ASU COVID-19 testing lab process was developed to operate as the primary testing site for all ASU staff, students, and specified external individuals. Tests are collected at various collection sites, including a walk-in site at the SDFC and various drive-up sites on campus; analysis is conducted on ASU campus and results are distributed virtually to all patients via the Health Services patient portal. The following is a literature review on past implementations of various process improvement techniques and how they can be applied to the ABCTL testing process to achieve laboratory goals. (abstract)
Project management is the crucial component for managing and mitigating the inherent risks associated with changes in technology and innovation. The procedures to track the schedule, budget, and scope of various projects in the standard worlds of engineering, manufacturing, construction, etc., are essential elements to the success of the project. Cost overruns, schedule changes, and other natural risks must be managed effectively. But what happens when a project manager is tasked with delivering an attraction that needs to withstand harsh weather conditions, and millions of people enjoying it every year, for a company with arguably the highest standards for quality and guest satisfaction? This would describe the project managers at Walt Disney Imagineering (WDI) and the projects they oversee have tight budgets, aggressive schedules and require a bit more pixie dust than other engineering projects. However, the universal truth is that no matter the size or the scope of the endeavor, project management processes are absolutely essential to ensuring that every team member can effectively collaborate to deliver the best product.
Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.