Filtering by
- All Subjects: engineering
- Creators: Barrett, The Honors College
- Member of: Barrett, The Honors College Thesis/Creative Project Collection
Although these commercial and research systems are still in testing, it is important to understand how AVs are being marketed to the general public and how they are perceived, so that one day they may be effectively adopted into everyday life. People do not want to see a car they do not trust on the same roads as them, so the questions are: why don’t people trust them, and how can companies and researchers improve the trustworthiness of the vehicles?
The next question: What do these changes in the roles and responsibilities look like for the auditors of the future? Cognitive technology will assuredly present new issues for which humans will have to find solutions.
• How will humans be able to test the accuracy and completeness of the decisions derived by cognitive systems?
• If cognitive computing systems rely on supervised learning, what is the most effective way to train systems?
• How will cognitive computing fair in an industry that experiences ever-changing industry regulations?
• Will cognitive technology enhance the quality of audits?
In order to answer these questions and many more, I plan on examining how cognitive technologies evolved into their use today. Based on this historic trajectory, stakeholder interviews, and industry research, I will forecast what auditing jobs may look like in the near future taking into account rapid advances in cognitive computing.
The conclusions forecast a future in auditing that is much more accurate, timely, and pleasant. Cognitive technologies allow auditors to test entire populations of transactions, to tackle audit issues on a more continuous basis, to alleviate the overload of work that occurs after fiscal year-end, and to focus on client interaction.
Fabrication of Alignment and Chemical Gradient Scaffold for Tendon-Bone Repair using Electrospinning
Based on findings of previous studies, there was speculation that two well-known experimental design software packages, JMP and Design Expert, produced varying power outputs given the same design and user inputs. For context and scope, another popular experimental design software package, Minitab® Statistical Software version 17, was added to the comparison. The study compared multiple test cases run on the three software packages with a focus on 2k and 3K factorial design and adjusting the standard deviation effect size, number of categorical factors, levels, number of factors, and replicates. All six cases were run on all three programs and were attempted to be run at one, two, and three replicates each. There was an issue at the one replicate stage, however—Minitab does not allow for only one replicate full factorial designs and Design Expert will not provide power outputs for only one replicate unless there are three or more factors. From the analysis of these results, it was concluded that the differences between JMP 13 and Design Expert 10 were well within the margin of error and likely caused by rounding. The differences between JMP 13, Design Expert 10, and Minitab 17 on the other hand indicated a fundamental difference in the way Minitab addressed power calculation compared to the latest versions of JMP and Design Expert. This was found to be likely a cause of Minitab’s dummy variable coding as its default instead of the orthogonal coding default of the other two. Although dummy variable and orthogonal coding for factorial designs do not show a difference in results, the methods affect the overall power calculations. All three programs can be adjusted to use either method of coding, but the exact instructions for how are difficult to find and thus a follow-up guide on changing the coding for factorial variables would improve this issue.