Matching Items (177)
Filtering by

Clear all filters

133340-Thumbnail Image.png
Description
For as long as humans have been working, they have been looking for ways to get that work done better, faster, and more efficient. Over the course of human history, mankind has created innumerable spectacular inventions, all with the goal of making the economy and daily life more efficient. Today,

For as long as humans have been working, they have been looking for ways to get that work done better, faster, and more efficient. Over the course of human history, mankind has created innumerable spectacular inventions, all with the goal of making the economy and daily life more efficient. Today, innovations and technological advancements are happening at a pace like never seen before, and technology like automation and artificial intelligence are poised to once again fundamentally alter the way people live and work in society. Whether society is prepared or not, robots are coming to replace human labor, and they are coming fast. In many areas artificial intelligence has disrupted entire industries of the economy. As people continue to make advancements in artificial intelligence, more industries will be disturbed, more jobs will be lost, and entirely new industries and professions will be created in their wake. The future of the economy and society will be determined by how humans adapt to the rapid innovations that are taking place every single day. In this paper I will examine the extent to which automation will take the place of human labor in the future, project the potential effect of automation to future unemployment, and what individuals and society will need to do to adapt to keep pace with rapidly advancing technology. I will also look at the history of automation in the economy. For centuries humans have been advancing technology to make their everyday work more productive and efficient, and for centuries this has forced humans to adapt to the modern technology through things like training and education. The thesis will additionally examine the ways in which the U.S. education system will have to adapt to meet the demands of the advancing economy, and how job retraining programs must be modernized to prepare workers for the changing economy.
ContributorsCunningham, Reed P. (Author) / DeSerpa, Allan (Thesis director) / Haglin, Brett (Committee member) / School of International Letters and Cultures (Contributor) / Department of Finance (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133894-Thumbnail Image.png
Description
Pandora is a play exploring our relationship with gendered technology through the lens of artificial intelligence. Can women be subjective under patriarchy? Do robots who look like women have subjectivity? Hoping to create a better version of ourselves, The Engineer must navigate the loss of her creation, and Pandora must

Pandora is a play exploring our relationship with gendered technology through the lens of artificial intelligence. Can women be subjective under patriarchy? Do robots who look like women have subjectivity? Hoping to create a better version of ourselves, The Engineer must navigate the loss of her creation, and Pandora must navigate their new world. The original premiere run was March 27-28, 2018, original cast: Caitlin Andelora, Rikki Tremblay, and Michael Tristano Jr.
ContributorsToye, Abigail Elizabeth (Author) / Linde, Jennifer (Thesis director) / Abele, Kelsey (Committee member) / Department of Information Systems (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
134157-Thumbnail Image.png
Description
This paper details the specification and implementation of a single-machine blockchain simulator. It also includes a brief introduction on the history & underlying concepts of blockchain, with explanations on features such as decentralization, openness, trustlessness, and consensus. The introduction features a brief overview of public interest and current implementations of

This paper details the specification and implementation of a single-machine blockchain simulator. It also includes a brief introduction on the history & underlying concepts of blockchain, with explanations on features such as decentralization, openness, trustlessness, and consensus. The introduction features a brief overview of public interest and current implementations of blockchain before stating potential use cases for blockchain simulation software. The paper then gives a brief literature review of blockchain's role, both as a disruptive technology and a foundational technology. The literature review also addresses the potential and difficulties regarding the use of blockchain in Internet of Things (IoT) networks, and also describes the limitations of blockchain in general regarding computational intensity, storage capacity, and network architecture. Next, the paper gives the specification for a generic blockchain structure, with summaries on the behaviors and purposes of transactions, blocks, nodes, miners, public & private key cryptography, signature validation, and hashing. Finally, the author gives an overview of their specific implementation of the blockchain using C/C++ and OpenSSL. The overview includes a brief description of all the classes and data structures involved in the implementation, including their function and behavior. While the implementation meets the requirements set forward in the specification, the results are more qualitative and intuitive, as time constraints did not allow for quantitative measurements of the network simulation. The paper concludes by discussing potential applications for the simulator, and the possibility for future hardware implementations of blockchain.
ContributorsRauschenbach, Timothy Rex (Author) / Vrudhula, Sarma (Thesis director) / Nakamura, Mutsumi (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2017-12
135547-Thumbnail Image.png
Description
The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a

The Experimental Data Processing (EDP) software is a C++ GUI-based application to streamline the process of creating a model for structural systems based on experimental data. EDP is designed to process raw data, filter the data for noise and outliers, create a fitted model to describe that data, complete a probabilistic analysis to describe the variation between replicates of the experimental process, and analyze reliability of a structural system based on that model. In order to help design the EDP software to perform the full analysis, the probabilistic and regression modeling aspects of this analysis have been explored. The focus has been on creating and analyzing probabilistic models for the data, adding multivariate and nonparametric fits to raw data, and developing computational techniques that allow for these methods to be properly implemented within EDP. For creating a probabilistic model of replicate data, the normal, lognormal, gamma, Weibull, and generalized exponential distributions have been explored. Goodness-of-fit tests, including the chi-squared, Anderson-Darling, and Kolmogorov-Smirnoff tests, have been used in order to analyze the effectiveness of any of these probabilistic models in describing the variation of parameters between replicates of an experimental test. An example using Young's modulus data for a Kevlar-49 Swath stress-strain test was used in order to demonstrate how this analysis is performed within EDP. In order to implement the distributions, numerical solutions for the gamma, beta, and hypergeometric functions were implemented, along with an arbitrary precision library to store numbers that exceed the maximum size of double-precision floating point digits. To create a multivariate fit, the multilinear solution was created as the simplest solution to the multivariate regression problem. This solution was then extended to solve nonlinear problems that can be linearized into multiple separable terms. These problems were solved analytically with the closed-form solution for the multilinear regression, and then by using a QR decomposition to solve numerically while avoiding numerical instabilities associated with matrix inversion. For nonparametric regression, or smoothing, the loess method was developed as a robust technique for filtering noise while maintaining the general structure of the data points. The loess solution was created by addressing concerns associated with simpler smoothing methods, including the running mean, running line, and kernel smoothing techniques, and combining the ability of each of these methods to resolve those issues. The loess smoothing method involves weighting each point in a partition of the data set, and then adding either a line or a polynomial fit within that partition. Both linear and quadratic methods were applied to a carbon fiber compression test, showing that the quadratic model was more accurate but the linear model had a shape that was more effective for analyzing the experimental data. Finally, the EDP program itself was explored to consider its current functionalities for processing data, as described by shear tests on carbon fiber data, and the future functionalities to be developed. The probabilistic and raw data processing capabilities were demonstrated within EDP, and the multivariate and loess analysis was demonstrated using R. As the functionality and relevant considerations for these methods have been developed, the immediate goal is to finish implementing and integrating these additional features into a version of EDP that performs a full streamlined structural analysis on experimental data.
ContributorsMarkov, Elan Richard (Author) / Rajan, Subramaniam (Thesis director) / Khaled, Bilal (Committee member) / Chemical Engineering Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Ira A. Fulton School of Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135574-Thumbnail Image.png
Description
The purpose of our research was to develop recommendations and/or strategies for Company A's data center group in the context of the server CPU chip industry. We used data collected from the International Data Corporation (IDC) that was provided by our team coaches, and data that is accessible on the

The purpose of our research was to develop recommendations and/or strategies for Company A's data center group in the context of the server CPU chip industry. We used data collected from the International Data Corporation (IDC) that was provided by our team coaches, and data that is accessible on the internet. As the server CPU industry expands and transitions to cloud computing, Company A's Data Center Group will need to expand their server CPU chip product mix to meet new demands of the cloud industry and to maintain high market share. Company A boasts leading performance with their x86 server chips and 95% market segment share. The cloud industry is dominated by seven companies Company A calls "The Super 7." These seven companies include: Amazon, Google, Microsoft, Facebook, Alibaba, Tencent, and Baidu. In the long run, the growing market share of the Super 7 could give them substantial buying power over Company A, which could lead to discounts and margin compression for Company A's main growth engine. Additionally, in the long-run, the substantial growth of the Super 7 could fuel the development of their own design teams and work towards making their own server chips internally, which would be detrimental to Company A's data center revenue. We first researched the server industry and key terminology relevant to our project. We narrowed our scope by focusing most on the cloud computing aspect of the server industry. We then researched what Company A has already been doing in the context of cloud computing and what they are currently doing to address the problem. Next, using our market analysis, we identified key areas we think Company A's data center group should focus on. Using the information available to us, we developed our strategies and recommendations that we think will help Company A's Data Center Group position themselves well in an extremely fast growing cloud computing industry.
ContributorsJurgenson, Alex (Co-author) / Nguyen, Duy (Co-author) / Kolder, Sean (Co-author) / Wang, Chenxi (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / Department of Finance (Contributor) / Department of Management (Contributor) / Department of Information Systems (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / School of Accountancy (Contributor) / WPC Graduate Programs (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135421-Thumbnail Image.png
Description
Multiple sclerosis is a neurological disease that attacks the nerves in the central nervous system of the brain and spinal cord. Multiple sclerosis is a neurological disease that attacks the nerves in the central nervous system of the brain and spinal cord.  The severity of multiple sclerosis varies based on

Multiple sclerosis is a neurological disease that attacks the nerves in the central nervous system of the brain and spinal cord. Multiple sclerosis is a neurological disease that attacks the nerves in the central nervous system of the brain and spinal cord.  The severity of multiple sclerosis varies based on the each person and the progression of the disease. There are roughly 2.5 million people that suffer from this disease that life is changed dramatically from being diagnosed with no main way to ease into adjusting to a new lifestyle. The increase of people that are diagnosed with multiple sclerosis, and with a majority of those people being diagnosed in their early 20’s, there is a need for an application that will help patients manage their health. Multiple sclerosis leads to a lifestyle change, which includes various treatment options as well as routine doctor appointments.  The creation of the myMS Specialist application will allow patients with multiple sclerosis to live a more comfortable lifestyle while they easily track and manage their health through their mobile devices. Our application has seven components that all play an important role in adjusting to the new everyday lifestyle for a patient with multiple sclerosis. All seven components are largely intertwined with each other to help patients realize patterns in their diet, sleep, exercise and the weather that causes their symptoms to worsen. Our application not only connects to a patient’s doctor so that there is full access of information at all time to the doctor but provides beneficial research to help further the understanding of multiple sclerosis. This application will be marketed and available for purchase to not only patients but doctors. It is our goal to lessen the burden of a new lifestyle to a patient, create constant communication with one’s doctor and provide beneficial data to researchers.
ContributorsSaenz, Devon (Co-author) / Peterson, Tyler (Co-author) / Chomina-Chavez, Aram (Thesis director) / Staats, Cody (Committee member) / W. P. Carey School of Business (Contributor) / Herberger Institute for Design and the Arts (Contributor) / School of Accountancy (Contributor) / Sandra Day O'Connor College of Law (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135637-Thumbnail Image.png
Description
The foundations of legacy media, especially the news media, are not as strong as they once were. A digital revolution has changed the operation models for and journalistic organizations are trying to find their place in the new market. This project is intended to analyze the effects of new/emerging technologies

The foundations of legacy media, especially the news media, are not as strong as they once were. A digital revolution has changed the operation models for and journalistic organizations are trying to find their place in the new market. This project is intended to analyze the effects of new/emerging technologies on the journalism industry. Five different categories of technology will be explored. They are as follows: the semantic web, automation software, data analysis and aggregators, virtual reality and drone journalism. The potential of these technologies will be broken up according to four guidelines, ethical implications, effects on the reportorial process, business impacts and changes to the consumer experience. Upon my examination, it is apparent that no single technology will offer the journalism industry the remedy it has been searching for. Some combination of emerging technologies however, may form the basis for the next generation of news. Findings are presented on a website that features video, visuals, linked content, and original graphics. Website found at http://www.explorenewstech.com/
Created2016-05
135654-Thumbnail Image.png
Description
Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key

Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key issue for Company X is how to commercialize RealSense's depth recognition capabilities. This thesis addresses the problem by examining which markets to address and how to monetize this technology. The first part of the analysis identified potential markets for RealSense. This was achieved by evaluating current markets that could benefit from the camera's gesture recognition, 3D scanning, and depth sensing abilities. After identifying seven industries where RealSense could add value, a model of the available, addressable, and obtainable market sizes was developed for each segment. Key competitors and market dynamics were used to estimate the portion of the market that Company X could capture. These models provided a forecast of the discounted gross profits that could be earned over the next five years. These forecasted gross profits, combined with an examination of the competitive landscape and synergistic opportunities, resulted in the selection of the three segments thought to be most profitable to Company X. These segments are smart home, consumer drones, and automotive. The final part of the analysis investigated entrance strategies. Company X's competitive advantages in each space were found by examining the competition, both for the RealSense camera in general and other technologies specific to each industry. Finally, ideas about ways to monetize RealSense were developed by exploring various revenue models and channels.
ContributorsDunn, Nicole (Co-author) / Boudreau, Thomas (Co-author) / Kinzy, Chris (Co-author) / Radigan, Thomas (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / WPC Graduate Programs (Contributor) / Department of Psychology (Contributor) / Department of Finance (Contributor) / School of Accountancy (Contributor) / Department of Economics (Contributor) / School of Mathematical and Statistical Science (Contributor) / W. P. Carey School of Business (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135671-Thumbnail Image.png
Description
Financial statements are one of the most important, if not the most important, documents for investors. These statements are prepared quarterly and yearly by the company accounting department, and are then audited in detail by a large external accounting firm. Investors use these documents to determine the value of the

Financial statements are one of the most important, if not the most important, documents for investors. These statements are prepared quarterly and yearly by the company accounting department, and are then audited in detail by a large external accounting firm. Investors use these documents to determine the value of the company, and trust that the company was truthful in its statements, and the auditing firm correctly audited the company's financial statements for any mistakes in their books and balances. Mistakes on a company's financial statements can be costly. However, financial fraud on the statements can be outright disastrous. Penalties for accounting fraud can include individual lifetime prison sentences, as well as company fines for billions of dollars. As students in the accounting major, it is our responsibility to ensure that financial statements are accurate and truthful to protect ourselves, other stakeholders, and the companies we work for. This ethics game takes the stories of Enron, WorldCom, and Lehman Brothers and uses them to help students identify financial fraud and how it can be prevented, as well as the consequences behind unethical decisions in financial reporting. The Enron scandal involved CEO Kenneth Lay and his predecessor Jeffery Skilling hiding losses in their financial statements with the help of their auditing firm, Arthur Andersen. Enron collapsed in 2002, and Lay was sentenced to 45 years in prison with his conspirator Skilling sentenced to 24 years in prison. In the WorldCom scandal, CEO Bernard "Bernie" Ebbers booked line costs as capital expenses (overstating WorldCom's assets), and created fraudulent accounts to inflate revenue and WorldCom's profit. Ebbers was sentenced to 25 years in prison and lost his title as WorldCom's Chief Executive Officer. Lehman Brothers took advantage of a loophole in accounting procedure Repo 105, that let the firm hide $50 billion in profits. No one at Lehman Brothers was sentenced to jail since the transaction was technically considered legal, but Lehman was the largest investment bank to fail and the only large financial institution that was not bailed out by the U.S. government.
ContributorsPanikkar, Manoj Madhuraj (Author) / Samuelson, Melissa (Thesis director) / Ahmad, Altaf (Committee member) / Department of Information Systems (Contributor) / School of Accountancy (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135678-Thumbnail Image.png
Description
The constant evolution of technology has greatly shifted the way in which we gain knowledge information. This, in turn, has an affect on how we learn. Long gone are the days where students sit in libraries for hours flipping through numerous books to find one specific piece of information. With

The constant evolution of technology has greatly shifted the way in which we gain knowledge information. This, in turn, has an affect on how we learn. Long gone are the days where students sit in libraries for hours flipping through numerous books to find one specific piece of information. With the advent of Google, modern day students are able to arrive at the same information within 15 seconds. This technology, the internet, is reshaping the way we learn. As a result, the academic integrity policies that are set forth at the college level seem to be outdated, often prohibiting the use of technology as a resource for learning. The purpose of this paper is to explore why exactly these resources are prohibited. By contrasting a subject such as Computer Science with the Humanities, the paper explores the need for the internet as a resource in some fields as opposed to others. Taking a look at the knowledge presented in Computer Science, the course structure, and the role that professors play in teaching this knowledge, this thesis evaluates the epistemology of Engineering subjects. By juxtaposing Computer Science with the less technology reliant humanities subjects, it is clear that one common policy outlining academic integrity does not suffice for an entire university. Instead, there should be amendments made to the policy specific to each subject, in order to best foster an environment of learning at the university level. In conclusion of this thesis, Arizona State University's Academic Integrity Policy is analyzed and suggestions are made to remove ambiguity in the language of the document, in order to promote learning at the university.
ContributorsMohan, Sishir Basavapatna (Author) / Brake, Elizabeth (Thesis director) / Martin, William (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05