Matching Items (871)
Filtering by

Clear all filters

135413-Thumbnail Image.png
Description
E-commerce has rapidly become a mainstay in today's economy, and many websites have built themselves around providing a platform for independent sellers. Sites such as Etsy, Storenvy, Redbubble, and Society6 are increasingly popular options for anyone looking to open their own online store. With this project, I attempted to examine

E-commerce has rapidly become a mainstay in today's economy, and many websites have built themselves around providing a platform for independent sellers. Sites such as Etsy, Storenvy, Redbubble, and Society6 are increasingly popular options for anyone looking to open their own online store. With this project, I attempted to examine the effects of four different marketing techniques on sales in an online store. I opened a shop on Etsy and tracked sales in connection with promotion through social media, selling products in-person at a convention, holding a holiday tie-in sale, and using price anchoring. Social media accounts were opened on Facebook, Tumblr, and Instagram to promote the shop over the course of the project period, and Etsy's web analytics were used to track which sites directed the most traffic to the shop. I attended a convention in mid-January 2016 where I sold my products and distributed business cards with a discount code to track sales resulting from being at the convention. A holiday sale was held in conjunction with Valentine's Day to look at whether holidays influenced purchases. Lastly, a significantly more expensive product was temporarily put in the shop to see whether it produced a price anchoring effect \u2014 that is, encouraged sales of the less expensive products by making them seem affordable in comparison. While the volume of sales data was too small to draw statistically significant conclusions, the project was a highly instructive experience in the process of opening a small online store. The decision-making steps outlined may be helpful to other students looking to open their own online shop.
ContributorsChen, Candice Elizabeth (Author) / Moore, James (Thesis director) / Sanford, Adriana (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135654-Thumbnail Image.png
Description
Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key

Company X has developed RealSenseTM technology, a depth sensing camera that provides machines the ability to capture three-dimensional spaces along with motion within these spaces. The goal of RealSense was to give machines human-like senses, such as knowing how far away objects are and perceiving the surrounding environment. The key issue for Company X is how to commercialize RealSense's depth recognition capabilities. This thesis addresses the problem by examining which markets to address and how to monetize this technology. The first part of the analysis identified potential markets for RealSense. This was achieved by evaluating current markets that could benefit from the camera's gesture recognition, 3D scanning, and depth sensing abilities. After identifying seven industries where RealSense could add value, a model of the available, addressable, and obtainable market sizes was developed for each segment. Key competitors and market dynamics were used to estimate the portion of the market that Company X could capture. These models provided a forecast of the discounted gross profits that could be earned over the next five years. These forecasted gross profits, combined with an examination of the competitive landscape and synergistic opportunities, resulted in the selection of the three segments thought to be most profitable to Company X. These segments are smart home, consumer drones, and automotive. The final part of the analysis investigated entrance strategies. Company X's competitive advantages in each space were found by examining the competition, both for the RealSense camera in general and other technologies specific to each industry. Finally, ideas about ways to monetize RealSense were developed by exploring various revenue models and channels.
ContributorsDunn, Nicole (Co-author) / Boudreau, Thomas (Co-author) / Kinzy, Chris (Co-author) / Radigan, Thomas (Co-author) / Simonson, Mark (Thesis director) / Hertzel, Michael (Committee member) / WPC Graduate Programs (Contributor) / Department of Psychology (Contributor) / Department of Finance (Contributor) / School of Accountancy (Contributor) / Department of Economics (Contributor) / School of Mathematical and Statistical Science (Contributor) / W. P. Carey School of Business (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135660-Thumbnail Image.png
Description
This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can

This paper presents work that was done to create a system capable of facial expression recognition (FER) using deep convolutional neural networks (CNNs) and test multiple configurations and methods. CNNs are able to extract powerful information about an image using multiple layers of generic feature detectors. The extracted information can be used to understand the image better through recognizing different features present within the image. Deep CNNs, however, require training sets that can be larger than a million pictures in order to fine tune their feature detectors. For the case of facial expression datasets, none of these large datasets are available. Due to this limited availability of data required to train a new CNN, the idea of using naïve domain adaptation is explored. Instead of creating and using a new CNN trained specifically to extract features related to FER, a previously trained CNN originally trained for another computer vision task is used. Work for this research involved creating a system that can run a CNN, can extract feature vectors from the CNN, and can classify these extracted features. Once this system was built, different aspects of the system were tested and tuned. These aspects include the pre-trained CNN that was used, the layer from which features were extracted, normalization used on input images, and training data for the classifier. Once properly tuned, the created system returned results more accurate than previous attempts on facial expression recognition. Based on these positive results, naïve domain adaptation is shown to successfully leverage advantages of deep CNNs for facial expression recognition.
ContributorsEusebio, Jose Miguel Ang (Author) / Panchanathan, Sethuraman (Thesis director) / McDaniel, Troy (Committee member) / Venkateswara, Hemanth (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135668-Thumbnail Image.png
Description
In the medical industry, there have been promising advances in the increase of new types of healthcare to the public. As of 2015, there was a 98% Premarket Approval rate, a 38% increase since 2010. In addition, there were 41 new novel drugs approved for clinical usage in 2014 where

In the medical industry, there have been promising advances in the increase of new types of healthcare to the public. As of 2015, there was a 98% Premarket Approval rate, a 38% increase since 2010. In addition, there were 41 new novel drugs approved for clinical usage in 2014 where the average in the previous years from 2005-2013 was 25. However, the research process towards creating and delivering new healthcare to the public remains remarkably inefficient. It takes on average 15 years, over $900 million by one estimate, for a less than 12% success rate of discovering a novel drug for clinical usage. Medical devices do not fare much better. Between 2005-2009, there were over 700 recalls per year. In addition, it takes at minimum 3.25 years for a 510(k) exempt premarket approval. Plus, a time lag exists where it takes 17 years for only 14% of medical discoveries to be implemented clinically. Coupled with these inefficiencies, government funding for medical research has been decreasing since 2002 (2.5% of Gross Domestic Product) and is predicted to be 1.5% of Gross Domestic Product by 2019. Translational research, the conversion of bench-side discoveries to clinical usage for a simplistic definition, has been on the rise since the 1990s. This may be driving the increased premarket approvals and new novel drug approvals. At the very least, it is worth considering as translational research is directly related towards healthcare practices. In this paper, I propose to improve the outcomes of translational research in order to better deliver advancing healthcare to the public. I suggest Best Value Performance Information Procurement System (BV PIPS) should be adapted in the selection process of translational research projects to fund. BV PIPS has been shown to increase the efficiency and success rate of delivering projects and services. There has been over 17 years of research with $6.3 billion of projects and services delivered showing that BV PIPS has a 98% customer satisfaction, 90% minimized management effort, and utilizes 50% less manpower and effort. Using University of Michigan \u2014 Coulter Foundation Program's funding process as a baseline and standard in the current selection of translational research projects to fund, I offer changes to this process based on BV PIPS that may ameliorate it. As concepts implemented in this process are congruent with literature on successful translational research, it may suggest that this new model for selecting translational research projects to fund will reduce costs, increase efficiency, and increase success. This may then lead to more Premarket Approvals, more new novel drug approvals, quicker delivery time to the market, and lower recalls.
ContributorsDel Rosario, Joseph Paul (Author) / Kashiwagi, Dean (Thesis director) / Kashiwagi, Jacob (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
135678-Thumbnail Image.png
Description
The constant evolution of technology has greatly shifted the way in which we gain knowledge information. This, in turn, has an affect on how we learn. Long gone are the days where students sit in libraries for hours flipping through numerous books to find one specific piece of information. With

The constant evolution of technology has greatly shifted the way in which we gain knowledge information. This, in turn, has an affect on how we learn. Long gone are the days where students sit in libraries for hours flipping through numerous books to find one specific piece of information. With the advent of Google, modern day students are able to arrive at the same information within 15 seconds. This technology, the internet, is reshaping the way we learn. As a result, the academic integrity policies that are set forth at the college level seem to be outdated, often prohibiting the use of technology as a resource for learning. The purpose of this paper is to explore why exactly these resources are prohibited. By contrasting a subject such as Computer Science with the Humanities, the paper explores the need for the internet as a resource in some fields as opposed to others. Taking a look at the knowledge presented in Computer Science, the course structure, and the role that professors play in teaching this knowledge, this thesis evaluates the epistemology of Engineering subjects. By juxtaposing Computer Science with the less technology reliant humanities subjects, it is clear that one common policy outlining academic integrity does not suffice for an entire university. Instead, there should be amendments made to the policy specific to each subject, in order to best foster an environment of learning at the university level. In conclusion of this thesis, Arizona State University's Academic Integrity Policy is analyzed and suggestions are made to remove ambiguity in the language of the document, in order to promote learning at the university.
ContributorsMohan, Sishir Basavapatna (Author) / Brake, Elizabeth (Thesis director) / Martin, William (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
136633-Thumbnail Image.png
Description
Breast and other solid tumors exhibit high and varying degrees of intra-tumor heterogeneity resulting in targeted therapy resistance and other challenges that make the management and treatment of these diseases rather difficult. Due to the presence of admixtures of non-neoplastic cells with polyclonal cell populations, it is difficult to define

Breast and other solid tumors exhibit high and varying degrees of intra-tumor heterogeneity resulting in targeted therapy resistance and other challenges that make the management and treatment of these diseases rather difficult. Due to the presence of admixtures of non-neoplastic cells with polyclonal cell populations, it is difficult to define cancer genomes in patient samples. By isolating tumor cells from normal cells, and enriching distinct clonal populations, clinically relevant genomic aberrations that drive disease can be identified in patients in vivo. An in-depth analysis of clonal architecture and tumor heterogeneity was performed in a stage II chemoradiation-naïve breast cancer from a sixty-five year old patient. DAPI-based DNA content measurements and DNA content-based flow sorting was used to to isolate nuclei from distinct clonal populations of diploid and aneuploid tumor cells in surgical tumor samples. We combined DNA content-based flow cytometry and ploidy analysis with high-definition array comparative genomic hybridization (aCGH) and next-generation sequencing technologies to interrogate the genomes of multiple biopsies from the breast cancer. The detailed profiles of ploidy, copy number aberrations and mutations were used to recreate and map the lineages present within the tumor. The clonal analysis revealed driver events for tumor progression (a heterozygous germline BRCA2 mutation converted to homozygosity within the tumor by a copy number event and the constitutive activation of Notch and Akt signaling pathways. The highlighted approach has broad implications in the study of tumor heterogeneity by providing a unique ultra-high resolution of polyclonal tumors that can advance effective therapies and clinical management of patients with this disease.
ContributorsLaughlin, Brady Scott (Author) / Ankeny, Casey (Thesis director) / Barrett, Michael (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor) / School for the Science of Health Care Delivery (Contributor)
Created2015-05
136637-Thumbnail Image.png
Description
The purpose of this project was to construct and write code for a vehicle to take advantage of the benefits of combining stepper motors with mecanum wheels. This process involved building the physical vehicle, designing a custom PCB for the vehicle, writing code for the onboard microprocessor, and implementing motor

The purpose of this project was to construct and write code for a vehicle to take advantage of the benefits of combining stepper motors with mecanum wheels. This process involved building the physical vehicle, designing a custom PCB for the vehicle, writing code for the onboard microprocessor, and implementing motor control algorithms.
ContributorsDavis, Severin Jan (Author) / Burger, Kevin (Thesis director) / Vannoni, Greg (Committee member) / Barrett, The Honors College (Contributor) / School of International Letters and Cultures (Contributor) / Computer Science and Engineering Program (Contributor)
Created2015-05
136728-Thumbnail Image.png
Description
This project was centered around designing a processor model (using the C programming language) based on the Coldfire computer architecture that will run on third party software known as Open Virtual Platforms. The end goal is to have a fully functional processor that can run Coldfire instructions and utilize peripheral

This project was centered around designing a processor model (using the C programming language) based on the Coldfire computer architecture that will run on third party software known as Open Virtual Platforms. The end goal is to have a fully functional processor that can run Coldfire instructions and utilize peripheral devices in the same way as the hardware used in the embedded systems lab at ASU. This project would cut down the substantial amount of time students spend commuting to the lab. Having the processor directly at their disposal would also encourage them to spend more time outside of class learning the hardware and familiarizing themselves with development on an embedded micro-controller. The model will be accurate, fast and reliable. These aspects will be achieved through rigorous unit testing and use of the OVP platform which provides instruction accurate simulations at hundreds of MIPS (million instructions per second) for the specified model. The end product was able to accurately simulate a subset of the Coldfire instructions at very high rates.
ContributorsDunning, David Connor (Author) / Burger, Kevin (Thesis director) / Meuth, Ryan (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2014-12
136771-Thumbnail Image.png
DescriptionMy main goal for my thesis is in conjunction with the research I started in the summer of 2010 regarding the creation of a TBI continuous-time sensor. Such goals include: characterizing the proteins in sensing targets while immobilized, while free in solution, and while in free solution in the blood.
ContributorsHaselwood, Brittney (Author) / LaBelle, Jeffrey (Thesis director) / Pizziconi, Vincent (Committee member) / Cook, Curtiss (Committee member) / Harrington Bioengineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2011-12
136785-Thumbnail Image.png
Description
This paper presents the design and evaluation of a haptic interface for augmenting human-human interpersonal interactions by delivering facial expressions of an interaction partner to an individual who is blind using a visual-to-tactile mapping of facial action units and emotions. Pancake shaftless vibration motors are mounted on the back of

This paper presents the design and evaluation of a haptic interface for augmenting human-human interpersonal interactions by delivering facial expressions of an interaction partner to an individual who is blind using a visual-to-tactile mapping of facial action units and emotions. Pancake shaftless vibration motors are mounted on the back of a chair to provide vibrotactile stimulation in the context of a dyadic (one-on-one) interaction across a table. This work explores the design of spatiotemporal vibration patterns that can be used to convey the basic building blocks of facial movements according to the Facial Action Unit Coding System. A behavioral study was conducted to explore the factors that influence the naturalness of conveying affect using vibrotactile cues.
ContributorsBala, Shantanu (Author) / Panchanathan, Sethuraman (Thesis director) / McDaniel, Troy (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor) / Department of Psychology (Contributor)
Created2014-05