Matching Items (1,156)
Filtering by

Clear all filters

133346-Thumbnail Image.png
Description
The advertising agency, in its variety of forms, is one of the most powerful forces in the modern world. Its products are seen globally through various multimedia outlets and they strongly impact culture and economy. Since its conception in 1843 by Volney Palmer, the advertising agency has evolved into the

The advertising agency, in its variety of forms, is one of the most powerful forces in the modern world. Its products are seen globally through various multimedia outlets and they strongly impact culture and economy. Since its conception in 1843 by Volney Palmer, the advertising agency has evolved into the recognizable—and unrecognizable—firms scattered around the world today. In the United States alone, there are roughly 13.4 thousand agencies, many of which also have branches in other countries. The evolution of the modern advertising agency coincided with, and even preceded, some of the major inflection points in history. Understanding how and why changes in advertising agencies affected these inflection points provides a glimpse of understanding into the relationship between advertising, business, and societal values.

In the pages ahead we will explore the future of the advertising industry. We will analyze our research to uncover the underlying trends pointing towards what is to come and work to apply those explanations to our understanding of advertising in the future.
ContributorsHarris, Chase (Co-author) / Potthoff, Zachary (Co-author) / Gray, Nancy (Thesis director) / Samper, Adriana (Committee member) / Department of Information Systems (Contributor) / Department of Marketing (Contributor) / Herberger Institute for Design and the Arts (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133359-Thumbnail Image.png
Description
The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power.

The current trend of interconnected devices, or the internet of things (IOT) has led to the popularization of single board computers (SBC). This is primarily due to their form-factor and low price. This has led to unique networks of devices that can have unstable network connections and minimal processing power. Many parallel program- ming libraries are intended for use in high performance computing (HPC) clusters. Unlike the IOT environment described, HPC clusters will in general look to obtain very consistent network speeds and topologies. There are a significant number of software choices that make up what is referred to as the HPC stack or parallel processing stack. My thesis focused on building an HPC stack that would run on the SCB computer name the Raspberry Pi. The intention in making this Raspberry Pi cluster is to research performance of MPI implementations in an IOT environment, which had an impact on the design choices of the cluster. This thesis is a compilation of my research efforts in creating this cluster as well as an evaluation of the software that was chosen to create the parallel processing stack.
ContributorsO'Meara, Braedon Richard (Author) / Meuth, Ryan (Thesis director) / Dasgupta, Partha (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133369-Thumbnail Image.png
Description
Breast microcalcifications are a potential indicator of cancerous tumors. Current visualization methods are either uncomfortable or impractical. Impedance measurement studies have been performed, but not in a clinical setting due to a low sensitivity and specificity. We are hoping to overcome this challenge with the development of a highly accurate

Breast microcalcifications are a potential indicator of cancerous tumors. Current visualization methods are either uncomfortable or impractical. Impedance measurement studies have been performed, but not in a clinical setting due to a low sensitivity and specificity. We are hoping to overcome this challenge with the development of a highly accurate impedance probe on a biopsy needle. With this technique, microcalcifications and the surrounding tissue could be differentiated in an efficient and comfortable manner than current techniques for biopsy procedures. We have developed and tested a functioning prototype for a biopsy needle using bioimpedance sensors to detect microcalcifications in the human body. In the final prototype a waveform generator sends a sin wave at a relatively low frequency(<1KHz) into the pre-amplifier, which both stabilizes and amplifies the signal. A modified howland bridge is then used to achieve a steady AC current through the electrodes. The voltage difference across the electrodes is then used to calculate the impedance being experienced between the electrodes. In our testing, the microcalcifications we are looking for have a noticeably higher impedance than the surrounding breast tissue, this spike in impedance is used to signal the presence of the calcifications, which are then sampled for examination by radiology.
ContributorsWen, Robert Bobby (Co-author) / Grula, Adam (Co-author) / Vergara, Marvin (Co-author) / Ramkumar, Shreya (Co-author) / Kozicki, Michael (Thesis director) / Ranjani, Kumaran (Committee member) / School of Molecular Sciences (Contributor) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133375-Thumbnail Image.png
Description
The town of Guadalupe, Arizona has a long history of divided residents and high poverty rates. The high levels of poverty in the town can be attributed to numerous factors, most notably high rates of drug abuse, low high school graduation rates, and teen pregnancy. The town has named one

The town of Guadalupe, Arizona has a long history of divided residents and high poverty rates. The high levels of poverty in the town can be attributed to numerous factors, most notably high rates of drug abuse, low high school graduation rates, and teen pregnancy. The town has named one of its most pressing issues of today to be youth disengagement. There are currently a handful of residents and community members passionate about finding a solution to this issue. After working with Guadalupe's Ending Hunger Task Force and resident youth, I set out to create a program design for a Guadalupe Youth Council. This council will contribute to combating youth disengagement. The program design will assist the task force in creating a standing youth council and deciding on the structure and role the council has in the town. I will offer learning outcomes and suggestions to the Task Force, youth council staff, and the youth of the youth council. This study contains an analysis of relevant literature, youth focus group results and data, and how the information gathered has contributed to the design of the youth council. The results of this study contain recommendations about four themes within the program design of a youth council: size, recruitment, activities and engagement, and adult support. The results also explore how the youth council will impact the power, policy, and behavior of Guadalupe youth.
ContributorsBalderas, Erica Theresa (Author) / Wang, Lili (Thesis director) / Avalos, Francisco (Committee member) / School of Community Resources and Development (Contributor) / Department of Information Systems (Contributor) / W.P. Carey School of Business (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
133381-Thumbnail Image.png
Description
This thesis discusses three recent optimization problems that seek to reduce disease spread on arbitrary graphs by deleting edges, and it discusses three approximation algorithms developed for these problems. Important definitions are presented including the Linear Threshold and Triggering Set models and the set function properties of submodularity and monotonicity.

This thesis discusses three recent optimization problems that seek to reduce disease spread on arbitrary graphs by deleting edges, and it discusses three approximation algorithms developed for these problems. Important definitions are presented including the Linear Threshold and Triggering Set models and the set function properties of submodularity and monotonicity. Also, important results regarding the Linear Threshold model and computation of the influence function are presented along with proof sketches. The three main problems are formally presented, and NP-hardness results along with proof sketches are presented where applicable. The first problem seeks to reduce spread of infection over the Linear Threshold process by making use of an efficient tree data structure. The second problem seeks to reduce the spread of infection over the Linear Threshold process while preserving the PageRank distribution of the input graph. The third problem seeks to minimize the spectral radius of the input graph. The algorithms designed for these problems are described in writing and with pseudocode, and their approximation bounds are stated along with time complexities. Discussion of these algorithms considers how these algorithms could see real-world use. Challenges and the ways in which these algorithms do or do not overcome them are noted. Two related works, one which presents an edge-deletion disease spread reduction problem over a deterministic threshold process and the other which considers a graph modification problem aimed at minimizing worst-case disease spread, are compared with the three main works to provide interesting perspectives. Furthermore, a new problem is proposed that could avoid some issues faced by the three main problems described, and directions for future work are suggested.
ContributorsStanton, Andrew Warren (Author) / Richa, Andrea (Thesis director) / Czygrinow, Andrzej (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
131504-Thumbnail Image.png
Description
In the last few years, billion-dollar companies like Yahoo and Equifax have had data breaches causing millions of people’s personal information to be leaked online. Other billion-dollar companies like Google and Facebook have gotten in trouble for abusing people’s personal information for financial gain as well. In this new age

In the last few years, billion-dollar companies like Yahoo and Equifax have had data breaches causing millions of people’s personal information to be leaked online. Other billion-dollar companies like Google and Facebook have gotten in trouble for abusing people’s personal information for financial gain as well. In this new age of technology where everything is being digitalized and stored online, people all over the world are concerned about what is happening to their personal information and how they can trust it is being kept safe. This paper describes, first, the importance of protecting user data, second, one easy tool that companies and developers can use to help ensure that their user’s information (credit card information specifically) is kept safe, how to implement that tool, and finally, future work and research that needs to be done. The solution I propose is a software tool that will keep credit card data secured. It is only a small step towards achieving a completely secure data anonymized system, but when implemented correctly, it can reduce the risk of credit card data from being exposed to the public. The software tool is a script that can scan every viable file in any given system, server, or other file-structured Linux system and detect if there any visible credit card numbers that should be hidden.
ContributorsPappas, Alexander (Author) / Zhao, Ming (Thesis director) / Kuznetsov, Eugene (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131509-Thumbnail Image.png
Description
This thesis project was conducted to create a practical tool to help micro and small local food enterprises identify potential strategies and sources of finance. Currently, many of these enterprises are unable to obtain the financial capital needed to start-up or maintain operations.

Sources and strategies of finance studied and

This thesis project was conducted to create a practical tool to help micro and small local food enterprises identify potential strategies and sources of finance. Currently, many of these enterprises are unable to obtain the financial capital needed to start-up or maintain operations.

Sources and strategies of finance studied and ultimately included in the tool were Loans, Equity, Membership, Crowdfunding, and Grants. The tool designed was a matrix that takes into account various criteria of the business (e.g. business lifecycle, organizational structure, business performance) and generates a financial plan based on these criteria and how they align with the selected business strategies. After strategies are found, stakeholders can search through an institutional database created in conjunction with the matrix tool to find possible institutional providers of financing that relate to the strategy or strategies found.

The tool has shown promise in identifying sources of finance for micro and small local food enterprises in practical use with hypothetical business cases, however further practical use is necessary to provide further input and revise the tool as needed. Ultimately, the tool will likely become fully user-friendly and stakeholders will not need the assistance of another expert helping them to use it. Finally, despite the promise of the tool itself, the fundamental and underlying problem that many of these businesses face (lack of infrastructure and knowledge) still exists, and while this tool can also help capacity-building efforts towards both those seeking and those providing finance, an institutional attitude adjustment towards social and alternative enterprises is necessary in order to further simplify the process of obtaining finance.
ContributorsDwyer, Robert Francis (Author) / Wiek, Arnim (Thesis director) / Forrest, Nigel (Committee member) / Department of Finance (Contributor) / Department of Information Systems (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131514-Thumbnail Image.png
Description
Political polarization is the coalescence of political parties -- and the individuals of which parties are composed -- around opposing ends of the ideological spectrum. Political parties in the United States have always been divided, however, in recent years this division has only intensified. Recently, polarization has also wound its

Political polarization is the coalescence of political parties -- and the individuals of which parties are composed -- around opposing ends of the ideological spectrum. Political parties in the United States have always been divided, however, in recent years this division has only intensified. Recently, polarization has also wound its way to the Supreme Court and the nomination processes of justices to the Court. This paper examines how prevalent polarization in the Supreme Court nomination process has become by looking specifically at the failed nomination of Judge Merrick Garland and the confirmations of now-Justices Neil Gorsuch and Brett Kavanaugh. This is accomplished by comparing the ideologies and qualifications of the three most recent nominees to those of previous nominees, as well as analysing the ideological composition of the Senate at the times of the individual nominations.
ContributorsJoss, Jacob (Author) / Hoekstra, Valerie (Thesis director) / Critchlow, Donald (Committee member) / Computer Science and Engineering Program (Contributor) / School of Politics and Global Studies (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131525-Thumbnail Image.png
Description
The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark

The original version of Helix, the one I pitched when first deciding to make a video game
for my thesis, is an action-platformer, with the intent of metroidvania-style progression
and an interconnected world map.

The current version of Helix is a turn based role-playing game, with the intent of roguelike
gameplay and a dark fantasy theme. We will first be exploring the challenges that came
with programming my own game - not quite from scratch, but also without a prebuilt
engine - then transition into game design and how Helix has evolved from its original form
to what we see today.
ContributorsDiscipulo, Isaiah K (Author) / Meuth, Ryan (Thesis director) / Kobayashi, Yoshihiro (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
131527-Thumbnail Image.png
Description
Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic

Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic indoor or urban environments. Using recent improvements in the field of machine learning, this project proposes a new method of localization using networks with several wireless transceivers and implemented without heavy computational loads or high costs. This project aims to build a proof-of-concept prototype and demonstrate that the proposed technique is feasible and accurate.

Modern communication networks heavily depend upon an estimate of the communication channel, which represents the distortions that a transmitted signal takes as it moves towards a receiver. A channel can become quite complicated due to signal reflections, delays, and other undesirable effects and, as a result, varies significantly with each different location. This localization system seeks to take advantage of this distinctness by feeding channel information into a machine learning algorithm, which will be trained to associate channels with their respective locations. A device in need of localization would then only need to calculate a channel estimate and pose it to this algorithm to obtain its location.

As an additional step, the effect of location noise is investigated in this report. Once the localization system described above demonstrates promising results, the team demonstrates that the system is robust to noise on its location labels. In doing so, the team demonstrates that this system could be implemented in a continued learning environment, in which some user agents report their estimated (noisy) location over a wireless communication network, such that the model can be implemented in an environment without extensive data collection prior to release.
ContributorsChang, Roger (Co-author) / Kann, Trevor (Co-author) / Alkhateeb, Ahmed (Thesis director) / Bliss, Daniel (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05