Matching Items (43)
Description

For my Thesis Project, I worked to operationalize an algorithmic trading application called Trading Dawg. Over the year, I was able to implement several analysis models, including accuracy, performance, volume, and hyperparameter analysis. With these improvements, we are in a strong position to create valuable tools in the algorithmic trading

For my Thesis Project, I worked to operationalize an algorithmic trading application called Trading Dawg. Over the year, I was able to implement several analysis models, including accuracy, performance, volume, and hyperparameter analysis. With these improvements, we are in a strong position to create valuable tools in the algorithmic trading space.

ContributorsPayne, Colton (Author) / Shakarian, Paulo (Thesis director) / Brandt, William (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor) / Department of Finance (Contributor)
Created2023-05
168449-Thumbnail Image.png
Description
The genre of world music and its market’s reliance on musical exoticism, othering, and the audience’s insatiable quest for musical authenticity have influenced and shaped the way artists construct and negotiate their musical representation. With the popularization of democratized music platforms such as Bandcamp, artists have greater autonomy in terms

The genre of world music and its market’s reliance on musical exoticism, othering, and the audience’s insatiable quest for musical authenticity have influenced and shaped the way artists construct and negotiate their musical representation. With the popularization of democratized music platforms such as Bandcamp, artists have greater autonomy in terms of artistic representation and musical distribution in the online world. Although the internet has in some ways disrupted the old power structures of the music industry, the old forms of world music marketing have been reinscribed into a new context. Old stereotypes and narratives of authenticity in world music have permeated the digital representation of artists and their music. Music recommendation algorithms also shape the way artists are represented in digital environments. Semantic descriptors such as social tags play a vital role in musical identification and recommendation systems implemented by streaming platforms. The use of social tags such as #worldmusic homogenizes diverse cultural sounds into a single umbrella genre. #World music also creates avenues for old stereotypes and narratives of authenticity to re-emerge. This re-emergence of the old tropes of world music creates less equitable recommendation and representational outcomes for musicians operating within the genre. In the age of streaming, where does world music belong? How do artists negotiate representation online? This thesis explores the dynamics of representation and the projections of “authenticity” between world music artists and record labels inside of Bandcamp’s digital ecosystem. By juxtaposing the traditional framework of “world music” marketing with new and evolving methods of distribution and artistic representation, it is possible to see how digital media are reshaping but also reproducing some of the old paradigms of world music. I also propose that a new framework needs to be established to study the impact digital streaming has on the genre of world music. This new framework, which I call “World Music 3.0,” will encompass how algorithms, tech companies, and the democratization of musical practices interact within a globalized community.
ContributorsCureno, Eric Leonel (Author) / Fossum, Dave (Thesis advisor) / Hayes, Lauren (Committee member) / Paine, Garth (Committee member) / Arizona State University (Publisher)
Created2021
Description

Bad actor reporting has recently grown in popularity as an effective method for social media attacks and harassment, but many mitigation strategies have yet to be investigated. In this study, we created a simulated social media environment of 500,000 users, and let those users create and review a number of

Bad actor reporting has recently grown in popularity as an effective method for social media attacks and harassment, but many mitigation strategies have yet to be investigated. In this study, we created a simulated social media environment of 500,000 users, and let those users create and review a number of posts. We then created four different post-removal algorithms to analyze the simulation, each algorithm building on previous ones, and evaluated them based on their accuracy and effectiveness at removing malicious posts. This thesis work concludes that a trust-reward structure within user report systems is the most effective strategy for removing malicious content while minimizing the removal of genuine content. This thesis also discusses how the structure can be further enhanced to accommodate real-world data and provide a viable solution for reducing bad actor online activity as a whole.

ContributorsYang, Lucas (Author) / Atkinson, Robert (Thesis director) / O'Neil, Erica (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
Description

This paper explores the inner workings of algorithms that computers may use to play Chess. First, we discuss the classical Alpha-Beta algorithm and several improvements, including Quiescence Search, Transposition Tables, and more. Next, we examine the state-of-the-art Monte Carlo Tree Search algorithm and relevant optimizations. After that, we consider a

This paper explores the inner workings of algorithms that computers may use to play Chess. First, we discuss the classical Alpha-Beta algorithm and several improvements, including Quiescence Search, Transposition Tables, and more. Next, we examine the state-of-the-art Monte Carlo Tree Search algorithm and relevant optimizations. After that, we consider a recent algorithm that transforms Alpha-Beta into a “Rollout” search, blending it with Monte Carlo Tree Search under the rollout paradigm. We then discuss our C++ Chess Engine, Homura, and explain its implementation of a hybrid algorithm combining Alpha-Beta with MCTS. Finally, we show that Homura can play master-level Chess at a strength currently exceeding that of our backtracking Alpha-Beta.

ContributorsMoore, Evan (Author) / Kobayashi, Yoshihiro (Thesis director) / Kambhampati, Subbarao (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
Description

The sudden turn to artificial intelligence has been widely supported because of the several proposed positive outcomes of using such technologies to support or replace humans. Automating tedious processes and removing potential human error is exciting for society, but some concerns must be addressed. This essay aims to understand how

The sudden turn to artificial intelligence has been widely supported because of the several proposed positive outcomes of using such technologies to support or replace humans. Automating tedious processes and removing potential human error is exciting for society, but some concerns must be addressed. This essay aims to understand how artificial intelligence can automate domains that likely significantly impact underprivileged and underrepresented groups. This essay will address the potentially devastating effects of algorithmic biases and AI’s contribution to perpetual economic inequality by surveying different domains, such as the justice system and the real estate industry. Without society broadly understanding the potential negative side effects on systems that matter, the rapid growth of artificial intelligence is a recipe for disaster. Everyone must become educated about AI’s current and potential implications before it is too late to stop its damaging effects.

ContributorsTerhune, Alexandra (Author) / Pofahl, Geoffrey (Thesis director) / Koretz, Lora (Committee member) / Barrett, The Honors College (Contributor) / Dean, W.P. Carey School of Business (Contributor)
Created2023-05
Description
This Honors thesis was written in partial fulfillment of the requirements for a Bachelor of Science in Human Systems Engineering with Honors. The project consists of a literature review that explores the uses and applications of Machine Learning and Artificial Intelligence techniques in the field of commercial aviation. After a

This Honors thesis was written in partial fulfillment of the requirements for a Bachelor of Science in Human Systems Engineering with Honors. The project consists of a literature review that explores the uses and applications of Machine Learning and Artificial Intelligence techniques in the field of commercial aviation. After a brief introduction and explanation of the most commonly used algorithms in the field of aviation, it explores the applications of Machine Learning techniques for risk reduction, and for the betterment of in-flight operations, and pilot selection, training, and assessment.
ContributorsInderberg, Laura (Author) / Gray, Robert (Thesis director) / Demir, Mustafa (Committee member) / Barrett, The Honors College (Contributor) / Human Systems Engineering (Contributor) / Dean, W.P. Carey School of Business (Contributor)
Created2023-12
187307-Thumbnail Image.png
Description
Networks are a versatile modeling tool for the cyber and physical infrastructure that characterize society. They can be used to describe system spatiotemporal dynamics, including distribution of commodities, movement of agents, and data transmission. This flexibility has resulted in the widespread use of network optimization techniques for decision-making in telecommunications,

Networks are a versatile modeling tool for the cyber and physical infrastructure that characterize society. They can be used to describe system spatiotemporal dynamics, including distribution of commodities, movement of agents, and data transmission. This flexibility has resulted in the widespread use of network optimization techniques for decision-making in telecommunications, transportation, commerce, among other systems. However, realistic network problems are typically large-scale and require the use of integer variables to incorporate design or logical system constraints. This makes such problems hard to solve and precludes their wide applicability in the solution of applied problems. This dissertation studies four large-scale optimization problems with underlying network structure in different domain applications, including wireless sensor networks, wastewater monitoring, and scheduling. The problems of interest are formulated using mixed-integer optimization formulations. The proposed solution approaches in this dissertation include branch-and-cut and heuristic algorithms, which are enhanced with network-based valid inequalities and network reduction techniques. The first chapter studies a relay node placement problem in wireless sensor networks, with and without the presence of transmission obstacles in the deployment region. The proposed integer linear programming approach leverages the underlying network structure to produce valid inequalities and network reduction heuristics, which are incorporated in the branch-and-bound exploration. The solution approach outperforms the equivalent nonlinear model and solves instances with up to 1000 sensors within reasonable time. The second chapter studies the continuous version of the maximum capacity (widest) path interdiction problem and introduces the first known polynomial time algorithm to solve the problem using a combination of binary search and the discrete version of the Newton’s method. The third chapter explores the service agent transport interdiction problem in autonomous vehicle systems, where an agent schedules service tasks in the presence of an adversary. This chapter proposes a single stage branch-and-cut algorithm to solve the problem, along with several enhancement techniques to improve scalability. The last chapter studies the optimal placement of sensors in a wastewater network to minimize the maximum coverage (load) of placed sensors. This chapter proposes a branch-and-cut algorithm enhanced with network reduction techniques and strengthening constraints.
ContributorsMitra, Ankan (Author) / Sefair, Jorge A (Thesis advisor) / Mirchandani, Pitu (Committee member) / Grubesic, Anthony (Committee member) / Byeon, Geunyeong (Committee member) / Arizona State University (Publisher)
Created2023
157738-Thumbnail Image.png
Description
Water is one of, if not the most valuable natural resource but extremely challenging to manage. According to old research in the field, many Water Distribution Systems (WDSs) around the world lose above 40 percent of clean water pumped into the distribution system because of unfortune leaks before the water

Water is one of, if not the most valuable natural resource but extremely challenging to manage. According to old research in the field, many Water Distribution Systems (WDSs) around the world lose above 40 percent of clean water pumped into the distribution system because of unfortune leaks before the water gets anywhere from the fresh water resources. By reducing the amount of water leaked, distribution system managers can reduce the amount of money, resources, and energy wasted on finding and repairing the leaks, and then producing and pumping water, increase system reliability and more easily satisfy present and future needs of all consumers. But having access to this information pre-amatively and sufficiently can be complex and time taking. For large companies like SRP who are moving tonnes of water from various water bodies around phoenix area, it is even more crucial to efficiently locate and characterize the leaks. And phoenix being a busy city, it is not easy to go start digging everywhere, whenever a loss in pressure is reported at the destination.

Keeping this in mind, non-invasive methods to geo-physically work on it needs attention. There is a lot of potential in this field of work to even help with environmental crisis as this helps in places where water theft is big and is conducted through leaks in the distribution system. Methods like Acoustic sensing and ground penetrating radars have shown good results, and the work done in this thesis helps us realise the limitations and extents to which they can be used in the phoenix are.

The concrete pipes used by SRP are would not be able to generate enough acoustic signals to be affectively picked up by a hydrophone at the opening, so the GPR would be helpful in finding the initial location of the leak, as the water around the leak would make the sand wet and hence show a clear difference on the GPR. After that the frequency spectrum can be checked around that point which would show difference from another where we know a leak is not present.
ContributorsSrivastava, Siddhant (Author) / Lee, Taewoo (Thesis advisor) / Kwan, Beomjin (Committee member) / Kim, Jeonglae (Committee member) / Arizona State University (Publisher)
Created2019
158544-Thumbnail Image.png
Description
This thesis addresses the following fundamental maximum throughput routing problem: Given an arbitrary edge-capacitated n-node directed network and a set of k commodities, with source-destination pairs (s_i,t_i) and demands d_i> 0, admit and route the largest possible number of commodities -- i.e., the maximum throughput -- to satisfy their demands.

This thesis addresses the following fundamental maximum throughput routing problem: Given an arbitrary edge-capacitated n-node directed network and a set of k commodities, with source-destination pairs (s_i,t_i) and demands d_i> 0, admit and route the largest possible number of commodities -- i.e., the maximum throughput -- to satisfy their demands.

The main contributions of this thesis are three-fold: First, a bi-criteria approximation algorithm is presented for this all-or-nothing multicommodity flow (ANF) problem. This algorithm is the first to achieve a constant approximation of the maximum throughput with an edge capacity violation ratio that is at most logarithmic in n, with high probability. The approach used is based on a version of randomized rounding that keeps splittable flows, rather than approximating those via a non-splittable path for each commodity: This allows it to work for arbitrary directed edge-capacitated graphs, unlike most of the prior work on the ANF problem. The algorithm also works if a weighted throughput is considered, where the benefit gained by fully satisfying the demand for commodity i is determined by a given weight w_i>0. Second, a derandomization of the algorithm is presented that maintains the same approximation bounds, using novel pessimistic estimators for Bernstein's inequality. In addition, it is shown how the framework can be adapted to achieve a polylogarithmic fraction of the maximum throughput while maintaining a constant edge capacity violation, if the network capacity is large enough. Lastly, one important aspect of the randomized and derandomized algorithms is their simplicity, which lends to efficient implementations in practice. The implementations of both randomized rounding and derandomized algorithms for the ANF problem are presented and show their efficiency in practice.
ContributorsChaturvedi, Anya (Author) / Richa, Andréa W. (Thesis advisor) / Sen, Arunabha (Committee member) / Schmid, Stefan (Committee member) / Arizona State University (Publisher)
Created2020
132360-Thumbnail Image.png
Description
We consider programmable matter as a collection of simple computational elements (or particles) that self-organize to solve system-wide problems of movement, configuration, and coordination. Here, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in

We consider programmable matter as a collection of simple computational elements (or particles) that self-organize to solve system-wide problems of movement, configuration, and coordination. Here, we focus on the compression problem, in which the particle system gathers as tightly together as possible, as in a sphere or its equivalent in the presence of some underlying geometry. Within this model a configuration of particles can be represented as a unique closed self-avoiding walk on the triangular lattice. In this paper we will examine the bias parameter of a Markov chain based algorithm that solves the compression problem under the geometric amoebot model, for particle systems that begin in a connected configuration with no holes. This bias parameter, $\lambda$, determines the behavior of the algorithm. It has been shown that for $\lambda > 2+\sqrt{2}$, with all but exponentially small probability, the algorithm achieves compression. Additionally the same algorithm can be used for expansion for small values of $\lambda$; in particular, for all $0 < \lambda < \sqrt{\tau}$, where $\lim_{n\to\infty} {(p_n)^{1
}}=\tau$. This research will focus on improving approximations on the lower bound of $\tau$. Toward this end we will examine algorithmic enumeration, and series analysis for self-avoiding polygons.
ContributorsLough, Kevin James (Author) / Richa, Andrea (Thesis director) / Fishel, Susanna (Committee member) / School of Mathematical and Statistical Sciences (Contributor, Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05