Matching Items (31)

132577-Thumbnail Image.png

Random Python Program Generator for JavaScript

Description

The most important task for a beginning computer science student, in order for them to succeed in their future studies, is to learn to be able to understand code. One of the greatest indicators of student success in beginning programming

The most important task for a beginning computer science student, in order for them to succeed in their future studies, is to learn to be able to understand code. One of the greatest indicators of student success in beginning programming courses is the ability to read code and predict its output, as this shows that the student truly understands what each line of code is doing. Yet few tools available to students today focus on helping students to improve their ability to read code. The goal of the random Python program generator is to give students a tool to practice this important skill.

The program writes randomly generated, syntactically correct Python 3 code in order to provide students infinite examples from which to study. The end goal of the project is to create an interactive tool where beginning programming students can click a button to generate a random code snippet, check if what they predict the output to be is correct, and get an explanation of the code line by line. The tool currently lacks a front end, but it currently is able to write Python code that includes assignment statements, delete statements, if statements, and print statements. It supports boolean, float, integer, and string variable types.

Contributors

Created

Date Created
2019-05

Detecting Propaganda Bots on Twitter Using Machine Learning

Description

Propaganda bots are malicious bots on Twitter that spread divisive opinions and support political accounts. This project is based on detecting propaganda bots on Twitter using machine learning. Once I began to observe patterns within propaganda followers on

Propaganda bots are malicious bots on Twitter that spread divisive opinions and support political accounts. This project is based on detecting propaganda bots on Twitter using machine learning. Once I began to observe patterns within propaganda followers on Twitter, I determined that I could train algorithms to detect these bots. The paper focuses on my development and process of training classifiers and using them to create a user-facing server that performs prediction functions automatically. The learning goals of this project were detailed, the focus of which was to learn some form of machine learning architecture. I needed to learn some aspect of large data handling, as well as being able to maintain these datasets for training use. I also needed to develop a server that would execute these functionalities on command. I wanted to be able to design a full-stack system that allowed me to create every aspect of a user-facing server that can execute predictions using the classifiers that I design.
Throughout this project, I decided on a number of learning goals to consider it a success. I needed to learn how to use the supporting libraries that would help me to design this system. I also learned how to use the Twitter API, as well as create the infrastructure behind it that would allow me to collect large amounts of data for machine learning. I needed to become familiar with common machine learning libraries in Python in order to create the necessary algorithms and pipelines to make predictions based on Twitter data.
This paper details the steps and decisions needed to determine how to collect this data and apply it to machine learning algorithms. I determined how to create labelled data using pre-existing Botometer ratings, and the levels of confidence I needed to label data for training. I use the scikit-learn library to create these algorithms to best detect these bots. I used a number of pre-processing routines to refine the classifiers’ precision, including natural language processing and data analysis techniques. I eventually move to remotely-hosted versions of the system on Amazon web instances to collect larger amounts of data and train more advanced classifiers. This leads to the details of my final implementation of a user-facing server, hosted on AWS and interfacing over Gmail’s IMAP server.
The current and future development of this system is laid out. This includes more advanced classifiers, better data analysis, conversions to third party Twitter data collection systems, and user features. I detail what it is I have learned from this exercise, and what it is I hope to continue working on.

Contributors

Agent

Created

Date Created
2019-05

133464-Thumbnail Image.png

An Extendable Python IoT Server and Task Handler

Description

The Internet of Things (IoT) is term used to refer to the billions of Internet connected, embedded devices that communicate with one another with the purpose of sharing data or performing actions. One of the core usages of the proverbial

The Internet of Things (IoT) is term used to refer to the billions of Internet connected, embedded devices that communicate with one another with the purpose of sharing data or performing actions. One of the core usages of the proverbial network is the ability for its devices and services to interact with one another to automate daily tasks and routines. For example, IoT devices are often used to automate tasks within the household, such as turning the lights on/off or starting the coffee pot. However, designing a modular system to create and schedule these routines is a difficult task.

Current IoT integration utilities attempt to help simplify this task, but most fail to satisfy one of the requirements many users want in such a system ‒ simplified integration with third party devices. This project seeks to solve this issue through the creation of an easily extendable, modular integrating utility. It is open-source and does not require the use of a cloud-based server, with users hosting the server themselves. With a server and data controller implemented in pure Python and a library for embedded ESP8266 microcontroller-powered devices, the solution seeks to satisfy both casual users as well as those interested in developing their own integrations.

Contributors

Agent

Created

Date Created
2018-05

133551-Thumbnail Image.png

ConstrictR and ConstrictPy: R Package and Python Tool for Microbiome Analysis

Description

I, Christopher Negrich, am the sole author of this paper, but the tools described were designed in collaboration with Andrew Hoetker. ConstrictR (constrictor) and ConstrictPy are an R package and python tool designed together. ConstrictPy implements the functions and methods

I, Christopher Negrich, am the sole author of this paper, but the tools described were designed in collaboration with Andrew Hoetker. ConstrictR (constrictor) and ConstrictPy are an R package and python tool designed together. ConstrictPy implements the functions and methods defined in ConstrictR and applies data handling, data parsing, input/output (I/O), and a user interface to increase usability. ConstrictR implements a variety of common data analysis methods used for statistical and subnetwork analysis. The majority of these methods are inspired by Lionel Guidi's 2016 paper, Plankton networks driving carbon export in the oligotrophic ocean. Additional methods were added to expand functionality, usability, and applicability to different areas of data science. Both ConstrictR and ConstrictPy are currently publicly available and usable, however, they are both ongoing projects. ConstrictR is available at github.com/cnegrich and ConstrictPy is available at github.com/ahoetker. Currently, ConstrictR has implemented functions for descriptive statistics, correlation, covariance, rank, sparsity, and weighted correlation network analysis with clustering, centrality, profiling, error handling, and data parsing methods to be released soon. ConstrictPy has fully implemented and integrated the features in ConstrictR as well as created functions for I/O and conversion between pandas and R data frames with a full feature user interface to be released soon. Both ConstrictR and ConstrictPy are designed to work with minimal dependencies and maximum available information on the algorithms implemented. As a result, ConstrictR is only dependent on base R (v3.4.4) functions with no libraries imported. ConstrictPy is dependent upon only pandas, Rpy2, and ConstrictR. This was done to increase longevity and independence of these tools. Additionally, all mathematical information is documented alongside the code, increasing the available information on how these tools function. Although neither tool is in its final version, this paper documents the code, mathematics, and instructions for use, in addition to plans for future work, for of the current versions of ConstrictR (v0.0.1) and ConstrictPy (v0.0.1).

Contributors

Agent

Created

Date Created
2018-05

131990-Thumbnail Image.png

Sunny: Arizona State University's Chatbot for Students

Description

Next year, Arizona State University is launching a chatbot that will place their knowledge and services right into the palms of their students’ hands. Currently named Sunny, this virtual assistant will be able to answer questions regarding all aspects of

Next year, Arizona State University is launching a chatbot that will place their knowledge and services right into the palms of their students’ hands. Currently named Sunny, this virtual assistant will be able to answer questions regarding all aspects of college life, from orientation to housing, financial aid, schedules, intramurals, and more. Over the last semester, I have met with members of the Sunny development team to discuss their design and implementation plans. With their information plus a bit of outside research, I was able to combine several frameworks and technologies to build a prototype for Sunny. Prototypes allow developers to evaluate their designs early on, giving them ample time to make any necessary adjustments. I am confident that the Sunny development team will be able to learn from my basic implementation, from its triumphs and failures, to create the best possible chatbot for the students attending Arizona State University.

Contributors

Agent

Created

Date Created
2019-12

134454-Thumbnail Image.png

Backend Construction of a Web Service

Description

A growing number of stylists \u2014 cosmetologists \u2014 are finding it harder to afford the basic necessities such as rent. However, the ever-increasing presence of smartphones and the increasing need for on-demand services like Uber and Uber Eats creates a

A growing number of stylists \u2014 cosmetologists \u2014 are finding it harder to afford the basic necessities such as rent. However, the ever-increasing presence of smartphones and the increasing need for on-demand services like Uber and Uber Eats creates a unique opportunity for stylists \u2014 Clippr. Clippr is an application that aims to connect individual stylists directly to their customers. The application gives stylists a platform to create and display their own prices, services, and portfolio. Customers get the benefit of finding a stylist that suits them and booking instantly. This project outlines the backend for the Clippr application. It goes over the framework, REST API, and various functionalities of the application. Additionally, the project also covers the work that is still needed to successfully launch the application.

Contributors

Agent

Created

Date Created
2017-05

134185-Thumbnail Image.png

A Novel Historical Safety Metric for Evaluating Road Networks

Description

37,461 automobile accident fatalities occured in the United States in 2016 ("Quick Facts 2016", 2017). Improving the safety of roads has traditionally been approached by governmental agencies including the National Highway Traffic Safety Administration and State Departments of Transporation. In

37,461 automobile accident fatalities occured in the United States in 2016 ("Quick Facts 2016", 2017). Improving the safety of roads has traditionally been approached by governmental agencies including the National Highway Traffic Safety Administration and State Departments of Transporation. In past literature, automobile crash data is analyzed using time-series prediction technicques to identify road segments and/or intersections likely to experience future crashes (Lord & Mannering, 2010). After dangerous zones have been identified road modifications can be implemented improving public safety. This project introduces a historical safety metric for evaluating the relative danger of roads in a road network. The historical safety metric can be used to update routing choices of individual drivers improving public safety by avoiding historically more dangerous routes. The metric is constructed using crash frequency, severity, location and traffic information. An analysis of publically-available crash and traffic data in Allgeheny County, Pennsylvania is used to generate the historical safety metric for a specific road network. Methods for evaluating routes based on the presented historical safety metric are included using the Mann Whitney U Test to evaluate the significance of routing decisions. The evaluation method presented requires routes have at least 20 crashes to be compared with significance testing. The safety of the road network is visualized using a heatmap to present distribution of the metric throughout Allgeheny County.

Contributors

Agent

Created

Date Created
2017-12

Immersion of Choice

Description

The topic of my creative project centers on the question of "How can the audience's choices influence dancers' improvisation?" This dance work seeks to redefine the relationship between audience and performers through integration of audience, technology, and movement in real-time.

The topic of my creative project centers on the question of "How can the audience's choices influence dancers' improvisation?" This dance work seeks to redefine the relationship between audience and performers through integration of audience, technology, and movement in real-time. This topic was derived from the fields of Computer Science and Dance. To answer my main question, I need to explore how I can interconnect the theory of Computer Science/fundamentals of a web application and the elements of dance improvisation. This topic interests me because it focuses on combining two studies that do not seem related. However, I find that when I am coding a web application, I can insert blocks of code. This relates to dance improvisation where I have a movement vocabulary, and I can insert different moves based on the context. The idea of gathering data from an audience in real time also interests me. I find that data is most useful when a story can be deduced from that data. To figure out how I can use dance to create and tell a story about the data that is collected, I find that to be intriguing as well. The main goals of my Creative Project are to learn the skills needed to develop a web application using the knowledge and theory that I am acquiring through Computer Science as well as learning about the skills needed to produce a performance piece. My object for the overall project is to create an audience-interactive experience that presents choices for dancers and creates a connection between two completely different studies: Computer Science and Dance. My project will consist of having the audience enter their answers to preset questions via an online voting application. The stage background screen will be utilized to show the question results in percentages in the form of a chart. The dancers will then serve as a live interpretation of these results. This Creative Project will serve as a gateway between the work that has been cultivated in my studies and the real world. The methods involve exploring movement qualities in improvisation, communicating with my cast about what worked best for the transitions between each section of the piece, and testing for the web applications. I learned the importance of having structure within improvisational movement for the purpose of choreography. The significance of structure is that it provides direction, clarity, and a sense of unification for the dancers. I also learned the basics of the programming language, Python, in order to develop the two real-time web applications. The significance of learning Python is that I will be able to add this to my skillset of programming languages as well as build upon my knowledge of Computer Science and develop more real-world applications in the future.

Contributors

Agent

Created

Date Created
2018-05

136364-Thumbnail Image.png

Raspberry Pi Radio: Programming a Multiple Source Music Player

Description

The purpose of this project was to program a Raspberry Pi to be able to play music from both local storage on the Pi and from internet radio stations such as Pandora. The Pi also needs to be able to

The purpose of this project was to program a Raspberry Pi to be able to play music from both local storage on the Pi and from internet radio stations such as Pandora. The Pi also needs to be able to play various types of file formats, such as mp3 and FLAC. Finally, the project is also to be driven by a mobile app running on a smartphone or tablet. To achieve this, a client server design was employed where the Raspberry Pi acts as the server and the mobile app is the client. The server functionality was achieved using a Python script that listens on a socket and calls various executables that handle the different formats of music being played. The client functionality was achieved by programming an Android app in Java that sends encoded commands to the server, which the server decodes and begins playing the music that command dictates. The designs for both the client and server are easily extensible and allow for any future modifications to the project to be easily made.

Contributors

Agent

Created

Date Created
2015-05

135230-Thumbnail Image.png

Software for Agent-Based Computational Economics

Description

Agent Based modeling has been used in computer science to simulate complex phenomena. The introduction of Agent Based Models into the field of economics (Agent Based Computational Economics ACE) is not new, however work on making model environments simpler to

Agent Based modeling has been used in computer science to simulate complex phenomena. The introduction of Agent Based Models into the field of economics (Agent Based Computational Economics ACE) is not new, however work on making model environments simpler to design for individuals without a background in computer science or computer engineering is a constantly evolving topic. The issue is a trade off of how much is handled by the framework and how much control the modeler has, as well as what tools exist to allow the user to develop insights from the behavior of the model. The solutions looked at in this thesis are the construction of a simplified grammar for model construction, the design of an economic based library to assist in ACE modeling, and examples of how to construct interactive models.

Contributors

Agent

Created

Date Created
2016-05