Matching Items (11)

134239-Thumbnail Image.png

An Introduction to Fractal Geometry and its Application in the Simulation of Nature

Description

Formerly coined mathematical "monsters," fractals are a compelling concept that dates back hundreds of years. The idea that a shape or set of information could be infinitely deconstructed into multiple

Formerly coined mathematical "monsters," fractals are a compelling concept that dates back hundreds of years. The idea that a shape or set of information could be infinitely deconstructed into multiple copies of itself is both confusing and brilliant. However, throughout its entire history, many scientists and mathematicians have repeatedly dismissed the applicability of self-similarity. The purpose of this study is to explore the path of development of fractal geometry and demonstrate its widely-ignored usefulness. While many students and professionals are unaware of this alternate system for describing natural processes and shapes, several disciplines can benefit from applying fractal geometry to their work.

Contributors

Created

Date Created
  • 2017-05

133804-Thumbnail Image.png

Calculat3d: A 3D Graphing Calculator

Description

Modern curriculum requires students to purchase expensive handheld calculators, which has created a market with little competition or incentive for improvement. The purpose of this project was to create a

Modern curriculum requires students to purchase expensive handheld calculators, which has created a market with little competition or incentive for improvement. The purpose of this project was to create a competitive free alternative to be used outside the classroom for those who do not have the economic stability to purchase, for example, a TI-82, which costs approximately $100. Calculat3d is an Android application that matches the general-purpose functionality of the TI-82, including calculations, basic statistical functions, graphing, and creating programs. Additionally, a programming language and interpreter were created so programs can be written inside Calculat3d and be used alongside calculations, thus expanding the functionality of the calculator. Graphing functionality is also included in Calculat3d but expanded to three dimensions as opposed to the two-dimension limited TI calculator.

Contributors

Agent

Created

Date Created
  • 2018-05

133743-Thumbnail Image.png

Game Engine for 2D Fighting Games with Simple DirectMedia Layer

Description

This project is a Game Engine for 2D Fighting Games which uses Simple DirectMedia Layer and C++. The Game Engine's goal is to model the conventions the genre has for

This project is a Game Engine for 2D Fighting Games which uses Simple DirectMedia Layer and C++. The Game Engine's goal is to model the conventions the genre has for dynamically handling combat between two characters. The characters can be in a variety of different states that animate certain features while also responding to the environment based on key statuses. There is a playable test game that is the subject of a user study. The Game Engine's capabilities are shown by the test game and the limitations / missing features are discussed.

Contributors

Created

Date Created
  • 2018-05

147768-Thumbnail Image.png

Procedural Content Generation Using Noise

Description

Procedural content generation refers to the creation of data algorithmically using controlled randomness. These algorithms can be used to generate complex environments and geological formations as opposed to manually creating

Procedural content generation refers to the creation of data algorithmically using controlled randomness. These algorithms can be used to generate complex environments and geological formations as opposed to manually creating environments, using photogrammetry, or other means. Geological formations and the surrounding terrain can be created using noise based algorithms such as Perlin noise. However, interpreting noise in this manner has a number of challenges due to the pseudo-random nature of noise. We will discuss how to generate noise, how to render noise, and the challenges in interpreting noise.

Contributors

Agent

Created

Date Created
  • 2021-05

133551-Thumbnail Image.png

ConstrictR and ConstrictPy: R Package and Python Tool for Microbiome Analysis

Description

I, Christopher Negrich, am the sole author of this paper, but the tools described were designed in collaboration with Andrew Hoetker. ConstrictR (constrictor) and ConstrictPy are an R package and

I, Christopher Negrich, am the sole author of this paper, but the tools described were designed in collaboration with Andrew Hoetker. ConstrictR (constrictor) and ConstrictPy are an R package and python tool designed together. ConstrictPy implements the functions and methods defined in ConstrictR and applies data handling, data parsing, input/output (I/O), and a user interface to increase usability. ConstrictR implements a variety of common data analysis methods used for statistical and subnetwork analysis. The majority of these methods are inspired by Lionel Guidi's 2016 paper, Plankton networks driving carbon export in the oligotrophic ocean. Additional methods were added to expand functionality, usability, and applicability to different areas of data science. Both ConstrictR and ConstrictPy are currently publicly available and usable, however, they are both ongoing projects. ConstrictR is available at github.com/cnegrich and ConstrictPy is available at github.com/ahoetker. Currently, ConstrictR has implemented functions for descriptive statistics, correlation, covariance, rank, sparsity, and weighted correlation network analysis with clustering, centrality, profiling, error handling, and data parsing methods to be released soon. ConstrictPy has fully implemented and integrated the features in ConstrictR as well as created functions for I/O and conversion between pandas and R data frames with a full feature user interface to be released soon. Both ConstrictR and ConstrictPy are designed to work with minimal dependencies and maximum available information on the algorithms implemented. As a result, ConstrictR is only dependent on base R (v3.4.4) functions with no libraries imported. ConstrictPy is dependent upon only pandas, Rpy2, and ConstrictR. This was done to increase longevity and independence of these tools. Additionally, all mathematical information is documented alongside the code, increasing the available information on how these tools function. Although neither tool is in its final version, this paper documents the code, mathematics, and instructions for use, in addition to plans for future work, for of the current versions of ConstrictR (v0.0.1) and ConstrictPy (v0.0.1).

Contributors

Agent

Created

Date Created
  • 2018-05

154976-Thumbnail Image.png

Automating fixture setups based on point cloud data & CAD model

Description

Metal castings are selectively machined-based on dimensional control requirements. To ensure that all the finished surfaces are fully machined, each as-cast part needs to be measured and then adjusted optimally

Metal castings are selectively machined-based on dimensional control requirements. To ensure that all the finished surfaces are fully machined, each as-cast part needs to be measured and then adjusted optimally in its fixture. The topics of this thesis address two parts of this process: data translations and feature-fitting clouds of points measured on each cast part. For the first, a CAD model of the finished part is required to be communicated to the machine shop for performing various machining operations on the metal casting. The data flow must include GD&T specifications along with other special notes that may be required to communicate to the machinist. Current data exchange, among various digital applications, is limited to translation of only CAD geometry via STEP AP203. Therefore, an algorithm is developed in order to read, store and translate the data from a CAD file (for example SolidWorks, CREO) to a standard and machine readable format (ACIS format - *.sat). Second, the geometry of cast parts varies from piece to piece and hence fixture set-up parameters for each part must be adjusted individually. To predictively determine these adjustments, the datum surfaces, and to-be-machined surfaces are scanned individually and the point clouds reduced to feature fits. The scanned data are stored as separate point cloud files. The labels associated with the datum and to-be-machined (TBM) features are extracted from the *.sat file. These labels are further matched with the file name of the point cloud data to identify data for the respective features. The point cloud data and the CAD model are then used to fit the appropriate features (features at maximum material condition (MMC) for datums and features at least material condition (LMC) for TBM features) using the existing normative feature fitting (nFF) algorithm. Once the feature fitting is complete, a global datum reference frame (GDRF) is constructed based on the locating method that will be used to machine the part. The locating method is extracted from a fixture library that specifies the type of fixturing used to machine the part. All entities are transformed from its local coordinate system into the GDRF. The nominal geometry, fitted features, and the GD&T information are then stored in a neutral file format called the Constraint Tolerance Feature (CTF) Graph. The final outputs are then used to identify the locations of the critical features on each part and these are used to establish the adjustments for its setup prior to machining, in another module, not part of this thesis.

Contributors

Agent

Created

Date Created
  • 2016

158074-Thumbnail Image.png

Developing a Neural Network Based Adaptive Task Selection System for anUndergraduate Level Organic Chemistry Course

Description

In the last decade, the immense growth of computational power, enhanced data storage capabilities, and the increasing popularity of online learning systems has led to adaptive learning systems becoming more

In the last decade, the immense growth of computational power, enhanced data storage capabilities, and the increasing popularity of online learning systems has led to adaptive learning systems becoming more widely available. Parallel to infrastructure enhancements, more researchers have started to study the adaptive task selection systems, concluding that suggesting tasks appropriate to students' needs may increase students' learning gains.

This work built an adaptive task selection system for undergraduate organic chemistry students using a deep learning algorithm. The proposed model is based on a recursive neural network (RNN) architecture built with Long-Short Term Memory (LSTM) cells that recommends organic chemistry practice questions to students depending on their previous question selections.

For this study, educational data were collected from the Organic Chemistry Practice Environment (OPE) that is used in the Organic Chemistry course at Arizona State University. The OPE has more than three thousand questions. Each question is linked to one or more knowledge components (KCs) to enable recommendations that precisely address the knowledge that students need. Subject matter experts made the connection between questions and related KCs.

A linear model derived from students' exam results was used to identify skilled students. The neural network based recommendation system was trained using those skilled students' problem solving attempt sequences so that the trained system recommends questions that will likely improve learning gains the most. The model was evaluated by measuring the predicted questions' accuracy against learners' actual task selections. The proposed model not only accurately predicted the learners' actual task selection but also the correctness of their answers.

Contributors

Agent

Created

Date Created
  • 2020

158908-Thumbnail Image.png

Exploring the Impact of Augmented Reality on Collaborative Decision-Making in Small Teams

Description

While significant qualitative, user study-focused research has been done on augmented reality, relatively few studies have been conducted on multiple, co-located synchronously collaborating users in augmented reality. Recognizing the need

While significant qualitative, user study-focused research has been done on augmented reality, relatively few studies have been conducted on multiple, co-located synchronously collaborating users in augmented reality. Recognizing the need for more collaborative user studies in augmented reality and the value such studies present, a user study is conducted of collaborative decision-making in augmented reality to investigate the following research question: “Does presenting data visualizations in augmented reality influence the collaborative decision-making behaviors of a team?” This user study evaluates how viewing data visualizations with augmented reality headsets impacts collaboration in small teams compared to viewing together on a single 2D desktop monitor as a baseline. Teams of two participants performed closed and open-ended evaluation tasks to collaboratively analyze data visualized in both augmented reality and on a desktop monitor. Multiple means of collecting and analyzing data were employed to develop a well-rounded context for results and conclusions, including software logging of participant interactions, qualitative analysis of video recordings of participant sessions, and pre- and post-study participant questionnaires. The results indicate that augmented reality doesn’t significantly change the quantity of team member communication but does impact the means and strategies participants use to collaborate.

Contributors

Agent

Created

Date Created
  • 2020

154871-Thumbnail Image.png

Automating GD&T schema for mechanical assemblies

Description

Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job

Parts are always manufactured with deviations from their nominal geometry due to many reasons such as inherent inaccuracies in the machine tools and environmental conditions. It is a designer job to devise a proper tolerance scheme to allow reasonable freedom to a manufacturer for imperfections without compromising performance. It takes years of experience and strong practical knowledge of the device function, manufacturing process and GD&T standards for a designer to create a good tolerance scheme. There is almost no theoretical resource to help designers in GD&T synthesis. As a result, designers often create inconsistent and incomplete tolerance schemes that lead to high assembly scrap rates. Auto-Tolerancing project was started in the Design Automation Lab (DAL) to investigate the degree to which tolerance synthesis can be automated. Tolerance synthesis includes tolerance schema generation (sans tolerance values) and tolerance value allocation. This thesis aims to address the tolerance schema generation. To develop an automated tolerance schema synthesis toolset, to-be-toleranced features need to be identified, required tolerance types should be determined, a scheme for computer representation of the GD&T information need to be developed, sequence of control should be identified, and a procedure for creating datum reference frames (DRFs) should be developed. The first three steps define the architecture of the tolerance schema generation module while the last two steps setup a base to create a proper tolerance scheme with the help of GD&T good practice rules obtained from experts. The GD&T scheme recommended by this module is used by the tolerance value allocation/analysis module to complete the process of automated tolerance synthesis. Various test cases are studied to verify the suitability of this module. The results show that software-generated schemas are proper enough to address the assemblability issues (first order tolerancing). Since this novel technology is at its initial stage of development, performing further researches and case studies will definitely help to improve the software for making more comprehensive tolerance schemas that cover design intent (second order tolerancing) and cost optimization (third order tolerancing).

Contributors

Agent

Created

Date Created
  • 2016

149744-Thumbnail Image.png

Smooth surfaces for video game development

Description

The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various

The video game graphics pipeline has traditionally rendered the scene using a polygonal approach. Advances in modern graphics hardware now allow the rendering of parametric methods. This thesis explores various smooth surface rendering methods that can be integrated into the video game graphics engine. Moving over to parametric or smooth surfaces from the polygonal domain has its share of issues and there is an inherent need to address various rendering bottlenecks that could hamper such a move. The game engine needs to choose an appropriate method based on in-game characteristics of the objects; character and animated objects need more sophisticated methods whereas static objects could use simpler techniques. Scaling the polygon count over various hardware platforms becomes an important factor. Much control is needed over the tessellation levels, either imposed by the hardware limitations or by the application, to be able to adaptively render the mesh without significant loss in performance. This thesis explores several methods that would help game engine developers in making correct design choices by optimally balancing the trade-offs while rendering the scene using smooth surfaces. It proposes a novel technique for adaptive tessellation of triangular meshes that vastly improves speed and tessellation count. It develops an approximate method for rendering Loop subdivision surfaces on tessellation enabled hardware. A taxonomy and evaluation of the methods is provided and a unified rendering system that provides automatic level of detail by switching between the methods is proposed.

Contributors

Agent

Created

Date Created
  • 2011