Matching Items (5)
Filtering by

Clear all filters

135340-Thumbnail Image.png
Description
Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and

Preventive maintenance is a practice that has become popular in recent years, largely due to the increased dependency on electronics and other mechanical systems in modern technologies. The main idea of preventive maintenance is to take care of maintenance-type issues before they fully appear or cause disruption of processes and daily operations. One of the most important parts is being able to predict and foreshadow failures in the system, in order to make sure that those are fixed before they turn into large issues. One specific area where preventive maintenance is a very big part of daily activity is the automotive industry. Automobile owners are encouraged to take their cars in for maintenance on a routine schedule (based on mileage or time), or when their car signals that there is an issue (low oil levels for example). Although this level of maintenance is enough when people are in charge of cars, the rise of autonomous vehicles, specifically self-driving cars, changes that. Now instead of a human being able to look at a car and diagnose any issues, the car needs to be able to do this itself. The objective of this project was to create such a system. The Electronics Preventive Maintenance System is an internal system that is designed to meet all these criteria and more. The EPMS system is comprised of a central computer which monitors all major electronic components in an autonomous vehicle through the use of standard off-the-shelf sensors. The central computer compiles the sensor data, and is able to sort and analyze the readings. The filtered data is run through several mathematical models, each of which diagnoses issues in different parts of the vehicle. The data for each component in the vehicle is compared to pre-set operating conditions. These operating conditions are set in order to encompass all normal ranges of output. If the sensor data is outside the margins, the warning and deviation are recorded and a severity level is calculated. In addition to the individual focus, there's also a vehicle-wide model, which predicts how necessary maintenance is for the vehicle. All of these results are analyzed by a simple heuristic algorithm and a decision is made for the vehicle's health status, which is sent out to the Fleet Management System. This system allows for accurate, effortless monitoring of all parts of an autonomous vehicle as well as predictive modeling that allows the system to determine maintenance needs. With this system, human inspectors are no longer necessary for a fleet of autonomous vehicles. Instead, the Fleet Management System is able to oversee inspections, and the system operator is able to set parameters to decide when to send cars for maintenance. All the models used for the sensor and component analysis are tailored specifically to the vehicle. The models and operating margins are created using empirical data collected during normal testing operations. The system is modular and can be used in a variety of different vehicle platforms, including underwater autonomous vehicles and aerial vehicles.
ContributorsMian, Sami T. (Author) / Collofello, James (Thesis director) / Chen, Yinong (Committee member) / School of Mathematical and Statistical Sciences (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-05
134769-Thumbnail Image.png
Description
In order to adequately introduce students to computer science and robotics in an exciting and engaging manner certain teaching techniques should be used. In recent years some of the most popular paradigms are Visual Programming Languages. Visual Programming Languages are meant to introduce problem solving skills and basic programming constructs

In order to adequately introduce students to computer science and robotics in an exciting and engaging manner certain teaching techniques should be used. In recent years some of the most popular paradigms are Visual Programming Languages. Visual Programming Languages are meant to introduce problem solving skills and basic programming constructs inherent to all modern day languages by allowing users to write programs visually as opposed to textually. By bypassing the need to learn syntax students can focus on the thinking behind developing an algorithm and see immediate results that help generate excitement for the field and reduce disinterest due to startup complexity and burnout. The Introduction to Engineering course at Arizona State University supports this approach by teaching students the basics of autonomous maze traversing algorithms and using ASU VIPLE, a Visual Programming Language developed to connect with and direct real-world robots. However, some startup time is needed to learn how to interface with these robots using ASU VIPLE. That is why the HTML5 Autonomous Robot Web Simulator was created -- by encouraging students to use the simulator the problem solving behind autonomous maze traversing algorithms can be introduced more quickly and with immediate affirmation. Our goal was to improve this simulator and add features so that the simulator could be accessed and used for a more wide variety of introductory Computer Science lessons. Features scattered across past implementations of robotic simulators were aggregated in a cross platform solution. Upon initial development, a classroom test group revealed usability concerns and a demonstration of students' mental models. Mean time for task completion was 8.1min - compared to 2min for the authors. The simulator was updated in response to test group feedback and new instructor requirements. The new implementation reduces programming overhead while maintaining a learning environment with support for even the most complex applications.
ContributorsRodewald, Spencer (Co-author, Co-author) / Patel, Ankit (Co-author) / Chen, Yinong (Thesis director) / Chattin, Linda (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2016-12
135981-Thumbnail Image.png
Description
Education in computer science is a difficult endeavor, with learning a new programing language being a barrier to entry, especially for college freshman and high school students. Learning a first programming language requires understanding the syntax of the language, the algorithms to use, and any additional complexities the language carries.

Education in computer science is a difficult endeavor, with learning a new programing language being a barrier to entry, especially for college freshman and high school students. Learning a first programming language requires understanding the syntax of the language, the algorithms to use, and any additional complexities the language carries. Often times this becomes a deterrent from learning computer science at all. Especially in high school, students may not want to spend a year or more simply learning the syntax of a programming language. In order to overcome these issues, as well as to mitigate the issues caused by Microsoft discontinuing their Visual Programming Language (VPL), we have decided to implement a new VPL, ASU-VPL, based on Microsoft's VPL. ASU-VPL provides an environment where users can focus on algorithms and worry less about syntactic issues. ASU-VPL was built with the concepts of Robot as a Service and workflow based development in mind. As such, ASU-VPL is designed with the intention of allowing web services to be added to the toolbox (e.g. WSDL and REST services). ASU-VPL has strong support for multithreaded operations, including event driven development, and is built with Microsoft VPL users in mind. It provides support for many different robots, including Lego's third generation robots, i.e. EV3, and any open platform robots. To demonstrate the capabilities of ASU-VPL, this paper details the creation of an Intel Edison based robot and the use of ASU-VPL for programming both the Intel based robot and an EV3 robot. This paper will also discuss differences between ASU-VPL and Microsoft VPL as well as differences between developing for the EV3 and for an open platform robot.
ContributorsDe Luca, Gennaro (Author) / Chen, Yinong (Thesis director) / Cheng, Calvin (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2015-12
132079-Thumbnail Image.png
Description
In this update to the ESPBot, we have introduced new libraries for a small OLED display and a beeper. This functionality can be easily expanded to multiple beepers and displays, but requires more GPIO pins, or for the user to not use some of the infrared sensors or the ultrasonic

In this update to the ESPBot, we have introduced new libraries for a small OLED display and a beeper. This functionality can be easily expanded to multiple beepers and displays, but requires more GPIO pins, or for the user to not use some of the infrared sensors or the ultrasonic sensor. We have also relocated some of the pins. The display can be updated to display 1 of 4 predefined shapes, or to display user-defined text. New shapes can be added by defining new methods within display.ino and calling the appropriate functions while parsing the JSON data in viple.ino. The beeper can be controlled by user-defined input to play any frequency for any amount of time. There is also a function added to play the happy birthday song. More songs can be added by defining new methods within beeper.ino and calling the appropriate functions while parsing the JSON data in viple.ino. More functionality can be added to allow the user to input a list of frequencies along with a list of time so the user can define their own songs or sequences on the fly.
ContributorsWelfert, Monica Michelle (Co-author) / Nguyen, Van (Co-author) / Chen, Yinong (Thesis director) / Nakamura, Mutsumi (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2019-12
132414-Thumbnail Image.png
Description
A common design of multi-agent robotic systems requires a centralized master node, which coordinates the actions of all the agents. The multi-agent system designed in this project enables coordination between the robots and reduces the dependence on a single node in the system. This design change reduces the complexity of

A common design of multi-agent robotic systems requires a centralized master node, which coordinates the actions of all the agents. The multi-agent system designed in this project enables coordination between the robots and reduces the dependence on a single node in the system. This design change reduces the complexity of the central node, and makes the system more adaptable to changes in its topology. The final goal of this project was to have a group of robots collaboratively claim positions in pre-defined formations, and navigate to the position using pose data transmitted by a localization server.
Planning coordination between robots in a multi-agent system requires each robot to know the position of the other robots. To address this, the localization server tracked visual fiducial markers attached to the robots and relayed their pose to every robot at a rate of 20Hz using the MQTT communication protocol. The robots used this data to inform a potential fields path planning algorithm and navigate to their target position.
This project was unable to address all of the challenges facing true distributed multi-agent coordination and needed to make concessions in order to meet deadlines. Further research would focus on shoring up these deficiencies and developing a more robust system.
ContributorsThibeault, Quinn (Author) / Meuth, Ryan (Thesis director) / Chen, Yinong (Committee member) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05