Matching Items (2)
Filtering by

Clear all filters

Description
This thesis introduces the Model-Based Development of Multi-iRobot Toolbox (MBDMIRT), a Simulink-based toolbox designed to provide the means to acquire and practice the Model-Based Development (MBD) skills necessary to design real-time embedded system. The toolbox was developed in the Cyber-Physical System Laboratory at Arizona State University. The MBDMIRT toolbox runs

This thesis introduces the Model-Based Development of Multi-iRobot Toolbox (MBDMIRT), a Simulink-based toolbox designed to provide the means to acquire and practice the Model-Based Development (MBD) skills necessary to design real-time embedded system. The toolbox was developed in the Cyber-Physical System Laboratory at Arizona State University. The MBDMIRT toolbox runs under MATLAB/Simulink to simulate the movements of multiple iRobots and to control, after verification by simulation, multiple physical iRobots accordingly. It adopts the Simulink/Stateflow, which exemplifies an approach to MBD, to program the behaviors of the iRobots. The MBDMIRT toolbox reuses and augments the open-source MATLAB-Based Simulator for the iRobot Create from Cornell University to run the simulation. Regarding the mechanism of iRobot control, the MBDMIRT toolbox applies the MATLAB Toolbox for the iRobot Create (MTIC) from United States Naval Academy to command the physical iRobots. The MBDMIRT toolbox supports a timer in both the simulation and the control, which is based on the local clock of the PC running the toolbox. In addition to the build-in sensors of an iRobot, the toolbox can simulate four user-added sensors, which are overhead localization system (OLS), sonar sensors, a camera, and Light Detection And Ranging (LIDAR). While controlling a physical iRobot, the toolbox supports the StarGazer OLS manufactured by HAGISONIC, Inc.
ContributorsSu, Shih-Kai (Author) / Fainekos, Georgios E (Thesis advisor) / Sarjoughian, Hessam S. (Committee member) / Artemiadis, Panagiotis K (Committee member) / Arizona State University (Publisher)
Created2012
152076-Thumbnail Image.png
Description
Human fingertips contain thousands of specialized mechanoreceptors that enable effortless physical interactions with the environment. Haptic perception capabilities enable grasp and manipulation in the absence of visual feedback, as when reaching into one's pocket or wrapping a belt around oneself. Unfortunately, state-of-the-art artificial tactile sensors and processing algorithms are no

Human fingertips contain thousands of specialized mechanoreceptors that enable effortless physical interactions with the environment. Haptic perception capabilities enable grasp and manipulation in the absence of visual feedback, as when reaching into one's pocket or wrapping a belt around oneself. Unfortunately, state-of-the-art artificial tactile sensors and processing algorithms are no match for their biological counterparts. Tactile sensors must not only meet stringent practical specifications for everyday use, but their signals must be processed and interpreted within hundreds of milliseconds. Control of artificial manipulators, ranging from prosthetic hands to bomb defusal robots, requires a constant reliance on visual feedback that is not entirely practical. To address this, we conducted three studies aimed at advancing artificial haptic intelligence. First, we developed a novel, robust, microfluidic tactile sensor skin capable of measuring normal forces on flat or curved surfaces, such as a fingertip. The sensor consists of microchannels in an elastomer filled with a liquid metal alloy. The fluid serves as both electrical interconnects and tunable capacitive sensing units, and enables functionality despite substantial deformation. The second study investigated the use of a commercially-available, multimodal tactile sensor (BioTac sensor, SynTouch) to characterize edge orientation with respect to a body fixed reference frame, such as a fingertip. Trained on data from a robot testbed, a support vector regression model was developed to relate haptic exploration actions to perception of edge orientation. The model performed comparably to humans for estimating edge orientation. Finally, the robot testbed was used to perceive small, finger-sized geometric features. The efficiency and accuracy of different haptic exploratory procedures and supervised learning models were assessed for estimating feature properties such as type (bump, pit), order of curvature (flat, conical, spherical), and size. This study highlights the importance of tactile sensing in situations where other modalities fail, such as when the finger itself blocks line of sight. Insights from this work could be used to advance tactile sensor technology and haptic intelligence for artificial manipulators that improve quality of life, such as prosthetic hands and wheelchair-mounted robotic hands.
ContributorsPonce Wong, Ruben Dario (Author) / Santos, Veronica J (Thesis advisor) / Artemiadis, Panagiotis K (Committee member) / Helms Tillery, Stephen I (Committee member) / Posner, Jonathan D (Committee member) / Runger, George C. (Committee member) / Arizona State University (Publisher)
Created2013