Matching Items (3)
Filtering by

Clear all filters

134271-Thumbnail Image.png
Description
In recent years, environment mapping has garnered significant interest in both industrial and academic settings as a viable means of generating comprehensive virtual models of the physical world. These maps are created using simultaneous localization and mapping (SLAM) algorithms that combine depth contours with visual imaging information to create rich,

In recent years, environment mapping has garnered significant interest in both industrial and academic settings as a viable means of generating comprehensive virtual models of the physical world. These maps are created using simultaneous localization and mapping (SLAM) algorithms that combine depth contours with visual imaging information to create rich, layered point clouds. Given the recent advances in virtual reality technology, these generated point clouds can be imported onto the Oculus Rift or similar headset for virtual reality implementation. This project deals with the robotic implementation of RGB-D SLAM algorithms on mobile ground robots to generate complete point clouds that can be processed off-line and imported into virtual reality engines for viewing in the Oculus Rift. This project uses a ground robot along with a Kinect sensor to collect RGB-D data of the surrounding environment to build point cloud maps using SLAM software. These point clouds are then exported as object or polygon files for post-processing in software engines such as Meshlab or Unity. The point clouds generated from the SLAM software can be viewed in the Oculus Rift as is. However, these maps are mainly empty space and can be further optimized for virtual viewing. Additional techniques such as meshing and texture meshing were implemented on the raw point cloud maps and tested on the Oculus Rift. The aim of this project was to increase the potential applications for virtual reality by taking a robotic mapping approach to virtual reality environment development. This project was successful in achieving its objective. The following report details the processes used in developing a remotely-controlled robotic platform that can scan its environment and generate viable point cloud maps. These maps are then processed off line and ported into virtual reality software for viewing through the Oculus Rift.
ContributorsUdupa, Shreya (Author) / Artemiadis, Panagiotis (Thesis director) / Chickamenahalli, Shamala (Committee member) / Mechanical and Aerospace Engineering Program (Contributor) / Economics Program in CLAS (Contributor) / Barrett, The Honors College (Contributor)
Created2017-05
148244-Thumbnail Image.png
Description

In this experiment, a haptic glove with vibratory motors on the fingertips was tested against the standard HTC Vive controller to see if the additional vibrations provided by the glove increased immersion in common gaming scenarios where haptic feedback is provided. Specifically, two scenarios were developed: an explosion scene containing

In this experiment, a haptic glove with vibratory motors on the fingertips was tested against the standard HTC Vive controller to see if the additional vibrations provided by the glove increased immersion in common gaming scenarios where haptic feedback is provided. Specifically, two scenarios were developed: an explosion scene containing a small and large explosion and a box interaction scene that allowed the participants to touch the box virtually with their hand. At the start of this project, it was hypothesized that the haptic glove would have a significant positive impact in at least one of these scenarios. Nine participants took place in the study and immersion was measured through a post-experiment questionnaire. Statistical analysis on the results showed that the haptic glove did have a significant impact on immersion in the box interaction scene, but not in the explosion scene. In the end, I conclude that since this haptic glove does not significantly increase immersion across all scenarios when compared to the standard Vive controller, it should not be used at a replacement in its current state.

ContributorsGriffieth, Alan P (Author) / McDaniel, Troy (Thesis director) / Selgrad, Justin (Committee member) / Computing and Informatics Program (Contributor) / School of Mathematical and Statistical Sciences (Contributor) / Economics Program in CLAS (Contributor) / Computer Science and Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

This project is called the Zoom Room and it is about the use of virtual reality (VR) for workspace productivity. It is where Zoom and VR meet to form an enhanced productive workspace for users. Equipped with two 3D printers that show how a 3D printer moves and the intricate

This project is called the Zoom Room and it is about the use of virtual reality (VR) for workspace productivity. It is where Zoom and VR meet to form an enhanced productive workspace for users. Equipped with two 3D printers that show how a 3D printer moves and the intricate parts that make up the 3D printer, it is much more than just a standard meeting room. It is a place to analyze machines and meet with others in a virtual space.

ContributorsWang, David (Author) / Johnson-Glenberg, Mina (Thesis director) / Surovec, Victor (Committee member) / Barrett, The Honors College (Contributor) / Economics Program in CLAS (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05