Matching Items (10)
Filtering by

Clear all filters

Description
Dale and Edna is a hybrid animated film and videogame experienced in virtual reality with dual storylines that increases in potential meanings through player interaction. Developed and played within Unreal Engine 4 using the HTC Vive, Oculus, or PlayStation VR, Dale and Edna allows for players to passively enjoy the

Dale and Edna is a hybrid animated film and videogame experienced in virtual reality with dual storylines that increases in potential meanings through player interaction. Developed and played within Unreal Engine 4 using the HTC Vive, Oculus, or PlayStation VR, Dale and Edna allows for players to passively enjoy the film element of the project or partake in the active videogame portion. Exploration of the virtual story world yields more information about that world, which may or may not alter the audience’s perception of the world. The film portion of the project is a static narrative with a plot that cannot be altered by players within the virtual world. In the static plot, the characters Dale and Edna discover and subsequently combat an alien invasion that appears to have the objective of demolishing Dale’s prize pumpkin. However, the aliens in the film plot are merely projections created by AR headsets that are reflecting Jimmy’s gameplay on his tablet. The audience is thus invited to question their perception of reality through combined use of VR and AR. The game element is a dynamic narrative scaffold that does not unfold as a traditional narrative might. Instead, what a player observes and interacts with within the sandbox level will determine the meaning those players come away from this project with. Both elements of the project feature modular code construction so developers can return to both the film and game portions of the project and make additions. This paper will analyze the chronological development of the project along with the guiding philosophy that was revealed in the result.
Keywords: virtual reality, film, videogame, sandbox
ContributorsKemp, Adam Lee (Co-author) / Kemp, Bradley (Co-author) / Kemp, Claire (Co-author) / LiKamWa, Robert (Thesis director) / Gilfillan, Daniel (Committee member) / Arts, Media and Engineering Sch T (Contributor) / Thunderbird School of Global Management (Contributor) / School of Film, Dance and Theatre (Contributor) / School of International Letters and Cultures (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
134100-Thumbnail Image.png
Description
Can a skill taught in a virtual environment be utilized in the physical world? This idea is explored by creating a Virtual Reality game for the HTC Vive to teach users how to play the drums. The game focuses on developing the user's muscle memory, improving the user's ability to

Can a skill taught in a virtual environment be utilized in the physical world? This idea is explored by creating a Virtual Reality game for the HTC Vive to teach users how to play the drums. The game focuses on developing the user's muscle memory, improving the user's ability to play music as they hear it in their head, and refining the user's sense of rhythm. Several different features were included to achieve this such as a score, different levels, a demo feature, and a metronome. The game was tested for its ability to teach and for its overall enjoyability by using a small sample group. Most participants of the sample group noted that they felt as if their sense of rhythm and drumming skill level would improve by playing the game. Through the findings of this project, it can be concluded that while it should not be considered as a complete replacement for traditional instruction, a virtual environment can be successfully used as a learning aid and practicing tool.
ContributorsDinapoli, Allison (Co-author) / Tuznik, Richard (Co-author) / Kobayashi, Yoshihiro (Thesis director) / Nelson, Brian (Committee member) / Computer Science and Engineering Program (Contributor) / School of International Letters and Cultures (Contributor) / Computing and Informatics Program (Contributor) / Barrett, The Honors College (Contributor)
Created2017-12
Description
Virtual Reality (hereafter VR) and Mixed Reality (hereafter MR) have opened a new line of applications and possibilities. Amidst a vast network of potential applications, little research has been done to provide real time collaboration capability between users of VR and MR. The idea of this thesis study is to

Virtual Reality (hereafter VR) and Mixed Reality (hereafter MR) have opened a new line of applications and possibilities. Amidst a vast network of potential applications, little research has been done to provide real time collaboration capability between users of VR and MR. The idea of this thesis study is to develop and test a real time collaboration system between VR and MR. The system works similar to a Google document where two or more users can see what others are doing i.e. writing, modifying, viewing, etc. Similarly, the system developed during this study will enable users in VR and MR to collaborate in real time.

The study of developing a real-time cross-platform collaboration system between VR and MR takes into consideration a scenario in which multiple device users are connected to a multiplayer network where they are guided to perform various tasks concurrently.

Usability testing was conducted to evaluate participant perceptions of the system. Users were required to assemble a chair in alternating turns; thereafter users were required to fill a survey and give an audio interview. Results collected from the participants showed positive feedback towards using VR and MR for collaboration. However, there are several limitations with the current generation of devices that hinder mass adoption. Devices with better performance factors will lead to wider adoption.
ContributorsSeth, Nayan Sateesh (Author) / Nelson, Brian (Thesis advisor) / Walker, Erin (Committee member) / Atkinson, Robert (Committee member) / Arizona State University (Publisher)
Created2017
155689-Thumbnail Image.png
Description
Paper assessment remains to be an essential formal assessment method in today's classes. However, it is difficult to track student learning behavior on physical papers. This thesis presents a new educational technology—Web Programming Grading Assistant (WPGA). WPGA not only serves as a grading system but also a feedback delivery tool

Paper assessment remains to be an essential formal assessment method in today's classes. However, it is difficult to track student learning behavior on physical papers. This thesis presents a new educational technology—Web Programming Grading Assistant (WPGA). WPGA not only serves as a grading system but also a feedback delivery tool that connects paper-based assessments to digital space. I designed a classroom study and collected data from ASU computer science classes. I tracked and modeled students' reviewing and reflecting behaviors based on the use of WPGA. I analyzed students' reviewing efforts, in terms of frequency, timing, and the associations with their academic performances. Results showed that students put extra emphasis in reviewing prior to the exams and the efforts demonstrated the desire to review formal assessments regardless of if they were graded for academic performance or for attendance. In addition, all students paid more attention on reviewing quizzes and exams toward the end of semester.
ContributorsHuang, Po-Kai (Author) / Hsiao, I-Han (Thesis advisor) / Nelson, Brian (Committee member) / VanLehn, Kurt (Committee member) / Arizona State University (Publisher)
Created2017
148262-Thumbnail Image.png
Description

This thesis is based on bringing together three different components: non-Euclidean geometric worlds, virtual reality, and environmental puzzles in video games. While all three exist in their own right in the world of video games, as well as combined in pairs, there are virtually no examples of all three together.

This thesis is based on bringing together three different components: non-Euclidean geometric worlds, virtual reality, and environmental puzzles in video games. While all three exist in their own right in the world of video games, as well as combined in pairs, there are virtually no examples of all three together. Non-Euclidean environmental puzzle games have existed for around 10 years in various forms, short environmental puzzle games in virtual reality have come into existence in around the past five years, and non-Euclidean virtual reality exists mainly as non-video game short demos from the past few years. This project seeks to be able to bring these components together to create a proof of concept for how a game like this should function, particularly the integration of non-Euclidean virtual reality in the context of a video game. To do this, a Unity package which uses a custom system for creating worlds in a non-Euclidean way rather than Unity’s built-in components such as for transforms, collisions, and rendering was used. This was used in conjunction with the SteamVR implementation with Unity to create a cohesive and immersive player experience.

ContributorsVerhagen, Daniel William (Author) / Kobayashi, Yoshihiro (Thesis director) / Nelson, Brian (Committee member) / Computer Science and Engineering Program (Contributor, Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
Description

Java Mission-planning and Analysis for Remote Sensing (JMARS) is a geospatial software that provides mission planning and data-analysis tools with access to orbital data for planetary bodies like Mars and Venus. Using JMARS, terrain scenes can be prepared with an assortment of data layers along with any additional data sets.

Java Mission-planning and Analysis for Remote Sensing (JMARS) is a geospatial software that provides mission planning and data-analysis tools with access to orbital data for planetary bodies like Mars and Venus. Using JMARS, terrain scenes can be prepared with an assortment of data layers along with any additional data sets. These scenes can then be exported into the JMARS extended reality platform, which includes both augmented reality and virtual reality experiences. JMARS VR Viewer is a virtual reality experience that allows users to view three-dimensional terrain data in a fully immersive and interactive way. This tool also provides a collaborative environment for users to host a terrain scene where people can analyze the data together. The purpose of the project is to design a set of interactions in virtual reality to try and address these questions: (1) how do we make sense of larger complex geospatial datasets, (2) how can we design interactions that assist users in understanding layered data in both an individual and collaborative work environment, and (3) what are the effects on the user’s cognitive overload while using these interfaces.

ContributorsWang, Olivia (Author) / LiKamWa, Robert (Thesis director) / Gold, Lauren (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2023-05
165907-Thumbnail Image.png
ContributorsVoitek, Julian (Author) / Thorn, Seth (Thesis director) / LiKamWa, Robert (Committee member) / Barrett, The Honors College (Contributor) / Arts, Media and Engineering Sch T (Contributor)
Created2022-05
Description

Computer-based auditory training programs (CBATPs) are used as an at-home aural rehabilitation solution in individuals with hearing impairment, most commonly in recipients of cochlear implants or hearing aids. However, recent advancements in spatial audio and immersive gameplay have not seen inclusion in these programs. Isle Aliquo, a virtual-reality CBATP, is

Computer-based auditory training programs (CBATPs) are used as an at-home aural rehabilitation solution in individuals with hearing impairment, most commonly in recipients of cochlear implants or hearing aids. However, recent advancements in spatial audio and immersive gameplay have not seen inclusion in these programs. Isle Aliquo, a virtual-reality CBATP, is designed to reformat traditional rehabilitation exercises into virtual 3D space. The program explores how the aural exercise outcomes of detection, discrimination, direction, and identification can be improved with the incorporation of directional spatial audio, as well as how the experience can be made more engaging to improve adherence to training routines. Fundamentals of professional aural rehabilitation and current CBATP design inform the structure of the exercise modules found in Isle Aliquo.

ContributorsVoitek, Julian (Author) / Thorn, Seth (Thesis director) / LiKamWa, Robert (Committee member) / Barrett, The Honors College (Contributor) / Arts, Media and Engineering Sch T (Contributor)
Created2022-05
165905-Thumbnail Image.jpg
ContributorsVoitek, Julian (Author) / Thorn, Seth (Thesis director) / LiKamWa, Robert (Committee member) / Barrett, The Honors College (Contributor) / Arts, Media and Engineering Sch T (Contributor)
Created2022-05
165906-Thumbnail Image.jpg
ContributorsVoitek, Julian (Author) / Thorn, Seth (Thesis director) / LiKamWa, Robert (Committee member) / Barrett, The Honors College (Contributor) / Arts, Media and Engineering Sch T (Contributor)
Created2022-05