Matching Items (8)
Filtering by

Clear all filters

133899-Thumbnail Image.png
Description
Emerging technologies, such as augmented reality (AR), are growing in popularity and accessibility at a fast pace. Developers are building more and more games and applications with this technology but few have stopped to think about what the best practices are for creating a good user experience (UX). Currently, there

Emerging technologies, such as augmented reality (AR), are growing in popularity and accessibility at a fast pace. Developers are building more and more games and applications with this technology but few have stopped to think about what the best practices are for creating a good user experience (UX). Currently, there are no universally accepted human-computer interaction guidelines for augmented reality because it is still relatively new. This paper examines three features - virtual content scale, indirect selection, and virtual buttons - in an attempt to discover their impact on the user experience in augmented reality. A Battleship game was developed using the Unity game engine with Vuforia, an augmented reality platform, and built as an iOS application to test these features. The hypothesis was that both virtual content scale and indirect selection would result in a more enjoyable and engaging user experience whereas the virtual button would be too confusing for users to fully appreciate the feature. Usability testing was conducted to gauge participants' responses to these features. After playing a base version of the game with no additional features and then a second version with one of the three features, participants rated their experiences and provided feedback in a four-part survey. It was observed during testing that people did not inherently move their devices around the augmented space and needed guidance to navigate the game. Most users were fascinated with the visuals of the game and two of the tested features. It was found that movement around the augmented space and feedback from the virtual content were critical aspects in creating a good user experience in augmented reality.
ContributorsBauman, Kirsten (Co-author) / Benson, Meera (Co-author) / Olson, Loren (Thesis director) / LiKamWa, Robert (Committee member) / School of the Arts, Media and Engineering (Contributor) / Barrett, The Honors College (Contributor)
Created2018-05
187854-Thumbnail Image.png
Description
Traditional sports coaching involves face-to-face instructions with athletes or playingback 2D videos of athletes’ training. However, if the coach is not in the same area as the athlete, then the coach will not be able to see the athlete’s full body and thus cannot give precise guidance to the athlete, limiting the

Traditional sports coaching involves face-to-face instructions with athletes or playingback 2D videos of athletes’ training. However, if the coach is not in the same area as the athlete, then the coach will not be able to see the athlete’s full body and thus cannot give precise guidance to the athlete, limiting the athlete’s improvement. To address these challenges, this paper proposes Augmented Coach, an augmented reality platform where coaches can view, manipulate and comment on athletes’ movement volumetric video data remotely via the network. In particular, this work includes a). Capturing the athlete’s movement video data with Kinects and converting it into point cloud format b). Transmitting the point cloud data to the coach’s Oculus headset via 5G or wireless network c). Coach’s commenting on the athlete’s joints. In addition, the evaluation of Augmented Coach includes an assessment of its performance from five metrics via the wireless network and 5G network environment, but also from the coaches’ and athletes’ experience of using it. The result shows that Augmented Coach enables coaches to instruct athletes from a distance and provide effective feedback for correcting athletes’ motions under the network.
ContributorsQiao, Yunhan (Author) / LiKamWa, Robert (Thesis advisor) / Bansal, Ajay (Committee member) / Jayasuriya, Suren (Committee member) / Arizona State University (Publisher)
Created2023
131793-Thumbnail Image.png
Description
As the prevalence of augmented reality (AR) technology continues to increase, so too have methods for improving the appearance and behavior of computer-generated objects. This is especially significant as AR applications now expand to territories outside of the entertainment sphere and can be utilized for numerous purposes encompassing but

As the prevalence of augmented reality (AR) technology continues to increase, so too have methods for improving the appearance and behavior of computer-generated objects. This is especially significant as AR applications now expand to territories outside of the entertainment sphere and can be utilized for numerous purposes encompassing but not limited to education, specialized occupational training, retail & online shopping, design, marketing, and manufacturing. Due to the nature of AR technology, where computer-generated objects are being placed into a real-world environment, a decision has to be made regarding the visual connection between the tangible and the intangible. Should the objects blend seamlessly into their environment or purposefully stand out? It is not purely a stylistic choice. A developer must consider how their application will be used — in many instances an optimal user experience is facilitated by mimicking the real world as closely as possible; even simpler applications, such as those built primarily for mobile devices, can benefit from realistic AR. The struggle here lies in creating an immersive user experience that is not reliant on computationally-expensive graphics or heavy-duty models. The research contained in this thesis provides several ways for achieving photorealistic rendering in AR applications using a range of techniques, all of which are supported on mobile devices. These methods can be employed within the Unity Game Engine and incorporate shaders, render pipelines, node-based editors, post-processing, and light estimation.
ContributorsSchanberger, Schuyler Catherine (Author) / LiKamWa, Robert (Thesis director) / Jayasuriya, Suren (Committee member) / Arts, Media and Engineering Sch T (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
132683-Thumbnail Image.png
Description
Augmented Reality (AR) is a tool increasingly available to young learners and educators. This paper documents and analyzes the creation of an AR app used as a tool to teach fractions to young learners and enhance their engagement in the classroom. As an emerging technology reaching diffusion into the general

Augmented Reality (AR) is a tool increasingly available to young learners and educators. This paper documents and analyzes the creation of an AR app used as a tool to teach fractions to young learners and enhance their engagement in the classroom. As an emerging technology reaching diffusion into the general populace, AR presents a unique opportunity to engage users in the digital and real world. Additionally, AR can be enabled on most modern phones and tablets; therefore, it is extremely accessible and has a low barrier to entry. To integrate AR into the classroom in an affordable way, I created leARn, an AR application intended to help young learners understand fractions. leARn is an application intended to be used alongside traditional teaching methods, in order to enhance the engagement of students in the classroom. Throughout the development of the product, I not only considered usability and design, but also the effectiveness of the app in the classroom. Moreover, due to collaboration with Arizona State University Research Enterprises, I tested the application in a classroom with sixth, seventh and eighth grade students. This paper presents the findings from that testing period and analysis of the educational effectiveness of the concept based on data received from students.
ContributorsVan Dobben, Maureen Veronica (Author) / LiKamWa, Robert (Thesis director) / Swisher, Kimberlee (Committee member) / Arts, Media and Engineering Sch T (Contributor) / Barrett, The Honors College (Contributor)
Created2019-05
165433-Thumbnail Image.png
Description

Augmented Reality (AR) especially when used with mobile devices enables the creation of applications that can help students in chemistry learn anything from basic to more advanced concepts. In Chemistry specifically, the 3D representation of molecules and chemical structures is of vital importance to students and yet when printed in

Augmented Reality (AR) especially when used with mobile devices enables the creation of applications that can help students in chemistry learn anything from basic to more advanced concepts. In Chemistry specifically, the 3D representation of molecules and chemical structures is of vital importance to students and yet when printed in 2D as on textbooks and lecture notes it can be quite hard to understand those vital 3D concepts. ARsome Chemistry is an app that aims to utilize AR to display complex and simple molecules in 3D to actively teach students these concepts through quizzes and other features. The ARsome chemistry app uses image target recognition to allow students to hand-draw or print line angle structures or chemical formulas of molecules and then scan those targets to get 3D representation of molecules. Students can use their fingers and the touch screen to zoom, rotate, and highlight different portions of the molecule to gain a better understanding of the molecule's 3D structure. The ARsome chemistry app also features the ability to utilize image recognition to allow students to quiz themselves on drawing line-angle structures and show it to the camera for the app to check their work. The ARsome chemistry app is an accessible and cost-effective study aid platform for students for on demand, interactive, 3D representations of complex molecules.

ContributorsEvans, Brandon (Author) / LiKamWa, Robert (Thesis director) / Johnson, Mina (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2022-05
165544-Thumbnail Image.png
Description

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete's form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete's form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete's form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.

ContributorsDbeis, Yasser (Author) / Channar, Sameer (Co-author) / Richards, Connor (Co-author) / LiKamWa, Robert (Thesis director) / Jayasuriya, Suren (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor)
Created2022-05
165564-Thumbnail Image.png
Description

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete’s form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.

ContributorsChannar, Sameer (Author) / Dbeis, Yasser (Co-author) / Richards, Connor (Co-author) / LiKamWa, Robert (Thesis director) / Jayasuriya, Suren (Committee member) / Barrett, The Honors College (Contributor) / Dean, W.P. Carey School of Business (Contributor) / Computer Science and Engineering Program (Contributor)
Created2022-05
165566-Thumbnail Image.png
Description

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for

Video playback is currently the primary method coaches and athletes use in sports training to give feedback on the athlete’s form and timing. Athletes will commonly record themselves using a phone or camera when practicing a sports movement, such as shooting a basketball, to then send to their coach for feedback on how to improve. In this work, we present Augmented Coach, an augmented reality tool for coaches to give spatiotemporal feedback through a 3-dimensional point cloud of the athlete. The system allows coaches to view a pre-recorded video of their athlete in point cloud form, and provides them with the proper tools in order to go frame by frame to both analyze the athlete’s form and correct it. The result is a fundamentally new concept of an interactive video player, where the coach can remotely view the athlete in a 3-dimensional form and create annotations to help improve their form. We then conduct a user study with subject matter experts to evaluate the usability and capabilities of our system. As indicated by the results, Augmented Coach successfully acts as a supplement to in-person coaching, since it allows coaches to break down the video recording in a 3-dimensional space and provide feedback spatiotemporally. The results also indicate that Augmented Coach can be a complete coaching solution in a remote setting. This technology will be extremely relevant in the future as coaches look for new ways to improve their feedback methods, especially in a remote setting.

ContributorsRichards, Connor (Author) / Dbeis, Yasser (Co-author) / Channar, Sameer (Co-author) / LiKamWa, Robert (Thesis director) / Jayasuriya, Suren (Committee member) / Barrett, The Honors College (Contributor) / Computer Science and Engineering Program (Contributor) / School of International Letters and Cultures (Contributor)
Created2022-05