Matching Items (2)
Filtering by

Clear all filters

156774-Thumbnail Image.png
Description
Research has shown that the learning processes can be enriched and enhanced with the presence of affective interventions. The goal of this dissertation was to design, implement, and evaluate an affective agent that provides affective support in real-time in order to enrich the student’s learning experience and performance by inducing

Research has shown that the learning processes can be enriched and enhanced with the presence of affective interventions. The goal of this dissertation was to design, implement, and evaluate an affective agent that provides affective support in real-time in order to enrich the student’s learning experience and performance by inducing and/or maintaining a productive learning path. This work combined research and best practices from affective computing, intelligent tutoring systems, and educational technology to address the design and implementation of an affective agent and corresponding pedagogical interventions. It included the incorporation of the affective agent into an Exploratory Learning Environment (ELE) adapted for this research.

A gendered, three-dimensional, animated, human-like character accompanied by text- and speech-based dialogue visually represented the proposed affective agent. The agent’s pedagogical interventions considered inputs from the ELE (interface, model building, and performance events) and from the user (emotional and cognitive events). The user’s emotional events captured by biometric sensors and processed by a decision-level fusion algorithm for a multimodal system in combination with the events from the ELE informed the production-rule-based behavior engine to define and trigger pedagogical interventions. The pedagogical interventions were focused on affective dimensions and occurred in the form of affective dialogue prompts and animations.

An experiment was conducted to assess the impact of the affective agent, Hope, on the student’s learning experience and performance. In terms of the student’s learning experience, the effect of the agent was analyzed in four components: perception of the instructional material, perception of the usefulness of the agent, ELE usability, and the affective responses from the agent triggered by the student’s affective states.

Additionally, in terms of the student’s performance, the effect of the agent was analyzed in five components: tasks completed, time spent solving a task, planning time while solving a task, usage of the provided help, and attempts to successfully complete a task. The findings from the experiment did not provide the anticipated results related to the effect of the agent; however, the results provided insights to improve diverse components in the design of affective agents as well as for the design of the behavior engines and algorithms to detect, represent, and handle affective information.
ContributorsChavez Echeagaray, Maria Elena (Author) / Atkinson, Robert K (Thesis advisor) / Burleson, Winslow (Thesis advisor) / Graesser, Arthur C. (Committee member) / VanLehn, Kurt (Committee member) / Walker, Erin A (Committee member) / Arizona State University (Publisher)
Created2018
155180-Thumbnail Image.png
Description
The present study explored the use of augmented reality (AR) technology to support cognitive modeling in an art-based learning environment. The AR application used in this study made visible the thought processes and observational techniques of art experts for the learning benefit of novices through digital annotations, overlays, and side-by-side

The present study explored the use of augmented reality (AR) technology to support cognitive modeling in an art-based learning environment. The AR application used in this study made visible the thought processes and observational techniques of art experts for the learning benefit of novices through digital annotations, overlays, and side-by-side comparisons that when viewed on mobile device appear directly on works of art.

Using a 2 x 3 factorial design, this study compared learner outcomes and motivation across technologies (audio-only, video, AR) and groupings (individuals, dyads) with 182 undergraduate and graduate students who were self-identified art novices. Learner outcomes were measured by post-activity spoken responses to a painting reproduction with the pre-activity response as a moderating variable. Motivation was measured by the sum score of a reduced version of the Instructional Materials Motivational Survey (IMMS), accounting for attention, relevance, confidence, and satisfaction, with total time spent in learning activity as the moderating variable. Information on participant demographics, technology usage, and art experience was also collected.

Participants were randomly assigned to one of six conditions that differed by technology and grouping before completing a learning activity where they viewed four high-resolution, printed-to-scale painting reproductions in a gallery-like setting while listening to audio-recorded conversations of two experts discussing the actual paintings. All participants listened to expert conversations but the video and AR conditions received visual supports via mobile device.

Though no main effects were found for technology or groupings, findings did include statistically significant higher learner outcomes in the elements of design subscale (characteristics most represented by the visual supports of the AR application) than the audio-only conditions. When participants saw digital representations of line, shape, and color directly on the paintings, they were more likely to identify those same features in the post-activity painting. Seeing what the experts see, in a situated environment, resulted in evidence that participants began to view paintings in a manner similar to the experts. This is evidence of the value of the temporal and spatial contiguity afforded by AR in cognitive modeling learning environments.
ContributorsShapera, Daniel Michael (Author) / Atkinson, Robert K (Thesis advisor) / Nelson, Brian C (Committee member) / Erickson, Mary (Committee member) / Arizona State University (Publisher)
Created2016