Filtering by
- Creators: Arts, Media and Engineering Sch T
- Creators: Chavez Echeagaray, Maria Elena
A gendered, three-dimensional, animated, human-like character accompanied by text- and speech-based dialogue visually represented the proposed affective agent. The agent’s pedagogical interventions considered inputs from the ELE (interface, model building, and performance events) and from the user (emotional and cognitive events). The user’s emotional events captured by biometric sensors and processed by a decision-level fusion algorithm for a multimodal system in combination with the events from the ELE informed the production-rule-based behavior engine to define and trigger pedagogical interventions. The pedagogical interventions were focused on affective dimensions and occurred in the form of affective dialogue prompts and animations.
An experiment was conducted to assess the impact of the affective agent, Hope, on the student’s learning experience and performance. In terms of the student’s learning experience, the effect of the agent was analyzed in four components: perception of the instructional material, perception of the usefulness of the agent, ELE usability, and the affective responses from the agent triggered by the student’s affective states.
Additionally, in terms of the student’s performance, the effect of the agent was analyzed in five components: tasks completed, time spent solving a task, planning time while solving a task, usage of the provided help, and attempts to successfully complete a task. The findings from the experiment did not provide the anticipated results related to the effect of the agent; however, the results provided insights to improve diverse components in the design of affective agents as well as for the design of the behavior engines and algorithms to detect, represent, and handle affective information.
Affective video games are still a relatively new field of research and entertainment. Even
so, being a form of entertainment media, emotion plays a large role in video games as a whole.
This project seeks to gain an understanding of what emotions are most prominent during game
play. From there, a system will be created wherein the game will record the player’s facial
expressions and interpret those expressions as emotions, allowing the game to adjust its difficulty
to create a more tailored experience.
The first portion of this project, understanding the relationship between emotions and
games, was done by recording myself as I played three different games of different genres for
thirty minutes each. The same system that would be used in the later game I created to evaluate
emotions was used to evaluate these recordings.
After the data was interpreted, I created three different versions of the same game, based
on a template created by Stan’s Assets, which was a version of the arcade game Stacker. The
three versions of the game included one where no changes were made to the gameplay
experience, it simply recorded the player’s face and extrapolated emotions from that recording,
one where the speed increased in an attempt to maintain a certain level of positive emotions, and
a third where, in addition to increasing the speed of the game, it also decreased the speed in an
attempt to minimize negative emotions.
These tests, together, show that the emotional experience of a player is heavily dependent
on how tailored the game is towards that particular emotion. Additionally, in creating a system
meant to interact with these emotions, it is easier to create a one-dimensional system that focuses
on one emotion (or range of emotions) as opposed to a more complex system, as the system
begins to become unstable, and can lead to undesirable gameplay effects.