Matching Items (4)
Filtering by

Clear all filters

137004-Thumbnail Image.png
Description
Brain-computer interface technology establishes communication between the brain and a computer, allowing users to control devices, machines, or virtual objects using their thoughts. This study investigates optimal conditions to facilitate learning to operate this interface. It compares two biofeedback methods, which dictate the relationship between brain activity and the movement

Brain-computer interface technology establishes communication between the brain and a computer, allowing users to control devices, machines, or virtual objects using their thoughts. This study investigates optimal conditions to facilitate learning to operate this interface. It compares two biofeedback methods, which dictate the relationship between brain activity and the movement of a virtual ball in a target-hitting task. Preliminary results indicate that a method in which the position of the virtual object directly relates to the amplitude of brain signals is most conducive to success. In addition, this research explores learning in the context of neural signals during training with a BCI task. Specifically, it investigates whether subjects can adapt to parameters of the interface without guidance. This experiment prompts subjects to modulate brain signals spectrally, spatially, and temporally, as well differentially to discriminate between two different targets. However, subjects are not given knowledge regarding these desired changes, nor are they given instruction on how to move the virtual ball. Preliminary analysis of signal trends suggests that some successful participants are able to adapt brain wave activity in certain pre-specified locations and frequency bands over time in order to achieve control. Future studies will further explore these phenomena, and future BCI projects will be advised by these methods, which will give insight into the creation of more intuitive and reliable BCI technology.
ContributorsLancaster, Jenessa Mae (Co-author) / Appavu, Brian (Co-author) / Wahnoun, Remy (Co-author, Committee member) / Helms Tillery, Stephen (Thesis director) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor) / Department of Psychology (Contributor)
Created2014-05
136952-Thumbnail Image.png
Description
Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has

Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has been mostly ignored. Characterizing the influence of arm configuration (i.e. intrinsic factors) would allow greater comprehension of sensorimotor integration and assist in interpreting exaggerated movement variability in patients. In this study, subjects were placed in a 3-D virtual reality environment and were asked to move from a starting position to one of three targets in the frontal plane with and without visual feedback of the moving limb. The alternating of visual feedback during trials increased uncertainty between the planning and execution phases. The starting limb configurations, adducted and abducted, were varied in separate blocks. Arm configurations were setup by rotating along the shoulder-hand axis to maintain endpoint position. The investigation hypothesized: 1) patterns of endpoint variability of movements would be dependent upon the starting arm configuration and 2) any differences observed would be more apparent in conditions that withheld visual feedback. The results indicated that there were differences in endpoint variability between arm configurations in both visual conditions, but differences in variability increased when visual feedback was withheld. Overall this suggests that in the presence of visual feedback, planning of movements in 3D space mostly uses coordinates that are arm configuration independent. On the other hand, without visual feedback, planning of movements in 3D space relies substantially on intrinsic coordinates.
ContributorsRahman, Qasim (Author) / Buneo, Christopher (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
Description
Biofeedback music is the integration of physiological signals with audible sound for aesthetic considerations, which an individual’s mental status corresponds to musical output. This project looks into how sounds can be drawn from the meditative and attentive states of the brain using the MindWave Mobile EEG biosensor from NeuroSky. With

Biofeedback music is the integration of physiological signals with audible sound for aesthetic considerations, which an individual’s mental status corresponds to musical output. This project looks into how sounds can be drawn from the meditative and attentive states of the brain using the MindWave Mobile EEG biosensor from NeuroSky. With the MindWave and an Arduino microcontroller processor, sonic output is attained by inputting the data collected by the MindWave, and in real time, outputting code that deciphers it into user constructed sound output. The input is scaled from values 0 to 100, measuring the ‘attentive’ state of the mind by observing alpha waves, and distributing this information to the microcontroller. The output of sound comes from sourcing this into the Musical Instrument Shield and varying the musical tonality with different chords and delay of the notes. The manipulation of alpha states highlights the control or lack thereof for the performer and touches on the question of how much control over the output there really is, much like the experimentalist Alvin Lucier displayed with his concepts in brainwave music.
ContributorsQuach, Andrew Duc (Author) / Helms Tillery, Stephen (Thesis director) / Feisst, Sabine (Committee member) / Barrett, The Honors College (Contributor) / Herberger Institute for Design and the Arts (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
136933-Thumbnail Image.png
Description
Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has

Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has been mostly ignored. Characterizing the influence of arm configuration (i.e. intrinsic factors) would allow greater comprehension of sensorimotor integration and assist in interpreting exaggerated movement variability in patients. In this study, subjects were placed in a 3-D virtual reality environment and were asked to move from a starting position to one of three targets in the frontal plane with and without visual feedback of the moving limb. The alternating of visual feedback during trials increased uncertainty between the planning and execution phases. The starting limb configurations, adducted and abducted, were varied in separate blocks. Arm configurations were setup by rotating along the shoulder-hand axis to maintain endpoint position. The investigation hypothesized: 1) patterns of endpoint variability of movements would be dependent upon the starting arm configuration and 2) any differences observed would be more apparent in conditions that withheld visual feedback. The results indicated that there were differences in endpoint variability between arm configurations in both visual conditions, but differences in variability increased when visual feedback was withheld. Overall this suggests that in the presence of visual feedback, planning of movements in 3D space mostly uses coordinates that are arm configuration independent. On the other hand, without visual feedback, planning of movements in 3D space relies substantially on intrinsic coordinates.
ContributorsRahman, Qasim (Author) / Buneo, Christopher (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05