Matching Items (57)
136361-Thumbnail Image.png
Description
Determining the characteristics of an object during a grasping task requires a combination of mechanoreceptors in the muscles and fingertips. The width of a person's finger aperture during the grasp may affect the accuracy of how that person determines hardness, as well. These experiments aim to investigate how an individual

Determining the characteristics of an object during a grasping task requires a combination of mechanoreceptors in the muscles and fingertips. The width of a person's finger aperture during the grasp may affect the accuracy of how that person determines hardness, as well. These experiments aim to investigate how an individual perceives hardness amongst a gradient of varying hardness levels. The trend in the responses is assumed to follow a general psychometric function. This will provide information about subjects' abilities to differentiate between two largely different objects, and their tendencies towards guess-chances upon the presentation of two similar objects. After obtaining this data, it is then important to additionally test varying finger apertures in an object-grasping task. This will allow an insight into the effect of aperture on the obtained psychometric function, thus ultimately providing information about tactile and haptic feedback for further application in neuroprosthetic devices. Three separate experiments were performed in order to test the effect of finger aperture on object hardness differentiation. The first experiment tested a one-finger pressing motion among a hardness gradient of ballistic gelatin cubes. Subjects were asked to compare the hardness of one cube to another, which produced the S-curve that accurately portrayed the psychometric function. The second experiment utilized the Phantom haptic device in a similar setup, using the precision grip grasping motion, instead. This showed a more linear curve; the percentage reported harder increased as the hardness of the second presented cube increased, which was attributed to both the experimental setup limitations and the scale of the general hardness gradient. The third experiment then progressed to test the effect of three finger apertures in the same experimental setup. By providing three separate testing scenarios in the precision grip task, the experiment demonstrated that the level of finger aperture has no significant effect on an individual's ability to perceive hardness.
ContributorsMaestas, Gabrielle Elise (Author) / Helms Tillery, Stephen (Thesis director) / Tanner, Justin (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2015-05
136230-Thumbnail Image.png
Description
Currently, assistive robots and prosthesis have a difficult time giving and receiving objects to and from humans. While many attempts have been made to program handover scenarios into robotic control algorithms, the algorithms are typically lacking in at least one important feature: intuitiveness, safety, and efficiency. By performing a study

Currently, assistive robots and prosthesis have a difficult time giving and receiving objects to and from humans. While many attempts have been made to program handover scenarios into robotic control algorithms, the algorithms are typically lacking in at least one important feature: intuitiveness, safety, and efficiency. By performing a study to better understand human-to-human handovers, we observe trends that could inspire controllers for object handovers with robots. Ten pairs of human subjects handed over a cellular phone-shaped, instrumented object using a key pinch while 3D force and motion tracking data were recorded. It was observed that during handovers, humans apply a compressive force on the object and employ linear grip force to load force ratios when two agents are grasping an object (referred to as the "mutual grasp period"). Results also suggested that object velocity during the mutual grasp period is driven by the receiver, while the duration of the mutual grasp period is driven by the preference of the slowest agent involved in the handover. Ultimately, these findings will inspire the development of robotic handover controllers to advance seamless physical interactions between humans and robots.
ContributorsChang, Eric (Author) / Helms Tillery, Stephen (Thesis director) / Santos, Veronica (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor)
Created2015-05
137282-Thumbnail Image.png
Description
A previous study demonstrated that learning to lift an object is context-based and that in the presence of both the memory and visual cues, the acquired sensorimotor memory to manipulate an object in one context interferes with the performance of the same task in presence of visual information about a

A previous study demonstrated that learning to lift an object is context-based and that in the presence of both the memory and visual cues, the acquired sensorimotor memory to manipulate an object in one context interferes with the performance of the same task in presence of visual information about a different context (Fu et al, 2012).
The purpose of this study is to know whether the primary motor cortex (M1) plays a role in the sensorimotor memory. It was hypothesized that temporary disruption of the M1 following the learning to minimize a tilt using a ‘L’ shaped object would negatively affect the retention of sensorimotor memory and thus reduce interference between the memory acquired in one context and the visual cues to perform the same task in a different context.
Significant findings were shown in blocks 1, 2, and 4. In block 3, subjects displayed insignificant amount of learning. However, it cannot be concluded that there is full interference in block 3. Therefore, looked into 3 effects in statistical analysis: the main effects of the blocks, the main effects of the trials, and the effects of the blocks and trials combined. From the block effects, there is a p-value of 0.001, and from the trial effects, the p-value is less than 0.001. Both of these effects indicate that there is learning occurring. However, when looking at the blocks * trials effects, we see a p-value of 0.002 < 0.05 indicating significant interaction between sensorimotor memories. Based on the results that were found, there is a presence of interference in all the blocks but not enough to justify the use of TMS in order to reduce interference because there is a partial reduction of interference from the control experiment. It is evident that the time delay might be the issue between context switches. By reducing the time delay between block 2 and 3 from 10 minutes to 5 minutes, I will hope to see significant learning to occur from the first trial to the second trial.
ContributorsHasan, Salman Bashir (Author) / Santello, Marco (Thesis director) / Kleim, Jeffrey (Committee member) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / W. P. Carey School of Business (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
137283-Thumbnail Image.png
Description
Electroencephalogram (EEG) used simultaneously with video monitoring can record detailed patient physiology during a seizure to aid diagnosis. However, current patient monitoring systems typically require a patient to stay in view of a fixed camera limiting their freedom of movement. The goal of this project is to design an automatic

Electroencephalogram (EEG) used simultaneously with video monitoring can record detailed patient physiology during a seizure to aid diagnosis. However, current patient monitoring systems typically require a patient to stay in view of a fixed camera limiting their freedom of movement. The goal of this project is to design an automatic patient monitoring system with software to track patient movement in order to increase a patient's mobility. This report discusses the impact of an automatic patient monitoring system and the design steps used to create and test a functional prototype.
ContributorsBui, Robert Truong (Author) / Frakes, David (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Electrical Engineering Program (Contributor)
Created2014-05
137456-Thumbnail Image.png
Description
Ultrasound is a sound wave that produces acoustic pressure and is most commonly known as a noninvasive technique for bodily imaging. However, high-intensity focused ultrasound can be used for noninvasive physiotherapy. An example of this the treatment of tumors in the kidneys, as the sound waves of HIFU interacts with

Ultrasound is a sound wave that produces acoustic pressure and is most commonly known as a noninvasive technique for bodily imaging. However, high-intensity focused ultrasound can be used for noninvasive physiotherapy. An example of this the treatment of tumors in the kidneys, as the sound waves of HIFU interacts with tissues in the body. For this thesis, the necessary parameters for ultrasonic stimulation of the central nervous system in rats were characterized.
ContributorsHughes, Brett William (Co-author) / Castel, Nikki (Co-author) / Hillen, Brian (Thesis director) / Helms Tillery, Stephen (Committee member) / Lozano, Cecil (Committee member) / Barrett, The Honors College (Contributor) / Department of Chemistry and Biochemistry (Contributor)
Created2013-05
137004-Thumbnail Image.png
Description
Brain-computer interface technology establishes communication between the brain and a computer, allowing users to control devices, machines, or virtual objects using their thoughts. This study investigates optimal conditions to facilitate learning to operate this interface. It compares two biofeedback methods, which dictate the relationship between brain activity and the movement

Brain-computer interface technology establishes communication between the brain and a computer, allowing users to control devices, machines, or virtual objects using their thoughts. This study investigates optimal conditions to facilitate learning to operate this interface. It compares two biofeedback methods, which dictate the relationship between brain activity and the movement of a virtual ball in a target-hitting task. Preliminary results indicate that a method in which the position of the virtual object directly relates to the amplitude of brain signals is most conducive to success. In addition, this research explores learning in the context of neural signals during training with a BCI task. Specifically, it investigates whether subjects can adapt to parameters of the interface without guidance. This experiment prompts subjects to modulate brain signals spectrally, spatially, and temporally, as well differentially to discriminate between two different targets. However, subjects are not given knowledge regarding these desired changes, nor are they given instruction on how to move the virtual ball. Preliminary analysis of signal trends suggests that some successful participants are able to adapt brain wave activity in certain pre-specified locations and frequency bands over time in order to achieve control. Future studies will further explore these phenomena, and future BCI projects will be advised by these methods, which will give insight into the creation of more intuitive and reliable BCI technology.
ContributorsLancaster, Jenessa Mae (Co-author) / Appavu, Brian (Co-author) / Wahnoun, Remy (Co-author, Committee member) / Helms Tillery, Stephen (Thesis director) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor) / Department of Psychology (Contributor)
Created2014-05
136952-Thumbnail Image.png
Description
Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has

Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has been mostly ignored. Characterizing the influence of arm configuration (i.e. intrinsic factors) would allow greater comprehension of sensorimotor integration and assist in interpreting exaggerated movement variability in patients. In this study, subjects were placed in a 3-D virtual reality environment and were asked to move from a starting position to one of three targets in the frontal plane with and without visual feedback of the moving limb. The alternating of visual feedback during trials increased uncertainty between the planning and execution phases. The starting limb configurations, adducted and abducted, were varied in separate blocks. Arm configurations were setup by rotating along the shoulder-hand axis to maintain endpoint position. The investigation hypothesized: 1) patterns of endpoint variability of movements would be dependent upon the starting arm configuration and 2) any differences observed would be more apparent in conditions that withheld visual feedback. The results indicated that there were differences in endpoint variability between arm configurations in both visual conditions, but differences in variability increased when visual feedback was withheld. Overall this suggests that in the presence of visual feedback, planning of movements in 3D space mostly uses coordinates that are arm configuration independent. On the other hand, without visual feedback, planning of movements in 3D space relies substantially on intrinsic coordinates.
ContributorsRahman, Qasim (Author) / Buneo, Christopher (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
137106-Thumbnail Image.png
Description
The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation

The goal of this project was to use the sense of touch to investigate tactile cues during multidigit rotational manipulations of objects. A robotic arm and hand equipped with three multimodal tactile sensors were used to gather data about skin deformation during rotation of a haptic knob. Three different rotation speeds and two levels of rotation resistance were used to investigate tactile cues during knob rotation. In the future, this multidigit task can be generalized to similar rotational tasks, such as opening a bottle or turning a doorknob.
ContributorsChalla, Santhi Priya (Author) / Santos, Veronica (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Mechanical and Aerospace Engineering Program (Contributor) / School of Earth and Space Exploration (Contributor)
Created2014-05
Description
Biofeedback music is the integration of physiological signals with audible sound for aesthetic considerations, which an individual’s mental status corresponds to musical output. This project looks into how sounds can be drawn from the meditative and attentive states of the brain using the MindWave Mobile EEG biosensor from NeuroSky. With

Biofeedback music is the integration of physiological signals with audible sound for aesthetic considerations, which an individual’s mental status corresponds to musical output. This project looks into how sounds can be drawn from the meditative and attentive states of the brain using the MindWave Mobile EEG biosensor from NeuroSky. With the MindWave and an Arduino microcontroller processor, sonic output is attained by inputting the data collected by the MindWave, and in real time, outputting code that deciphers it into user constructed sound output. The input is scaled from values 0 to 100, measuring the ‘attentive’ state of the mind by observing alpha waves, and distributing this information to the microcontroller. The output of sound comes from sourcing this into the Musical Instrument Shield and varying the musical tonality with different chords and delay of the notes. The manipulation of alpha states highlights the control or lack thereof for the performer and touches on the question of how much control over the output there really is, much like the experimentalist Alvin Lucier displayed with his concepts in brainwave music.
ContributorsQuach, Andrew Duc (Author) / Helms Tillery, Stephen (Thesis director) / Feisst, Sabine (Committee member) / Barrett, The Honors College (Contributor) / Herberger Institute for Design and the Arts (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05
136933-Thumbnail Image.png
Description
Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has

Motor behavior is prone to variable conditions and deviates further in disorders affecting the nervous system. A combination of environmental and neural factors impacts the amount of uncertainty. Although the influence of these factors on estimating endpoint positions have been examined, the role of limb configuration on endpoint variability has been mostly ignored. Characterizing the influence of arm configuration (i.e. intrinsic factors) would allow greater comprehension of sensorimotor integration and assist in interpreting exaggerated movement variability in patients. In this study, subjects were placed in a 3-D virtual reality environment and were asked to move from a starting position to one of three targets in the frontal plane with and without visual feedback of the moving limb. The alternating of visual feedback during trials increased uncertainty between the planning and execution phases. The starting limb configurations, adducted and abducted, were varied in separate blocks. Arm configurations were setup by rotating along the shoulder-hand axis to maintain endpoint position. The investigation hypothesized: 1) patterns of endpoint variability of movements would be dependent upon the starting arm configuration and 2) any differences observed would be more apparent in conditions that withheld visual feedback. The results indicated that there were differences in endpoint variability between arm configurations in both visual conditions, but differences in variability increased when visual feedback was withheld. Overall this suggests that in the presence of visual feedback, planning of movements in 3D space mostly uses coordinates that are arm configuration independent. On the other hand, without visual feedback, planning of movements in 3D space relies substantially on intrinsic coordinates.
ContributorsRahman, Qasim (Author) / Buneo, Christopher (Thesis director) / Helms Tillery, Stephen (Committee member) / Barrett, The Honors College (Contributor) / Harrington Bioengineering Program (Contributor)
Created2014-05