Filtering by
- All Subjects: engineering
- Creators: Harrington Bioengineering Program
- Member of: Barrett, The Honors College Thesis/Creative Project Collection
Protein and gene circuit level synthetic bioengineering can require years to develop a single target. Phage assisted continuous evolution (PACE) is a powerful new tool for rapidly engineering new genes and proteins, but the method requires an automated cell culture system, making it inaccessible to non industrial research programs. Complex protein functions, like specific binding, require similarly dynamic PACE selection that can be alternatively induced or suppressed, with heat labile chemicals like tetracycline. Selection conditions must be controlled continuously over days, with adjustments made every few minutes. To make PACE experiments accessible to the broader community, we designed dedicated cell culture hardware and integrated optogenetically controlled plasmids. The low cost and open source platform allows a user to conduct PACE with continuous monitoring and precise control of evolution using light.
Following a study conducted in 1991 supporting that kinesthetic information affects visual processing information when moving an arm in extrapersonal space, this research aims to suggest utilizing virtual-reality (VR) technology will lead to more accurate and faster data acquisition (Helms Tillery, et al.) [1]. The previous methods for conducting such research used ultrasonic systems of ultrasound emitters and microphones to track distance from the speed of sound. This method made the experimentation process long and spatial data difficult to synthesize. The purpose of this paper is to show the progress I have made in the efforts to capture spatial data using VR technology to enhance the previous research that has been done in the field of neuroscience. The experimental setup was completed using the Oculus Quest 2 VR headset and included hand controllers. The experiment simulation was created using Unity game engine to build a 3D VR world which can be used interactively with the Oculus. The result of this simulation allows the user to interact with a ball in the VR environment without seeing the body of the user. The VR simulation is able to be used in combination with real-time motion capture cameras to capture live spatial data of the user during trials, though spatial data from the VR environment has not been able to be collected.