Matching Items (1,517)
Filtering by
- All Subjects: Computer Science
- All Subjects: Mechanical Engineering

Typically, the complete loss or severe impairment of a sense such as vision and/or hearing is compensated through sensory substitution, i.e., the use of an alternative sense for receiving the same information. For individuals who are blind or visually impaired, the alternative senses have predominantly been hearing and touch. For movies, visual content has been made accessible to visually impaired viewers through audio descriptions -- an additional narration that describes scenes, the characters involved and other pertinent details. However, as audio descriptions should not overlap with dialogue, sound effects and musical scores, there is limited time to convey information, often resulting in stunted and abridged descriptions that leave out many important visual cues and concepts. This work proposes a promising multimodal approach to sensory substitution for movies by providing complementary information through haptics, pertaining to the positions and movements of actors, in addition to a film's audio description and audio content. In a ten-minute presentation of five movie clips to ten individuals who were visually impaired or blind, the novel methodology was found to provide an almost two time increase in the perception of actors' movements in scenes. Moreover, participants appreciated and found useful the overall concept of providing a visual perspective to film through haptics.

Reverse engineering gene regulatory networks (GRNs) is an important problem in the domain of Systems Biology. Learning GRNs is challenging due to the inherent complexity of the real regulatory networks and the heterogeneity of samples in available biomedical data. Real world biological data are commonly collected from broad surveys (profiling studies) and aggregate highly heterogeneous biological samples. Popular methods to learn GRNs simplistically assume a single universal regulatory network corresponding to available data. They neglect regulatory network adaptation due to change in underlying conditions and cellular phenotype or both. This dissertation presents a novel computational framework to learn common regulatory interactions and networks underlying the different sets of relatively homogeneous samples from real world biological data. The characteristic set of samples/conditions and corresponding regulatory interactions defines the cellular context (context). Context, in this dissertation, represents the deterministic transcriptional activity within the specific cellular regulatory mechanism. The major contributions of this framework include - modeling and learning context specific GRNs; associating enriched samples with contexts to interpret contextual interactions using biological knowledge; pruning extraneous edges from the context-specific GRN to improve the precision of the final GRNs; integrating multisource data to learn inter and intra domain interactions and increase confidence in obtained GRNs; and finally, learning combinatorial conditioning factors from the data to identify regulatory cofactors. The framework, Expattern, was applied to both real world and synthetic data. Interesting insights were obtained into mechanism of action of drugs on analysis of NCI60 drug activity and gene expression data. Application to refractory cancer data and Glioblastoma multiforme yield GRNs that were readily annotated with context-specific phenotypic information. Refractory cancer GRNs also displayed associations between distinct cancers, not observed through only clustering. Performance comparisons on multi-context synthetic data show the framework Expattern performs better than other comparable methods.

This thesis focuses on the continued extension, validation, and application of combined thermal-structural reduced order models for nonlinear geometric problems. The first part of the thesis focuses on the determination of the temperature distribution and structural response induced by an oscillating flux on the top surface of a flat panel. This flux is introduced here as a simplified representation of the thermal effects of an oscillating shock on a panel of a supersonic/hypersonic vehicle. Accordingly, a random acoustic excitation is also considered to act on the panel and the level of the thermo-acoustic excitation is assumed to be large enough to induce a nonlinear geometric response of the panel. Both temperature distribution and structural response are determined using recently proposed reduced order models and a complete one way, thermal-structural, coupling is enforced. A steady-state analysis of the thermal problem is first carried out that is then utilized in the structural reduced order model governing equations with and without the acoustic excitation. A detailed validation of the reduced order models is carried out by comparison with a few full finite element (Nastran) computations. The computational expedience of the reduced order models allows a detailed parametric study of the response as a function of the frequency of the oscillating flux. The nature of the corresponding structural ROM equations is seen to be of a Mathieu-type with Duffing nonlinearity (originating from the nonlinear geometric effects) with external harmonic excitation (associated with the thermal moments terms on the panel). A dominant resonance is observed and explained. The second part of the thesis is focused on extending the formulation of the combined thermal-structural reduced order modeling method to include temperature dependent structural properties, more specifically of the elasticity tensor and the coefficient of thermal expansion. These properties were assumed to vary linearly with local temperature and it was found that the linear stiffness coefficients and the "thermal moment" terms then are cubic functions of the temperature generalized coordinates while the quadratic and cubic stiffness coefficients were only linear functions of these coordinates. A first validation of this reduced order modeling strategy was successfully carried out.

Damage assessment and residual useful life estimation (RULE) are essential for aerospace, civil and naval structures. Structural Health Monitoring (SHM) attempts to automate the process of damage detection and identification. Multiscale modeling is a key element in SHM. It not only provides important information on the physics of failure, such as damage initiation and growth, the output can be used as "virtual sensing" data for detection and prognosis. The current research is part of an ongoing multidisciplinary effort to develop an integrated SHM framework for metallic aerospace components. In this thesis a multiscale model has been developed by bridging the relevant length scales, micro, meso and macro (or structural scale). Micro structural representations obtained from material characterization studies are used to define the length scales and to capture the size and orientation of the grains at the micro level. Parametric studies are conducted to estimate material parameters used in this constitutive model. Numerical and experimental simulations are performed to investigate the effects of Representative Volume Element (RVE) size, defect area fraction and distribution. A multiscale damage criterion accounting for crystal orientation effect is developed. This criterion is applied for fatigue crack initial stage prediction. A damage evolution rule based on strain energy density is modified to incorporate crystal plasticity at the microscale (local). Optimization approaches are used to calculate global damage index which is used for the RVE failure prediciton. Potential cracking directions are provided from the damage criterion simultaneously. A wave propagation model is incorporated with the damage model to detect changes in sensing signals due to plastic deformation and damage growth.

Given the process of tumorigenesis, biological signaling pathways have become of interest in the field of oncology. Many of the regulatory mechanisms that are altered in cancer are directly related to signal transduction and cellular communication. Thus, identifying signaling pathways that have become deregulated may provide useful information to better understanding altered regulatory mechanisms within cancer. Many methods that have been created to measure the distinct activity of signaling pathways have relied strictly upon transcription profiles. With advancements in comparative genomic hybridization techniques, copy number data has become extremely useful in providing valuable information pertaining to the genomic landscape of cancer. The purpose of this thesis is to develop a methodology that incorporates both gene expression and copy number data to identify signaling pathways that have become deregulated in cancer. The central idea is that copy number data may significantly assist in identifying signaling pathway deregulation by justifying the aberrant activity being measured in gene expression profiles. This method was then applied to four different subtypes of breast cancer resulting in the identification of signaling pathways associated with distinct functionalities for each of the breast cancer subtypes.
Buildings in the United States, account for over 68 percent of electricity consumed, 39 percent of total energy use, and 38 percent of the carbon dioxide emissions. By the year 2035, about 75% of the U.S. building sector will be either new or renovated. The energy efficiency requirements of current building codes would have a significant impact on future energy use, hence, one of the most widely accepted solutions to slowing the growth rate of GHG emissions and then reversing it involves a stringent adoption of building energy codes. A large number of building energy codes exist and a large number of studies which state the energy savings possible through code compliance. However, most codes are difficult to comprehend and require an extensive understanding of the code, the compliance paths, all mandatory and prescriptive requirements as well as the strategy to convert the same to energy model inputs. This paper provides a simplified solution for the entire process by providing an easy to use interface for code compliance and energy simulation through a spreadsheet based tool, the ECCO or the Energy Code COmpliance Tool. This tool provides a platform for a more detailed analysis of building codes as applicable to each and every individual building in each climate zone. It also facilitates quick building energy simulation to determine energy savings achieved through code compliance. This process is highly beneficial not only for code compliance, but also for identifying parameters which can be improved for energy efficiency. Code compliance is simplified through a series of parametric runs which generates the minimally compliant baseline building and 30% beyond code building. This tool is seen as an effective solution for architects and engineers for an initial level analysis as well as for jurisdictions as a front-end diagnostic check for code compliance.

A method of determining nanoparticle temperature through fluorescence intensity levels is described. Intracellular processes are often tracked through the use of fluorescence tagging, and ideal temperatures for many of these processes are unknown. Through the use of fluorescence-based thermometry, cellular processes such as intracellular enzyme movement can be studied and their respective temperatures established simultaneously. Polystyrene and silica nanoparticles are synthesized with a variety of temperature-sensitive dyes such as BODIPY, rose Bengal, Rhodamine dyes 6G, 700, and 800, and Nile Blue A and Nile Red. Photographs are taken with a QImaging QM1 Questar EXi Retiga camera while particles are heated from 25 to 70 C and excited at 532 nm with a Coherent DPSS-532 laser. Photographs are converted to intensity images in MATLAB and analyzed for fluorescence intensity, and plots are generated in MATLAB to describe each dye's intensity vs temperature. Regression curves are created to describe change in fluorescence intensity over temperature. Dyes are compared as nanoparticle core material is varied. Large particles are also created to match the camera's optical resolution capabilities, and it is established that intensity values increase proportionally with nanoparticle size. Nile Red yielded the closest-fit model, with R2 values greater than 0.99 for a second-order polynomial fit. By contrast, Rhodamine 6G only yielded an R2 value of 0.88 for a third-order polynomial fit, making it the least reliable dye for temperature measurements using the polynomial model. Of particular interest in this work is Nile Blue A, whose fluorescence-temperature curve yielded a much different shape from the other dyes. It is recommended that future work describe a broader range of dyes and nanoparticle sizes, and use multiple excitation wavelengths to better quantify each dye's quantum efficiency. Further research into the effects of nanoparticle size on fluorescence intensity levels should be considered as the particles used here greatly exceed 2 ìm. In addition, Nile Blue A should be further investigated as to why its fluorescence-temperature curve did not take on a characteristic shape for a temperature-sensitive dye in these experiments.

Navigating within non-linear structures is a challenge for all users when the space is large but the problem is most pronounced when the users are blind or visually impaired. Such users access digital content through screen readers like JAWS which read out the text on the screen. However presentation of non-linear narratives in such a manner without visual cues and information about spatial dependencies is very inefficient for such users. The NSDL Science Literacy StrandMaps are visual layouts to help students and teachers browse educational resources. A Strandmap shows relationships between concepts and how they build upon one another across grade levels. NSDL Strandmaps are non-linear narratives which need to be presented to users who are blind in an effective way. A good summary of the Strandmap can give the users an idea about the concepts that are explained in it. This can help them decide whether to view the map or not. In addition, a preview-based navigation mechanism can help users decide which direction they want to take, based on a preview of upcoming content in each direction. Given a non-linear narrative like a Strandmap which has both text and structure, and a word limit w, the goal of this thesis is to find the best way to create its summary. The following approaches are considered: – Purely Text-based Approach using a Multi-document Text Summarizer – Purely Structure-based Approach using PageRank – Approaches Combining both Text and Structure → CUTS-Based Approach (Topic Segmentation) → PageRank with Content Since no reference summaries for such structures were available, user studies were conducted to evaluate these algorithms. PageRank with Content approach performed the best. Another important conclusion was that text and structure are intertwined in a Strandmap by design.

In order to catch the smartest criminals in the world, digital forensics examiners need a means of collaborating and sharing information with each other and outside experts that is not prohibitively difficult. However, standard operating procedures and the rules of evidence generally disallow the use of the collaboration software and techniques that are currently available because they do not fully adhere to the dictated procedures for the handling, analysis, and disclosure of items relating to cases. The aim of this work is to conceive and design a framework that provides a completely new architecture that 1) can perform fundamental functions that are common and necessary to forensic analyses, and 2) is structured such that it is possible to include collaboration-facilitating components without changing the way users interact with the system sans collaboration. This framework is called the Collaborative Forensic Framework (CUFF). CUFF is constructed from four main components: Cuff Link, Storage, Web Interface, and Analysis Block. With the Cuff Link acting as a mediator between components, CUFF is flexible in both the method of deployment and the technologies used in implementation. The details of a realization of CUFF are given, which uses a combination of Java, the Google Web Toolkit, Django with Apache for a RESTful web service, and an Ubuntu Enterprise Cloud using Eucalyptus. The functionality of CUFF's components is demonstrated by the integration of an acquisition script designed for Android OS-based mobile devices that use the YAFFS2 file system. While this work has obvious application to examination labs which work under the mandate of judicial or investigative bodies, security officers at any organization would benefit from the improved ability to cooperate in electronic discovery efforts and internal investigations.

A new method of adaptive mesh generation for the computation of fluid flows is investigated. The method utilizes gradients of the flow solution to adapt the size and stretching of elements or volumes in the computational mesh as is commonly done in the conventional Hessian approach. However, in the new method, higher-order gradients are used in place of the Hessian. The method is applied to the finite element solution of the incompressible Navier-Stokes equations on model problems. Results indicate that a significant efficiency benefit is realized.