Matching Items (67)
128363-Thumbnail Image.png
Description

This study investigates the presence of dynamical patterns of interpersonal coordination in extended deceptive conversations across multimodal channels of behavior. Using a novel "devil’s advocate" paradigm, we experimentally elicited deception and truth across topics in which conversational partners either agreed or disagreed, and where one partner was surreptitiously asked to

This study investigates the presence of dynamical patterns of interpersonal coordination in extended deceptive conversations across multimodal channels of behavior. Using a novel "devil’s advocate" paradigm, we experimentally elicited deception and truth across topics in which conversational partners either agreed or disagreed, and where one partner was surreptitiously asked to argue an opinion opposite of what he or she really believed. We focus on interpersonal coordination as an emergent behavioral signal that captures interdependencies between conversational partners, both as the coupling of head movements over the span of milliseconds, measured via a windowed lagged cross correlation (WLCC) technique, and more global temporal dependencies across speech rate, using cross recurrence quantification analysis (CRQA). Moreover, we considered how interpersonal coordination might be shaped by strategic, adaptive conversational goals associated with deception. We found that deceptive conversations displayed more structured speech rate and higher head movement coordination, the latter with a peak in deceptive disagreement conversations. Together the results allow us to posit an adaptive account, whereby interpersonal coordination is not beholden to any single functional explanation, but can strategically adapt to diverse conversational demands.

Created2017-06-02
128370-Thumbnail Image.png
Description

Lack of biodiversity data is a major impediment to prioritizing sites for species representation. Because comprehensive species data are not available in any planning area, planners often use surrogates (such as vegetation communities, or mapped occurrences of a well-inventoried taxon) to prioritize sites. We propose and demonstrate the effectiveness of

Lack of biodiversity data is a major impediment to prioritizing sites for species representation. Because comprehensive species data are not available in any planning area, planners often use surrogates (such as vegetation communities, or mapped occurrences of a well-inventoried taxon) to prioritize sites. We propose and demonstrate the effectiveness of predicted rarity-weighted richness (PRWR) as a surrogate in situations where species inventories may be available for a portion of the planning area. Use of PRWR as a surrogate involves several steps. First, rarity-weighted richness (RWR) is calculated from species inventories for a q% subset of sites. Then random forest models are used to model RWR as a function of freely available environmental variables for that q% subset. This function is then used to calculate PRWR for all sites (including those for which no species inventories are available), and PRWR is used to prioritize all sites. We tested PRWR on plant and bird datasets, using the species accumulation index to measure efficiency of PRWR. Sites with the highest PRWR represented species with median efficiency of 56% (range 32%–77% across six datasets) when q = 20%, and with median efficiency of 39% (range 20%–63%) when q = 10%. An efficiency of 56% means that selecting sites in order of PRWR rank was 56% as effective as having full knowledge of species distributions in PRWR's ability to improve on the number of species represented in the same number of randomly selected sites. Our results suggest that PRWR may be able to help prioritize sites to represent species if a planner has species inventories for 10%–20% of the sites in the planning area.

Created2016-10-27
129653-Thumbnail Image.png
Description

Through the mathematical study of two models we quantify some of the theories of co-development and co-existence of focused groups in the social sciences. This work attempts to develop the mathematical framework behind the social sciences of community formation. By using well developed theories and concepts from ecology and epidemiology

Through the mathematical study of two models we quantify some of the theories of co-development and co-existence of focused groups in the social sciences. This work attempts to develop the mathematical framework behind the social sciences of community formation. By using well developed theories and concepts from ecology and epidemiology we hope to extend the theoretical framework of organizing and self-organizing social groups and communities, including terrorist groups. The main goal of our work is to gain insight into the role of recruitment and retention in the formation and survival of social organizations. Understanding the underlining mechanisms of the spread of ideologies under competition is a fundamental component of this work. Here contacts between core and non-core individuals extend beyond its physical meaning to include indirect interaction and spread of ideas through phone conversations, emails, media sources and other similar mean.

This work focuses on the dynamics of formation of interest groups, either ideological, economical or ecological and thus we explore the questions such as, how do interest groups initiate and co-develop by interacting within a common environment and how do they sustain themselves? Our results show that building and maintaining the core group is essential for the existence and survival of an extreme ideology. Our research also indicates that in the absence of competitive ability (i.e., ability to take from the other core group or share prospective members) the social organization or group that is more committed to its group ideology and manages to strike the right balance between investment in recruitment and retention will prevail. Thus under no cross interaction between two social groups a single trade-off (of these efforts) can support only a single organization. The more efforts that an organization implements to recruit and retain its members the more effective it will be in transmitting the ideology to other vulnerable individuals and thus converting them to believers.

Created2013-09-11
127929-Thumbnail Image.png
Description

Previous studies in building energy assessment clearly state that to meet sustainable energy goals, existing buildings, as well as new buildings, will need to improve their energy efficiency. Thus, meeting energy goals relies on retrofitting existing buildings. Most building energy models are bottom-up engineering models, meaning these models calculate energy

Previous studies in building energy assessment clearly state that to meet sustainable energy goals, existing buildings, as well as new buildings, will need to improve their energy efficiency. Thus, meeting energy goals relies on retrofitting existing buildings. Most building energy models are bottom-up engineering models, meaning these models calculate energy demand of individual buildings through their physical properties and energy use for specific end uses (e.g., lighting, appliances, and water heating). Researchers then scale up these model results to represent the building stock of the region studied.

Studies reveal that there is a lack of information about the building stock and associated modeling tools and this lack of knowledge affects the assessment of building energy efficiency strategies. Literature suggests that the level of complexity of energy models needs to be limited. Accuracy of these energy models can be elevated by reducing the input parameters, alleviating the need for users to make many assumptions about building construction and occupancy, among other factors. To mitigate the need for assumptions and the resulting model inaccuracies, the authors argue buildings should be described in a regional stock model with a restricted number of input parameters. One commonly-accepted method of identifying critical input parameters is sensitivity analysis, which requires a large number of runs that are both time consuming and may require high processing capacity.

This paper utilizes the Energy, Carbon and Cost Assessment for Buildings Stocks (ECCABS) model, which calculates the net energy demand of buildings and presents aggregated and individual- building-level, demand for specific end uses, e.g., heating, cooling, lighting, hot water and appliances. The model has already been validated using the Swedish, Spanish, and UK building stock data. This paper discusses potential improvements to this model by assessing the feasibility of using stepwise regression to identify the most important input parameters using the data from UK residential sector. The paper presents results of stepwise regression and compares these to sensitivity analysis; finally, the paper documents the advantages and challenges associated with each method.

ContributorsArababadi, Reza (Author) / Naganathan, Hariharan (Author) / Parrish, Kristen (Author) / Chong, Oswald (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2015-09-14
127931-Thumbnail Image.png
Description

Construction waste management has become extremely important due to stricter disposal and landfill regulations, and a lesser number of available landfills. There are extensive works done on waste treatment and management of the construction industry. Concepts like deconstruction, recyclability, and Design for Disassembly (DfD) are examples of better construction waste

Construction waste management has become extremely important due to stricter disposal and landfill regulations, and a lesser number of available landfills. There are extensive works done on waste treatment and management of the construction industry. Concepts like deconstruction, recyclability, and Design for Disassembly (DfD) are examples of better construction waste management methods. Although some authors and organizations have published rich guides addressing the DfD's principles, there are only a few buildings already developed in this area. This study aims to find the challenges in the current practice of deconstruction activities and the gaps between its theory and implementation. Furthermore, it aims to provide insights about how DfD can create opportunities to turn these concepts into strategies that can be largely adopted by the construction industry stakeholders in the near future.

ContributorsRios, Fernanda (Author) / Chong, Oswald (Author) / Grau, David (Author) / Julie Ann Wrigley Global Institute of Sustainability (Contributor)
Created2015-09-14
127949-Thumbnail Image.png
Description

The United State generates the most waste among OECD countries, and there are adverse effects of the waste generation. One of the most serious adverse effects is greenhouse gas, especially CH4, which causes global warming. However, the amount of waste generation is not decreasing, and the United State recycling rate,

The United State generates the most waste among OECD countries, and there are adverse effects of the waste generation. One of the most serious adverse effects is greenhouse gas, especially CH4, which causes global warming. However, the amount of waste generation is not decreasing, and the United State recycling rate, which could reduce waste generation, is only 26%, which is lower than other OECD countries. Thus, waste generation and greenhouse gas emission should decrease, and in order for that to happen, identifying the causes should be made a priority. The research objective is to verify whether the Environmental Kuznets Curve relationship is supported for waste generation and GDP across the U.S. Moreover, it also confirmed that total waste generation and recycling waste influences carbon dioxide emissions from the waste sector. The annual-based U.S. data from 1990 to 2012 were used. The data were collected from various data sources, and the Granger causality test was applied for identifying the causal relationships. The results showed that there is no causality between GDP and waste generation, but total waste and recycling generation significantly cause positive and negative greenhouse gas emissions from the waste sector, respectively. This implies that the waste generation will not decrease even if GDP increases. And, if waste generation decreases or recycling rate increases, the greenhouse gas emission will decrease. Based on these results, it is expected that the waste generation and carbon dioxide emission from the waste sector can decrease more efficiently.

ContributorsLee, Seungtaek (Author) / Kim, Jonghoon (Author) / Chong, Oswald (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-05-20
127964-Thumbnail Image.png
Description

As the construction continue to be a leading industry in the number of injuries and fatalities annually, several organizations and agencies are working avidly to ensure the number of injuries and fatalities is minimized. The Occupational Safety and Health Administration (OSHA) is one such effort to assure safe and healthful

As the construction continue to be a leading industry in the number of injuries and fatalities annually, several organizations and agencies are working avidly to ensure the number of injuries and fatalities is minimized. The Occupational Safety and Health Administration (OSHA) is one such effort to assure safe and healthful working conditions for working men and women by setting and enforcing standards and by providing training, outreach, education and assistance. Given the large databases of OSHA historical events and reports, a manual analysis of the fatality and catastrophe investigations content is a time consuming and expensive process. This paper aims to evaluate the strength of unsupervised machine learning and Natural Language Processing (NLP) in supporting safety inspections and reorganizing accidents database on a state level. After collecting construction accident reports from the OSHA Arizona office, the methodology consists of preprocessing the accident reports and weighting terms in order to apply a data-driven unsupervised K-Means-based clustering approach. The proposed method classifies the collected reports in four clusters, each reporting a type of accident. The results show the construction accidents in the state of Arizona to be caused by falls (42.9%), struck by objects (34.3%), electrocutions (12.5%), and trenches collapse (10.3%). The findings of this research empower state and local agencies with a customized presentation of the accidents fitting their regulations and weather conditions. What is applicable to one climate might not be suitable for another; therefore, such rearrangement of the accidents database on a state based level is a necessary prerequisite to enhance the local safety applications and standards.

ContributorsChokor, Abbas (Author) / Naganathan, Hariharan (Author) / Chong, Oswald (Author) / El Asmar, Mounir (Author) / Ira A. Fulton Schools of Engineering (Contributor)
Created2016-05-20
127985-Thumbnail Image.png
Description

This paper describes a novel method for displaying data obtained by three-dimensional medical imaging, by which the position and orientation of a freely movable screen are optically tracked and used in real time to select the current slice from the data set for presentation. With this method, which we call

This paper describes a novel method for displaying data obtained by three-dimensional medical imaging, by which the position and orientation of a freely movable screen are optically tracked and used in real time to select the current slice from the data set for presentation. With this method, which we call a “freely moving in-situ medical image”, the screen and imaged data are registered to a common coordinate system in space external to the user, at adjustable scale, and are available for free exploration. The three-dimensional image data occupy empty space, as if an invisible patient is being sliced by the moving screen. A behavioral study using real computed tomography lung vessel data established the superiority of the in situ display over a control condition with the same free exploration, but displaying data on a fixed screen (ex situ), with respect to accuracy in the task of tracing along a vessel and reporting spatial relations between vessel structures. A “freely moving in-situ medical image” display appears from these measures to promote spatial navigation and understanding of medical data.

ContributorsShukla, Gaurav (Author) / Klatzky, Roberta L. (Author) / Wu, Bing (Author) / Wang, Bo (Author) / Galeotti, John (Author) / Chapmann, Brian (Author) / Stetten, George (Author) / New College of Interdisciplinary Arts and Sciences (Contributor)
Created2017-08-23
127988-Thumbnail Image.png
Description

Essential or enduring understandings are often defined as the underlying core concepts or “big ideas” we’d like our students to remember when much of the course content has been forgotten. The central dogma of molecular biology and how cellular information is stored, used, and conveyed is one of the essential

Essential or enduring understandings are often defined as the underlying core concepts or “big ideas” we’d like our students to remember when much of the course content has been forgotten. The central dogma of molecular biology and how cellular information is stored, used, and conveyed is one of the essential understandings students should retain after a course or unit in molecular biology or genetics. An additional enduring understanding is the relationships between DNA sequence, RNA sequence, mRNA production and processing, and the resulting polypeptide/protein product. A final big idea in molecular biology is the relationship between DNA mutation and polypeptide change. To engage students in these essential understandings in a Genetics course, I have developed a hands-on activity to simulate VDJ recombination. Students use a foldable type activity to splice out regions of a mock kappa light chain gene to generate a DNA sequence for transcription and translation. Students fold the activity several different times in multiple ways to “recombine” and generate several different DNA sequences. They then are asked to construct the corresponding mRNA and polypeptide sequence of each “recombined” DNA sequence and reflect on the products in a write-to-learn activity.

Created2017-08-11
127989-Thumbnail Image.png
Description

The elongases of very long chain fatty acid (ELOVL or ELO) are essential in the biosynthesis of fatty acids longer than C14. Here, two ELO full-length cDNAs (TmELO1, TmELO2) from the yellow mealworm (Tenebrio molitor L.) were isolated and the functions were characterized. The open reading frame (ORF) lengths of

The elongases of very long chain fatty acid (ELOVL or ELO) are essential in the biosynthesis of fatty acids longer than C14. Here, two ELO full-length cDNAs (TmELO1, TmELO2) from the yellow mealworm (Tenebrio molitor L.) were isolated and the functions were characterized. The open reading frame (ORF) lengths of TmELO1 and TmELO2 were 1005 bp and 972 bp, respectively and the corresponding peptide sequences each contained several conserved motifs including the histidine-box motif HXXHH. Phylogenetic analysis demonstrated high similarity with the ELO of Tribolium castaneum and Drosophila melanogaster. Both TmELO genes were expressed at various levels in eggs, 1st and 2nd instar larvae, mature larvae, pupae, male and female adults. Injection of dsTmELO1 but not dsTmELO2 RNA into mature larvae significantly increased mortality although RNAi did not produce any obvious changes in the fatty acid composition in the survivors. Heterologous expression of TmELO genes in yeast revealed that TmELO1 and TmELO2 function to synthesize long chain and very long chain fatty acids.

ContributorsZheng, Tianxiang (Author) / Li, Hongshuang (Author) / Han, Na (Author) / Wang, Shengyin (Author) / Hackney Price, Jennifer (Author) / Wang, Minzi (Author) / Zhang, Dayu (Author) / New College of Interdisciplinary Arts and Sciences (Contributor)
Created2017-09-08