Matching Items (5)

136680-Thumbnail Image.png

Finite Element Analysis of Microstructural Weak and Strong Links for Shock-Induced Damage in Metallic Materials

Description

Understanding damage evolution, particularly as it relates to local nucleation and growth kinetics of spall failure in metallic materials subjected to shock loading, is critical to national security. This work

Understanding damage evolution, particularly as it relates to local nucleation and growth kinetics of spall failure in metallic materials subjected to shock loading, is critical to national security. This work uses computational modeling to elucidate what characteristics have the highest impact on damage localization at the microstructural level in metallic materials, since knowledge of these characteristics is critical to improve these materials. The numerical framework consists of a user-defined material model implemented in a user subroutine run in ABAQUS/Explicit that takes into account crystal plasticity, grain boundary effects, void nucleation and initial growth, and both isotropic and kinematic hardening to model incipient spall. Finite element simulations were performed on copper bicrystal models to isolate the boundary effects between two grains. Two types of simulations were performed in this work: experimentally verified cases in order to validate the constitutive model as well as idealized cases in an attempt to determine the microstructural characteristic that define weakest links in terms of spall damage. Grain boundary effects on damage localization were studied by varying grain boundary orientation in respect to the shock direction and the crystallographic properties of each grain in the bicrystal. Varying these parameters resulted in a mismatch in Taylor factor across the grain boundary and along the shock direction. The experimentally verified cases are models of specific damage sites found from flyer plate impact tests on copper multicrystals in which the Taylor factor mismatch across the grain boundary and along the shock direction are both high or both low. For the idealized cases, grain boundary orientation and crystallography of the grains are chosen such that the Taylor factor mismatch in the grain boundary normal and along the shock direction are maximized or minimized. A perpendicular grain boundary orientation in respect to the shock direction maximizes Taylor factor mismatch, while a parallel grain boundary minimizes the mismatch. Furthermore, it is known that <1 1 1> crystals have the highest Taylor factor, while <0 0 1> has nearly the lowest Taylor factor. The permutation of these extremes for mismatch in the grain boundary normal and along the shock direction results in four idealized cases that were studied for this work. Results of the simulations demonstrate that the material model is capable of predicting damage localization, as it has been able to reproduce damage sites found experimentally. However, these results are qualitative since further calibration is still required to produce quantitatively accurate results. Moreover, comparisons of results for void nucleation rate and void growth rate suggests that void nucleation is more influential in the total void volume fraction for bicrystals with high property mismatch across the interface, suggesting that nucleation is the dominant characteristic in the propagation of damage in the material. Further work in recalibrating the simulation parameters and modeling different bicrystal orientations must be done to verify these results.

Contributors

Created

Date Created
  • 2014-12

154217-Thumbnail Image.png

Modeling, simulation and analysis for software-as-service in cloud

Description

Software-as-a-Service (SaaS) has received significant attention in recent years as major computer companies such as Google, Microsoft, Amazon, and Salesforce are adopting this new approach to develop software and systems.

Software-as-a-Service (SaaS) has received significant attention in recent years as major computer companies such as Google, Microsoft, Amazon, and Salesforce are adopting this new approach to develop software and systems. Cloud computing is a computing infrastructure to enable rapid delivery of computing resources as a utility in a dynamic, scalable, and virtualized manner. Computer Simulations are widely utilized to analyze the behaviors of software and test them before fully implementations. Simulation can further benefit SaaS application in a cost-effective way taking the advantages of cloud such as customizability, configurability and multi-tendency.

This research introduces Modeling, Simulation and Analysis for Software-as-Service in Cloud. The researches cover the following topics: service modeling, policy specification, code generation, dynamic simulation, timing, event and log analysis. Moreover, the framework integrates current advantages of cloud: configurability, Multi-Tenancy, scalability and recoverability.

The following chapters are provided in the architecture:

Multi-Tenancy Simulation Software-as-a-Service.

Policy Specification for MTA simulation environment.

Model Driven PaaS Based SaaS modeling.

Dynamic analysis and dynamic calibration for timing analysis.

Event-driven Service-Oriented Simulation Framework.

LTBD: A Triage Solution for SaaS.

Contributors

Agent

Created

Date Created
  • 2015

155075-Thumbnail Image.png

Mechanisms and models of agropastoral spread during the Neolithic in the west Mediterranean: the Cardial Spread Model

Description

This dissertation examines the various factors and processes that have been proposed as explanations for the spread of agriculture in the west Mediterranean. The expansion of the Neolithic in the

This dissertation examines the various factors and processes that have been proposed as explanations for the spread of agriculture in the west Mediterranean. The expansion of the Neolithic in the west Mediterranean (the Impresso-Cardial Neolithic) is characterized by a rapid spread of agricultural subsistence and material culture from the southern portion of the Italian peninsula to the western coast of the Iberian peninsula. To address this unique case, four conceptual models of Neolithic spread have been proposed: the Wave of Advance, the Capillary Spread Model, the Maritime Pioneer Colonization Model and the Dual Model. An agent-based model, the Cardial Spread Model, was built to simulate each conceptual spread model in a spatially explicit environment for comparison with evidence from the archaeological record. Chronological information detailing the arrival of the Neolithic was used to create a map of the initial arrival of the Neolithic (a chronosurface) throughout the study area. The results of each conceptual spread model were then compared to the chronosurface in order to evaluate the relative performance of each conceptual model of spread. These experiments suggest that both the Dual and Maritime Pioneer Colonization models best fit the available chronological and spatial distribution of the Impresso-Cardial Neolithic.

For the purpose of informing agent movement and improving the fit of the conceptual spread models, a variety of paleoenvironmental maps were tested within the Cardial Spread Model. The outcome of these experiments suggests that topographic slope was an important factor in settlement location and that rivers were important vectors of transportation for early Neolithic migration. This research demonstrates the application of techniques rare to archaeological analysis, agent-based modeling and the inclusion of paleoenvironmental information, and provides a valuable tool that future researchers can utilize to further evaluate and fabricate new models of Neolithic expansion.

Contributors

Agent

Created

Date Created
  • 2016

153607-Thumbnail Image.png

A framework for screening experiments and modelling in complex systems

Description

Complex systems are pervasive in science and engineering. Some examples include complex engineered networks such as the internet, the power grid, and transportation networks. The complexity of such systems arises

Complex systems are pervasive in science and engineering. Some examples include complex engineered networks such as the internet, the power grid, and transportation networks. The complexity of such systems arises not just from their size, but also from their structure, operation (including control and management), evolution over time, and that people are involved in their design and operation. Our understanding of such systems is limited because their behaviour cannot be characterized using traditional techniques of modelling and analysis.

As a step in model development, statistically designed screening experiments may be used to identify the main effects and interactions most significant on a response of a system. However, traditional approaches for screening are ineffective for complex systems because of the size of the experimental design. Consequently, the factors considered are often restricted, but this automatically restricts the interactions that may be identified as well. Alternatively, the designs are restricted to only identify main effects, but this then fails to consider any possible interactions of the factors.

To address this problem, a specific combinatorial design termed a locating array is proposed as a screening design for complex systems. Locating arrays exhibit logarithmic growth in the number of factors because their focus is on identification rather than on measurement. This makes practical the consideration of an order of magnitude more factors in experimentation than traditional screening designs.

As a proof-of-concept, a locating array is applied to screen for main effects and low-order interactions on the response of average transport control protocol (TCP) throughput in a simulation model of a mobile ad hoc network (MANET). A MANET is a collection of mobile wireless nodes that self-organize without the aid of any centralized control or fixed infrastructure. The full-factorial design for the MANET considered is infeasible (with over 10^{43} design points) yet a locating array has only 421 design points.

In conjunction with the locating array, a ``heavy hitters'' algorithm is developed to identify the influential main effects and two-way interactions, correcting for the non-normal distribution of the average throughput, and uneven coverage of terms in the locating array. The significance of the identified main effects and interactions is validated independently using the statistical software JMP.

The statistical characteristics used to evaluate traditional screening designs are also applied to locating arrays.

These include the matrix of covariance, fraction of design space, and aliasing, among others. The results lend additional support to the use of locating arrays as screening designs.

The use of locating arrays as screening designs for complex engineered systems is promising as they yield useful models. This facilitates quantitative evaluation of architectures and protocols and contributes to our understanding of complex engineered networks.

Contributors

Agent

Created

Date Created
  • 2015

154789-Thumbnail Image.png

Examining the impact of experimental design strategies on the predictive accuracy of quantile regression metamodels for computer simulations of manufacturing systems

Description

This thesis explores the impact of different experimental design strategies for the development of quantile regression based metamodels of computer simulations. In this research, the objective is to compare the

This thesis explores the impact of different experimental design strategies for the development of quantile regression based metamodels of computer simulations. In this research, the objective is to compare the resulting predictive accuracy of five experimental design strategies, each of which is used to develop metamodels of a computer simulation of a semiconductor manufacturing facility. The five examined experimental design strategies include two traditional experimental design strategies, sphere packing and I-optimal, along with three hybrid design strategies, which were developed for this research and combine desirable properties from each of the more traditional approaches. The three hybrid design strategies are: arbitrary, centroid clustering, and clustering hybrid. Each of these strategies is analyzed and compared based on common experimental design space, which includes the investigation of four densities of design point placements three different experimental regions to predict four different percentiles from the cycle time distribution of a semiconductor manufacturing facility. Results confirm that the predictive accuracy of quantile regression metamodels depends on both the location and density of the design points placed in the experimental region. They also show that the sphere packing design strategy has the best overall performance in terms of predictive accuracy. However, the centroid clustering hybrid design strategy, developed for this research, has the best predictive accuracy for cases in which only a small number of simulation resources are available from which to develop a quantile regression metamodel.

Contributors

Agent

Created

Date Created
  • 2016