Matching Items (8)

136400-Thumbnail Image.png

Method to Systematically Optimize the Formula Society of Automotive Engineering Race Car

Description

The purpose of this paper is to provide a new and improved design method for the Formula Society of Automotive Engineering (FSAE) team. There are five tasks that I accomplish in this paper: 1. I describe how the FSAE team

The purpose of this paper is to provide a new and improved design method for the Formula Society of Automotive Engineering (FSAE) team. There are five tasks that I accomplish in this paper: 1. I describe how the FSAE team is currently designing their car. This allows the reader to understand where the flaws might arise in their design method. 2. I then describe the key aspects of systems engineering design. This is the backbone of the method I am proposing, and it is important to understand the key concepts so that they can be applied to the FSAE design method. 3. I discuss what is available in the literature about race car design and optimization. I describe what other FSAE teams are doing and how that differs from systems engineering design. 4. I describe what the FSAE team at Arizona State University (ASU) should do to improve their approach to race car design. I go into detail about how the systems engineering method works and how it can and should be applied to the way they design their car. 5. I then describe how the team should implement this method because the method is useless if they do not implement it into their design process. I include an interview from their brakes team leader, Colin Twist, to give an example of their current method of design and show how it can be improved with the new method. This paper provides a framework for the FSAE team to develop their new method of design that will help them accomplish their overall goal of succeeding at the national competition.

Contributors

Agent

Created

Date Created
2015-05

137487-Thumbnail Image.png

Intervention Strategies for the DoD Acquisition Process Using Simulation

Description

The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program

The current Enterprise Requirements and Acquisition Model (ERAM), a discrete event simulation of the major tasks and decisions within the DoD acquisition system, identifies several what-if intervention strategies to improve program completion time. However, processes that contribute to the program acquisition completion time were not explicitly identified in the simulation study. This research seeks to determine the acquisition processes that contribute significantly to total simulated program time in the acquisition system for all programs reaching Milestone C. Specifically, this research examines the effect of increased scope management, technology maturity, and decreased variation and mean process times in post-Design Readiness Review contractor activities by performing additional simulation analyses. Potential policies are formulated from the results to further improve program acquisition completion time.

Contributors

Agent

Created

Date Created
2013-05

151945-Thumbnail Image.png

System-level synthesis of dataplane subsystems for MPSoCs

Description

In recent years we have witnessed a shift towards multi-processor system-on-chips (MPSoCs) to address the demands of embedded devices (such as cell phones, GPS devices, luxury car features, etc.). Highly optimized MPSoCs are well-suited to tackle the complex application demands

In recent years we have witnessed a shift towards multi-processor system-on-chips (MPSoCs) to address the demands of embedded devices (such as cell phones, GPS devices, luxury car features, etc.). Highly optimized MPSoCs are well-suited to tackle the complex application demands desired by the end user customer. These MPSoCs incorporate a constellation of heterogeneous processing elements (PEs) (general purpose PEs and application-specific integrated circuits (ASICS)). A typical MPSoC will be composed of a application processor, such as an ARM Coretex-A9 with cache coherent memory hierarchy, and several application sub-systems. Each of these sub-systems are composed of highly optimized instruction processors, graphics/DSP processors, and custom hardware accelerators. Typically, these sub-systems utilize scratchpad memories (SPM) rather than support cache coherency. The overall architecture is an integration of the various sub-systems through a high bandwidth system-level interconnect (such as a Network-on-Chip (NoC)). The shift to MPSoCs has been fueled by three major factors: demand for high performance, the use of component libraries, and short design turn around time. As customers continue to desire more and more complex applications on their embedded devices the performance demand for these devices continues to increase. Designers have turned to using MPSoCs to address this demand. By using pre-made IP libraries designers can quickly piece together a MPSoC that will meet the application demands of the end user with minimal time spent designing new hardware. Additionally, the use of MPSoCs allows designers to generate new devices very quickly and thus reducing the time to market. In this work, a complete MPSoC synthesis design flow is presented. We first present a technique \cite{leary1_intro} to address the synthesis of the interconnect architecture (particularly Network-on-Chip (NoC)). We then address the synthesis of the memory architecture of a MPSoC sub-system \cite{leary2_intro}. Lastly, we present a co-synthesis technique to generate the functional and memory architectures simultaneously. The validity and quality of each synthesis technique is demonstrated through extensive experimentation.

Contributors

Agent

Created

Date Created
2013

153486-Thumbnail Image.png

Quantum resilience

Description

Quantum resilience is a pragmatic theory that allows systems engineers to formally characterize the resilience of systems. As a generalized theory, it not only clarifies resilience in the literature, but also can be applied to all disciplines and domains of

Quantum resilience is a pragmatic theory that allows systems engineers to formally characterize the resilience of systems. As a generalized theory, it not only clarifies resilience in the literature, but also can be applied to all disciplines and domains of discourse. Operationalizing resilience in this manner permits decision-makers to compare and contrast system deployment options for suitability in a variety of environments and allows for consistent treatment of resilience across domains. Systems engineers, whether planning future infrastructures or managing ecosystems, are increasingly asked to deliver resilient systems. Quantum resilience provides a way forward that allows specific resilience requirements to be specified, validated, and verified.

Quantum resilience makes two very important claims. First, resilience cannot be characterized without recognizing both the system and the valued function it provides. Second, resilience is not about disturbances, insults, threats, or perturbations. To avoid crippling infinities, characterization of resilience must be accomplishable without disturbances in mind. In light of this, quantum resilience defines resilience as the extent to which a system delivers its valued functions, and characterizes resilience as a function of system productivity and complexity. System productivity vis-à-vis specified “valued functions” involves (1) the quanta of the valued function delivered, and (2) the number of systems (within the greater system) which deliver it. System complexity is defined structurally and relationally and is a function of a variety of items including (1) system-of-systems hierarchical decomposition, (2) interfaces and connections between systems, and (3) inter-system dependencies.

Among the important features of quantum resilience is that it can be implemented in any system engineering tool that provides sufficient design and specification rigor (i.e., one that supports standards like the Lifecycle and Systems Modeling languages and frameworks like the DoD Architecture Framework). Further, this can be accomplished with minimal software development and has been demonstrated in three model-based system engineering tools, two of which are commercially available, well-respected, and widely used. This pragmatic approach assures transparency and consistency in characterization of resilience in any discipline.

Contributors

Agent

Created

Date Created
2015

153597-Thumbnail Image.png

Test-based falsification and conformance testing for cyber-physical systems

Description

In this dissertation, two problems are addressed in the verification and control of Cyber-Physical Systems (CPS):

1) Falsification: given a CPS, and a property of interest that the CPS must satisfy under all allowed operating conditions, does the CPS violate, i.e.

In this dissertation, two problems are addressed in the verification and control of Cyber-Physical Systems (CPS):

1) Falsification: given a CPS, and a property of interest that the CPS must satisfy under all allowed operating conditions, does the CPS violate, i.e. falsify, the property?

2) Conformance testing: given a model of a CPS, and an implementation of that CPS on an embedded platform, how can we characterize the properties satisfied by the implementation, given the properties satisfied by the model?

Both problems arise in the context of Model-Based Design (MBD) of CPS: in MBD, the designers start from a set of formal requirements that the system-to-be-designed must satisfy.

A first model of the system is created.

Because it may not be possible to formally verify the CPS model against the requirements, falsification tries to verify whether the model satisfies the requirements by searching for behavior that violates them.

In the first part of this dissertation, I present improved methods for finding falsifying behaviors of CPS when properties are expressed in Metric Temporal Logic (MTL).

These methods leverage the notion of robust semantics of MTL formulae: if a falsifier exists, it is in the neighborhood of local minimizers of the robustness function.

The proposed algorithms compute descent directions of the robustness function in the space of initial conditions and input signals, and provably converge to local minima of the robustness function.

The initial model of the CPS is then iteratively refined by modeling previously ignored phenomena, adding more functionality, etc., with each refinement resulting in a new model.

Many of the refinements in the MBD process described above do not provide an a priori guaranteed relation between the successive models.

Thus, the second problem above arises: how to quantify the distance between two successive models M_n and M_{n+1}?

If M_n has been verified to satisfy the specification, can it be guaranteed that M_{n+1} also satisfies the same, or some closely related, specification?

This dissertation answers both questions for a general class of CPS, and properties expressed in MTL.

Contributors

Agent

Created

Date Created
2015

151408-Thumbnail Image.png

Standardized strategic assessment framework for small and medium enterprises in high-tech manufacturing industry

Description

A fundamental question in the field of strategic management is how companies achieve sustainable competitive advantage. The Market-Oriented Theory (MOT), the Resource-Based Model and their complementary perspective try to answer this fundamental question. The primary goal of this study is

A fundamental question in the field of strategic management is how companies achieve sustainable competitive advantage. The Market-Oriented Theory (MOT), the Resource-Based Model and their complementary perspective try to answer this fundamental question. The primary goal of this study is to lay the groundwork for Standardized Strategic Assessment Framework (SSAF). The SSAF, which consists of a set of six models, aids in the evaluation and assessment of current and future strategic positioning of Small and Medium Enterprises (SMEs). The SSAF was visualized by IDEF0, a systems engineering tool. In addition, a secondary goal is the development of models to explain relationships between a company's resources, capabilities, and competitive strategy within the SSAF. Six models are considered within the SSAF, including R&D; activities model, product innovation model, process innovation model, operational excellence model, and export performance model. Only one of them, R&D; activities model was explained in-debt and developed a model by transformational system. In the R&D; activities model, the following question drives the investigation. Do company R&D; inputs (tangible, intangible and human resources) affect R&D; activities (basic research, applied research, and experimental development)? Based on this research question, eight hypotheses were extrapolated regarding R&D; activities model. In order to analyze these hypotheses, survey questions were developed for the R&D; model. A survey was sent to academic staff and industry experts for a survey instrument validation. Based on the survey instrument validation, content validity has been established and questions, format, and scales have been improved for future research application.

Contributors

Agent

Created

Date Created
2012

157656-Thumbnail Image.png

Institutional management for infrastructure resilience

Description

To improve the resilience of complex, interdependent infrastructures, we need to better understand the institutions that manage infrastructures and the work that they do. This research demonstrates that a key aspect of infrastructure resilience is the adequate institutional management of

To improve the resilience of complex, interdependent infrastructures, we need to better understand the institutions that manage infrastructures and the work that they do. This research demonstrates that a key aspect of infrastructure resilience is the adequate institutional management of infrastructures. This research analyzes the institutional dimension of infrastructure resilience using sociotechnical systems theory and, further, investigates the critical role of institutions for infrastructure resilience using a thorough analysis of water and energy systems in Arizona.

Infrastructure is not static, but dynamic. Institutions play a significant role in designing, building, maintaining, and upgrading dynamic infrastructures. Institutions create the appearance of infrastructure stability while dynamically changing infrastructures over time, which is resilience work. The resilience work of different institutions and organizations sustains, recovers, adapts, reconfigures, and transforms the physical structure on short, medium, and long temporal scales.

To better understand and analyze the dynamics of sociotechnical infrastructure resilience, this research examines several case studies. The first is the social and institutional arrangements for the allocation of resources from Hoover Dam. This research uses an institutional analysis framework and draws on the institutional landscape of water and energy systems in Arizona. In particular, this research illustrates how institutions contribute to differing resilience work at temporal scales while fabricating three types of institutional threads: lateral, vertical, and longitudinal threads.

This research also highlights the importance of institutional interdependence as a critical challenge for improving infrastructure resilience. Institutional changes in one system can disrupt other systems’ performance. The research examines this through case studies that explore how changes to water governance impact the energy system in Arizona. Groundwater regulations affect the operation of thermoelectric power plants which withdraw groundwater for cooling. Generation turbines, droughts, and water governance are all intertwined via institutions in Arizona.

This research, finally, expands and applies the interdependence perspective to a case study of forest management in Arizona. In a nutshell, the perilous combination of chronic droughts and the engineering resilience perspective jeopardizes urban water and energy systems. Wildfires caused by dense forests have legitimized an institutional transition, from thickening forests to thinning trees in Arizona.

Contributors

Agent

Created

Date Created
2019

158888-Thumbnail Image.png

Design & Analysis of a 21st Century, Scalable, Student-Centric Model of Innovation at the Collegiate Level

Description

The Luminosity Lab, located at Arizona State University, is a prototype for a novel model of interdisciplinary, student-led innovation. The model’s design was informed by the following desired outcomes: i) the model would be well-suited for the 21st century, ii)

The Luminosity Lab, located at Arizona State University, is a prototype for a novel model of interdisciplinary, student-led innovation. The model’s design was informed by the following desired outcomes: i) the model would be well-suited for the 21st century, ii) it would attract, motivate, and retain the university’s strongest student talent, iii) it would operate without the oversight of faculty, and iv) it would work towards the conceptualization, design, development, and deployment of solutions that would positively impact society. This model of interdisciplinary research was tested at Arizona State University across four academic years with participation of over 200 students, who represented more than 20 academic disciplines. The results have shown successful integration of interdisciplinary expertise to identify unmet needs, design innovative concepts, and develop research-informed solutions. This dissertation analyzes Luminosity’s model to determine the following: i) Can a collegiate, student-driven interdisciplinary model of innovation designed for the 21st century perform without faculty management? ii) What are the motivators and culture that enable student success within this model? and iii) How does Luminosity differ from traditional research opportunities and learning experiences?
Through a qualitative, grounded theory analysis, this dissertation examines the phenomena of the students engaging in Luminosity’s model, who have demonstrated their ability to serve as the principal investigators and innovators in conducting substantial discovery, research, and innovation work through full project life cycles. This study supports a theory that highly talented students often feel limited by the pace and scope of their college educations, and yearn for experiences that motivate them with agency, achievement, mastery, affinity for colleagues, and a desire to impact society. Through the cumulative effect of these motivators and an organizational design that facilitates a bottom-up approach to student-driven innovation, Luminosity has established itself as a novel model of research and development in the collegiate space.

Contributors

Agent

Created

Date Created
2020