Matching Items (725)
Filtering by

Clear all filters

151528-Thumbnail Image.png
Description
The heat transfer enhancements available from expanding the cross-section of a boiling microchannel are explored analytically and experimentally. Evaluation of the literature on critical heat flux in flow boiling and associated pressure drop behavior is presented with predictive critical heat flux (CHF) and pressure drop correlations. An optimum channel configuration

The heat transfer enhancements available from expanding the cross-section of a boiling microchannel are explored analytically and experimentally. Evaluation of the literature on critical heat flux in flow boiling and associated pressure drop behavior is presented with predictive critical heat flux (CHF) and pressure drop correlations. An optimum channel configuration allowing maximum CHF while reducing pressure drop is sought. A perturbation of the channel diameter is employed to examine CHF and pressure drop relationships from the literature with the aim of identifying those adequately general and suitable for use in a scenario with an expanding channel. Several CHF criteria are identified which predict an optimizable channel expansion, though many do not. Pressure drop relationships admit improvement with expansion, and no optimum presents itself. The relevant physical phenomena surrounding flow boiling pressure drop are considered, and a balance of dimensionless numbers is presented that may be of qualitative use. The design, fabrication, inspection, and experimental evaluation of four copper microchannel arrays of different channel expansion rates with R-134a refrigerant is presented. Optimum rates of expansion which maximize the critical heat flux are considered at multiple flow rates, and experimental results are presented demonstrating optima. The effect of expansion on the boiling number is considered, and experiments demonstrate that expansion produces a notable increase in the boiling number in the region explored, though no optima are observed. Significant decrease in the pressure drop across the evaporator is observed with the expanding channels, and no optima appear. Discussion of the significance of this finding is presented, along with possible avenues for future work.
ContributorsMiner, Mark (Author) / Phelan, Patrick E (Thesis advisor) / Baer, Steven (Committee member) / Chamberlin, Ralph (Committee member) / Chen, Kangping (Committee member) / Herrmann, Marcus (Committee member) / Arizona State University (Publisher)
Created2013
151532-Thumbnail Image.png
Description
Modern day gas turbine designers face the problem of hot mainstream gas ingestion into rotor-stator disk cavities. To counter this ingestion, seals are installed on the rotor and stator disk rims and purge air, bled off from the compressor, is injected into the cavities. It is desirable to reduce the

Modern day gas turbine designers face the problem of hot mainstream gas ingestion into rotor-stator disk cavities. To counter this ingestion, seals are installed on the rotor and stator disk rims and purge air, bled off from the compressor, is injected into the cavities. It is desirable to reduce the supply of purge air as this decreases the net power output as well as efficiency of the gas turbine. Since the purge air influences the disk cavity flow field and effectively the amount of ingestion, the aim of this work was to study the cavity velocity field experimentally using Particle Image Velocimetry (PIV). Experiments were carried out in a model single-stage axial flow turbine set-up that featured blades as well as vanes, with purge air supplied at the hub of the rotor-stator disk cavity. Along with the rotor and stator rim seals, an inner labyrinth seal was provided which split the disk cavity into a rim cavity and an inner cavity. First, static gage pressure distribution was measured to ensure that nominally steady flow conditions had been achieved. The PIV experiments were then performed to map the velocity field on the radial-tangential plane within the rim cavity at four axial locations. Instantaneous velocity maps obtained by PIV were analyzed sector-by-sector to understand the rim cavity flow field. It was observed that the tangential velocity dominated the cavity flow at low purge air flow rate, its dominance decreasing with increase in the purge air flow rate. Radially inboard of the rim cavity, negative radial velocity near the stator surface and positive radial velocity near the rotor surface indicated the presence of a recirculation region in the cavity whose radial extent increased with increase in the purge air flow rate. Qualitative flow streamline patterns are plotted within the rim cavity for different experimental conditions by combining the PIV map information with ingestion measurements within the cavity as reported in Thiagarajan (2013).
ContributorsPathak, Parag (Author) / Roy, Ramendra P (Thesis advisor) / Calhoun, Ronald (Committee member) / Lee, Taewoo (Committee member) / Arizona State University (Publisher)
Created2013
152535-Thumbnail Image.png
Description
Virtual Patient Simulations (VPS) are web-based exercises involving simulated patients in virtual environments. This study investigates the utility of VPS for increasing medical student clinical reasoning skills, collaboration, and engagement. Many studies indicate that VPS provide medical students with essential practice in clinical decision making before they encounter real life

Virtual Patient Simulations (VPS) are web-based exercises involving simulated patients in virtual environments. This study investigates the utility of VPS for increasing medical student clinical reasoning skills, collaboration, and engagement. Many studies indicate that VPS provide medical students with essential practice in clinical decision making before they encounter real life patients. The utility of a recursive, inductive VPS for increasing clinical decision-making skills, collaboration, or engagement is unknown. Following a design-based methodology, VPS were implemented in two phases with two different cohorts of first year medical students: spring and fall of 2013. Participants were 108 medical students and six of their clinical faculty tutors. Students collaborated in teams of three to complete a series of virtual patient cases, submitting a ballpark diagnosis at the conclusion of each session. Student participants subsequently completed an electronic, 28-item Exit Survey. Finally, students participated in a randomized controlled trial comparing traditional (tutor-led) and VPS case instruction methods. This sequence of activities rendered quantitative and qualitative data that were triangulated during data analysis to increase the validity of findings. After practicing through four VPS cases, student triad teams selected accurate ballpark diagnosis 92 percent of the time. Pre-post test results revealed that PPT was significantly more effective than VPS after 20 minutes of instruction. PPT instruction resulted in significantly higher learning gains, but both modalities supported significant learning gains in clinical reasoning. Students collaborated well and held rich clinical discussions; the central phenomenon that emerged was "synthesizing evidence inductively to make clinical decisions." Using an inductive process, student teams collaborated to analyze patient data, and in nearly all instances successfully solved the case, while remaining cognitively engaged. This is the first design-based study regarding virtual patient simulation, reporting iterative phases of implementation and design improvement, culminating in local theories (petite generalizations) about VPS design. A thick, rich description of environment, process, and findings may benefit other researchers and institutions in designing and implementing effective VPS.
ContributorsMcCoy, Lise (Author) / Wetzel, Keith (Thesis advisor) / Ewbank, Ann (Thesis advisor) / Simon, Harvey (Committee member) / Arizona State University (Publisher)
Created2014
152536-Thumbnail Image.png
Description
As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be seen as cooperating to complete a task with some object

As robotic systems are used in increasingly diverse applications, the interaction of humans and robots has become an important area of research. In many of the applications of physical human robot interaction (pHRI), the robot and the human can be seen as cooperating to complete a task with some object of interest. Often these applications are in unstructured environments where many paths can accomplish the goal. This creates a need for the ability to communicate a preferred direction of motion between both participants in order to move in coordinated way. This communication method should be bidirectional to be able to fully utilize both the robot and human capabilities. Moreover, often in cooperative tasks between two humans, one human will operate as the leader of the task and the other as the follower. These roles may switch during the task as needed. The need for communication extends into this area of leader-follower switching. Furthermore, not only is there a need to communicate the desire to switch roles but also to control this switching process. Impedance control has been used as a way of dealing with some of the complexities of pHRI. For this investigation, it was examined if impedance control can be utilized as a way of communicating a preferred direction between humans and robots. The first set of experiments tested to see if a human could detect a preferred direction of a robot by grasping and moving an object coupled to the robot. The second set tested the reverse case if the robot could detect the preferred direction of the human. The ability to detect the preferred direction was shown to be up to 99% effective. Using these results, a control method to allow a human and robot to switch leader and follower roles during a cooperative task was implemented and tested. This method proved successful 84% of the time. This control method was refined using adaptive control resulting in lower interaction forces and a success rate of 95%.
ContributorsWhitsell, Bryan (Author) / Artemiadis, Panagiotis (Thesis advisor) / Santello, Marco (Committee member) / Santos, Veronica (Committee member) / Arizona State University (Publisher)
Created2014
152539-Thumbnail Image.png
Description
The slider-crank mechanism is popularly used in internal combustion engines to convert the reciprocating motion of the piston into a rotary motion. This research discusses an alternate mechanism proposed by the Wiseman Technology Inc. which involves replacing the crankshaft with a hypocycloid gear assembly. The unique hypocycloid gear arrangement allows

The slider-crank mechanism is popularly used in internal combustion engines to convert the reciprocating motion of the piston into a rotary motion. This research discusses an alternate mechanism proposed by the Wiseman Technology Inc. which involves replacing the crankshaft with a hypocycloid gear assembly. The unique hypocycloid gear arrangement allows the piston and the connecting rod to move in a straight line, creating a perfect sinusoidal motion. To analyze the performance advantages of the Wiseman mechanism, engine simulation software was used. The Wiseman engine with the hypocycloid piston motion was modeled in the software and the engine's simulated output results were compared to those with a conventional engine of the same size. The software was also used to analyze the multi-fuel capabilities of the Wiseman engine using a contra piston. The engine's performance was studied while operating on diesel, ethanol and gasoline fuel. Further, a scaling analysis on the future Wiseman engine prototypes was carried out to understand how the performance of the engine is affected by increasing the output power and cylinder displacement. It was found that the existing Wiseman engine produced about 7% less power at peak speeds compared to the slider-crank engine of the same size. It also produced lower torque and was about 6% less fuel efficient than the slider-crank engine. These results were concurrent with the dynamometer tests performed in the past. The 4 stroke diesel variant of the same Wiseman engine performed better than the 2 stroke gasoline version as well as the slider-crank engine in all aspects. The Wiseman engine using contra piston showed poor fuel efficiency while operating on E85 fuel. But it produced higher torque and about 1.4% more power than while running on gasoline. While analyzing the effects of the engine size on the Wiseman prototypes, it was found that the engines performed better in terms of power, torque, fuel efficiency and cylinder BMEP as their displacements increased. The 30 horsepower (HP) prototype, while operating on E85, produced the most optimum results in all aspects and the diesel variant of the same engine proved to be the most fuel efficient.
ContributorsRay, Priyesh (Author) / Redkar, Sangram (Thesis advisor) / Mayyas, Abdel Ra'Ouf (Committee member) / Meitz, Robert (Committee member) / Arizona State University (Publisher)
Created2014
152401-Thumbnail Image.png
Description
ABSTRACT Current federal and state education mandates were developed to make schools accountable for student performance with the rationale that schools, teachers, and students will improve through the administration of high-stakes tests. Public schools are mandated to adhere to three accountability systems: national, state, and local. Additional elements include the

ABSTRACT Current federal and state education mandates were developed to make schools accountable for student performance with the rationale that schools, teachers, and students will improve through the administration of high-stakes tests. Public schools are mandated to adhere to three accountability systems: national, state, and local. Additional elements include the recent implementation of the Common Core standards and newly devised state accountability systems that are granted through waivers as an alternative to the accountability mandates in the No Child Left Behind Act NCLB of 2001. Teachers' voices have been noticeably absent from the accountability debates, but as studies show, as primary recipients of accountability sanctions, many teachers withdraw, "burn out," or leave the profession altogether. The present study is based on the premise that teachers are vital to student achievement, and that their perspectives and understandings are therefore a resource for educational reform especially in light of the accountability mandates under NCLB. With that premise as a starting point, this dissertation examines practicing urban teachers' experiences of accountability in culturally and linguistically diverse schools. To fulfill these goals, this qualitative study used individual and focus group interviews and observations with veteran elementary school teachers in an urban Southwestern public school district, to ascertain practices they perceive to be effective. The study's significance lies in informing stakeholders, researchers, and policymakers of practicing teachers' input on accountability mandates in diverse urban schools.
ContributorsGishey, Rhiannon L (Author) / Mccarty, Teresa L (Thesis advisor) / Fischman, Gustavo E (Committee member) / Ikeler, Susan (Committee member) / Arizona State University (Publisher)
Created2013
152414-Thumbnail Image.png
Description
Creative design lies at the intersection of novelty and technical feasibility. These objectives can be achieved through cycles of divergence (idea generation) and convergence (idea evaluation) in conceptual design. The focus of this thesis is on the latter aspect. The evaluation may involve any aspect of technical feasibility and may

Creative design lies at the intersection of novelty and technical feasibility. These objectives can be achieved through cycles of divergence (idea generation) and convergence (idea evaluation) in conceptual design. The focus of this thesis is on the latter aspect. The evaluation may involve any aspect of technical feasibility and may be desired at component, sub-system or full system level. Two issues that are considered in this work are: 1. Information about design ideas is incomplete, informal and sketchy 2. Designers often work at multiple levels; different aspects or subsystems may be at different levels of abstraction Thus, high fidelity analysis and simulation tools are not appropriate for this purpose. This thesis looks at the requirements for a simulation tool and how it could facilitate concept evaluation. The specific tasks reported in this thesis are: 1. The typical types of information available after an ideation session 2. The typical types of technical evaluations done in early stages 3. How to conduct low fidelity design evaluation given a well-defined feasibility question A computational tool for supporting idea evaluation was designed and implemented. It was assumed that the results of the ideation session are represented as a morphological chart and each entry is expressed as some combination of a sketch, text and references to physical effects and machine components. Approximately 110 physical effects were identified and represented in terms of algebraic equations, physical variables and a textual description. A common ontology of physical variables was created so that physical effects could be networked together when variables are shared. This allows users to synthesize complex behaviors from simple ones, without assuming any solution sequence. A library of 16 machine elements was also created and users were given instructions about incorporating them. To support quick analysis, differential equations are transformed to algebraic equations by replacing differential terms with steady state differences), only steady state behavior is considered and interval arithmetic was used for modeling. The tool implementation is done by MATLAB; and a number of case studies are also done to show how the tool works. textual description. A common ontology of physical variables was created so that physical effects could be networked together when variables are shared. This allows users to synthesize complex behaviors from simple ones, without assuming any solution sequence. A library of 15 machine elements was also created and users were given instructions about incorporating them. To support quick analysis, differential equations are transformed to algebraic equations by replacing differential terms with steady state differences), only steady state behavior is considered and interval arithmetic was used for modeling. The tool implementation is done by MATLAB; and a number of case studies are also done to show how the tool works.
ContributorsKhorshidi, Maryam (Author) / Shah, Jami J. (Thesis advisor) / Wu, Teresa (Committee member) / Gel, Esma (Committee member) / Arizona State University (Publisher)
Created2014
152592-Thumbnail Image.png
Description
Public demands for accountability and educational change are at an all-time high. No Child Left Behind set the stage for public accountability of educators and the recently created Race to the Top grant raised the stakes of public school accountability even more with the creation of national standards and assessments

Public demands for accountability and educational change are at an all-time high. No Child Left Behind set the stage for public accountability of educators and the recently created Race to the Top grant raised the stakes of public school accountability even more with the creation of national standards and assessments as well as public accountability of individual teacher performance based on student test scores. This high-stakes context has placed pressure on local schools to change their instructional practices rapidly to ensure students are learning what they need to in order to perform well on looming Partnership for Assessment of Readiness for College and Careers (PARCC) exams. The purpose of this mixed methods action research study was to explore a shared leadership model and discover the impact of a change facilitation team using the Concerns Based Adoption Model tools on the speed and quality of innovation diffusion at a Title One elementary school. The nine-member change facilitation team received support for 20 weeks in the form of professional development and ongoing team coaching as a means to empower teacher-leaders to more effectively take on the challenges of change. Eight of those members participated in this research. This approach draws on the research on change, learning organizations, and coaching. Quantitative results from the Change Facilitator Stages of Concern Questionnaire were triangulated with qualitative data from interviews, field notes, and Innovation Configuration Maps. Results show the impact on instructional innovation when teacher-leadership is leveraged to support change. Further, there is an important role for change coaches when leading change initiatives. Implications from this study can be used to support other site leaders grappling with instructional innovation and calls for additional research.
ContributorsCruz, Jennifer (Author) / Zambo, Debby (Thesis advisor) / Foulger, Teresa (Committee member) / Tseunis, Paula (Committee member) / Arizona State University (Publisher)
Created2014
152600-Thumbnail Image.png
Description
This thesis contains the applications of the ASU mathematical model (Tolerance Maps, T-Maps) to the construction of T-Maps for patterns of line profiles. Previously, Tolerance Maps were developed for patterns of features such as holes, pins, slots and tabs to control their position. The T-Maps that are developed in this

This thesis contains the applications of the ASU mathematical model (Tolerance Maps, T-Maps) to the construction of T-Maps for patterns of line profiles. Previously, Tolerance Maps were developed for patterns of features such as holes, pins, slots and tabs to control their position. The T-Maps that are developed in this thesis are fully compatible with the ASME Y14.5 Standard. A pattern of square profiles, both linear and 2D, is used throughout this thesis to illustrate the idea of constructing the T-Maps for line profiles. The Standard defines two ways of tolerancing a pattern of profiles - Composite Tolerancing and Multiple Single Segment Tolerancing. Further, in the composite tolerancing scheme, there are two different ways to control the entire pattern - repeating a single datum or two datums in the secondary datum reference frame. T-Maps are constructed for all the different specifications. The Standard also describes a way to control the coplanarity of discontinuous surfaces using a profile tolerance and T-Maps have been developed. Since verification of manufactured parts relative to the tolerance specifications is crucial, a least squares fit approach, which was developed earlier for line profiles, has been extended to patterns of line profiles. For a pattern, two tolerances are specified, and the manufactured profile needs to lie within the tolerance zones established by both of these tolerances. An i-Map representation of the manufactured variation, located within the T-Map is also presented in this thesis.
ContributorsRao, Shyam Subramanya (Author) / Davidson, Joseph K. (Thesis advisor) / Arizona State University (Publisher)
Created2014
152562-Thumbnail Image.png
Description
Conformance of a manufactured feature to the applied geometric tolerances is done by analyzing the point cloud that is measured on the feature. To that end, a geometric feature is fitted to the point cloud and the results are assessed to see whether the fitted feature lies within the specified

Conformance of a manufactured feature to the applied geometric tolerances is done by analyzing the point cloud that is measured on the feature. To that end, a geometric feature is fitted to the point cloud and the results are assessed to see whether the fitted feature lies within the specified tolerance limits or not. Coordinate Measuring Machines (CMMs) use feature fitting algorithms that incorporate least square estimates as a basis for obtaining minimum, maximum, and zone fits. However, a comprehensive set of algorithms addressing the fitting procedure (all datums, targets) for every tolerance class is not available. Therefore, a Library of algorithms is developed to aid the process of feature fitting, and tolerance verification. This paper addresses linear, planar, circular, and cylindrical features only. This set of algorithms described conforms to the international Standards for GD&T.; In order to reduce the number of points to be analyzed, and to identify the possible candidate points for linear, circular and planar features, 2D and 3D convex hulls are used. For minimum, maximum, and Chebyshev cylinders, geometric search algorithms are used. Algorithms are divided into three major categories: least square, unconstrained, and constrained fits. Primary datums require one sided unconstrained fits for their verification. Secondary datums require one sided constrained fits for their verification. For size and other tolerance verifications, we require both unconstrained and constrained fits
ContributorsMohan, Prashant (Author) / Shah, Jami (Thesis advisor) / Davidson, Joseph K. (Committee member) / Farin, Gerald (Committee member) / Arizona State University (Publisher)
Created2014