Matching Items (846)
Filtering by

Clear all filters

151684-Thumbnail Image.png
Description
This study tested the effects of two kinds of cognitive, domain-based preparation tasks on learning outcomes after engaging in a collaborative activity with a partner. The collaborative learning method of interest was termed "preparing-to-interact," and is supported in theory by the Preparation for Future Learning (PFL) paradigm and the Interactive-Constructive-Active-Passive

This study tested the effects of two kinds of cognitive, domain-based preparation tasks on learning outcomes after engaging in a collaborative activity with a partner. The collaborative learning method of interest was termed "preparing-to-interact," and is supported in theory by the Preparation for Future Learning (PFL) paradigm and the Interactive-Constructive-Active-Passive (ICAP) framework. The current work combined these two cognitive-based approaches to design collaborative learning activities that can serve as alternatives to existing methods, which carry limitations and challenges. The "preparing-to-interact" method avoids the need for training students in specific collaboration skills or guiding/scripting their dialogic behaviors, while providing the opportunity for students to acquire the necessary prior knowledge for maximizing their discussions towards learning. The study used a 2x2 experimental design, investigating the factors of Preparation (No Prep and Prep) and Type of Activity (Active and Constructive) on deep and shallow learning. The sample was community college students in introductory psychology classes; the domain tested was "memory," in particular, concepts related to the process of remembering/forgetting information. Results showed that Preparation was a significant factor affecting deep learning, while shallow learning was not affected differently by the interventions. Essentially, equalizing time-on-task and content across all conditions, time spent individually preparing by working on the task alone and then discussing the content with a partner produced deeper learning than engaging in the task jointly for the duration of the learning period. Type of Task was not a significant factor in learning outcomes, however, exploratory analyses showed evidence of Constructive-type behaviors leading to deeper learning of the content. Additionally, a novel method of multilevel analysis (MLA) was used to examine the data to account for the dependency between partners within dyads. This work showed that "preparing-to-interact" is a way to maximize the benefits of collaborative learning. When students are first cognitively prepared, they seem to make the most efficient use of discussion towards learning, engage more deeply in the content during learning, leading to deeper knowledge of the content. Additionally, in using MLA to account for subject nonindependency, this work introduces new questions about the validity of statistical analyses for dyadic data.
ContributorsLam, Rachel Jane (Author) / Nakagawa, Kathryn (Thesis advisor) / Green, Samuel (Committee member) / Stamm, Jill (Committee member) / Arizona State University (Publisher)
Created2013
151688-Thumbnail Image.png
Description
This study empirically evaluated the effectiveness of the instructional design, learning tools, and role of the teacher in three versions of a semester-long, high-school remedial Algebra I course to determine what impact self-regulated learning skills and learning pattern training have on students' self-regulation, math achievement, and motivation. The 1st version

This study empirically evaluated the effectiveness of the instructional design, learning tools, and role of the teacher in three versions of a semester-long, high-school remedial Algebra I course to determine what impact self-regulated learning skills and learning pattern training have on students' self-regulation, math achievement, and motivation. The 1st version was a business-as-usual traditional classroom teaching mathematics with direct instruction. The 2rd version of the course provided students with self-paced, individualized Algebra instruction with a web-based, intelligent tutor. The 3rd version of the course coupled self-paced, individualized instruction on the web-based, intelligent Algebra tutor coupled with a series of e-learning modules on self-regulated learning knowledge and skills that were distributed throughout the semester. A quasi-experimental, mixed methods evaluation design was used by assigning pre-registered, high-school remedial Algebra I class periods made up of an approximately equal number of students to one of the three study conditions or course versions: (a) the control course design, (b) web-based, intelligent tutor only course design, and (c) web-based, intelligent tutor + SRL e-learning modules course design. While no statistically significant differences on SRL skills, math achievement or motivation were found between the three conditions, effect-size estimates provide suggestive evidence that using the SRL e-learning modules based on ARCS motivation model (Keller, 2010) and Let Me Learn learning pattern instruction (Dawkins, Kottkamp, & Johnston, 2010) may help students regulate their learning and improve their study skills while using a web-based, intelligent Algebra tutor as evidenced by positive impacts on math achievement, motivation, and self-regulated learning skills. The study also explored predictive analyses using multiple regression and found that predictive models based on independent variables aligned to student demographics, learning mastery skills, and ARCS motivational factors are helpful in defining how to further refine course design and design learning evaluations that measure achievement, motivation, and self-regulated learning in web-based learning environments, including intelligent tutoring systems.
ContributorsBarrus, Angela (Author) / Atkinson, Robert K (Thesis advisor) / Van de Sande, Carla (Committee member) / Savenye, Wilhelmina (Committee member) / Arizona State University (Publisher)
Created2013
152231-Thumbnail Image.png
Description
This study examined the relations between cognitive ability, socioemotional competency (SEC), and achievement in gifted children. Data were collected on children between the ages of 8 and 15 years (n = 124). Children were assessed via teacher reports of SEC, standardized cognitive assessment, and standardized achievement assessment. Composite achievement significantly

This study examined the relations between cognitive ability, socioemotional competency (SEC), and achievement in gifted children. Data were collected on children between the ages of 8 and 15 years (n = 124). Children were assessed via teacher reports of SEC, standardized cognitive assessment, and standardized achievement assessment. Composite achievement significantly correlated with all areas of SEC on the Devereux Student Strengths Assessment (DESSA). Cognitive ability significantly correlated with all areas of SEC as well. Composite cognitive ability significantly correlated with all composite achievement, as well as with achievement in all subject areas assessed. Achievement scores tended to be higher in older age groups in comparison to younger age groups. When gender differences were found (in some areas of SEC and in language achievement), they tended to be higher in females. Gender moderated the relation between SEC and composite achievement. The areas of SEC that best predicted achievement, over-and-above other SEC scales, were Optimistic Thinking, Self-Awareness, and Relationship Skills. While cognitive scores did not significantly predict achievement when controlling for SEC, SEC did significantly predict achievement over-and-above cognitive ability scores. Overall findings suggest that SEC may be important in children's school achievement; thus it is important for schools and families to promote the development of SEC in gifted children, especially in the areas of optimism and self-awareness.
ContributorsKong, Tiffany (Author) / Caterino, Linda (Thesis advisor) / Naglieri, Jack (Committee member) / Brulles, Dina (Committee member) / Arizona State University (Publisher)
Created2013
152235-Thumbnail Image.png
Description
The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the

The ability to design high performance buildings has acquired great importance in recent years due to numerous federal, societal and environmental initiatives. However, this endeavor is much more demanding in terms of designer expertise and time. It requires a whole new level of synergy between automated performance prediction with the human capabilities to perceive, evaluate and ultimately select a suitable solution. While performance prediction can be highly automated through the use of computers, performance evaluation cannot, unless it is with respect to a single criterion. The need to address multi-criteria requirements makes it more valuable for a designer to know the "latitude" or "degrees of freedom" he has in changing certain design variables while achieving preset criteria such as energy performance, life cycle cost, environmental impacts etc. This requirement can be met by a decision support framework based on near-optimal "satisficing" as opposed to purely optimal decision making techniques. Currently, such a comprehensive design framework is lacking, which is the basis for undertaking this research. The primary objective of this research is to facilitate a complementary relationship between designers and computers for Multi-Criterion Decision Making (MCDM) during high performance building design. It is based on the application of Monte Carlo approaches to create a database of solutions using deterministic whole building energy simulations, along with data mining methods to rank variable importance and reduce the multi-dimensionality of the problem. A novel interactive visualization approach is then proposed which uses regression based models to create dynamic interplays of how varying these important variables affect the multiple criteria, while providing a visual range or band of variation of the different design parameters. The MCDM process has been incorporated into an alternative methodology for high performance building design referred to as Visual Analytics based Decision Support Methodology [VADSM]. VADSM is envisioned to be most useful during the conceptual and early design performance modeling stages by providing a set of potential solutions that can be analyzed further for final design selection. The proposed methodology can be used for new building design synthesis as well as evaluation of retrofits and operational deficiencies in existing buildings.
ContributorsDutta, Ranojoy (Author) / Reddy, T Agami (Thesis advisor) / Runger, George C. (Committee member) / Addison, Marlin S. (Committee member) / Arizona State University (Publisher)
Created2013
152197-Thumbnail Image.png
Description
Microelectronic industry is continuously moving in a trend requiring smaller and smaller devices and reduced form factors with time, resulting in new challenges. Reduction in device and interconnect solder bump sizes has led to increased current density in these small solders. Higher level of electromigration occurring due to increased current

Microelectronic industry is continuously moving in a trend requiring smaller and smaller devices and reduced form factors with time, resulting in new challenges. Reduction in device and interconnect solder bump sizes has led to increased current density in these small solders. Higher level of electromigration occurring due to increased current density is of great concern affecting the reliability of the entire microelectronics systems. This paper reviews electromigration in Pb- free solders, focusing specifically on Sn0.7wt.% Cu solder joints. Effect of texture, grain orientation, and grain-boundary misorientation angle on electromigration and intermetallic compound (IMC) formation is studied through EBSD analysis performed on actual C4 bumps.
ContributorsLara, Leticia (Author) / Tasooji, Amaneh (Thesis advisor) / Lee, Kyuoh (Committee member) / Krause, Stephen (Committee member) / Arizona State University (Publisher)
Created2013
152200-Thumbnail Image.png
Description
Magnetic Resonance Imaging using spiral trajectories has many advantages in speed, efficiency in data-acquistion and robustness to motion and flow related artifacts. The increase in sampling speed, however, requires high performance of the gradient system. Hardware inaccuracies from system delays and eddy currents can cause spatial and temporal distortions in

Magnetic Resonance Imaging using spiral trajectories has many advantages in speed, efficiency in data-acquistion and robustness to motion and flow related artifacts. The increase in sampling speed, however, requires high performance of the gradient system. Hardware inaccuracies from system delays and eddy currents can cause spatial and temporal distortions in the encoding gradient waveforms. This causes sampling discrepancies between the actual and the ideal k-space trajectory. Reconstruction assuming an ideal trajectory can result in shading and blurring artifacts in spiral images. Current methods to estimate such hardware errors require many modifications to the pulse sequence, phantom measurements or specialized hardware. This work presents a new method to estimate time-varying system delays for spiral-based trajectories. It requires a minor modification of a conventional stack-of-spirals sequence and analyzes data collected on three orthogonal cylinders. The method is fast, robust to off-resonance effects, requires no phantom measurements or specialized hardware and estimate variable system delays for the three gradient channels over the data-sampling period. The initial results are presented for acquired phantom and in-vivo data, which show a substantial reduction in the artifacts and improvement in the image quality.
ContributorsBhavsar, Payal (Author) / Pipe, James G (Thesis advisor) / Frakes, David (Committee member) / Kodibagkar, Vikram (Committee member) / Arizona State University (Publisher)
Created2013
152208-Thumbnail Image.png
Description
Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households

Vehicle type choice is a significant determinant of fuel consumption and energy sustainability; larger, heavier vehicles consume more fuel, and expel twice as many pollutants, than their smaller, lighter counterparts. Over the course of the past few decades, vehicle type choice has seen a vast shift, due to many households making more trips in larger vehicles with lower fuel economy. During the 1990s, SUVs were the fastest growing segment of the automotive industry, comprising 7% of the total light vehicle market in 1990, and 25% in 2005. More recently, due to rising oil prices, greater awareness to environmental sensitivity, the desire to reduce dependence on foreign oil, and the availability of new vehicle technologies, many households are considering the use of newer vehicles with better fuel economy, such as hybrids and electric vehicles, over the use of the SUV or low fuel economy vehicles they may already own. The goal of this research is to examine how vehicle miles traveled, fuel consumption and emissions may be reduced through shifts in vehicle type choice behavior. Using the 2009 National Household Travel Survey data it is possible to develop a model to estimate household travel demand and total fuel consumption. If given a vehicle choice shift scenario, using the model it would be possible to calculate the potential fuel consumption savings that would result from such a shift. In this way, it is possible to estimate fuel consumption reductions that would take place under a wide variety of scenarios.
ContributorsChristian, Keith (Author) / Pendyala, Ram M. (Thesis advisor) / Chester, Mikhail (Committee member) / Kaloush, Kamil (Committee member) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2013
152178-Thumbnail Image.png
Description
The construction industry in India suffers from major time and cost overruns. Data from government and industry reports suggest that projects suffer from 20 to 25 percent time and cost overruns. Waste of resources has been identified as a major source of inefficiency. Despite a substantial increase in the past

The construction industry in India suffers from major time and cost overruns. Data from government and industry reports suggest that projects suffer from 20 to 25 percent time and cost overruns. Waste of resources has been identified as a major source of inefficiency. Despite a substantial increase in the past few years, demand for professionals and contractors still exceeds supply by a large margin. The traditional methods adopted in the Indian construction industry may not suffice the needs of this dynamic environment, as they have produced large inefficiencies. Innovative ways of procurement and project management can satisfy the needs aspired to as well as bring added value. The problems faced by the Indian construction industry are very similar to those faced by other developing countries. The objective of this paper is to discuss and analyze the economic concerns, inefficiencies and investigate a model that both explains the Indian construction industry structure and provides a framework to improve efficiencies. The Best Value (BV) model is examined as an approach to be adopted in lieu of the traditional approach. This could result in efficient construction projects by minimizing cost overruns and delays, which until now have been a rarity.
ContributorsNihas, Syed (Author) / Kashiwagi, Dean (Thesis advisor) / Sullivan, Kenneth (Committee member) / Kashiwagi, Jacob (Committee member) / Arizona State University (Publisher)
Created2013
152181-Thumbnail Image.png
Description
The objective of this thesis was to compare various approaches for classification of the `good' and `bad' parts via non-destructive resonance testing methods by collecting and analyzing experimental data in the frequency and time domains. A Laser Scanning Vibrometer was employed to measure vibrations samples in order to determine the

The objective of this thesis was to compare various approaches for classification of the `good' and `bad' parts via non-destructive resonance testing methods by collecting and analyzing experimental data in the frequency and time domains. A Laser Scanning Vibrometer was employed to measure vibrations samples in order to determine the spectral characteristics such as natural frequencies and amplitudes. Statistical pattern recognition tools such as Hilbert Huang, Fisher's Discriminant, and Neural Network were used to identify and classify the unknown samples whether they are defective or not. In this work, a Finite Element Analysis software packages (ANSYS 13.0 and NASTRAN NX8.0) was used to obtain estimates of resonance frequencies in `good' and `bad' samples. Furthermore, a system identification approach was used to generate Auto-Regressive-Moving Average with exogenous component, Box-Jenkins, and Output Error models from experimental data that can be used for classification
ContributorsJameel, Osama (Author) / Redkar, Sangram (Thesis advisor) / Arizona State University (Publisher)
Created2013
152184-Thumbnail Image.png
Description
Current emphasis on adequate academic progress monitored by standardized assessments has increased focus on student acquisition of required skills. Reading ability can be assessed through student achievement on Oral Reading Fluency (ORF) measures. This study investigated the effectiveness of using ORF measures to predict achievement on high stakes tests. Study

Current emphasis on adequate academic progress monitored by standardized assessments has increased focus on student acquisition of required skills. Reading ability can be assessed through student achievement on Oral Reading Fluency (ORF) measures. This study investigated the effectiveness of using ORF measures to predict achievement on high stakes tests. Study participants included 312 students across four Title 1 elementary schools in a Southwestern United States school district utilizing the Response to Intervention (RTI) model. Participants' ORF scores from first through third grade years and their third grade standardized achievement test scores were collected. In addition, information regarding reading interventions was obtained. Pearson product-moment correlations were used to determine how ORF scores and specific reading skills were related. Correlations were also used to assess whether the ORF scores from the fall, winter, or spring were most related to high stakes test scores. Additionally, the difference between computer-based versus instructor-led interventions on predicting high stakes test scores was assessed. Results indicated that correlation coefficients were larger between ORF and reading comprehension scores than between ORF and basic reading skills. ORF scores from spring were more highly related to high stakes tests than other times of the year. Students' ORF scores were more strongly related to high stakes tests when in computer-based interventions compared to instructor-led interventions. In predicting third grade high stakes test scores, first grade ORF scores had the most variance for the non-intervention sample, while third grade ORF scores had the most variance for the intervention sample.
ContributorsDevena, Sarah (Author) / Caterino, Linda (Thesis advisor) / Balles, John (Committee member) / Mathur, Sarup (Committee member) / Arizona State University (Publisher)
Created2013