Matching Items (3)
Filtering by

Clear all filters

151416-Thumbnail Image.png
Description
The purpose of this study was to investigate the effects of instructor response prompts and rubrics on students' performance in an asynchronous discussion-board assignment, their learning achievement on an objective-type posttest, and their reported satisfaction levels. Researchers who have studied asynchronous computer-mediated student discussion transcripts have found evidence of mostly

The purpose of this study was to investigate the effects of instructor response prompts and rubrics on students' performance in an asynchronous discussion-board assignment, their learning achievement on an objective-type posttest, and their reported satisfaction levels. Researchers who have studied asynchronous computer-mediated student discussion transcripts have found evidence of mostly mid-level critical thinking skills, with fewer examples limited to lower or higher order thinking skill demonstration. Some researchers suggest that instructors may facilitate increased demonstration of higher-order critical thinking skills within asynchronous discussion-board activities. However, there is little empirical evidence available to compare the use of different external supports to facilitate students' critical thinking skills performance and learning achievement in blended learning environments. Results of the present study indicate that response prompts and rubrics can affect students' discussion performance, learning, and satisfaction ratings. The results, however, are complex, perhaps mirroring the complexity of instructor-led online learning environments. Regarding discussion board performance, presenting students with a rubric tended to yield higher scores on most aspects that is, on overall performance, as well as depth and breadth of performance, though these differences were not significant. In contrast, instructor prompts tended to yield lower scores on aspects of discussion board performance. On breadth, in fact, this main effect difference was significant. Interactions also indicated significant differences on several aspects of discussion board performance, in most cases indicating that the combination of rubric and prompt was detrimental to scores. The learning performance on the quiz showed, again, the effectiveness of rubrics, with students who received the rubric earning significantly higher scores, and with no main effects or interactions for instructor prompts. Regarding student satisfaction, again, the picture is complicated. Results indicated that, in some instances, the integration of prompts resulted in lower satisfaction ratings, particularly in the areas of students' perceptions of the amount of work required, learning in the partially online format, and student-to-student interaction. Based on these results, design considerations to support rubric use and explicit feedback in asynchronous discussions to support student learning are proposed.
ContributorsGiacumo, Lisa (Author) / Savenye, Wilhelmina (Thesis advisor) / Nelson, Brian (Committee member) / Legacy, Jane (Committee member) / Bitter, Gary (Committee member) / Arizona State University (Publisher)
Created2012
153640-Thumbnail Image.png
Description
The purpose of this instructional design and development study was to describe, evaluate and improve the instructional design process and the work of interdisciplinary design teams. A National Science Foundation (NSF) funded, Transforming Undergraduate Education in Science (TUES) project was the foundation for this study. The project developed new curriculum

The purpose of this instructional design and development study was to describe, evaluate and improve the instructional design process and the work of interdisciplinary design teams. A National Science Foundation (NSF) funded, Transforming Undergraduate Education in Science (TUES) project was the foundation for this study. The project developed new curriculum materials to teach learning content in unsaturated soils in undergraduate geotechnical engineering classes, a subset of the civil engineering. The study describes the instructional design (ID) processes employed by the team members as they assess the need, develop the materials, disseminate the learning unit, and evaluate its effectiveness, along with the impact the instructional design process played in the success of the learning materials with regard to student achievement and faculty and student attitudes. Learning data were collected from undergraduate geotechnical engineering classes from eight partner universities across the country and Puerto Rico over three phases of implementation. Data were collected from students and faculty that included pretest/posttest scores and attitudinal survey questions. The findings indicated a significant growth in the learning with the students of the faculty who were provided all learning materials. The findings also indicated an overall faculty and student satisfaction with the instructional materials. Observational and anecdotal data were also collected in the form of team meeting notes, personal observations, interviews and design logs. Findings of these data indicated a preference with working on an interdisciplinary instructional design team. All these data assisted in the analysis of the ID process, providing a basis for descriptive and inferential data used to provide suggestions for improving the ID process and the work of interdisciplinary instructional design teams.
ContributorsOrnelas, Arthur (Author) / Savenye, Wilhelmina C. (Thesis advisor) / Atkinson, Robert (Committee member) / Bitter, Gary (Committee member) / Houston, Sandra (Committee member) / Arizona State University (Publisher)
Created2015
155595-Thumbnail Image.png
Description
While predicting completion in Massive Open Online Courses (MOOCs) has been an active area of research in recent years, predicting completion in self-paced MOOCS, the fastest growing segment of open online courses, has largely been ignored. Using learning analytics and educational data mining techniques, this study examined data generated by

While predicting completion in Massive Open Online Courses (MOOCs) has been an active area of research in recent years, predicting completion in self-paced MOOCS, the fastest growing segment of open online courses, has largely been ignored. Using learning analytics and educational data mining techniques, this study examined data generated by over 4,600 individuals working in a self-paced, open enrollment college algebra MOOC over a period of eight months.

Although just 4% of these students completed the course, models were developed that could predict correctly nearly 80% of the time which students would complete the course and which would not, based on each student’s first day of work in the online course. Logistic regression was used as the primary tool to predict completion and focused on variables associated with self-regulated learning (SRL) and demographic variables available from survey information gathered as students begin edX courses (the MOOC platform employed).

The strongest SRL predictor was the amount of time students spent in the course on their first day. The number of math skills obtained the first day and the pace at which these skills were gained were also predictors, although pace was negatively correlated with completion. Prediction models using only SRL data obtained on the first day in the course correctly predicted course completion 70% of the time, whereas models based on first-day SRL and demographic data made correct predictions 79% of the time.
ContributorsCunningham, James Allan (Author) / Bitter, Gary (Thesis advisor) / Barber, Rebecca (Committee member) / Douglas, Ian (Committee member) / Arizona State University (Publisher)
Created2017