Carnegie Mellon University

SUCCESS STORY

How a re-designed college-level course led to improved outcomes and better allocation of resources

Data informed the redesign of C@CM, an introductory course required for all incoming freshmen at CMU. The changes transitioned the course from a traditional instruction model that took place in computer labs to an online learning experience that was then supplemented by lab time only for those students in need of face time. The changes brought about a dramatically improved success rate (98 percent of students passed the most recent course) and allowed both students and faculty to spend more time focused on core courses.

Two keys to the redesigned approach is the careful collection of learner interaction data from students’ use of the course, and a regular, iterative improvement cycle, designed to leverage quantitative and qualities data from the course’s use.

How it worked:

Computing @ Carnegie Mellon (C@CM) is the only course required for all CMU undergraduates. Taught to more than 1,500 students each year, the course focuses on a range of essential skills for success at the university, including the responsible, efficient and safe use of the university’s electronic resources, effective approaches for finding, evaluating and using information and critical metacognitive skills.

C@CM is unique in that it has no single faculty instructor responsible for the course; instead the C@CM team works with individual colleges and support partners across the university to develop appropriate learning activities. During the semester, the course is taught by trained undergraduate teaching assistants (TAs). For two decades, the course was taught in CMU computer labs using traditional instructional methods. This approach was expensive, both in lab use and the size of the undergraduate TA staff necessary to support the incoming class. Students also expressed continued dissatisfaction with the course, driven largely by the broad range of student’s background experience — expert and novice computer users mixed in the same course created an unsatisfying learning experience for both.

A large-scale effort to reengineer the course was undertaken in 2010, developing the course into an Open Learning Initiative (OLI) online learning environment. This approach provided students with a set of carefully designed online instructional activates, informed by best practices in learning and technology. The OLI course was designed to provide a self-directed experience, scaffolded with additional human support. This delivery model offers opportunities for students to work ahead and attempt the final exam early, offering a venue for students with pre-existing knowledge and schedule constraints, while defining a minimum schedule and required milestones, which provides structure to ensure that students don’t fall behind.

The data used:

Two keys to the OLI approach are the careful collection of learner interaction data from students’ use of the course, and a regular, iterative improvement cycle, designed to leverage quantitative and qualities data from the course’s use. While OLI is specifically designed for data capture and use with regard to the online learning activities, the C@CM team embraced this research-based approach for the entire course, from delivery and attendance to support model and incentive structure.

This approach has left the team with exceptionally rich data from multiple sources, including:

  • OLI Learning Interaction data for all learning activities — how did students answer questions? What mistakes did they make? Where did they need help?
  • OLI Assessment data — how well did students achieve the learning outcomes as measured by the final exam?
  • OLI Timing and Engagement data — how did students use the course materials? For how long?
  • Semantic data, tying individual learning activities and questions to student-centered, measurable learning outcomes and sub-skills.
  • Learning Estimates produced by the OLI Learning Dashboard.
  • Gradebook information, including OLI use, exams, attendance and other graded activities.
  • Rich demographic information from the university’s student information system.
  • Extensive qualitative data from student surveys.
  • Outreach, attendance and other information captured by the C@CM staff throughout the delivery of the course.
Using this data, the C@CM team has worked through four cycles of improvement, with exceptional results. In approaching the course materials, the team has been able to use standard OLI and DataShop tools to identify learning activities and outcomes that are problematic, making appropriate corrections and improvements to the OLI course over time. The team was also able to identify areas where students were already well prepared, reducing the number of activities — these kinds of targeted improvements were able to improve learner productivity, reducing the amount time students needed while improving course success rates. A more thoughtful look at data also revealed where the course development team could best spend their time. During preliminary development, significant time and resource was invested in creating high quality walk-through videos for different course elements. But students’ usage and success data indicated that these videos were not contributing to student success, particularly not when compared to other, more active learn-by-doing opportunities. This allowed the development team to focus more of their time and resource on activities that would best support the learners. Similarly, initial considerations for how to improve the course after preliminary use focused on course pass rates for the entire population — with 91 percent of students passing the course, the team initially looked to create new activities to assist the 9 percent. A closer look at the data, however, revealed a much different picture of student success for students who engaged with the online learning materials:
Engaged Did Not Engage
Pass 96% 67%
Fail 4% 33%

This idea that course success is dependent on engagement allowed for a different improvement approach. Recognizing that students who work through the online course materials in a timely manner and seek support when necessary pass the course, the team was able to focus attention on appropriate areas — additional options for students that were engaging appropriately but still not passing. A modified incentive structure was created to encourage the non-engaging students to make better use of the course.

Other areas where data has been leveraged to improve the course:
Exploratory analysis of demographic information revealed that a student’s major or college did not have an impact on success — with one exception. Students from two specific colleges were failing the course at nearly four times the rate of other colleges. A deeper analysis revealed two different types of problems — additional support structures were necessary for one college, while scheduling was a major factor for students from the other. Changes in the support model have been very effective in improving success for the one anomalous college — they are now passing at rates similar to those of other colleges. The scheduling challenge has been more difficult to resolve, but the team is working with the college faculty and will be trying some new approaches.


The larger impact

This careful, iterative use of data to improve the learning experience over a four-year period has had a number of valuable effects:

  • The early access option allowed a large population of students to successfully complete and pass the course before arriving on campus, which meant they could spend more of their time on campus on core courses.
  • Students are entering the core courses better prepared, which means a better use of faculty time.
  • The reduced delivery cost, with fewer TAs and lab sessions needed (a more than 50 percent reduction) has meant more resources are able to be devoted to core courses. Instructional teams also get to focus more attention on learners and iterative improvement.
  • Labs are composed of learners who most need assistance, which means the TAs’ time is being better used. Data has enabled this in two ways: by supporting the development of an appropriate support model and by providing live dashboards to TAs to better guide their instruction.
  • C@CM content has been expanded, as have its learning outcomes, as a result of the improved productivity.
  • Overall, CMU has seen a dramatically improved success rate for the course — 98 percent of learners passed C@CM during the most recent academic year.