Carnegie Mellon University

Eberly Center

Teaching Excellence & Educational Innovation

Quick-Fire Talks

Teams That Work: Developing Effective Teams Through Evidence-Based Skills Assessment Discovery

Carla Bevins, Tepper School of Business

Teamwork makes the dream work, right? However, we can all pinpoint times in our own classes when our students’ team projects were not as successful as we would have hoped. This time, let us look at the people in the teams and the skills and experiences they bring to the table. By using individual leadership style and conflict management style assessments in conjunction with Tuckman’s Stages of Group Development, we can teach our students how to create robust teams. Students focus on each team member’s strengths before delving into the assigned task in order to create more meaningful, successful project outcomes. Students learn how to develop successful teams and how to translate these skills to future team development through in-class lecture, discussions, and activities. This instructional strategy is essential for any class focusing on high-performance team development and communication skills growth.

Virtual Reality (VR) App: A Supplemental Instructional Tool 

Kim Hyatt, Heinz

Fear of public speaking is a realistic and common anxiety for professionals and students alike. In response to this area of weakness, I developed a virtual reality (VR) application for classroom use to improve public speaking skills using artificial intelligence, machine learning, and reflection. Students record a speech in VR and receive immersive, real-time, and post-presentation feedback on their performance. After practice sessions, students present in front of their peers to determine levels of improvement. Multiple types of data were collected: (1) user experiences in terms of self-efficacy (i.e. how confident students feel about their ability to be public speakers, as well as their ability to continue to improve); (2) system log data to understand what features of the system are tied to user outcomes; (3) student performances (rubric scores) before/after using the VR app; and, (4) student self-assessments and reflections. This talk will share features of the VR app and some results from the pilot study.

Improving Peer Feedback in the Classroom

Jessica Hammer, HCII

Students commenting on one another’s work, also known as peer feedback, helps both feedback providers and feedback receivers learn. However, left to themselves, students struggle to be appropriately critical, to justify the feedback they provide, and to make sense of the feedback they receive. The EOTA (Experiences, Observations, Theories, Advice) process addresses all three of these issues through a structured non-digital feedback format. This talk will present the EOTA method as used for live in-class feedback in the design classroom; it can also be extended to asynchronous or online environments, or any teaching context leveraging studio-style critique or peer-feedback of design projects.

Does Active Learning Help? If So, How Best to Debrief it?

Mike McCarthy & Marty Barrett, Heinz

Highly-scripted labs overseen by Teaching Assistants can be a scalable way to increase active learning in computer science courses with burgeoning enrollments. Despite positive student feedback on lab activities, some students were still not achieving learning goals, as reflected on exam questions. We attempted to increase student learning by adding short “lab debrief” exercises during the lecture sessions immediately following labs. Lab debriefs utilized multiple-choice questions and active learning in pairs of students. We evaluated two research questions. First, does adding brief active learning exercises to lectures enhance student outcomes on lab content in a computer science course? Second, following active learning, what’s the best way to debrief the exercise with the entire class? Specifically, does student learning differ when instructors debrief the rationale for all the answer choices versus only the correct answer? Third, does the impact of active learning depend on the type of question (recall versus application)?