Carnegie Mellon University

Eberly Center

Teaching Excellence & Educational Innovation

Exams in Hybrid/Remote Environments

The process of creating exams necessarily foregrounds considerations related to equity, level of difficulty, and accuracy of measurement. Exams in hybrid and remote modalities have added complexities with issues that include but are not limited to exam format, use of technology, academic integrity, and accommodations for students with documentation from the Office of Disability Resources. While acknowledging that there is no single solution that will meet all needs, the following suggestions are designed to help instructors identify approaches that speak to these considerations.

Faculty who transition to remote exams report that it is insufficient to simply transfer questions from paper formats to online formats (Cramp et al., 2019) but rather time and attention should be directed to redesigning for remote formats (Böhmer et al., 2018). In particular, the following design considerations are important:

  • Students should be able to easily discern how to navigate within the exam. If possible, instructions should be itemized and distinct from the exam questions (Parshall et al., 2002). 
  • Consider the format for how students will submit their assignments. Exams that are written on paper and require scan and upload at the end of the exam period can create additional pressure points for students. If students can submit text-only responses, the quiz function in Canvas can be used for such exams, eliminating the need to scan and upload. Alternatively, editable exam templates and fillable PDF documents can also allow students to save and upload typed responses. (See also an instructor guide and student guide on using Gradescope assignments.)
  • If students are required to submit responses in formats other than text (i.e., drawings, figures, etc.) that do require scan and upload, factor in additional time to scan and upload when designing the exam. Scanning apps such as Cam Scanner and Scannable allow students to scan documents from a smart phone or tablet.
  • The exam format and logistics – including technology needs – should be explicitly communicated in advance, and students should have the opportunity (and even encouragement and/or requirement) to practice in advance, so that technology problems are surfaced earlier rather than later and so that students have some familiarity with the format before exam time.
  • Consider using open-ended questions (with short or long answers) as well as/instead of multiple-choice questions. Open-ended question formats make it harder for students to share answers. 
  • For exams in which students have to perform a calculation or solve a problem, ask them to show their work or explain their approach to improve assessment accuracy and mitigate cheating risks
  • Open-book, open-note exams are another option for hybrid/remote environments. These exams tend to focus more on whether students can apply concepts and how well they can explain their approach, rather than recalling facts or solving simple problems. 
  • For written assignments, create questions that require critical thinking, as these types of responses may mitigate opportunities for cheating (McNabb & Olmstead, 2009) and plagiarism (Heckler et al., 2013).
At the beginning of your exam/evaluation, include an explicit reminder to students of what is appropriate/inappropriate collaboration or use of resources for the exam/evaluation they are about to take. You can also note the potential range of consequences and ask students to acknowledge that they have reviewed and understand these expectations prior to beginning the assessment. Such reminders may help to reduce the likelihood of violations (Corrigan-Gibbs et al., 2015). 
  • To get a sense of your students’ time zones for planning purposes, you can find this information on course rosters.
  • To offer flexibility for students in different time zones, allow students to take Canvas exams/quizzes with a fixed time allotment and flexible start windows to ensure that all students, regardless of time zone, can benefit from the participation and practice. Alternatively, create a different version of the quiz for students in other time zones.
  • If it’s possible to create two form of your assessment (see section below), you can offer them to students at different times.
  • If you use multiple-choice questions, you can randomize the order of the answer choices for each question (Sullivan, 2016). This can be done when setting up a quiz in Canvas. 
  • Similarly, instructors who wish to substitute numbers in parameterized exam questions can do so within Canvas or OLI. Here, the system generates a version of the question for each student by inserting a number (from a range you indicate) for a given parameter of the question. To do this in Canvas, see the instructions on Creating a simple formula question and Creating a question with a single variable. To do this in OLI does not require the creation of an entire OLI course. For more information on implementing this strategy with either tool, please email eberly-assist@andrew.cmu.edu. 
  • Consider using two versions of the exam that are equivalent in difficulty, but use slightly different questions (Chiesl, 2007). Note: Some instructors are planning to write a set of exam questions that covers the most essential learning objectives for the course (Kinzie, 2020) and then to (randomly) sample from this set to create two different exams – i.e., with the intent that the randomization will, on average, address equivalence issue).
    • You can do this manually, or Canvas allows you to create a “test bank” from which you can draw when creating an assessment. This approach minimizes perceived practicality of cheating (Sullivan, 2016). This may be especially relevant for students taking exams in different time zones. 
    • Following the exam, you may wish to review the grade distributions across exam versions to determine whether a correction is needed to account for differences in difficulty. 
In the mastery format, students can retake the exam (or different versions of the exam) multiple times in order to demonstrate mastery. Early attempts do not affect the student’s final grade. This focus on mastery over memorization allowed students to demonstrate their knowledge more fully, and lower their stress. Mastery exams can follow many different formats and have been used in multiple departments including Computer Science, the Information Networking Institute (INI) and Statistics and Data Science.
Although the majority of scholarship on oral exams has been conducted outside of the U.S., the education literature offers some evidence of benefits of oral assessment approaches  including opportunities to focus on deep understanding as opposed to recall (Iannone & Simpson, 2012) as well as live prompting and correction by the instructor (Douglas & Knighten, 2014). Additionally, oral assessment formats minimize opportunities for plagiarism (Joughin, 1998). That said, it is important for instructors to understand that student anxiety and uncertainty around oral question formats may be high (Huxnam et al., 2010; Ianonne & Simpson, 2015). Therefore, it is important to implement this approach with great care and consideration for a number of complexities including elevated anxiety, implicit bias, and equity. For example, to reduce bias, instructors using this approach are advised to select the sub-sample of questions for oral review in advance and in some randomized manner. Important additional information regarding administration of oral exams is available here.
Multiple technology tools can be used during exam sessions to provide students with access to instructors and TA’s, similar to what they would experience in a physical classroom. An added benefit to this approach is that instructors and TA’s can likewise monitor students throughout the exam. One approach that has been used in the School of Computer Science combines GradescopePiazzaZoom, and Slack.

Advance preparation and testing will help students demonstrate what they have learned without distractions and unnecessary anxiety. If technology is required for students during your exams/evaluations – for completing the assessment activity and/or for remote proctoring purposes – these strategies are strongly recommended:

  • Ensure that all students have the necessary technology and that it works properly for them in their remote learning environment. Enrollment Services sent a survey to all undergraduate and graduate students in July to inventory basic technology needs such as reliable internet access, computer/laptop, webcam, smartphone, and headset. Although this data collection provides important early information, student needs may change throughout the semester. If students report needs related to technology, their Student Affairs college liaison can work with them to provide support and identify resources.
  • Avoid using a technology that is new to students. Use technology tools that you have already successfully used with students (e.g., in prior online activity or assessment). 
  • Conduct a trial run with the technology. Schedule a trial run when instructors and students practice using the planned exam-administration technologies. Do this enough in advance so that any technical glitches or gaps in students’ remote-working environment can be addressed (Cramp et al., 2019).
  • The trial run should contain all of the question types on the actual exam. This not only allows students to test the functionality of their technology but affords an opportunity for meaningful review of content.
  • If students use any assistive technology, make sure it works with the designated technology. If you have questions about compatibility of technology with assistive devices, please reach out to the Office of Disability Resources at access@andrew.cmu.edu
  • Consider allowing soft deadlines. If your exam may be sensitive to student connectivity issues or time required for students to download/scan/upload, provide some extra time (above and beyond your exam-completion time) for these logistics. Please consider allowing soft deadlines if students report having technical problems.
  • Provide a communication channel for students to contact you if technical issues arise during the exam session.

Böhmer, C., Feldmann, N., & Ibsen, M. (2018, April). E-exams in engineering education—online testing of engineering competencies: Experiences and lessons learned. In 2018 IEEE Global Engineering Education Conference (EDUCON) (pp. 571-576). 

Chiesl, N. (2007). Pragmatic methods to reduce dishonesty in web-based courses. Quarterly Review of Distance Education, 8(3), 203.

Corrigan-Gibbs, H., Gupta, N., Northcutt, C., Cutrell, E., & Thies, W. (2015). Deterring cheating in online environments. ACM Transactions on Computer-Human Interaction (TOCHI)22(6), 1-23.

Cramp, J., Medlin, J. F., Lake, P., & Sharp, C. (2019). Lessons learned from implementing remotely invigilated online exams. Journal of University Teaching & Learning Practice16(1), 10.

Heckler, N. C., Forde, D. R., & Bryan, C. H. (2013). Using writing assignment designs to mitigate plagiarism. Teaching Sociology41(1), 94-105.

Huxham, M., Campbell, F., & Westwood, J. (2010). Oral versus written assessments: A test of student performance and attitudes. Assessment & Evaluation in Higher Education, 37(1), 125-136.

Iannone, P., & Simpson, A. (2012). Oral assessment in mathematics: implementation and outcomes. Teaching Mathematics and Its Applications, 31(4), 179-190.

Iannone, P., & Simpson, A. (2015). Students’ views of oral performance assessment in mathematics: straddling the “assessment of” ’and “assessment for” learning divide. Assessment & Evaluation in Higher Education, 40(7), 971-987.

Joughin, G. (1998). Dimensions of oral assessment. Assessment & Evaluation in Higher Education, 23(4), 367-378.

Kinzie, J. (2020). How to Reorient Assessment and Accreditation in the Time of COVID‐19 Disruption. Assessment Update, 32(4), 4-5.

McNabb, L., & Olmstead, A. (2009). Communities of integrity in online courses: Faculty member beliefs and strategies. Journal of Online Learning and Teaching5(2), 208-23.

Parshall, C., Spray, J., Kalohn, J. & Davey, T. (2002). Practical issues in computer-based testing. New York, NY: Springer-Verlag.

Sullivan, D. P. (2016). An integrated approach to preempt cheating on asynchronous, objective, online assessments in graduate business classes. Online Learning20(3), 195-209.