Tips for Remote Assessments

The UTM Office of the Vice-Principal Academic & Dean (OVPAD) recently undertook a review of issues around remote assessments (assignments, quizzes, tests, and exams administered online), including evaluating the impact that the switch to remote learning may have had on academic integrity in the Winter 2020 term. Two main data sets were examined: (1) a year-over-year analysis of GWR requests made during the “transition period” (March 16 - May 6, 2020), and (2) consultations with UTM instructors who participated in beta testing ProctorU use for exams. Key observations and recommendations are provided below to guide instructors with syllabus design, assessment design, pros and cons of implementing exam software and class management points related to academic integrity and online assessment for the upcoming Fall 2020 and Winter 2021 terms.

Sections
Syllabus Design
Assessment Design
Exam Software
Classroom Management

 

Syllabus Design

The review found that the majority of GWR requests submitted during the transition period pertained to assessments weighted at 25% or more of the final grade. The following methods can help to decrease academic integrity issues.

  • Clearly delineate when peer collaboration is acceptable. Based on in-class experiences (e.g., labs, group work), students may have the impression that collaboration is permitted when completing equivalent assessments in a remote environment. Always make expectations for assessment completion clear.

  • Monitor relevant online resources. Instructors must anticipate that their assessment questions will be posted to third-party websites (Easy Edu, Chegg, etc.) and shared over social media (even Twitter!). Instructors are encouraged to include copyright statements in their syllabi and bring copyright claims against websites hosting their course materials.

  • Be mindful of student stress levels and amenable to accommodations. Students registered with Accessibility Services may require accommodations (e.g., additional time or multiple breaks) that can be challenging to implement in remote assessments. Instructors may contact Accessibility Services in advance (access.utm@utoronto.ca) and request assistance. Further information and advice on tests and other topics can be found at https://www.utm.utoronto.ca/accessibility/covid-19-updates-faculty.

 

Assessment Design

The review suggested that question type was a major factor in the frequency of academic offenses.

  • Open-book assessments do not necessarily prevent academic offenses, but instead can create the (mis)perception that it is fair to consult resources available online. Instructors must be explicit about what is allowed in assessments and advise students on the risks of engaging with unauthorized aids.

  • Adjusting the difficulty of an open-book assessment is not a solution. Academic offenses have occurred when questions were both too easy (and “Google-able”) and too hard. On the other hand, increasing the difficulty simply to decrease easy searches may punish students who try to complete the assessment honestly and may drive them to use unauthorized aids, especially if they feel that their peers are also using such aids.

  • Randomization and versioning may help to reduce (and to identify) academic offenses but can be circumvented. Some cases involved students submitting answers to other versions of assessments, revealing that they had communicated with their peers (via, e.g., WhatsApp or Facebook) during the assessment period.

  • Essay-like questions seem to result in fewer academic offenses; however, plagiarism is easier in a remote environment for essay tests. Cases include examples of copied and pasted answers from a variety of sources, including class lecture slides. Please be aware that Turnitin is only integrated with Quercus Assignments and not with Quercus Quizzes.

  • Control the time that students have to complete assessments. A too-generous time window creates more opportunity for the assessment to be shared and circulated. It is a deterrent if the time it takes to communicate with peers or search for solutions online risks having sufficient time to complete the assessment.

  • Use strategies in Quercus Quiz features to support academic integrity, as appropriate. For example, break assessments into distinct time-limited parts with no returning to the previous part; this also allows for breaks to be inserted into long tests/exams; for example, two 50-minute sections separated by a 10-minute break. Set a time limit for Quercus Quizzes, only allow 1 attempt, and do not let students see their responses until after everyone has completed the quiz.

 

Exam Software

There are two types of exam software currently in use at U of T: remote proctoring and exam delivery. Remote proctoring software, such as ProctorU and Examity, run on a student’s computer and uses the student’s webcam, along with academic integrity algorithms, to monitor their behaviour while taking the exam. Exam delivery software, such as ExamSoft, also runs on a student’s computer but delivers the exam in-app while locking down the computer to ensure that students cannot refer to notes or search for answers online. U of T has existing agreements with ProctorU, Examity, and ExamSoft, but only ProctorU is integrated with Quercus at present.

As ProctorU does not have the capacity to deliver live proctoring for large courses, the ProctorU pilot group used the “ProctorU AutoLaunch/Record” service, which records students via webcam as well as records all activity on their computer screen. Recordings are viewed by ProctorU invigilators at a later date, who flag suspected academic offences for the instructor’s attention.

  • Remote proctoring requires a very large time investment. Some instructors commented that the additional workload from onboarding students, answering student questions, and reviewing flagged recordings amounted to teaching an entire extra course.

  • ProctorU does not scale well to large classes. Onboarding a large volume of students onto the service at the same time can overload ProctorU’s server and result in technical issues for students and ProctorU. If using ProctorU with a large class, consider implementing staggered start times.

  • Hold a practice run to resolve onboarding and technical issues in advance. The first experience with ProctorU can be very unpredictable; completion rates for the pilot may have ranged as widely as 40% to 98%. Instructors had no issues with deferred exams, as there were fewer students to manage and knew what to expect of the service by then. ProctorU charges for practice runs; remember to include this extra cost in cost estimates.

  • ProctorU’s academic integrity reports are not very helpful. Instructors noticed synchronization issues between webcam recordings and screen recordings, which makes it cumbersome to review reports. Flagged activities are also often unclear and inconsistent between ProctorU invigilators.

IMPORTANT NOTE: If you want to use exam software for your course, you must create a plan to provide alternatives for those who cannot or do not wish to use this software, to be submitted as part of the software use request. You must contact I&ITS well before the start of the term to see if it is possible to support your request; the available space on U of T licenses is very limited.

 

Class Management

It is important to remember that the great majority of our students do want to learn and be successful in their studies.

  • Promote a culture of academic integrity from the beginning of the course. Instructors can set expectations around remote assessments and help students understand the importance of academic integrity for their own learning experience. Instructors can remind students about, and ask them to commit to, academic integrity guidelines by having the first question on the test consist of an academic integrity agreement. Having students collaborate on writing a “student code of behaviour” has also been employed.