IS OPEN SOURCE SAFE FOR DIGITAL EXAMS?

opensource-istock-1039072216-uriz1.jpg

One of the fears we typically encounter is that online exams will increase the academic dishonesty among students, citing the conclusion that using computers provides the students with more opportunities to cheat. But this is not necessarily the case.

According to a randomized response survey, an online setting for exams is not conducive to more cheating among students (Grijalva et al., 2006). Only an estimated 3-4 percent of the responders had cheated, which “suggests that academic dishonesty in a single online class is not greater than estimates of cheating in a traditional class” (Ibid., p. 13). A later study from another university finds that the amount of students cheating is much higher, but the equilibrium between cheating in a manual  or online setting is maintained (Watson & Sottile, 2010). A third study suggest that an online setting is even less conducive to cheating than the traditional approach (Stuber-McEwen et al., 2009).

So, the fear of an increase in academic dishonesty is not supported by data, but that does not mean it should be disregarded; it is still a valid concern in general. And, it should be a concern when choosing which approach to online exams an educational institution is to take.

Open Source or Proprietary Code?

One of the tools providing security to online exams is a lockdown browser: a full-screen browser window which the student sits the exam in. It cannot be exited before the end of the exam, and all actions within the window can be restricted and controlled by the administrator of the exam, e.g. access to resources, applications or other websites. This allows for standardisation of permissions in the lockdown browser to reflect the institutional guideline of specific exams.

There are a number of different decisions related to choosing a lockdown browser. One of the earliest crossroads we encountered developing WISEflow was whether we were going down the open source road or going forward with a professionally developed tool with proprietary code for our lockdown browser. Open source has some positive sides and is widely used in developing new applications, as the ‘pre-made building blocks’ makes development much more agile and keeps development costs down. As such, it is a very attractive solution economically, and it provides an easy starting point, as the code is often well-documented.

However, in the end, we chose a professionally developed tool for several reasons. Our primary reason was that open source has some security issues that become very apparent when used in an exam and assessment environment. Instead, we moved forward with a tool with proprietary code, as this solution provided us with better security and a trusted partner to hold accountable regarding the security of the lockdown browser.

Technology is constantly under pressure, and this includes lockdown browsers. To eliminate external threats and attacks, we have a specific and well-known partner to collaborate with to maintain and secure the lockdown browser.

This solution has proven itself, as the tech-savviness of students has continued at an accelerated pace along with technological advances. This now means that open source lockdown browsers are becoming increasingly vulnerable to modifications and alterations, allowing students to compromise the security of the exams.

Exploits in Open Source Exam Software

The vulnerability of open source programs can differ widely depending on their use. For a lockdown browser for exams, tech-savvy students with the intention of cheating can try to find loopholes in the code or alter the code in order to compromise security measures, as the code by definition is openly accessible. The basic fact that it is open source “…could be a disadvantage with regard to security, as any examinee can access and modify the source code” (Søgaard, 6). As his thesis, a student in computer science at the Norwegian University of Science and Technology investigated the vulnerabilities of the lockdown browser his university used for digital exams (Heintz, 2017).

The lockdown browser in question (Safe Exam Browser) is an open source product; a licence which enables the student to view and modify the code of the lockdown browser. For example, by looking at the source code, he is able to understand how the browser generates the browser request hash for exam keys, which it uses to validate that the user in question is running the lockdown browser with the correct version and setup (Ibid., 17, 19). With easy and open access to the code of the browsers, the student figures out how to go around the security measures meant to keep users from using modified versions of the browser, enabling him to launch his own version of the exam browser and still generate a correct exam key (Ibid., 42).

This is the worst-case scenario for digital exams, as it potentially enables the student to strip the lockdown browser of all intended security measures while keeping the interface identical to the secure software. As such, it becomes almost impossible for invigilators to identify attempts at academic misconduct by the student in question during the exam. Meanwhile, the student has access to every resource the educational institution has sought to exclude, such as the possibility of communicating with others during the exam, even outside of the exam room.

The specific issues in these examples were of course to be handled by their developers. But the root of the problem is more difficult to manage. As the entire point of open source is distributing it under a licence that grants users the possibility of studying, experimenting with and distributing the code, it becomes inherently less safe than proprietary code. This is less important in some circumstances, but for exams, security must be one of the highest priorities.

Better Support with Proprietary Exam Solutions 

The reasons to use open source are plenty, if your focus is purely agility and cost-saving, but for exams – and lockdown browser exams especially – it is not the best choice. Other than the source code being better guarded against misuse, proprietary software has other advantages. By partnering with a trusted third-party vendor, we also gain access to their support. This can speed up potential troubleshooting and setup issues.

Open source programs more often rely on an active community to develop and support the code, which makes the open source code reliant on the goodwill of internet forum members. In these cases, when loopholes are found, they are not fixed as fast as with a designated support and development team, as dependencies on open source code often entails waiting for patches to become available. This can be critical for high-stake activities such as exams. 

Proprietary Code for Academic Integrity

What is really at stake when choosing an exam browser is the academic integrity of the institution, and academic integrity is already at risk, as reported academic misconduct is rising steadily. The Guardian reports that Russel Group universities experienced a 40% increase in academic misconduct from 2014/15 to 2016-17.

One of the many reasons cited for this development is an increase in students feeling stressed and anxious about the perceived expectations about their performance in exams. According to an IPPR report, the number of undergraduates disclosing mental illness to their educational institution has almost quintupled, from 3.145 in 2006/2007 to 15.395 in 2015/2016. The vast majority of these disclosures involve depression and anxiety, which can have a serious impact on the ethical decisions students make in relation to testing circumstances such as exams (Kouchaki & Desai, 2014).

This development is an important issue in and of itself, and should definitely be an area of focus, but until a change is manifested within higher education, it also becomes important to decrease the number of opportunities students have to make poor judgment in relation to exams. And in this light, open source poses a considerable problem for digital exams.

References

Donna Stuber-McEwen, Phillip Wiseley, and Susan Hoggatt. “Point, Click, and Cheat: Frequency and Type of Academic Dishonesty in the Virtual Classroom”. In: Online Journal of Distance Learning Administration 12.3 (2009), pp. 1– 9

Heintz, Aleksander. ”Cheating at Digital Exams – Vulnerabilities and Countermeasures”. Norwegian University of Science and Technology, Department of Computer Science, 2017.

Kouchaki, M., & Desai, S. (2014). Anxious, Threatened, and Also Unethical: How Anxiety Makes Individuals Feel Threatened and Commit Unethical Acts. Journal of Applied Psychology DOI: 10.1037/a0037796

Søgaard, Thea Marie. “Mitigiation of Cheating Threats in Digital BYOD exams”. Norwegian University of Science and Technology, Department of Computer and Information Science, 2016.

Therese C Grijalva, Joe Kerkvliet, and Clifford Nowell. “Academic Honesty and Online Courses”. In: College Student Journal (2006).

Watson, George, and James Sottile. “Cheating in the Digital Age: Do Students Cheat More in Online Courses?” Online Journal of Distance Learning Administration 13.1 (2010)

Previous
Previous

Change the Debate on Digital Assessment

Next
Next

THE POINT OF PAPERLESS EXAMS