An online “proctor” who can survey a student’s home and manipulate the mouse on their computer as the scholar takes an exam. A remote-learning platform that takes face scans and voiceprints of scholars. Virtual classrooms where strangers can pop up out of the blue and see who’s at school.

These three unnerving scenarios aren’t hypothetical. Rather, they stand as stark, real-life examples of how distant learning through the pandemic – each on the K-12 and college level – has develop into riddled with threats to students’ privacy.

As a scholar of privacy, I imagine all of the electronic eyes watching students today have created privacy concerns that merit more attention.

Which is why, increasingly, you will notice aggrieved students, parents and digital privacy advocates looking for to carry schools and technology platforms accountable for running afoul of student privacy law.

Concerns and criticisms

For instance, the American Civil Liberties Union of Massachusetts has accused that state of lacking sufficient measures to guard the privacy of faculty and college students.

Students are taking measures to force universities to stop the usage of invasive software reminiscent of proctoring apps, which some schools and colleges use to be certain that students don’t cheat on exams. They have filed quite a few petitions asking administrators and teachers to finish the usage of these apps. In a letter to the California Supreme Court, the Electronic Frontier Foundation, a world nonprofit that defends
digital rights, wrote that the usage of remote-proctoring technologies is essentially the identical as spying.

A series of security breaches serves as an instance why students and privacy advocates are fighting against online proctor apps.

For instance, in July 2020, online proctoring service ProctorU suffered a cyberbreach during which sensitive personal information for 444,000 students – including their names, email address, home addresses, phone numbers and passwords – was leaked. This data then became available in online hacker forums. Cybercriminals may use such information to launch phishing attacks to steal people’s identities and falsely obtain loans using their names.

A public petition filed by students at Washington State University has expressed concerns about ProctorU’s weak security practices. The petition had over 1,900 signatures as of Nov. 5.

Thousands of scholars have had sensitive information leaked online.
Bill Hinton/Getty Images

Students compelled to share sensitive data

Some online proctoring firms have engaged in activities that violate students’ privacy. The online proctoring software Proctorio’s CEO, for instance, violated a student’s privacy by posting the scholar’s chats on the social news forum Reddit.

To use online proctoring apps, students are required to supply full access to their devices including all personal files. They are also asked to activate their computer’s video camera and microphone. Some national advocacy groups of fogeys, teachers and community members argue that requiring students to activate their cameras with rooms within the background during virtual classes or exams for a stranger to look at would violate their civil rights.

Fair information practices, a set of principles established by the International Association of Privacy Professionals, require that information be collected by fair means. Online proctoring apps use methods that could cause anxiety and stress amongst many students and are thus unfair.

When students are forced to reveal sensitive information against their wishes, it may well harm them psychologically. Some students also experience physical symptoms resulting from stress and anxiety. One student literally vomited resulting from the stress from a statistics exam. She did so at her desk at home because no bathroom breaks were permitted.

Poor technology performance

These privacy-invasive proctoring tools depend on artificial intelligence, which affect certain groups more adversely.

For instance, these programs may flag body-focused repetitive behaviors reminiscent of trichotillomania, chronic tic disorder and other health disorders, as cheating.

Artificial intelligence also performs poorly in identifying the faces of scholars who’re ethnic minorities or darker-skinned individuals. In some cases, such students undergo extra hassles. They may have to contact the technical support team to resolve the issue and hence get lower than allotted time to finish the exam.

One student who experienced this snafu blamed the situation on “racist technology.”

Lawsuits and regulatory concerns

Providers of distant learning and technology solutions and schools are facing several lawsuits and regulatory actions.

For example, an Illinois parent has sued Google. The lawsuit alleged that Google’s G Suite for Education apps illegally collected children’s biometric data, reminiscent of facial scans and voiceprints, that are a human voice’s measurable characteristics that discover a person. Such practices violate the Illinois Biometric Information Privacy Act.

In April, a gaggle of California parents filed a federal lawsuit against G Suite on similar grounds.

In some cases, officials have taken motion to scale back the hostile privacy effects posed by distant learning and technology solutions that had weak security. For instance, New York’s Department of Education banned video communications app Zoom resulting from privacy and other concerns. Many instances were reported during which Zoom’s weak cybersecurity failed to forestall a type of harassment often known as “Zoombombing,” during which intruders could gain access to virtual classrooms.

In such situations, schools face two major problems. First, video, audio and chat sessions in Zoom recordings have personally identifiable information reminiscent of faces, voices and names. These education records are thus subject to the Family Educational Rights and Privacy Act, which is supposed to protect the privacy of student education records.

Such information shouldn’t be accessed by anyone who just isn’t in the category. When teachers cannot prevent unintended participants from joining a virtual class, there’s a violation of the Family Educational Rights and Privacy Act.

What K-12 schools and universities can do

The increasing scrutiny of and criticism for privacy-invasive software, which resembles spyware, may require schools and universities to reconsider their use. One option may very well be to go for open-note, open-book exams that don’t require proctoring.

In general, artificial intelligence just isn’t developed well. For instance, in an effort to be sure that artificial intelligence algorithms can accurately predict cheating in exams, they could should be trained with thousands and thousands of images and videos of student cheating. This has not yet happened in most areas including distant learning. The artificial intelligence industry has been described as being at an infant stage of development. Even simpler algorithms reminiscent of facial recognition applications have been mainly trained to discover white males and, consequently, misidentify ethnic minorities. Thus, I don’t imagine this technology is currently appropriate for distant proctoring.

This article was originally published at theconversation.com