Just a few short years ago, personal digital assistants like Amazon’s Alexa, Apple’s Siri and Google Assistant sounded futuristic. Now, the longer term is here and this future is embedded, augmented and ubiquitous.

Digital assistants will be present in your office, home, automotive, hotel, phone and lots of other places. They have recently undergone massive transformation and run on operating systems which are fuelled by artificial intelligence (AI). They observe and collect data in real-time and have the aptitude to drag information from different sources comparable to smart devices and cloud services and put the knowledge into context using AI to make sense of the situation. Although now we have come a good distance within the design and execution of those AI technologies, there remains to be more work to be done on this arena.

Much of the info that these digital assistants collect and use include personal, potentially identifiable and possibly sensitive information. Can Alexa or other personal digital assistants violate the privacy and security of our data? Possibly. There is a dark side to those virtual assistants.

My expertise is in data privacy, data governance and artificial intelligence. I used to be previously the Information and Privacy Officer with the Ontario Information and Privacy Commissioner’s Office.

Welcoming service

Imagine the next situation.

You expect some guests over. Your first guest arrives, and the outdoor security camera in your porch captures her walking as much as your house. A polite voice welcomes her and unlocks the door. Once she is inside, your digital assistant explains to your guest that you just are in your way and shall be home soon. Through your house audio system, your digital assistant plays a collection of your guest’s favourite songs (out of your Spotify friends network). Your digital assistant asks your guest if pumpkin spice remains to be her preferred coffee flavour or if she prefers other ones: french vanilla or Colombian. Soon after, your guest picks the coffee up from the digital coffee machine. Welcoming duties now complete, your digital assistant goes silent, and while waiting for you, your guest makes a couple of phone calls.

It is fascinating how a digital assistant can accurately and autonomously validate the identity of your guest, select her favourite songs, remember her preferred coffee flavour and manage the smart appliances in your home.

As smart devices turn out to be more ubiquitous, maintaining our privacy and that of others requires a brand new and concerted collective effort.

Hosting assistants

But does your digital assistant’s behaviour concern you?

Digital assistants can record our conversations, images and lots of other pieces of sensitive personal information, including location via our smartphones. They use our data for machine learning to enhance themselves over time. Their software is developed and maintained by firms which are consistently pondering of recent ways to gather and use our data.

Similar to other computer programs, the elemental issue with these digital assistants is that they’re vulnerable to technical and process failures. Digital assistants may also be hacked remotely, leading to breaches of users’ privacy.

For example, an Oregon couple needed to unplug their Alexa device, Amazon’s virtual assistant, as their private conversation was recorded and sent to certainly one of their friends on their contact list.

In one other incident, a German man by accident received access to 1,700 Alexa audio files belonging to a whole stranger. The files revealed the person’s name, habits, jobs and other sensitive information.

Awareness privilege

Increasing popularity and availability of private digital assistants has resulted in a widening of the so-called digital divide. The interesting paradox is that individuals who’re aware of and sensitive to problems with privacy typically limit their usage of digital tools, while users who’re less vulnerable to protect their privacy extensively incorporate personal assistants into their digital lives.

Digital assistants either record data repeatedly or wait for a word to “get up” or turn out to be activated. They don’t limit data collection to the owners’ or authorised users’ information. Personal digital assistants may collect and process unapproved users’ personal data, like their voices.

In the digitally divided society, someone who’s privacy savvy wouldn’t invite such equipment into their lives, while others may accept or rationalize such behaviours.

Smart devices connect the user to household appliances, with the promise to enhance quality of life by making household management easier and more efficient.

Respecting others’ privacy

In this age of ubiquitous devices and web access, how should we take care of this paradox and respect each others’ space and selections?

Let’s revisit our imaginary personal digital assistant. It needed to process different sources of knowledge concerning the guest to operate as a sensible host. Did the digital assistant use all that data to feed the algorithms or to invade the guest’s privacy? Depending on who you ask, the reply shall be different.

Our etiquette-conscious upbringing tells us that now we have a social and ethical responsibility to respect each others’ values in the case of digital technologies. But the implications and growth of those technologies have been so significant and rapid that now we have not yet been capable of redefine our social norms and expectations.

For instance, as a number, do now we have an ethical obligation towards our guests to tell them about our personal digital assistant? Is it polite for a house visitor to ask the host to show their digital tools off? Should we inquire concerning the presence of smart tools and digital assistants before arriving at a friend’s house, a hotel or an AirBnB?

The answer to those questions is yes, in accordance with etiquette expert Daniel Post Senning. Senning explains that etiquette is strongest once you use it as a tool for self-assessment. Would we like to learn that we’re being recorded in a business meeting or a non-public gathering? Or how can we prefer to be asked to show digital tools off if we’re hosting? The etiquette rules are universal: to be considerate, honest and sort.

Inform your colleagues and guests that your digital devices may record their voices, images or other information. Ask your host to show off digital assistants in the event you should not comfortable having them around. But be considerate. You may not need to ask your host to show off digital assistants within the presence of anyone who’s elderly or has a disability and is determined by those tools.

Maintaining our collective privacy

Privacy is a social norm that now we have to work together to take care of. First of all, we’d like to teach ourselves on cybersafety and potential risks of digital technologies. We also needs to be proactive in keeping current with the most recent news on technologies and take actions when required.

The government’s role on this complex paradigm is critical. We need stronger privacy laws to handle privacy issues related to personal digital assistants. Right now, firms comparable to Amazon, Google and Apple are making the principles.

Other jurisdictions have developed and implemented regulations comparable to Europe’s General Data Protection Regulation (GDPR) which provides oversight on data collection for a wide range of household devices. Canada should follow suit.

[ ]

This article was originally published at theconversation.com