Alexa, how do I defend my group’s knowledge from you?
Has Alexa infiltrated your company yet? There are 3.25 billion users of digital voice assistants worldwide. It goes without saying that if you haven’t seen any of these devices on employee desks, then you will soon. The industry is expected to grow to eight billion users by 2023.
The proliferation of smart speakers
From the multitude of Alexa devices to Google Home, Apple HomePod and other smart home devices, there are ears everywhere. The devices are trained to listen to what you say and to respond when necessary. In fact, they’ll listen even before you say “Alexa” or “Hey Siri” because they need to know when you’re talking to them.
Amazon, Google, and Apple have all been in the news lately because their smart speakers may have breached user privacy. The companies let human auditors listen to the data collected by the devices – without the users being aware of it. While this practice changes, there is still a potential dilemma that a device will record you even if you don’t want to.
This is something to think about as digital assistants like Microsoft’s Cortana and Alexa for Business find their way into the workplace. Smart speakers do simple tasks first, so a user can easily check the calendar and add meetings, create and keep track of to-do lists, read and send email and instant messages, and ask general questions. Ultimately, these devices will also be the point of contact for accessing business applications, financial reports, system monitoring, customer support, and information from other custom integrations.
More than 18,000 companies already use Alexa in some way, and the total increases when you include devices from Apple, Google, and other manufacturers. For example, McDonald’s uses voice activated devices to accept applications. Corporations and law firms also use them in a variety of ways, placing them in individual offices and public areas such as break rooms and conference rooms.
With wider usage, security concerns increase.
Business secret, phishing, and hacking concerns
If these devices are not properly set up and monitored, they can cause serious business problems. For example, a family in Oregon found out last year that their smart home device not only recorded a private conversation without their consent or knowledge, but also accidentally sent the recording to someone on their contact list.
It is very easy to accidentally trigger an accidental exposure. This means that the device may be listening to background comments and confidential business conversations. If the smart speaker gets access to a company’s platforms to increase production and efficiency, a bad actor could shamefully use those avenues.
Recently, researchers used Alexa’s “skills” and Google’s “actions” to perform malicious testing tasks. Unfortunately, it was surprisingly easy to pull off. If a user asks for something seemingly simple and harmless like reading a horoscope, the device will react as expected and perform additional tasks in the background, such as eavesdropping or phishing. The apps developed by researchers could trick the user into believing that the device is not recording, or worse, provide a password to access sensitive files.
The devices could also be used to monitor employees and track traits such as tone and mood using AI. They could even access the health statistics of the employees. Digital voice assistants could use previously collected data to predict a person’s future actions and then check that their predictions are correct.
Data protection and legal issues
As GDPR considerations find their way into the U.S. legal system and states take the lead in data protection laws – like California’s most recent consumer privacy law – there is increasing demand for transparency and disclosure. This means that following the rules, the user must understand and agree to what the device is doing before any private data is collected. Because of this, you can sign in with Alexa, Google, and other apps and view data files and transcripts.
Legal disclosure requirements can also be triggered by these devices. Regardless of when or how a digital voice assistant receives data, that data may need to be collected, analyzed, and created for legal or investigative purposes. This means that the devices may incur more effort and expense than estimated. When it comes to courts, it doesn’t matter where the data is. If it is potentially relevant, it must be collected for review and analysis.
Data from such speakers is already used in criminal cases. For example, data and records from Amazon Alexa have been used in murder cases in Arkansas and New Hampshire. Other digital devices, including Fitbits and pacemakers, have been used in additional cases.
When it comes to corporate data, smart speakers can reveal information that doesn’t come from email or other documents, such as business reports. B. a private conversation in the break room or complaints in an office. This data could be found in a document request, whether the recording was intended or not.
Create an implementation plan
Currently, using smart speakers is similar to using smartphones. As in the past, companies typically respond in three ways:
- Ignore it and wait until it’s a problem (or hope it goes away).
- Completely ban smart speakers, which can be difficult to enforce.
- Create formal guidelines that describe acceptable and unacceptable uses.
The third option makes the most sense and has to be done at some point anyway. So it’s best to start now. That way, you can find a way to introduce smartly.
Your policy should:
- Include a formal company statement as to which smart speakers, if any, may be used.
- Provide use cases, especially as part of the normal and routine operations of your company, that give employees clear examples of acceptable and unacceptable uses.
- Areas of detail where the smart speakers can and cannot be used, e.g. B. Areas where particularly sensitive information can be routinely discussed.
- Encourage the use of privacy options like microphone mute and camera blocks built into or available as add-ons to most smart speakers.
- Create a formal training program to ensure that all employees are aware of the new guidelines for using smart speakers, and then follow up with routine refresher courses.
- Routinely track and interview employees about the use of smart speakers and ask for examples of their use to evolve company policies.
- Stay up to date on compliance solutions that inevitably come into play with widespread business use.
Smart speakers have the potential to help businesses and law firms get more work done – and do it in an easier, more effective way. But the really “smart” part of the speaker still comes from people. Make sure you use their skills in smart, well-planned ways.