A recent legal article highlighted Amazon’s Alexa device. The article suggests that Alexa may listen and record patient’s Protected Health Information (PHI).
Healthcare Workers File Complaint Against Amazon Alleging the Alexa Device Listens to Protected Information on June 30, 2021, a group of healthcare workers filed suit against Amazon.com, Inc., alleging that Amazon’s Alexa device listens to and stores audio recordings – encompassing those that may contain highly sensitive information, including protected health information under HIPAA. The class-action lawsuit, filed in the state of Washington, alleges that Amazon violates state and federal wiretap laws along with state consumer protection statutes. Among other things, the plaintiffs seek certification as a class, an order declaring that the acts and practices of Amazon violate various state and federal laws including those relating to wiretapping and consumer protection, as well as injunctive relief, and damages.
Amazon notes that Alexa is triggered by a specific “wake word” and, once spoken, the device is triggered to listen to users and respond to user commands. This is how the corporation advertised the device for many years. The suit alleges that the device then initiates a process to record the audio and permanently store the information. In addition, the device incorrectly identifies a “wake word” and records a user when the user is not intending or knowing the user is being recorded. When the device incorrectly identifies a “wake word” it records the audio, stores it, and sends it to human analysts and sometimes third parties to review the recording. The plaintiffs, all healthcare workers in some capacity, allege that Alexa may have captured HIPAA-protected information without the plaintiffs’ intent or knowledge. Additionally, it is alleged that Amazon did not disclose that actual humans listen in on these recordings until 2020 – after the plaintiffs purchased their devices. The plaintiffs allege that the only way to stop Amazon from making these recordings is to either mute the device’s microphone or unplug it, thereby defeating the device’s functionality. The case should serve as a cautionary tale for healthcare workers to be cognizant of Alexa and similar devices in work and other areas where discussions with or about patients may take place.
We are not sure how the court will rule in this case. However, it reminds us to always assume that someone or something maybe listening when you are discussing a patient’s PHI.