Can your smart home be used against you in court? Do They Have Right To Remain Silent?

Can your smart home be used against you in court?
by Brian Heater Posted March 12, 2016

On a November, 2015 morning in Bentonville, Arkansas, first responders discovered a corpse floating in a hot tub. The home’s resident, James Andrew Bates, told authorities he’d found the body of Victor Collins dead that morning. He’d gone to bed at 1 AM, while Collins and another friend stayed up drinking.

This past December, The Information reported that authorities had subpoenaed Amazon over the case. The police were considering Bates a suspect in what they suspected was a murder after signs of a struggle were found at the scene. They hoped his Echo might hold some insights into what happened the night before.

Amazon initially pushed back against the request, citing First Amendment protections, but ultimately conceded when Bates agreed to allow the information to be handed over to police.

While Amazon’s fight has been rendered moot, this case lays groundwork for some tough and important conversations to come, raising a slew of fascinating questions around the technologies. What do devices like the Echo or Google Home actually record and save? Have we, as consumers, effectively surrendered a reasonable right to privacy from corporations and the government by bringing such devices into our home?

“It’s like this perfect test case,” says Andrew Ferguson, a professor of law at the University of the District of Columbia. “Alexa is only one of the smart devices in that guy’s house. I don’t know if all of them were on or recording, but if you were going to set up a hypothetical situation to decide if the internet of things could be used as an investigative tool, you’ve got this mysterious hot tub murder.”

A reasonable expectation of privacy

The question of how much privacy we can reasonably expect when installing a home assistant is complex and unresolved. In a sense, people who buy an Echo or Home know what they’re getting themselves into from the very basic fact that they’ve purchased an internet-connected device, with built-in microphones, that is designed (in some sense) to always be listening — and it’s created by companies that thrive on tailoring ads based on the boatloads of data they collect from users.

Still, constant recording and storage is another question entirely. Home assistants are designed to have an ear open at all times, monitoring their surroundings for keywords like “Alexa,” “Google” or “Siri.” But once a user consents by introducing such a device into their home, are its manufacturers bound by law to only record and store the information their products were designed to act upon? Or has the consumer effectively waived those rights?

“As a legal matter, it’s unresolved, which is part of what worries us about the whole thing,” ACLU senior analyst Jay Stanley tells TechCrunch. “I think most people don’t expect that snippets of their conversation might accidentally get picked up. [Smart assistants] do hear trigger words when trigger words are not intended.”

Even with the best of intentions, such devices leave open the possibility of collecting unintended information, courtesy of advanced recording technologies capable of firing up from across the room. Stanley covered the topic recently in an article penned for the ACLU that was inspired when he encountered an Echo at a friend’s dinner party.

“The group’s conversation became self-conscious as we began joking about the Echo listening in. Joking or not, in short order, our host walked over and unplugged it,” he writes. “It is exactly this kind of self-consciousness and chilling effects that surveillance — or even the most remote threat of surveillance — casts over otherwise freewheeling private conversations, and is the reason people need ironclad assurance that their devices will not — cannot — betray them.”

It’s a familiar feeling, surely, to anyone who’s ever covered a webcam with electrical tape for fear of snooping.

“I would push back against a legal argument that said categorically that users don’t have a reasonable expectation for privacy when they’ve installed one of these devices in their home,“ says Electronic Frontier Foundation Senior Staff Attorney and Civil Liberties Director Daniel Greene. “You are trusting that third-party to assert your rights, to notify you when your information is being sought. To me those things are independent of your reasonable expectation of privacy.”

There also seems to be some lingering legal questions regarding disclosure. It’s not entirely clear whether companies are legally bound to notify users about the manner of information they gather or how they ultimately act upon it. Some will touch upon the idea in publicly available privacy policies (which, like TOS and EULA, are rarely given a second thought by most users), but while welcome, don’t seem to be a legal obligation.

“It’s pretty much the Wild West,” explains Stanley. “I can’t think of any legal requirements that would [force them to disclose what they’re recording]. It’s caveat emptor, let the buyer beware.”

What the smart home hears

There is, of course, the risk of confusion in disclosure. Early last month, Samsung’s seemingly endless parade of bad luck continued when language in the privacy policy for its Smart TVs sure made it sound like the company was going out of its way to capture and transmit sensitive information:

Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.

The company issued a public apology of sorts with the blunt title “Samsung Smart TVs Do Not Monitor Living Room Conversations” and tweaked the language accordingly — only to be put back on its heels earlier this week when WikiLeaks detailed what was claimed to be a secret spying operation on behalf of the CIA and British intelligence that allowed Samsung smart TVs to spy on users when the sets appeared to be off.

After dealing with its own myriad privacy brush-ups over the years, Google seems to be taking a fairly transparent approach to disclosure. We reached out to the company to ask how its Home device handles voice monitoring, and the company issued the following statement:

All the devices that come with the Google Assistant are designed with privacy in mind. Google only stores voice-based queries received immediately after recognizing the hotwords ‘OK Google’ or ‘Hey Google.’ After Google Home hears the hotword, its LEDs light up to indicate that it is listening. Hotword detection runs locally on the Google Home device and if the hotword is not detected, the audio snippet stays local on the device and is immediately discarded. If the hotword is recognized, the data, including the query contents, are sent to Google servers for analyzing and storage in My Activity.

Google goes on to explain that the history of user interaction with the Assistant is stored in a manner similar to the way the company handles Search history, allowing users to control and edit the voice queries after the fact.

Amazon, on the other hand, has yet to offer us a similar response. In a pre-holiday report that refers to the Echo as a “box of intelligence,” Amazon told NBC simply, “Echo and Alexa were designed with privacy and security as part of the design, not an afterthought.” That seems to be the extent of its feedback on the subject, referring at least in part to the encryption the company uses before sending information to the cloud.

In most cases, the company has referred inquiries to its FAQ, which details how users can disable the microphone and delete voice recordings, along with some tidbits of information about how those ever-important wake words work, “Amazon Echo and Echo Dot use on-device keyword spotting to detect the wake word. When these devices detect the wake word, they stream audio to the Cloud, including a fraction of a second of audio before the wake word.”

Know your rights

What, precisely, did authorities think they were going to get from Alexa? And was Amazon afraid of what such disclosure could ultimately reveal about the data it collects?

“I assume part of what was going on in [law enforcement’s] minds was, here you are, you’re intoxicated in a hot tub and you say ‘Alexa, how to do I get rid of a dead body?’” Ferguson says with a laugh.“ Or, he continues, “‘How do I clean up blood?’ Those would be wonderfully damning admissions that possibly could have been picked up, just as a lot of people’s Google searches are insights into their minds and what’s going on with them.”

The laws governing precisely what access the government has to information collected on smart home devices is similarly up in the air, another fact that the Arkansas murder trial highlighted. Unlike the recent San Bernardino shooter case, wherein Apple argued that providing an encryption key would open a backdoor vulnerability for eavesdropping and other malicious activity, Amazon’s Echo case rested firmly on the shoulders of the First Amendment.

“The responses may contain expressive material, such as a podcast, an audiobook, or music requested by the user,” Amazon argued. “Second, the response itself constitutes Amazon’s First Amendment-protected speech.” The company seemed to offer little in the way of pushback with regard to offering up Bates’s purchase history, but it cited a 2010 ruling in its favor stating, “[t]he fear of government tracking and censoring one’s reading, listening and viewing choices chills the exercise of First Amendment rights.”

In that case, the company was joined by the ACLU, arguing against the North Carolina Department of Revenue’s attempt to acquire customer purchase histories. While the company appears to have given up similar data this time around, it’s once again invoking the First Amendment, citing a potential violation of privacy and anonymity tied to items a user buys. It also makes a claim for the Echo’s responses as a sort of protected speech.

Amazon added that it would refuse to offer up the data “unless the Court finds that the State has met its heightened burden for compelled production of such material.”

A cynical (though not necessarily incorrect) look at Amazon’s argument points to the PR potential of appearing to be both a staunch defender of the First Amendment and user privacy. That point is, in part, driven home by a fairly significant portion of the company’s 90-page legal filing that essentially reads like an advertisement for the Echo.

But while the optics of such a fight do reflect well on Amazon as companies increasingly battle with privacy backlash, for better or worse, the fact is that it may ultimately be incumbent upon companies to wage those battles on behalf of the user.

“I’m pleased to see companies asserting these privacy concerns as they arise,” says Greene. “This is a new area. I think law enforcement understands that these connected devices are a rich source of information. There will be, whether we like it or not, a legal regime that will be put in place when law enforcement gets access to these things.”

New technologies invariably present new challenges for old laws. Traditionally, one expects that a warrant would be required to access this manner of information, as part of the Fourth Amendment’s protection against unreasonable searches and seizures. But do voice recordings gathered by a piece of technology like a home assistant belong to the user or the company? And what if that information is stored on an off-site server, rather than locally? What about when it’s recorded outside of the home?

“Notable in the litigation of the Bates case is the fact that there was no Fourth Amendment argument,” explains Ferguson. “In part because that reasonable expectation of privacy has been given up to the third party of Amazon and with a lawful subpoena, it makes it difficult for the individual defendant to claim their Fourth Amendment rights were violated. I think that is a statement of where are terms of how our current constitutional protections are not adequate to protect us from these new smart devices that collect information from us in a whole host of ways.”

There’s a consensus among those we spoke with that at, the very least, the laws governing the acquisition of data collected by IoT devices and their ilk ought to be revisited. There are simply too many factors at play to be assumed they’ll be adequately served by precedent like 1986’s Electronic Communications Privacy Act (ECBA). And while Amazon was ultimately forced to throw in the towel in the Arkansas case, the fight over the information is clearly only beginning.

Risk versus reward

There’s another emerging consensus of sorts among the legal experts I spoke to about the case. They largely seem to be steering clear of smart assistants in their own home. Granted, people who have made a career of monitoring civil rights violations are naturally a bit touchy when it comes to even the remotest potential for introducing an additional backdoor.

Stanley couches his own decision not to adopt a home assistant as a cost-benefit analysis.

“If we didn’t adopt technology because of privacy concerns, there would be a lot of technologies people wouldn’t use right now,” he says. “I think people use technologies that are troubling from a privacy point of view, but they also feel uneasy about it. One of the risks is that we end up in this twilight zone where everyone knows that their privacy is not being protected but also tries to adapt and live in the real world. We can do better than that.”

Perhaps clearer legal precedent will help address some of the uneasiness surrounding these devices. But even while that continues to shake out in court, it seems safe to say that where manufacturers are concerned, a little disclosure can go a long way. Even with fine print down in writing, however, it’s important to be mindful of every piece of new equipment you make room for in your home.

And, as is the case with any piece of electronics, maybe unplug it every once in a while.



Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger