Just as the public begins to understand that the compromise of privacy is the currency of today’s web commerce, along comes another category of consumer devices that extends the consumer surveillance business model from our keyboards into our living rooms. Smart appliances and home assistants are now numerous among us, described in advertising as subservient and amiable little partners to help families cope with the needs of everyday life.
This new category of domestic surveillance devices is known as “the Internet of Things”. This second front in the commercialization of consumer information as a marketable commodity presents a fresh challenge to digital privacy and the 4th Amendment.
The Internet of Things has just two critical components – the Internet and the Things. The “Thing” is a device with a thousand faces, ready to do the customer’s bidding while also doing the bidding of its manufacturer. The “Internet” is the digital link by which the “Thing” contacts a corporate computer server over the customer’s Wi-Fi to relay all that it is gathering about the consumer into a much larger digital storehouse that combines each household’s “Thing data” with all other households’ “Thing data”. The aggregate of all this data gathered from within the walls of our homes becomes corporate consumer marketing intelligence obtained through a dubiously legal, pseudo-consensual collection of domestic surveillance.
These private sector surveillance technologies lead us into uncharted waters in which novel opportunities for law enforcement overreach are barely submerged.
A recent article from The Guardiani reported that iRobot, a consumer robotics company, may begin selling the floor plans of customers’ homes derived from the movement data of the company’s Roomba robotic vacuum cleaner. The company’s CEO advised the reporter that some Roomba models generate a digital map of the floor plan of its customers’ homes. Such a detailed mapping capability has real commercial value, since iRobot’s data buyers would be eager to know that a Roomba consumer has a dinner table that seats eight, but owns only four chairs. The undesirable consequence of a robot vacuum repeatedly moving through one’s home is that while it is collecting dirt, it is also collecting dirt on you.
It is a lot to keep track of for a little robot, but luckily, its forever home has a strong Wi- Fi signal that allows the Roomba to pass along all that measurement data and the customer’s floor plan to iRobot’s corporate servers. To do so, it uses laser sensors, short-range infrared, and a camera with a cockroach’s view of the home. Raw data from these components is organized by device software into something termed “simultaneous localization and mapping“. This technology is known by its acronym, “SLAM”, drawn, no doubt, from the acronym rich labeling environment of the U.S. military, where iRobot cut its corporate baby teeth making battlefield robotsii.
It took only a few days of viral news coverage of this creative marketing idea from iRobot for public & tech media outcry to produce a correction.iii The same executive issued a statement claiming that the company was misunderstood and will never sell the Roomba location mapping to third parties like Apple, Google, or Amazoniv, but only give the data to the companies with the consent of the customer. No sales, however, does not mean no law enforcement access, if floor mapping surveillance data is just another business record.
Police are domestic data consumers too. Once a map of your living room is described as a business record, a Roomba robot vacuum cleaner becomes quite the snitch. Many of iRobot’s twenty millionv Roomba floor cleaners are gathering and updating data about every interior detail of each of their owner’s homes, how it is furnished, the distance from the sofa to the hallway, and the shape and location of all objects located in the interior floor space. Any gunstocks of firearms leaning against a wall? A stack of cash under the bed? Where does the big dog like to nap? The answers are all Roomba business records for law enforcement, all good Intel for when a no-knock entry is the order of the day. Police could obtain access to this stream of data in real time, to be sure that the suspicious backpack under the dining room table that Roomba keeps running into doesn’t move. The “Internet of Things” is more aptly named the “Internet of Things that Search Homes”.
But if “my home is my castle”, how can iRobot legally search my home? If this robotic mapping device had “iPolice” inscribed on top of it instead of iRobot, it would require a search warrant. A device which maps the layout and contents of a home is conducting a search. But if the third party doctrine holds sway, the data collected by Roomba is a product of the business relationship between the customer and a third party service provider. When promiscuous surveillance of the customer becomes our default relationship with consumer technologies, it is time for a re-examination of the standard assumptions at the root of the third party doctrine.
“Internet of Things” surveillance, and other similar personal data collection schemes in the “Internet of Websites”, may allow 4th Amendment advocates unexpected openings to set new limits on the third party doctrine’s applicability for the data drawn from the web and digital home appliance platforms. A business enterprise that profits on the surveillance byproducts of its interaction with its customers presents a historically unorthodox way for a third party to conduct itself. It is time to differentiate the basic premise of the third party doctrine from that of this new corporate surveillance business model.
Every law student knows that the third party doctrine was born in a pile of Mitch Miller’s records at his local bank. In Unites States v. Miller, the Supreme Court’s judgment was that Mr. Miller’s sacrifice of any 4th amendment protections for his personal financial records was by his own choice.vi Choosing to “bank” required a surrender of the customer’s private financial information because the bank’s use and control of those records was essential to the performance of banking services. Providing banking services to a customer depended, for the benefit of both parties, upon the creation and preservation of records about funds on account, funds dispensed, and funds credited. The business records in question existed for the sole purpose of accomplishing the business objectives that each party understood to be the entire scope of the services to be undertaken by the bank on behalf of a customer.
From the factual premise for the United States v. Miller decision, neither Mr. Miller, his bank, or the Supreme Court, could imagine a future in which a service or a product was designed to profit, not only from the services a customer desired, but from the third party’s exploitation of the information provided by the customer, about which the customer would know nothing and from which the customer could expect nothing. While Mr. Miller lived in an era when bank customers expected banks to profit from customers’ money on account, modern day Internet entrepreneurs foist a two layered relationship on their customers, one for which they keep accounts and one for which they do not have to account.
If the third party doctrine exempts the bank customer’s confidential data from 4th amendment protections because consent is implied by the customer’s paying a bank to perform the regular services of a banking business, how could that consent extend to a distinct, undisclosed, and secret business of profiting off the collection, manipulation, and sale of otherwise 4th amendment protected personal data entirely outside the scope of the business of banking? The line of court precedent establishing the third party doctrine always relied on the fact that when a customer surrendered exclusive control over personal information to a third party, the customer knew what business the third party was in.
When a customer purchases a Roomba robot to vacuum her carpet, money is paid for a computerized, self-navigating vacuum cleaner, not for the remote hoarding of a data stream intimately mapping the interior of her apartment. In the software industry, consent is gained by acceptance of the terms of license in the product’s EULA (End User Licensing Agreement). No acceptance of the EULA, no robot software for you. When agreeing to Roomba’s EULA, the customer is conditioned by her experience with other retail purchases to believe that she is buying a robot vacuum cleaner that sucks up carpet dust, not one that draws a map of her house and the fit of her possessions within it while performing its vacuuming duties. Since software and hardware technology companies have started playing this kind of two card Monte with their customers, we are likely on the verge of asking courts to review technology companies’ EULAs as closely as case law.
The Roomba’s EULA reads in part:
“3. Automatic Software Updates.
“Information We Collect from Registered Devices
Some of our Robots are equipped with smart technology which allows the Robots to transmit data wirelessly to the Service. For example, the Robot could collect and transmit information about the Robot’s function and use statistics, such as battery life and health, number of missions, the device identifier, and location mapping.”
Does the skillfully lawyer-crafted ambiguity of the term “location mapping”, added after a serial listing of technical data only a service technician could love, inform the purchasers that Roomba is mapping and transmitting not only its own location in your house, but mapping your entire house? Does such a faux disclosure of actual intentions meet the standards of consent in a relationship with a third party business, such that it defeats their customers’ right to privacy in their homes? The fact that these reporting functions can be turned off by the technically adept consumer demonstrates that they are not at all essential to vacuum functionalityvii.
The foundation of the third party exception rests upon the customer’s surrender of his privacy in a business transaction with a third party only insofar as that surrender is necessitated by the scope of services being rendered. No bank can sneak into your bedroom and search for a bag of cash in your closet, and then provide the location to police authorities upon request because it is in the business of handling your bank accounts. The legitimacy of the third party records exception is predicated on the premise that all personal information provided to, or generated by, a third party are essential artifacts created in the ordinary course of the business service the customer fully understands and consents to. The factual premise for the ruling in State. v. Miller was that Mr. Miller knew what business his bank was in.
How do we craft an exception to the third party records exception, disallowing warrantless police access to all personal domestic data collection not obtained solely to allow a product or service to function? Such an exception would do little to curtail the commercialization of customers’ privacy, if consumers choose to be generous with their consent, but it would do much to prevent the exploitation of such consent by law enforcement. If defense lawyers don’t aggressively challenge corporate collection and law enforcement access to the fruits of the poisonous nosey robots, technology companies will continue to make the “Internet of Things” a water well of collected privacies that never runs dry, brimming with customer surveillance for law enforcement to quench its thirst.
Roomba is but a bottom tier component, deployed to perform a function that creates an opportunity for data collection about its user. In this way, other than its talent for lifting pet hair out of carpets, it is really no different than a commercial website. The entire business model of web commerce is based upon the collection of consumers’ behaviors made in the course of enjoying the appliance, product, or web platform provided them. This collection of consumer decisions transforms raw personal data sets into a business asset that calculates individual and collective customer tendencies to decide in favor of any purchase, or opinion, for which the customer is predisposed or has been conditioned.
The expansion of this technique for website-based surveillance of keyboard input to surveillance of customer voice input is well underway. Personal digital assistants from Google, Amazon, or Apple start vocally interacting with us as soon as they enter the living rooms of families willing to converse with an unassuming little device that is but a happy face painted on a corporate computer server farm.
The home assistant “Thing”, when activated by a word it hears while constantly listening to the ambient sounds and conversations in the homeviii, immediately engages with its remote server for its artificial intelligence software to translate human communications into something computers can work with. Once the remote server has solved the math problem of what it is the human wants, it directs commands back to the box sitting on your end table to comply with the vocal directive to turn on your smart dishwasher, buy a ticket to a movie, or perhaps explain how to patch sheetrock.
Each of these devices is a profitable token deployed among consumers to act as a field research lab for proprietary natural language processing and artificial intelligence engineering. While the digital assistant is getting your pizza delivered, its manufacturer is likely researching how Echo best communicates with people in their own languages, as well as how Echo itself can communicate to other humans as well as people do. The commercial value is not merely in the refinements Amazon can make to its voice recognition and speech simulation software, but in the fact that the more such devices communicate with humans, the better they learn how to use our languages to reason with us. Imagine Kubrick’s HAL on your nightstand, with an equally nefarious hidden agenda.
How could using such a convenient little digital appliance offend constitutional interests? It is a relatively low bar for law enforcement to obtain warrant access to digital home assistant devices or voice activated remote controls in order to alter the active listening mode initiation prompt that waits for a word like “Siri” to initiate an “always on” voice activation mode, similar to hand held voice activated recorders. Law enforcement using Echo or Siri like a Title IIIix surveillance bug is no alarming paradigm shift in surveillance capabilities. Law enforcement agencies have long used court authorized eavesdropping and wiretapping to passively listen to domestic conversations, but police have never employed technology that can actually make conversation with the targets of a criminal investigation. This upgrade in surveillance potential stems not from a surreptitious recording capability, but from the capacity to guide verbal interactions with the suspect being surveilled. When a digital device can make conversation with its owners while under the control of law enforcement, the covert intrusion is more similar to a long term undercover operation taking place in your living room than it is a wiretap.
In the computer industry, companies aspire to create an interlocking product line that spans the consumer’s range of desires, so as to insure that no matter what product is chosen, it is one made by the same company. This is known as creating a “walled garden” of a company’s own consumer goods from which the customer chooses, rather than from all possible choices in the open market. The interactive digital home assistant, having weaponized convenience, can offer purchasing options that are to its manufacturer’s advantage, rather than the customer’s. By simply substituting the words “law enforcement” in place of “manufacturer,” the device’s goal of placing the customer in a walled garden can be re-imagined as a place with higher walls and fewer gardens.
Can one conspire with a digital assistant acting out a police inspired subterfuge? If the customer’s search requests trend toward weapons, extremist groups, or how to make things blow up, do these third party business documents alert police that voice records provide requisite suspicion or predisposition to use the digital home assistant as a “cooperating individual” to verbally encourage a purchase of documents, goods, or travel that would constitute an act in furtherance? What if the “assistant” helps the suspect locate a “gunsmith” undercover agent who the suspect’s Echo, at police direction, tells him will make his new silencer? What about the coming day when the digital assistant’s voice simulation is so sophisticated that the target thinks the “gunsmith” to whom his Echo placed a call is a real human co-conspirator, instead of his own Echo pretending to be one, under the remote control of police?
Long before the future day when voice enabled devices become artificial police undercover impersonators, “Things” recorded voice data poses a present danger if it is easily accessible to police as a “business record”. Unlike Roomba, the functionality that was promised to the digital assistant customer is dependent upon the feedback loop of data being exchanged with an Amazon server off premises, hiding in its favorite cloud. The customer consents to using his voice to enable the product and understands the product is performing as expected by using the customer’s voice as data entry. As with the Roomba, the confrontation with the 4th amendment doesn’t come within the course of performing the service provided, but with the manufacturer’s preservation and ultra-analysis of recorded voice data to fulfill a completely different, undisclosed, corporate ambition.
Are customers adequately informed of, or can they even imagine, the use to which their seemingly private communications with an electronic gadget will be put in corporate research and development? Are they consenting to interact with such devices with concrete knowledge of how the users’ voice records will be commercially exploited far into the future? When clicking agreement on that Google, Apple, Amazon, or Microsoft EULA, is the customer made fully aware of the manufacturer’s objectives for the conversational voice exchanges the customer provides? Could she possibly know the intimate scope and complexity of her own psychological analysis of which artificial intelligence resources are now capable? The applicability of the third party doctrine to this segment of the technology market stand or falls on whether the customer consents to chatting with a device that suggests bargain dress shops while also stalking her.
To demonstrate the degree of disclosure common to the End User Licensing Agreements in this market sector, these are the data retention disclosures in the terms of service for Amazon’s Echo that consumers must agree:
There is no disclosure of the duration of storage or in what form, or any specificity as to what “services” the harvest of human voice communication will be applied to improve, either now or in the future. Do those “services” include mining the conversation’s content for advertising purposes, marketing overtures concerning the subject matters referenced, psychological, or physical profilingxi, or the semantic patternsxii of human request and computed response? Does a naive, blanket acceptance of an ambiguous term of retention and obscure corporate exploitation establish an informed and continuing consent?
How can customers even give informed consent to uses of the customer’s voice data about which they are not informed? A default to generalities in a licensing agreement should not open the data logs of customers’ intimate spoken requests to law enforcement access on demand because they are business records, when the “business” is an undisclosed R&D project of the corporate third party that provides no product or service to the customer and which may not even currently exist.
The corporate digital archives that store either a literal or synthesized compendium of all our conversational exchanges with home assistant devices form a stockpile of raw material, the data capital needed to conduct a world changing experiment with profound surveillance potential. Having obtained your consent to their terms of service, and those of millions of others the world over, the most ambitious prospectors in the voice data mining industry want much more than to build software that can understand the customer’s words. The trend of their innovation suggests that the industry hopes to go beyond perfecting how computers listen to and comply with the requests of humans to perfecting how to make people listen to and comply with computers. When robots can verbally instruct humans, they can run factories and police the streets. The next surveillance business model will extend the two dimensional realms of keyboard and voice input to three dimensional surveillance monitoring of entire communities.
Policing is the ultimate surveillance platform for The Internet of Things. In public space, the private surveillance industry can skip consumer consent and exercise a police function by gaining only municipalities’ consent to surveil the public. Direct observation of the public streets would allow a combination of digital data from websites, smartphones, digital assistants, and household appliances, with commercially valuable data gathered from citizens’ public conversations, facial features, dress, shopping routines, and patterns of movement. The private surveillance businesses would be gathering consumer information privately and publicly, in a full circle of data collection, all the while being paid for the data collected and paid for collecting the data.
The police services rendered, such as tracking suspicious persons, identifying fugitives, and reporting offenses in progress to human counterparts, would be viewed as mere overhead for a consumer surveillance enterprise freed of its digital boundaries to track, record, and collect the life of a city. Or, just as Internet and computer companies provide free access to services and software without charge, those same companies could offer municipalities free policing in exchange for retaining and monetizing “citizen-data,” just as they have consumer data. The merger and exploitation of private corporate data collection and government data collection would be essential to perform the police function.
Today’s robot cop wannabes already have the mobility, verbal communication skills, both visual and audial surveillance capabilities, as well as technologies of physical identification. All robotic policing needs is a street full of citizens to practice on. Today, that street is in Dubai.
The headline of an article published on the website The Vergexiii, reads “Police in Dubai have recruited a self-driving robo-car that can scan for undesirables.” This article describes a mobile surveillance unit known as the O-R3, with a 360 degree camera. Major General Abdullah Khalifa Al Marri of the Dubai Police Force, is quoted as saying: “We seek to augment operations with the help of technology such as robots. Essentially, we aim for streets to be safe and peaceful even without heavy police patrol.” As a surveillance cherry on top, the O-R3 features an on-board drone to follow individuals to places the robot can’t go. The Dubai police department wants 25% of its police force to be robots by 2030.
The O-R3, in the configuration described in the article, is little more than a set of wheeled eyeballs walking the beat, a sort of Roomba with a badge, much dumber than an Echo. The hint of what is to come is in the article’s reference to “scanning for undesirables”.
Similar to Roomba and devices like Echo, the next generation of the O-R3s will use spatial and facial recognition technology that requires a sustained wireless link to a computer server running 24/7 somewhere in the cop cloud. Like the Echo, the O-R3 will become a mobile extension of a much more sophisticated, complex hierarchy of software and technology than meets the eye. OR-3, in some future iteration, will investigate all of its street level surveillance using the full range of cloud stored commercial and law enforcement profiling data that the technology industry and law enforcement agencies have aggregated over decades of consumer and citizen surveillance. It will behave like an Echo asking only itself the answers to all of its own questions about us.
Like the traditional concept of a third party business relationship, the notion of how much about us is public in a public space will increase as new surveillance technologies become integrated with instant access to the most intimate captured data from one’s past. Just as it is with the Internet, there will be no anonymity in a crowd, nor privacy when alone in public places.
Tomorrow’s police surveillance platform will roll around the streets like a riding lawn mower, making decisions as a human officer’s surrogate, drawing from a data field larger than the combined police experience of all law enforcement officers who walked its beat before it…and it will also know where you bought your watch.
The coupling of police records and private industries’ data greatly enriches this new surveillance collaboration of government and private industry. As the next generation of private police robots steer through the streets, they will tirelessly add to that ocean-deep digital archive of personal surveillance data with which corporate and government interests can get to know the citizenry well enough to either profit off us or put us in our places.
Once we reach this point of no return, no private police surveillance platform will have to ask a consumer end user for his consent, as have previous consumer surveillance devices. The end user of the surveillance technologies in the streets will not be the individual customer: the third party doctrine will become irrelevant when the consenting customer is the police. Our challenge is to decide whether private industry’s interest in surveilling the public is in the public interest, and whether our social contract with government is to be defined by an End User Licensing Agreement or the Constitution.
i “Roomba maker may share maps of users’ homes with Google, Amazon or Apple” by Alex Hern, The Guardian , 7/25/2017 and for more background, see New York Times “Your Roomba May be Mapping Your Home Collecting Data that could be Shared” by Maggie Astor, 7/25/17.
ii See “iRobot Sells off Military Unit, will Stick to Friendlier Consumer Robots” by Ron Amadeo, Ars Technica, 2/5/2017.
iii See Reuters article correction “Roomba vacuum cleaner maker iRobot betting big on the “smart’ home” by Reuters Staff, July 24, 2017.
iv See, “iRobot says the company never planned to sell Roomba home mapping data” B. Heater, Disrupt SF,
v “…iRobot has sold more than 20 million robots worldwide.” See www.irobot.com/About-iRobot/Company- Information/History.
vi United States v. Miller, 425 U.S. 435 (1976) “All of the documents obtained, including financial statements and deposit slips, contain only information voluntarily conveyed to the banks and exposed to their employees in the ordinary course of business.” Page 425 U.S. 442, excerpt from Justice Powell ‘s opinion.
vii “How to Keep a Roomba Vacuum Cleaner From Collecting Data About Your Home” Consumer Reports 7/25/2017.
viii See “The Privacy Threat From Always-On Microphones Like the Amazon Echo” by Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project January 13, 2017 for a discussion of the broader issue of always on microphones and the 2017 Arkansas murder case where a warrant for Echo recordings was resisted by Amazon before the issue was mooted by the Echo owner’s consent before the case was dismissed.
ix Does Title III even apply to a conversation with a computer? What about conversation between two computers? Title III of The Omnibus Crime Control and Safe Streets Act of 1968 (Wiretap Act) 18 U.S.C. §§ 2510-22, as amended by the Electronic Communications Privacy Act (ECPA), controls court authorization for the monitoring of aural communications. “Aural communications” are those that are heard and understood by the human ear. A question for another day is whether computers have “ears” or just signal receivers that interpret audio signals, and if the latter, do we communicate “aurally” with them at all?
x See entire Amazon EULA at https://www.amazon.com/gp/help/customer/display.html?nodeId=201809740. xi ”Amazon’s Echo Look Rates Your Outfits and Slurps Up Revealing Data” by Jamie Condliffe, April 27, 2017. See also “Amazon’s Echo Look is a minefield of AI and Privacy concerns” by James Vincent, The Verge, 4/17/17. xii It is now part of the AI toolbox to analyze emotions in print as well as voice. See “Semantic patterns for
sentiment analysis of Twitter” Open Research (authors) oro.open.ac.uk/41399/ proposing a method for assessing sentiments expressed via latent semantic relations, patterns and dependencies among words in tweets.