Business

How Your Digital Trails Wind Up in the Police’s Hands

Phone calls. Web searches. Location tracks. Smart speaker requests. They’ve become crucial tools for law enforcement, while users often are unaware. 

Michael Williams’ every move was being tracked without his knowledge—even before the fire. In August, Williams, an associate of R&B star and alleged rapist R. Kelly, allegedly used explosives to destroy a potential witness’s car. When police arrested Williams, the evidence cited in a Justice Department affidavit was drawn largely from his smartphone and online behavior: text messages to the victim, cell phone records, and his search history.

The investigators served Google a “keyword warrant,” asking the company to provide information on any user who had searched for the victim’s address around the time of the arson. Police narrowed the search, identified Williams, then filed another search warrant for two Google accounts linked to him. They found other searches: the “detonation properties” of diesel fuel, a list of countries that do not have extradition agreements with the US, and YouTube videos of R. Kelly’s alleged victims speaking to the press. Williams has pleaded not guilty.

Data collected for one purpose can always be used for another. Search history data, for example, is collected to refine recommendation algorithms or build online profiles, not to catch criminals. Usually. Smart devices like speakers, TVs, and wearables keep such precise details of our lives that they’ve been used both as incriminating and exonerating evidence in murder cases. Speakers don’t have to overhear crimes or confessions to be useful to investigators. They keep time-stamped logs of all requests, alongside details of their location and identity. Investigators can access these logs and use them to verify a suspect’s whereabouts or even catch them in a lie.

It isn’t just speakers or wearables. In a year where some in Big Tech pledged support for the activists demanding police reform, they still sold devices and furnished apps that allow government access to far more intimate data from far more people than traditional warrants and police methods would allow.

A November report in Vice found that users of the popular Muslim Pro app may have had data on their whereabouts sold to government agencies. Any number of apps ask for location data, for say, the weather or to track your exercise habits. The Vice report found that X-Mode, a data broker, collected Muslim Pro users’ data for the purpose of prayer reminders, then sold it to others, including federal agencies. Both Apple and Google banned developers from transferring data to X-Mode, but it’s already collected the data from millions of users.

The problem isn’t just any individual app, but an over-complicated, under-scrutinized system of data collection. In December, Apple began requiring developers to disclose key details about privacy policies in a “nutritional label” for apps. Users “consent” to most forms of data collection when they click “Agree” after downloading an app, but privacy policies are notoriously incomprehensible, and people often don’t know what they’re agreeing to.

An easy-to-read summary like Apple’s nutrition label is useful, but not even developers know where the data their apps collect will eventually end up. (Many developers contacted by Vice admitted they didn’t even know X-Mode accessed user data.)

The pipeline between commercial and state surveillance is widening as we adopt more always-on devices and serious privacy concerns are dismissed with a click of “I Agree.” The nationwide debate on policing and racial equity this summer brought that quiet cooperation into stark relief. Despite lagging diversity numbers, indifference to white nationalism, and mistreatment of nonwhite employees, several tech companies raced to offer public support for Black Lives Matter and reconsider their ties to law enforcement.

Amazon, which committed millions to racial equity groups this summer, promised to pause (but not stop) sales of facial-recognition technology to police after defending the practice for years. But the company also noted an increase in police requests for user data, including the internal logs kept by its smart speakers.

Google’s support for racial equity included donations and doodles, but law enforcement agencies increasingly rely on “geofence warrants.” In these cases, police request data from Google or another tech company on all the devices in the area near an alleged crime around the time it occurred. Google returns an anonymized list of users, which police narrow down, then send a subsequent request for data on suspects.

As with keyword warrants, police get anonymized data on a large group of people for whom no tailored warrant has been filed. Between 2017 and 2018, Google reported a 1,500 percent increase in geofence requests. Apple, Uber, and Snapchat also have received similar requests for the data of a large group of anonymous users.

Civil rights organizations have called on Google to disclose how often it fulfills these geofence and keyword requests. A magistrate judge in a Chicago case said the practice “ensures an overbroad scope” and questioned whether it violates Fourth Amendment protections against invasive searches. Similarly, a forensic expert who specializes in extracting data from IoT devices like speakers and wearables questioned whether it was possible to tailor a search. For example, while investigating data from a smart speaker, data might link to a laptop, then to a smartphone, then to a smart TV. Connecting these devices is marketed as a convenience for consumers, but it also has consequences for law enforcement access to data.

These warrants allow police to rapidly accelerate their ability to access our private information. In some cases, the way apps collect data on us turns them into surveillance tools that rival what police could collect even if they were bound to traditional warrants.

The solution isn’t simply for people to stop buying IoT devices or for tech companies to stop sharing data with the government. But “equity” demands that users be aware of the digital bread crumbs they leave behind as they use electronic devices and how state agents capitalize on both obscure systems of data collection and our own ignorance.


More From WIRED’s Year in Review

Products You May Like

Articles You May Like

The Science of Crypto Forensics Survives a Court Battle—for Now
4 Internal Apple Emails That Helped the DOJ Build Its Case
The Baltimore Bridge Collapse Is About to Get Even Messier
The EU Targets Apple, Meta, and Alphabet for Investigations Under New Tech Law
A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

Leave a Reply