The EFF unveiled the Atlas of Surveillance today. It’s a database of surveillance tech used by law enforcement across the country. Anyone can use it to see what spying technology their state’s LE uses. You can download datasets, too.
We specifically focused on the most pervasive technologies, including drones, body-worn cameras, face recognition, cell-site simulators, automated license plate readers, predictive policing, camera registries, and gunshot detection. Although we have amassed more than 5,000 datapoints in 3,000 jurisdictions, our research only reveals the tip of the iceberg and underlines the need for journalists and members of the public to continue demanding transparency from criminal justice agencies.
A report from NBCNews mentions a tool from GrayKey called Hide UI, and until now has been kept secret from the public.
But another tool, previously unknown to the public, doesn’t have to crack the code that people use to unlock their phones. It just has to log the code as the user types it in.
Software called Hide UI, created by Grayshift, a company that makes iPhone-cracking devices for law enforcement, can track a suspect’s passcode when it’s entered into a phone, according to two people in law enforcement, who asked not to be named out of fear of violating non-disclosure agreements.
This is called a keylogger, and it is neither new nor revolutionary. It would be cheaper for police to use pen and paper to write down a suspect’s passcode, although there is that pesky fifth amendment.
Forensics company Cellebrite, mainly known for its iPhone hacking capabilities, released a report of top digital intelligence trends for 2020. One thing that stuck out at me:
…over 70 percent of officers are still asking witnesses and victims to surrender their devices…However, most people do not want to have their primary communication device taken away for an indefinite period. To combat this issue, 67 percent of agency management believe that mobility technology is important or very important to the agency’s long-term digital evidence strategy and 72 percent of investigators believe it is important to conduct in-the-field extractions of this data.
In other words, it sounds to me like LE wants the capability to extract data from devices on site, instead of sending it to a lab. Fast action is important for LE, but it may also be too fast for people to think about those pesky rights they have before handing their phone over.
Motherboard built a database of over 500 iPhones that law enforcement have tried to unlock. Many of them weren’t able to be unlocked at all.
Out of 516 analyzed cases, 295 were marked as executed. Officials from the FBI, DEA, DHS, Homeland Security and Investigations, the Bureau of Alcohol, Tobacco, Firearms and Explosives were able to extract data from iPhones in investigations ranging from arson, to child exploitation, to drug trafficking. And investigators executed warrants against modern iPhones, not just older models.
As mentioned, this provides useful data instead of the usual anecdotes. You can find the database here.
Apple has blocked Clearview AI’s iPhone app, saying it violated the terms of its enterprise program because the app wasn’t for internal use.
Clearview AI gained notoriety for partnering with law enforcement on facial recognition, using its database of billions of scraped images from the web. But someone just stole its list of clients.
…Clearview AI disclosed to its customers that an intruder “gained unauthorized access” to its list of customers, to the number of user accounts those customers had set up, and to the number of searches its customers have conducted. The notification said the company’s servers were not breached and that there was “no compromise of Clearview’s systems or network.”
Meanwhile, law enforcement on end-to-end encryption: “Who needs that kind of encryption, other than maybe the military? We don’t even — in law enforcement — use encryption like that.”
Andrew wrote that Apple scans uploaded iCloud content for child abuse imagery, and a search warrant reveals it scans emails too.
Inside a lab in New York worth US$10 million, specialists are trying to brute force their way into iPhones and iPads.
What’s going on in the isolation room is important, if silent, forensic work. All of the phones are hooked up to two powerful computers that generate random numbers in an attempt to guess the passcode that locked each device. At night, technicians can enlist other computers in the office, harnessing their unused processing power to create a local supercomputer network.
According to Apple’s Legal Process Guidelines, there is a lot of data that the company can provide to law enforcement.
Apple had plans to introduce end-to-end encryption for iCloud backups, but canceled it two years ago after the FBI complained.
In a long read from NYT, Kashmir Hill writes about a startup called Clearview AI that works with law enforcement on facial recognition.
You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.
Starting January 20, 2020 Scotland police will use devices called cyber kiosks to analyze the contents of smartphones during investigations.
Police Scotland will only examine a digital device where there is a legal basis and where it is necessary, justified and proportionate to the incident or crime under investigation.
Cyber kiosks used by Police Scotland will not be enabled to store data from digital devices. Once an examination is complete, all device data is securely deleted from the cyber kiosk.
Cellebrite, a company specializing in hacking smartphones for law enforcement, has acquired BlackBag Technologies, a company specializing in hacking computers for law enforcement. This will let Cellebrite offer law enforcement an “all-in-one” forensic solution to cover smartphones, laptops, desktops, and cloud data.
It also means offering a broad array of field acquisition capabilities including consent-based evidence collection along with an integrated solution set that provides access, insight and evidence management to facilitate and control large-scale deployments and orchestrate the entire digital intelligence operation.
Cellebrite offers all of these capabilities to law enforcement, but the FBI still wants Apple to create a backdoored version of iOS.
When a company like Cellebrite or GrayKey use their devices to break into your iPhone, it’s not just your local data that can be accessed. Using various types of “cloud forensics” or cloud extraction technology, they can get your data in the cloud as well. It’s a long read but worth it.
Cellebrite’s UFED Cloud Analyzer, for example, uses login credentials that can be extracted from the device to then pull a history of searches, visited pages, voice search recording and translations from Google web history and view text searches conducted with Chrome and Safari on iOS devices backed-up iCloud.
A judge recently ruled that law enforcement have the ability to search through DNA database GEDmatch, overriding the choice of its over one million users.
In the wake of that attention-grabbing case, GEDmatch changed its policies in May 2018 to make it less easy for police to access their data. Users now have to opt in to having their data made available to police; information they upload is set to private by default. Rogers told the NYT that as of October, less than 15% of current users, 185,000 out of 1.3 million, have opted in to sharing their data with police.
Documents reveal that New York City law enforcement has a partnership with Cellebrite to hack iPhones.
Previously, if law enforcement wanted to get into newer devices, they had to send the phones to one of Cellebrite’s digital forensics labs, located in New Jersey and Virginia. But Cellebrite’s new UFED Premium program gave law enforcement the ability to “unlock and extract data from all iOS and high-end Android devices” on their own, using software installed on computers in their offices.
I’ve always wondered if eventually Apple will remove the Lightning port from the iPhone once wireless charging becomes the norm. Side effects may include better waterproofing and worsened hacking.
Ring, the Amazon-owned surveillance company that sells doorbell cameras, is partnering with 400 more police forces across the U.S.
The partnerships let police automatically request the video recorded by homeowners’ cameras within a specific time and area, helping officers see footage from the company’s millions of Internet-connected cameras installed nationwide, the company said. Officers don’t receive ongoing or live-video access, and homeowners can decline the requests, which Ring sends via email thanking them for “making your neighborhood a safer place.”
Previous Ring coverage: Here, and here.
The Sarasota County Sheriff’s office compiled a list of 15 apps that they believe pose a danger to young children. Here are the apps on the list:
MeetMe, Grindr, Skout, WhatsApp, TikTok, Badoo, Bumble, Snapchat, Kik, LiveMe, Holla, Whisper, Ask.fm, Calculator%, Hot or Not.
An app called what3words saved a group of friends after they got lost. Police told them to download the app and they were quickly found.
A couple weeks ago I shared news that Amazon is requiring police to promote its Ring surveillance cameras. Not that bad, I thought, because at least the police had to have the owner’s permission. But I was optimistic, because Amazon is giving police talking points on how to persuade owners, and even seizing the video footage if the owner said no.
As reported by GovTech on Friday, police can request Ring camera footage directly from Amazon, even if a Ring customer denies to provide police with the footage. It’s a workaround that allows police to essentially “subpoena” anything captured on Ring cameras.
Things like government surveillance and hacking are precisely why I will never buy smart home products. Update: A Ring spokesperson emailed me a correction: The reports that police can obtain any video from a Ring doorbell within 60 days is false. Ring will not release customer information in response to government demands without a valid and binding legal demand properly served on us. Ring objects to overbroad or otherwise inappropriate demands as a matter of course.