Researchers found that Zoom uses its own encryption scheme, sometimes using keys issued by China.
Some of the key management systems — 5 out of 73, in a Citizen Lab scan — seem to be located in China, with the rest in the United States. Interestingly, the Chinese servers are at least sometimes used for Zoom chats that have no nexus in China. The two Citizen Lab researchers, Bill Marczak and John Scott-Railton, live in the United States and Canada. During a test call between the two, the shared meeting encryption key “was sent to one of the participants over TLS from a Zoom server apparently located in Beijing,” according to the report.
I don’t have further commentary on Zoom, other than asking, “How will this end?”
Along with recent news that Zoom sent your data to Facebook (although it stopped) now we learn that its video calls don’t use end-to-end encryption, despite the company marketing it as such.
…But despite this misleading marketing, the service actually does not support end-to-end encryption for video and audio content, at least as the term is commonly understood. Instead it offers what is usually called transport encryption, explained further below.
It just keeps getting worse for Zoom. It’s unfortunate the company has chosen such tactics, because it really is one of the better video calling apps out there.
Andrew found seven Apple alternatives to use if you don’t want your data shared with the FBI, including Bitwarden, Cryptomator, and more.
A flaw found in Intel chips lets attackers decrypt your hard drive, among other things. It can’t be fixed, only mitigated with patches.
Introduced by Senators Lindsey Graham and Richard Blumenthal, the EARN It act would force companies to “earn” protection from Section 230 to fight online child exploitation.
Though it seems wholly focused on reducing child exploitation, the EARN IT Act has definite implications for encryption. If it became law, companies might not be able to earn their liability exemption while offering end-to-end encrypted services. This would put them in the position of either having to accept liability or remove encryption protections altogether.
My linked teaser from yesterday was separate from the EARN It act, but now it shows that companies are being coerced on two fronts.
Let’s Encrypt announced on Saturday, February 29 that it discovered a bug in its Certification Authority Authorization (CAA) code.
Sir Andrew Parker is the head of MI5, the UK’s domestic security service. He wants tech firms to provide “exceptional access” to encrypted messages.
In an ITV interview to be broadcast on Thursday, Sir Andrew Parker says he has found it “increasingly mystifying” that intelligence agencies like his are not able to easily read secret messages of terror suspects they are monitoring.
Bah, this is smoke and mirrors. As the head of a security agency he knows that restricting backdoors to the good guys is impossible.
Starting today, Firefox will begin rolling out support for encrypted DNS over HTTPS for U.S.-based users.
We’re enabling DoH by default only in the US. If you’re outside of the US and would like to enable DoH, you’re welcome to do so by going to Settings, then General, then scroll down to Networking Settings and click the Settings button on the right. Here you can enable DNS over HTTPS by clicking, and a checkbox will appear.
You can choose between Cloudflare and NextDNS. As I mentioned in my roundup of DNS services, I’ve been using NextDNS for the past couple weeks and I love it.
Andrew Orr joins host Kelly Guimont for Security Friday, discussing a new data breach and keeping your ISP from selling your web history.
Two phrases that you’ll often hear in security are “bank-level security” and “military-grade encryption.” But what do they mean?
In certain areas of the U.S. some AT&T users found they couldn’t access their inboxes in encrypted email app Tutanota.
Starting on January 25th 2020, we have had constant complaints from AT&T mobile users who were unable to access their encrypted Tutanota mailbox. While AT&T seemed willing to fix this when we reached out to them, the issue is still not solved and reports from users keep coming in.
While some AT&T users confirmed the block, others said that they were able to access Tutanota. As AT&T has not fixed the issue after more than two weeks, we are reaching out publicly in the hope of getting the attention of the right people at AT&T.
Signal creator Moxie Marlinspike is growing the Signal Foundation and adding new features to the app thanks to money from WhatsApp cofounder Brian Acton.
Since then, Marlinspike’s nonprofit has put Acton’s millions—and his experience building an app with billions of users—to work. After years of scraping by with just three overworked full-time staffers, the Signal Foundation now has 20 employees. For years a bare-bones texting and calling app, Signal has increasingly become a fully featured, mainstream communications platform. With its new coding muscle, it has rolled out features at a breakneck speed…
I wish I could use Signal but none of my friends use it.
Four years ago a federal judge held Francis Rawls in contempt when he refused to decrypt hard drives for police.
The practical result is that, at least in federal court, someone can only be imprisoned for 18 months for refusing to open an encrypted device. That’s probably a harsh-enough penalty to induce most people to comply with decryption orders. But suspects in child-pornography cases might be tempted to “forget” the passwords on their encrypted device if doing so could save them from a conviction and a much longer prison term.
What an interesting case, and I remember reading about it four years ago. I wonder if the court was trying to set a precedent for passwords and the Fifth Amendment.
In a report from the Financial Times (paywall), a letter signed by 129 non-profits, think tanks, and academics urge Facebook to reconsider encrypting its apps. They use the “think of the children” argument because encryption could enable more child sexual abuse. But Justin Myles Holmes says we should think of the children and enable end-to-end encryption for them, so their data isn’t used and abused by corporations precisely like Facebook.
If we fail to take action now, we risk a world in which unsavory actors – domestic and foreign – have built rich, comprehensive profiles for every one of our children, following the trajectories of their education, home life, consumer habits, health, and on and on. These profiles will then be used to manipulate their behavior not only as consumers, but as voters and participants in all those corners of society which, in order for freedom and justice to prevail, require instead that these kids mature into functional, free-thinking adults.
Vicki Boykis wrote yesterday about Apple’s privacy, current flaws, and how the company should do better (I agree!)
So, here we are, in 2020, with Apple in a bit of a pickle. It’s becoming so big that it’s not prioritizing security. At the same time, it needs to advertise privacy as a key differentiator as consumer tastes change. And, at the same time, it’s about to get canclled [sic] by the FBI, China, and Russia.
And while it’s thinking over all of these things, it’s royally screwing over the consumer who came in search of a respite from being tracked.
Senator Lindsey Graham is drafting a bill [PDF] that could penalize companies using end-to-end encryption.
Although the measure doesn’t directly mention encryption, it would require that companies work with law enforcement to identify, remove, report and preserve evidence related to child exploitation — which critics said would be impossible to do for services such as WhatsApp that are encrypted from end-to-end.
If technology companies don’t certify that they are following the best practices set by the 15-member commission, they would lose the legal immunity they currently enjoy under Section 230 relating to child exploitation and abuse laws. That would open the door to lawsuits for “reckless” violations of those laws, a lower standard than contained in current statutes.
Of all the dumb things this administration has done, attacking encryption is a doozy. It’s not clear how much this would impact Apple, since the company does in fact scan for child abuse images. But iMessage and a few other services are end-to-end encrypted.
Inside a lab in New York worth US$10 million, specialists are trying to brute force their way into iPhones and iPads.
What’s going on in the isolation room is important, if silent, forensic work. All of the phones are hooked up to two powerful computers that generate random numbers in an attempt to guess the passcode that locked each device. At night, technicians can enlist other computers in the office, harnessing their unused processing power to create a local supercomputer network.
Bryan Chaffin and Andrew Orr join host Kelly Guimont to discuss Apple’s decision not to encrypt backups, and what data Apple can share.
According to Apple’s Legal Process Guidelines, there is a lot of data that the company can provide to law enforcement.