It works by storing a unique digital fingerprint for every photo found on these trusted sites. It also saves a signature of every photo you see while you browse with the extension installed.
Tim Cook continues to raise his profile, tweeting more publicity photos of his visits to lots of places, including a Normandy war cemetery. Bryan and Jeff reexamine the idea that Mr. Cook may be thinking of political office. They also talk about Dow Jones’s brief flirtation with publishing fake news about Apple, and how Apple has changed the way on/off buttons work in iOS.
The news sent shares of $AAPL up as trading bots reacted to the news without being able to tell it was fake.
I know it’s coming. I know it’s unavoidable. But that doesn’t keep me from being terrified of this inevitable future when fake things are indistinguishable from reality. Adobe has its VoCo technology in testing—and that’s scary enough, but now University of Washington researchers have demonstrated the ability to to match speech to a generated video. In the demonstration video, they used real speech from former president Barack Obama and matched it to artificially generated video of him speaking those same words. It’s easy to see this tech being used to match falsified speech to falsified video. And while there are some aspects of UW’s artificially generated video that look fake, this is a demonstration, not a finished product. Within a few years, the ability to perfectly fake video and speech together will be available on our smartphones. The end result will be an ever-greater cynicism towards never believing anything you see. It’s inevitable, scary, and the technology is impressive as all heck. It will also be a huge test of democracy. Not only can
someone anyone be made to say something they didn’t, anyone could also deny saying something they really did say, claiming to be the victim of this technology. The Atlantic has a good story with a lot more information on the university project.
InVID’s Chrome plugin is the front end to a sophisticated backend that sifts through metadata, information from the videos themselves, and social media information that a journalist—or anyone—could then use to determine whether a video is “fake.”
The proliferation of “fake news” has been blamed in part on social media companies’ hands-off approach to curation. Charlotte Henry argues this is one area where social media can take its cues from Apple and its heavily curated approach to Apple News.