Microsoft has announced new tools to help combat deepfakes, reported Computing. The idea is to combat content such as video, audio, or photographs that are edited to make it look like someone has done or said something that they, in fact, never did.
According to the company, its first tool – dubbed ‘Video Authenticator’ – can analyse an image or video clip to determine whether it has been edited using AI. The tool will then provide a confidence score, indicating the chance that the media has been manipulated. In the case of a video, the tool shows a confidence score in real-time on each frame as the video plays… According to Microsoft, its new tool works by ‘detecting the blending boundary of the deepfake and subtle fading or greyscale elements that might not be detectable by the human eye.’ The company said that its video authenticator tool was created using a public dataset from Face Forensic++, and has been tested on the DeepFake Detection Challenge Dataset.
Check It Out: Microsoft Unveils New Tools to Help Fight Deepfakes