Have you ever wondered if a video or audio recording was made by AI to trick you into thinking it's real? Well, now there's a Chrome browser extension for that.
Hiya's Deepfake Voice Detector is an extension that can detect AI-generated, deepfaked audio just about anywhere online, including on videos or recordings published to X/Twitter, YouTube, and Facebook. It requires a verified email address to create an account in order to access it once installed.
Using it involves granting access for each test. The extension only listens to a few seconds of the deepfake before making its decision. It focuses on audio to detect the deepfake, however, so videos without audio won't be able to be checked. Once you've asked it to analyze your active Chrome tab with a video playing, your history of past checked content will appear below in the extension in case you'd like to revisit any results.
Hiya's developer says they're releasing the tool now to help prevent political deepfakes from tricking viewers in the weeks leading up to the US federal election. Major political figures have already been extensively deepfaked, from the President Biden robocalls earlier this year to Vice President Kamala Harris deepfakes to AI images imagining Taylor Swift fans supporting Donald Trump.
"At Hiya, we’re committed to combating scams, fraud, and misinformation," says Hiya President Kush Parikh in a statement. "Deepfakes are becoming harder to detect, making it difficult to discern between real and fabricated content. Voice cloning, while an incredible technological innovation, is also being exploited by cybercriminals for scams and even to try to influence major events like elections."
PCMag tested the Hiya extension on a number of videos, including one where deepfakes of Trump and Biden exchange jabs in a sit-down chat. Hiya's extension gave the video a 59/100 on its "Authenticity Score," along with a note: "Our models are uncertain about this voice." That's somewhat reassuring, but it's unfortunate the extension wasn't able to decide with more certainty that the video was a deepfake.
It is often accurate, though. A real video of Elon Musk talking about crypto was confirmed correctly as "likely authentic" in a PCMag test using Hiya. The extension was also able to correctly identify obvious deepfakes of Musk as "likely a deepfake" and a deepfake of US Vice President Kamala Harris speaking with her running mate Governor Tim Walz, giving both low trust scores.
It identified a video of Will Smith criticizing Biden with 118,000 views as a deepfake, though some users on X seem to have been fooled into thinking it's real. And it identified this video of a supposed former student of Waltz's and this video of Anderson Cooper as deepfaked, as well.
If you're unsure about a Hiya result, you can try getting it to analyze multiple different parts of a recording to see if that presents a clearer reading. Otherwise, you'll have to do your own research and generally be wary about sensationalized content published on social media. There are other ways to spot deepfakes, though, like asking a person on a livestream to turn sideways or listening for an off-sounding lilt in someone's voice. Bad deepfakes may be blurry or sound like audio clips being chopped together, but good deepfakes can be harder to spot.
In a Hiya survey, 13% of Americans said they've seen a deepfaked video. But more may have seen them without knowing it. In the past, deepfaked YouTube "livestreams" with "Elon Musk" or fake copies of crypto founders like Ethereum founder Vitalik Buterin have duped victims into falling for scams.
Access to Hiya is free for now during the election season. It's unclear if or when the company will start charging for access to it at a later date, however, though other deepfake detection tools have already been around for years. YouTube recently announced its own deepfake detection tool for its video site, but it's only going to be available to select creators to start.
Get Our Best Stories!
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
Sign up for other newsletters