Audio deepfakes are emerging as a powerful new tool in information warfare during a year of big elections around the world, as artificial intelligence-powered voice-cloning tools proliferate online.
On Monday, the office of New Hampshire’s attorney-general said it was investigating possible voter suppression, after receiving complaints that an “artificially generated” voice in the likeness of US President Joe Biden was robocalling voters encouraging them not to vote in the state’s presidential primary.
Researchers have also warned that the use of realistic but faked voice clips that imitate politicians and leaders are likely to spread, following instances in 2023 of allegedly synthetic audio being created to influence politics and elections in the UK, India, Nigeria, Sudan, Ethiopia and Slovakia.