This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Google to require disclosures for digitally altered election ads
Google is to make it mandatory for advertisers to disclose election content that has been digitally altered to depict real or realistic-looking people or events.
The update forms part of its political content policy and will make it compulsory for digitally altered ads to select the checkbox ‘altered or synthetic content’ section in the campaign settings.
With this, the search giant will create its own in-ad disclosure for feeds and shorts on mobile phones, and in-stream on computers, mobile phones, and TV.
For other formats, the advertiser is responsible for providing a prominent disclosure.
The update comes during a year when over two billion people are heading to the polls to vote globally.
Year of elections: a deepfake threat on politics and business
The synthetic election content that must be disclosed may include computer-generated audio, imagery, or video content.
Facebook and Instagram parent, Meta, has been taking similar steps to tackle the spread of misinformation.
Before the recent EU elections, the firm created a team to tackle deepfakes and disinformation as a part of its fact-checking network.
Meta’s internal deepfake watchdog was formed to “identify potential threats and put specific mitigations in place,” such as removing deceiving ads, debunking any misleading AI-generated content, and preventing them from going live.
#BeInformed
Subscribe to our Editor's weekly newsletter