Connect with us
Image by Microsoft Bing Image creator, based on a prompt by Gadget.

Artifical Intelligence

AI drives election year messaging boom

AI tools have proved both good and perilous in global democracies, writes RICCARDO AMATI of the Mobile Ecosystem Forum.

In the 2024 year of elections, including the general election in South Africa, messaging apps have become the dominant medium for political campaigns, revolutionising voter engagement. 

This shift is driven by the ubiquity of smartphones, declining trust in traditional media, and the powerful capabilities of AI, which have enabled more direct and personalised communication between candidates and voters. While this transformation offers more inclusive voter participation, it also raises concerns about the ethical use of AI and the potential for voter manipulation, highlighting the urgent need for regulatory oversight.

“AI allows campaigns to move beyond broad-based messaging to highly targeted communication strategies,” said Katy Harbat, head of international affairs at DUCO Experts and former public policy director at Facebook, in an interview with MEF’s “Perspectives” podcast. She emphasised that, while AI lets campaigns analyse voter data in real-time and adapt their messaging swiftly, this precision also comes with the responsibility to ensure voters are informed rather than manipulated.

The rise of mobile messaging in political campaigns

With a record number of people either having voted or set to vote in 64 countries—representing about 49% of the global population—2024 has proven to be an acid test for election messaging. Mobile platforms like WhatsApp, Telegram, and Signal, along with SMS texts, have seen explosive growth in political usage due to their unparalleled reach and immediacy. In countries like India, Brazil, and the United States, these apps have become essential tools for political campaigns, effectively bypassing traditional media channels. 

In South Africa, where there are more than 22-million smartphone users (Statista), political text messaging has been a key tool since at least 2016. Then, the opposition party, the Democratic Alliance, implemented a hybrid messaging system, combining SMS, push notifications, and voice messages to engage voters. 

This approach ensured that messages reached all devices, regardless of app ownership. As smartphone app usage increased, South African political parties also developed mobile-optimised applications, focusing on supporting campaigns, particularly for canvassing and Election Day operations.

In the highly contested May election, the African National Congress (ANC) lost its majority for the first time in 30 years. Ahead of the vote, the Institute for Security Studies (ISS) flagged the use of AI-generated deepfakes and disinformation campaigns. Researchers also identified tactics like “follow trains” and “hashjacking” to manipulate algorithms and create echo chambers. These methods, often undetected by followers, were particularly prevalent on social media, which has about 26-million users in South Africa.

The sheer volume of political messaging in the 2024 U.S. elections underscores the transformative role these platforms play. 

Thomas Peters, founder and CEO of RumbleUp—a peer-to-peer texting platform widely used by the Republican Party—highlighted this shift, noting that 2024 is expected to see a 50% increase in political messaging compared to 2022, when about 16-billion messages were sent.

“I would be surprised if we didn’t at least reach 25-billion political messages, both sides, between now and November 5th of this year,” he told the audience of a panel during the MEF Leadership Forum Americas in Miami, Florida

Peters’ prediction may have been surpassed, though we don’t have figures yet. In the campaign’s final ten days, in particular, millions of Americans were bombarded with nonstop messages. Some of Harris’ texts warned, “The very future of the republic is at stake,” urging donations as low as $7. Meanwhile, Trump’s campaign, though focused on selling hats, mirrored the Democrats in using scare tactics, fake deadlines, and the illusion of personal messages from high profile figures. 

What unfolded during the U.S. presidential campaign highlights the growing reliance on mobile messaging and the scale at which candidates are investing in these new forms of communication. The reasons are pretty clear. According to a research conducted by cybersecurity firm Proofpoint, most U.S. adults (60%) prefer to get their news through digital media. An even higher percentage (86%) will often or sometimes consume news via a smartphone, tablet or computer. And most of the U.S. voting population (97%) has access to mobile messaging.  

Mobile phone political messaging offers significant benefits, such as increasing voter engagement, particularly among younger and disenfranchised voters. It also provides a direct and efficient channel for candidates to inform the electorate about key issues, ultimately enhancing democratic participation.

The Power and Risks of AI in Political Messaging

The elections held around the world in 2024 were the first to take place in the era of chatbots.

In the U.S. campaign, ChatGPT blocked 250,000 deepfakes, with users creating fake articles, posts, and altered images of Trump and Harris, OpenAI reported. Among the artificial photos were images of Taylor Swift endorsing Republicans, Kamala Harris as a communist general, and Trump saving ducks and kittens to fuel a fake news story in Ohio. The goal was not to deceive but to influence—using memes to provoke and reinforce a narrative that didn’t require facts to attack the opposition.

Similarly a fake robocall from “President Joe Biden” highlighted the growing concern over misinformation. According to a McAfee study, 43% of Americans are worried about deepfakes influencing elections, 37% about undermining trust in the media, and 43% about impersonating public figures. 

In South Africa, a video surfaced during the 2024 election campaign, seemingly showing Donald Trump endorsing a political party. Though poorly synced and accompanied by emojis, it highlighted the ease of manipulating AI-generated content to falsely depict political endorsements. While AI-driven disinformation hasn’t been widespread in South African campaigning, manipulated content and false claims, including borderline incitement, have raised serious concerns. For example, a threatening tweet towards the president of ActionSA party Herman Mashaba. While AI can positively impact voter education, as seen with Rivonia Circle’s “Thoko the Bot,” there’s a need for critical engagement with online content. The media plays a vital role in providing accurate, timely information, particularly during elections.

In India, where general elections were held from April 1h to June 1, AI-driven deepfake technology was used to impersonate both deceased and living politicians, delivering messages under the guise of familiar leaders. According to cybersecurity firm McAfee, one in four Indians (22%) reported encountering political content that was later discovered to be a deepfake. About 75% had encountered deepfake content in the 12 months before March, with many expressing concern about its use for impersonating public figures, undermining trust in media, and influencing elections.

Similarly, in Pakistan, AI was used to simulate messages from imprisoned political leader Imran Khan, creating a scenario where voters received communications from a figure currently behind bars. In the U.S., AI-generated messages have impersonated candidates, further blurring the line between reality and manipulation. 

In the UK, just before July 4 snap elections, deepfake audio clips of Labour Party Leader Keir Starmer and Slovak opposition head Michal Šimečka spread rapidly on social media before being debunked by fact-checkers. 

AI has significantly enhanced the effectiveness of mobile messaging by enabling campaigns to precisely target and personalize communication. However, this capability also introduces significant risks. The line between persuasion and manipulation is becoming increasingly blurred, particularly as AI-generated deepfakes gain sophistication. Anna Quint, Executive Director of Campaign Verify, warned — in an interview with MEF — that AI-generated content could truly sway an election, especially on the local level, where budgets are smaller, and the electorate is less likely to recognize deepfakes as fabrications.

Regulatory challenges and future implications

The rise of mobile messaging during elections has exposed significant regulatory gaps. Unlike public social media platforms, messaging apps operate within closed networks, making it difficult for regulators to monitor content and curb the spread of misinformation. This regulatory void raises concerns about the integrity of the electoral process and the potential for abuse.

Anna Quint emphasized that while tools like Campaign Verify can ensure the identity of political campaigns, broader regulatory measures are needed to address the content and governance of these messages. Without comprehensive regulation, political messaging risks becoming the “wild west.”

Looking ahead, governments and tech companies must collaborate to create regulations that balance privacy with transparency, ensuring accountability in political messaging. The future will likely see increased regulation, stronger data privacy measures, and further integration of advanced technologies like AI, AR, and VR to engage voters in new ways.

The tech accord to combat deceptive use of AI in election campaigns promoted by 25 leading companies, including Amazon, Google, Meta and Open AI, at the margins of the Munich Security Conference in February, is a step in the right direction. But remains nothing more than a letter of intent.  

The 2024 elections have marked a turning point in political communication, with mobile messaging becoming central to campaign strategies. This offers new opportunities for voter engagement, but also presents ethical and regulatory challenges. As Katie Harbat stated, “The future of political messaging is bright, but it’s up to us to ensure it serves the public good, not just political interests.” 

The potential for both positive innovation and manipulation is unprecedented. Protecting the integrity of elections and democracy will require careful consideration. 

Subscribe to our free newsletter
To Top