Politics

Hannah Campbell: AI, campaigning and how to win in the ‘Kemi Rap’ era

Published

on

Hannah Campbell specialises in data, and AI workforce transformation. She was Parliamentary candidate for Telford in 2024 and is currently the Regional Deputy Chairman (Political) for the West Midlands.

The 2024 general election was widely billed as the first “AI election”. It was not. Artificial intelligence appeared largely at the margins: modest productivity gains, a handful of deepfakes, and limited use by Reform to generate TikTok content. Yet to dismiss 2024 as overhyped would be a mistake. The cultural and technological shift is now unmistakable.

When an AI-assisted video of Kemi Badenoch “rapping” in response to the Budget went viral, it marked a turning point. The clip was satirical, quickly consumed, and widely shared. It demonstrated that political culture has already changed. AI is no longer experimental. It is becoming part of the core infrastructure of campaigns.

The next election will not be defined by whether AI appears, but by who uses it well. This transition will be visible in the May 2026 local elections and, by the next general election, AI will be embedded in the operations of every serious campaign.

Advertisement

This is not optional. Campaigns are being asked to deliver more with fewer volunteers, tighter budgets, and an electorate whose attention has shifted decisively towards short-form video, satire and decentralised online creators. Those who use AI to scale operations, sharpen messaging and produce high-impact content will outperform those who do not. For the Conservative Party, failing to keep pace would be a strategic error.

How AI Has Already Entered UK Campaigning:

Despite the hype, AI’s most common uses in 2024 were practical rather than transformative. Campaigns used it to draft emails, leaflets and social media posts; generate multiple message variations in seconds; speed up rebuttal and opposition analysis; add subtitles and edit video; and improve back-office efficiency.

In effect, AI has already become a productivity engine. It allows local associations to match national-level production values and to respond at speed in a compressed media cycle.

Advertisement

This matters in an environment where timing often determines reach.

Academic assessments also confirm that AI-enabled disinformation surfaced in 2024. The scale was limited, but the direction of travel is clear. The tools are improving, misuse is becoming cheaper, and voters are increasingly uncertain about what to trust. Future campaigns will operate in an environment shaped by synthetic content, rapid iteration and narratives created outside formal party structures.

The Opportunities AI Brings:

AI enables campaigns to do more with less. Volunteer numbers are falling, digital expectations are rising, and budgets remain constrained. AI can generate targeted messaging, graphics, canvassing routes, donor communications and briefing packs in minutes, freeing scarce human time for strategy and persuasion.

Advertisement

It also sharpens political judgement. AI can analyse sentiment, test arguments, identify weak lines and surface emerging issues before a human team has convened. It does not replace political instinct, but it accelerates it.

Mobilisation and engagement can also improve. Chatbots and automated tools can help voters with registration, voter ID requirements and polling logistics, reducing friction and allowing campaign teams to focus on persuasion rather than administration.

Crucially, AI levels the playing field. Associations and candidates without professional creative teams can now produce high-quality content. Campaign capability is being democratised, but only if people are trained and confident enough to use the technology. That requires investment, support and leadership from the centre.

The most significant cultural shift, however, is the rise of AI-enabled video satire. Younger voters increasingly consume politics through humour, remix culture and short-form video rather than leaflets or long policy documents. AI is now the engine powering much of this ecosystem.

Advertisement

Groups such as Crewkerne Gazette illustrate this vividly, producing parody songs and videos using AI-enhanced voices or imagery of political figures. These are not official communications, yet they shape perceptions, embed narratives and reach audiences that formal campaigns struggle to access.

Political satire is not new, but access has changed. You no longer need a production studio to reach millions. AI-powered satire spreads quickly and cheaply, often outside formal campaign rules, and the boundary between parody and misinformation is increasingly blurred.

Risks to Democratic Integrity:

The risks are real. Deepfakes and engineered deception are becoming easier to produce, and misuse will increase. Even a small number of convincing falsehoods can erode public trust. When voters assume any clip might be fake, democratic accountability weakens.

Advertisement

Hyper-personalised persuasion also raises concerns. AI-driven interactions can be persuasive yet opaque, creating an uneven playing field. Microtargeted messages fragment public debate by delivering claims to audiences unseen by others, limiting scrutiny and challenge.

Independent creators may also cross ethical lines that political parties would avoid. Campaigns may benefit or suffer reputationally, but they cannot control the outcome.

Unlabelled synthetic content creates serious exposure. It can breach electoral law around imprints and misleading material, trigger defamation claims where individuals are falsely represented, and raise data protection issues where targeting lacks proper safeguards. Platforms may remove content at critical moments, and regulators may intervene. Most damaging of all, voters punish perceived manipulation. Once trust is lost, it is difficult to recover.

Using AI for Attacks – Legal Boundaries:

Advertisement

AI-assisted attacks must sit within clear legal limits. It is lawful to analyse opponents’ records, highlight genuine inconsistencies, and produce parody or satire, provided it is clearly labelled and would not reasonably be understood as fact.

However, AI-generated audio, video or imagery that fabricates words, actions or behaviour risks breaching electoral law, particularly the prohibition on false statements about a candidate’s character or conduct under the Representation of the People Act. Outsourcing such material to “independent” creators does not remove exposure where campaigns encourage, coordinate or knowingly benefit. The test is whether voters are misled, not who created the content.

What Conservatives Should Do Now:

The Conservative Party operates within a framework overseen by the Electoral Commission and cannot dictate the direction of regulation. What it can do is ensure its own practices are robust, credible and defensible.

Advertisement

That means setting clear internal standards now: no synthetic impersonation intended to mislead; consistent labelling of AI-generated content; and a presumption that campaign communications are traceable and attributable. These safeguards reflect existing law, public expectation and basic political common sense.

Regulation should remain targeted and proportionate, focused on deception, impersonation and covert interference rather than legitimate creativity or satire. Where work is already under way through regulators or Parliament, Conservatives should engage constructively. Shaping outcomes from within the system is more effective than objecting once rules are fixed.

Above all, the party should lead by example. If Conservatives use AI openly, responsibly and competently, that approach becomes the benchmark. In practice, regulators tend to codify behaviour that already works.

The Strategic Imperative:

Advertisement

The 2024 election was the AI election that was not. The next will be the opposite. As volunteer numbers decline, budgets tighten and video dominates political communication, AI will sit at the heart of campaign success.

AI will shape how future elections are fought, whether parties prepare for it or not. Those that embed it early, train their people and use it responsibly will define the terms of political competition. The rest will spend their time reacting to a campaign environment they no longer control.

Source link

Advertisement

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version