AI-generated ‘documentary’ targeting Muslims exposes the rise of ‘slopaganda’ as a weapon for the far right
Fake documentary about a Muslim sex trafficking ring in an Irish town offers a case study in how cheap content is weaponised to spread hatred
DUBLIN (MNTV) — A YouTube video presented as an independent documentary recently surfaced on Irish social media, claiming to tell the story of a Muslim man running a sex trafficking and drug operation in the town of Clonmel. It had all the hallmarks of investigative journalism: a female narrator recounting a disturbing personal testimony, dramatic pacing, and a channel name — Gael Force Media — that suggested a credible media outlet.
Almost none of it was real. The footage was generated by artificial intelligence. The narrator’s voice was synthetic. The story it told was thin on verifiable detail — no specific locations, no dates, no named sources — and instead relied on recycled anti-immigrant tropes about Muslim men and sexual violence. YouTube itself flagged the video as “altered or synthetic content.”
As reported by the Journal.ie, the video represents a growing phenomenon that researchers have begun calling “slopaganda” — a term combining AI slop, the flood of cheap synthetic content degrading the internet, with propaganda, the deliberate use of information to advance a political agenda.
From geopolitics to local hate
The term entered mainstream usage around February, when the US-Israeli war on Iran produced a wave of AI-generated content from multiple sides — viral Lego-style videos from Iranian accounts mocking Trump, and posts from Trump himself depicting everything from fighter jets dumping waste on protesters to AI images of himself as the Pope and Christ.
But the Journal.ie’s reporting demonstrates that the phenomenon has already migrated from the level of geopolitical conflict down to the street-level politics of immigration and Islamophobia. The Clonmel video was not produced by a state actor or a sophisticated disinformation network. It was made by a fringe operation with links to Sinne na Daoine, an anti-immigrant “community safety” group that the video itself name-checks toward the end — functioning less as journalism than as a recruitment tool wrapped in the aesthetics of a documentary.
The Journal.ie reached out to Gael Force Media to verify the claims and request contact with the woman whose testimony was used. No response was received. Irish police would not confirm any details of the alleged investigation the video referenced.
Why it works
The video’s production quality is low, its AI origins are obvious to anyone paying attention, and its claims are unsubstantiated. By conventional standards, it is not convincing. But that assessment misunderstands how this kind of content operates.
Slopaganda does not need to persuade through quality. It needs only to introduce the possibility that something might be true — to plant a seed of suspicion that persists even after the viewer scrolls past. In this case, the seed is a familiar one: the idea that Muslim men in Irish towns pose a sexual threat to women. The format borrows credibility from the documentary genre while the AI-generated presentation provides plausible deniability — there is no identifiable narrator to hold accountable, no real person whose claims can be challenged or fact-checked.
For fringe political groups, that combination is powerful. The content only needs to go viral once to achieve its purpose. And the obscuring of authorship — no human face, no traceable voice, no byline — makes it difficult to assign responsibility even when the political affiliations of those behind it are not especially hidden.
A pattern already visible
The Clonmel video is not an isolated experiment. During Ireland’s fuel protests in April, AI-generated posters calling for convoys and road blockades circulated widely on social media. Across Europe and the United States, synthetic content targeting Muslim communities — from fabricated crime statistics to deepfake testimonials — has become an increasingly common feature of far-right online ecosystems.
What distinguishes slopaganda from older forms of propaganda is not sophistication but accessibility. Historically, effective political propaganda required technical skill — the graphic design of a recruitment poster, the cinematography of a campaign film. Generative AI has eliminated that barrier entirely. Anyone with a laptop and a prompt can now produce content that mimics the format of journalism, testimony, or documentary filmmaking, and social media platforms will distribute it without meaningful friction.
The result is a media environment in which the distinction between authentic reporting and synthetic fabrication is becoming harder to maintain — and in which communities already targeted by prejudice, particularly Muslims, find themselves the subject of content that looks like evidence but is manufactured from nothing.
The word slopaganda may yet end up in the Oxford English Dictionary’s shortlist for 2026. The phenomenon it describes, however, is not a quirk of internet culture. It is a new infrastructure for hatred — cheap to build, difficult to trace, and designed to leave its mark even on those who recognise it for what it is.