‘Warhammer’ legend Jervis Johnson has issued a stark warning about the unchecked spread of artificial intelligence, arguing that generative AI risks becoming “the asbestos of the internet” if the creative industries fail to treat it with caution.
Johnson, one of the principal designers behind Warhammer 40,000, Necromunda, and Blood Bowl, spoke out in support of Games Workshop’s decision to bar generative AI from its creative pipelines. In his view, the current wave of AI tools is wildly overhyped and fundamentally incompatible with the kind of original, authored work that built the Warhammer universe into a global phenomenon.
Most of what passes for AI “innovation” today, he suggested, is far less impressive than its marketing. “Most of the stuff that I’ve seen doesn’t seem to actually quite match up to the hype,” Johnson said in a recent interview. He referenced a line that resonated with him: AI could be to the digital world what asbestos was to construction-initially celebrated as a miracle material, later recognized as a pervasive hazard that would take decades to remove.
The comparison is pointed. Asbestos was once everywhere: cheap, flexible, and apparently safe. It was layered into buildings and products at scale before the long‑term damage became impossible to ignore. Johnson fears generative AI could follow a similar trajectory online-rapidly embedded into tools, workflows, and platforms, only for creators and businesses to discover too late that it undermines originality, corrodes trust, and pollutes the cultural landscape.
His comments come against the backdrop of Games Workshop’s explicit ban on generative AI in any part of its creative process. The company, founded in 1975 and publicly traded since the mid‑1990s, has built its brand on distinctive worlds, hand‑crafted miniatures, bespoke artwork, and tightly authored narratives. Its statement on AI makes clear that all official content-art, text, rules, and designs-must be produced by human creators, not machine‑generated systems.
For Johnson, that stance is not just about legal compliance or brand protection; it is about recognizing where value is actually created. Generative AI models, by design, remix and reassemble patterns from vast datasets of existing human work. To him, this is fundamentally parasitic: it leans on other people’s creativity without truly contributing anything new.
The tension is especially visible in tabletop gaming, where artwork, lore, and rules are intimately tied to a designer’s voice. Warhammer’s grimdark aesthetic, its dense history of factions, and its intricate rulesets were not produced by algorithms-they were painstakingly developed over decades by writers, artists, sculptors, and game designers. Johnson suggests that treating AI outputs as equivalent to that labour risks flattening the very qualities that made such worlds compelling.
Beyond questions of artistic integrity, Johnson’s “asbestos” analogy also speaks to a longer‑term, structural concern. Once generative AI is deeply woven into content pipelines, websites, design tools, and even operating systems, it may prove extremely difficult to disentangle. Companies could find themselves dependent on tools that erode the uniqueness of their IP. Audiences might lose the ability to distinguish between authentic, purposeful human expression and automated filler.
There is also the issue of cultural noise. As more AI tools churn out articles, images, and scripts at industrial scale, the internet risks being swamped by derivative content. Johnson’s warning implies that this flood could bury genuinely original voices under layers of formulaic output that looks competent on the surface but is hollow underneath-“a bit rubbish,” as he put it.
His scepticism challenges a prevailing narrative that generative AI is an inevitable, unambiguously positive step forward. In contrast, he frames it as a technology whose costs have been underplayed, whose limitations are ignored, and whose long‑term damage may only become visible after it has permeated every level of creative production.
From a creator’s perspective, there are several specific risks that make the asbestos comparison feel uncomfortably apt:
1. Erosion of craft
When studios replace early‑career artists, writers, or concept designers with AI tools, they remove the ladder that once allowed new talent to develop. Over time, that can hollow out the profession itself. The industry may reap short‑term savings but face a shortage of experienced creators later.
2. Legal and ethical grey zones
Generative models are trained on enormous datasets that often include copyrighted material. While some argue this falls under broad notions of “transformative use,” many artists and writers view it as unconsented exploitation. If courts and regulators later clamp down, studios that bet heavily on AI‑generated content could find themselves on unstable ground.
3. Brand dilution
Distinctive brands like Warhammer rely on a consistent, recognizable style. Flooding a setting with AI‑generated visuals or text, even if superficially on‑model, can make it feel generic. Over time, that undermines the aura of authenticity that drives fan loyalty and merchandising.
4. Data feedback loops
As AI‑generated content proliferates online, newer models may increasingly be trained on prior AI output. This feedback loop can degrade quality, reinforce clichés, and lock in existing biases. The internet becomes more homogenous and less surprising.
5. Loss of trust
Readers, players, and consumers are increasingly asking a simple question: “Did a person actually make this?” If companies cannot answer clearly-or choose not to-audience trust may erode. In niche, community‑driven hobbies such as tabletop gaming, that trust is a critical asset.
Johnson’s perspective does not deny that AI can have useful applications. Many creators quietly use software for spell‑checking, layout, upscaling, or other mundane tasks. The line he and Games Workshop appear to draw is between tools that support human authorship and systems that attempt to replace it. The former extend craft; the latter, in his view, cannibalize it.
There is also a cultural dimension specific to Warhammer and similar franchises. Fans have always invested their own creativity into the hobby-designing custom armies, writing fan fiction, painting miniatures, building terrain. If corporate owners began to flood official channels with machine‑made content, it could feel like an intrusion into a space defined by human imagination and effort.
Looking ahead, Johnson’s warning suggests a path for more responsible AI use in creative industries:
– Transparency by default: Clearly labeling AI‑assisted or AI‑generated work allows audiences to make informed decisions about what they value and support.
– Human‑first pipelines: Using AI only as a support tool-never as the primary generator of core creative assets-helps maintain the central role of human authorship.
– Opt‑in training data: Building models trained on consented, compensated datasets respects the rights of the very creators whose work fuels AI advances.
– Preserving training grounds: Protecting entry‑level creative roles, even when AI might appear cheaper in the short term, safeguards the long‑term health of the talent pool.
The metaphor of AI as “the asbestos of the internet” is not just a dramatic soundbite. It encapsulates a fear that, unless handled with restraint and foresight, generative AI could become embedded in online culture in ways that are costly, time‑consuming, and painful to reverse. For someone like Jervis Johnson-whose career has been defined by careful world‑building and long‑term thinking about game systems-that prospect is alarming.
His stance adds an influential creative voice to a growing pushback against uncritical AI adoption in entertainment and publishing. It challenges studios, publishers, and tech companies to ask a more uncomfortable question than “What can AI do for us right now?” Instead, Johnson urges them to consider what kind of creative ecosystem they will be living with in ten or twenty years-and whether today’s shortcuts are worth the risk of spending those decades trying to clean up the mess.
