The Deeper Problems with AI in Creative Work

“But AI will get better at writing stories, won’t it?” This response to my rant about AI-generated content misses the point entirely. Even if AI writing improved dramatically tomorrow, fundamental problems would remain that no algorithm refinement can fix.

The environmental cost alone should give us pause. Training even a small LLM produces carbon emissions equivalent to the output of several cars, over their entire lifetimes, and each query consumes significant energy. Game development already has a substantial carbon footprint—why increase it when human writers remain perfectly capable?

These systems aren’t creating—they’re processing vast quantities of human-created work without permission or compensation. This massive appropriation of creative labour affects writers who haven’t consented to having their work used this way. Many are struggling financially only to see their work exploited to build tools threatening their livelihoods. The distinction between content being accessible to read and available for commercial exploitation is crucial (something that even Microsoft seems confused about). Publishing a story doesn’t grant permission for it to be processed as training data for systems designed to replace writers.

Copyright law protects creative works and provide creators with control. AI training methodologies undermine this protection, resulting in ongoing litigation as creators fight for their rights. The legal foundation of these systems grows shakier by the day. Or doesn’t, which is even more concerning.

AI systems also concentrate power in the hands of wealthy technology companies. The computational resources required are immense, creating a troubling dynamic where tech giants control the infrastructure of creative production. Tools become dependent on proprietary black-box systems with no transparency. It’s not democratising creativity (whatever the fuck that means)—it’s centralising control under corporations whose interests don’t align with individual creators.

AI-generated creativity devalues creative labour by treating it as just another process to optimise (whatever optimise may mean in the case of creativity). This affects how writers and artists are compensated, how creative work is valued, and whether people can make a living through creative expression. This is, of course, part of a broader pattern where human labour is treated as an inefficiency rather than the essential source of value it is.

What’s at stake isn’t just quality—it’s an (ethical?) vision of creativity where human expression is valued, creative labour is respected, and technology enhances rather than replaces human capabilities. The possibilities for technology in creative fields should support human creativity, not simulate it.

Human creativity isn’t something to be optimised away—it’s something to be cherished and protected. I’m so sick of saying this.


Cover Image by bohdanchreptak from Pixabay