Why Shouldn’t You Use AI-Generated Images for Your Writing?
AI provides a quick fix for visuals to accompany your words, but there are drawbacks for you and other creatives

The age of artificial intelligence is upon us — or so we’re told, anyway. One thing I’ve certainly noticed as an editor, writer, and visual artist who uses multiple online writing platforms is a rise in the use of AI-generated images by writers.
AIs — or large language models (LLMs), which are in reality a far cry from the kind of sci-fi-esque artifical intelligence the term “AI” evokes — are being pushed into every app, platform, phone, and computer these days.
It’s hardly surprising that some writers take advantage of the seemingly endless possibilities to create the perfect accompaniment to their carefully written words. As far as AI goes, your imagination is the limit, or so it would seem.
But there are reasons you should maybe stop and reconsider before generating an image for your next personal essay, think piece, or book review — and it’s not just AI’s jarring inability to render the right number of fingers on a human hand.
Full disclosure: I train AIs for a living, though not in image generation. I realise that makes me a hypocrite. But I’m also poor and disabled. It allows me to work from home and preserve my mental health, and also be around to help my physically disabled partner. I’m no AI evangelist, though, as you’ll see. And I don’t use AI in my work.
The great AI art heist
One of the major problems with using AI-generated art is how the images are generated in the first place. An AI isn’t coming up with that imagery by itself, but based on the thousands of human-made art pieces it has been ‘fed’.
AI companies like OpenAI and Anthropic scrape the web to obtain their “datasets”, swallowing up creative content that visual artists and writers — like you — put on the web. None of these human artists or writers are paid for the use of their work, of course, and copyright notices and watermarking are routinely ignored. Meanwhile, many visual artists are finding they’re losing their jobs to AI.
No wonder there’s been massive backlash against AI-generated imagery. There are increasingly complex tools available, developed by those who are sick of having their data stolen, designed to “poison” and confuse AI webscrapers to prevent people's hard work being swallowed up and spat back out by an LLM.
AI companies are hardly coy about their models’ abilities to imitate the work of the real artists whose creations their models are trained on. OpenAI recently touted their ChatGPT model’s ability to mimic the art of Studio Ghibli’s Hayao Miyazaki, who is himself heavily opposed to AI-generated art, calling it “an insult to life itself”.
So next time you think about using an AI-generated image for your hard-written article, stop and think: would you be happy for some visual artist to generate text for their image that heavily mimics your stolen words?
Uncanny art
No matter how much data AI models are fed to generate their “art”, it is still noticeably AI-generated. All AI art has a certain “look”, and for those of us with a discerning eye it can be an immediate turn-off.
One reason for this is what’s known as the “uncanny valley”. This term was originally coined by robotocists who noticed that making robots seem human-like made people feel more at ease with them — but only up to a point. When robots too closley mimic human appearance and behaviour but are still discernably not-quite-human, it produces a sense of unease or disgust.
This negative sweet-spot is the uncanny valley, and it applies not just to robots but to computer-generated images in visual media, such as movies and video games, as well as to AI-generated imagery. You might think you’re using a striking image that will grab readers attention, and it might, but just not in the way you want.
That AI-generated image you just used for your latest prose might be generating disgust rather than interest.
AI boycott
With growing opposition to AI usage in general, not least due to the mass-theft of data discussed above, some people deliberately avoid anything with even a whiff of AI to it — myself included.
As mentioned, AI-generated images stick out like a sore thumb due to their uncanniness, and might also lead to readers dismissing your work out of principle. Put simply, by using AI-generated art, you’re reducing your audience.
The opposition to AI-generated art has been measured in a recent study, which found that even when presented with two pieces of human-made art, people are less likely to prefer the piece they are told was made by an AI. This was linked to the belief that artistic creativity is an inherently human pursuit, rather than a personal preference for the artistic quality of each piece of art.
In short, people prefer art made by humans, and dislike art they think is made by an AI because they dislike the thought of AI-generated art at a philosophical level. By extension, it’s fair to assume they aren’t going to be interested in the work of someone who uses AI-generated art, either.
What else are you using AI for?
Just as there’s been a massive backlash to AI-generated imagery, there’s also been a backlash to the use of AI by writers —and understandably so. One of the reasons we read is to gain the opinions, experience, and perspective of others. If your writing is generated by AI, not only is it not really your writing but it’s inauthentic — a mere imitation of human experience.
And if I spot an AI-generated image at the top of your article, I can’t help but wonder what else you’ve used AI for. Did you actually write the article, or did ChatGPT?
If I suspect you might have used AI to write your article, I’m immediately doubting your abilities and whether your writing is worth my time. I’m not alone in that sentiment: research into AI use in the workplace shows that people view coworkers who use AI at work as lazy and incompetent.
Personally, given my work, I can tell pretty much immediately if something is written by AI. There are very obvious tells, often even without needing to read a single word. Generally, though, people aren’t that great at detecting if written content is AI-generated or not.
Think again about the human art in the study mentioned above: people disliked it purely because they were told it was generated by AI. By using an AI-generated image, you’re priming people to read your article as AI-generated, too. And if they suspect that, they might just not read it at all.
Just (don’t) do it
There’s really no reason to use AI to generate images for your writing —there are plenty of copyright-free image alternatives, like Wikimedia, Unsplash, or Pexels.
What’s more, using AI-generated imagery might even be hurting your ability to reach your audience. One thing is for certain: using AI-generated images does hurt other creatives. So maybe skip ChatGPT for your article’s main image next time, yeah?
Absolutely agree 100%, though there are circumstances where I think AI image generation (we musn't call it 'art,' for sure, Art requires a human, or at least sentient, touch,) is useful.
For example I do a lot of tabletop roleplaying, and while I love to encourage my players to do art (at least half my group, myself included, enjoy making visual art) Its really handy to be able to quickly knock up some visual references, especially for my players without a minds eye.
Its also sometimes useful, for my fiction writing, to mock up visual references for myself, when I need a quick reference and don't have time or skill to draw things I want.
I would *never* use these images in a published work, but the technology is absolutely useful in a non-commercial and R&D capacity for an individual content creator.
Thank you, I agree. I’d also list the environmental impact of AI as a reason not to use it. That said, I’m not a writer, nor a visual artist, so I don’t know firsthand the struggles of people in those professions, and from a systemic perspective, I find it hard to fault people for using generative AI if their income depends on churning out content (like writers for so many digital publications these days).