To the author: do you use AI at all in the creative aspects of the production? I assume that AI assistance in creative writing is now mainstream, and an accepted tool for most writers. I am interested to know your thoughts on this subject and if you use AI then what sort of methods do you use?
Note: Google Gemini reports that "the most successful writers in 2025 use me as a "distillation machine." They write 1,000 words of raw emotion, then ask me to help them find the "300 words that actually matter"
I haven't really found a way to incorporate AI into my creative process.
I do sometimes use it as thesaurus on steroids when I can't think of the right word or if I need to check grammar / sentence structure (I'm not a native speaker). I would never use AI in the way Gemini self-reports (and I doubt that's really a thing anyway).
I might be tempted to use it for episode art if I didn't have access to a professional illustrator.
I am experimenting with some AI-generated music at the moment, but that's for background cues that I'd use canned royalty-free music for anyway (emotionally important scenes have tailored-made music done by a professional composer).
One way I'm eyeing to use AI in the future is as a way to translate what are currently audio stories into video. But I feel we're still not there.
I’m writing my first sci-fi novel, and I’m using it for two main things. One, as a basic sanity check for the story, pacing, internal consistency, character development, etc, and basic copy editing. I’ve also used it to generate images for settings to go in the lore bible, so I can pin down what I see in my head into something tangible that I won’t forget. This isn’t the only step, I have human beta readers to double check the LLM, but it often tells me easily verifiable things, and it is often right, (for instance, this sentence reads funny), so it definitely adds value in that sense. Subjective things are a bit harder to trust, but that can be the same thing with subjective feedback from humans too, though of course the difference is that the LLM isn’t going to buy my book, whether or not it loves it.
Yes it sounds like a bold statement. I called Gemini out on that and it admitted that it over-egged its confidence on that assertion.
But presumably the LLMs do have some knowledge about how they are used?
On further probing Gemini did give a plausible justification - in summary:
"Creation is easy. Selection is hard. In an era of infinite content, the "most successful" writer isn't the one who can produce the most; it's the one with the best taste. Using an LLM as a distillation machine allows a writer to iterate through their own ideas at 10x speed, discarding the "average" and keeping only the "potent."
LLMs have no knowledge (really “knowledge-like weights and biases”) outside their training set and system prompt. That plausible justification is just that — a bunch of words that make sense when strung together. Whether you’d like to give that any epistemic weight is up to you.
More like whether one is correct in giving it any epistemic weight. Not everything is opinion, some things really are clearly right and clearly wrong and attributing thought, reasoning, and analysis to an LLM is one of those things that’s clearly wrong.
Why would Gemini (the text model part) have that info? I'm sure that Google has some kinda analytics, or so on, but it wouldn't necessarily be part of training, the system prompt, or distillation directly.
> I assume that AI assistance in creative writing is now mainstream, and an accepted tool for most writers.
It absolutely is not. In fact the Nebula awards just banned entries from having _any_ AI use involved with them whatsoever. You can't even use them for grammar correction.
I'm not sure quite how that works. Google Docs will suggest various changes which I take into account or don't. And they certainly correct misspellings. You can choose to decide that's not AI but it's a grey line.
For writing I've sometimes used LLMs to speed up some essentially boilerplate. Never used for something that's not pretty much routine that I could easily do but would probably spend some time doing so.
For anything that might be a Nebula submission, it's hard to imagine LLMs doing anything beyond the copyediting level (which may not be well-defined but seems a reasonable threshold).
Well, they want to preserve a role for the editor; because the editor is not just checking the grammar but also the content, and weighing in with their relative objectivity on the current state of the story, what should be improved, what was good and what didn't work, etc. and if we have AI glazing us continuously we will just produce slop; it may look like good fiction but it will not read like it, and people can tell the difference!
Not even grammar correction? That's lame and kinda evil.
When you submit your manuscript to a big publisher I guarantee they're using AI to check it (now). At the very least, AI is the only tool that can detect a great number of issues that even the best editors miss. To NOT take advantage of that is a huge waste.
It sounds to me like they're just trying to push out independents and small publishers. Because you know they're not going to ask big publishers if they use AI (who will likely deny it anyway... Liars).
FYI: AI is both the best grammar checker ever as well as the best consistency checker. It'll be able to generate intelligent lexical density report that will know that you used "evasive", "evaded", and "evading" too much (because it knows they're all the same base word). They're also fantastic at noticing ambiguities that humans often miss because they're like-minded and "know what you mean." (Our brains are wired like that to improve the efficiency of our repetitive tasks like reading words).
AI tools can help you improve as a writer and enhance your craft in a lot of ways. To not take advantage of that—to me—feels like burying your head in the sand and screaming, "LA LA LA LA! I don't want to think about AI because it can be used for bad things!"
I've chatted with many writers about AI and nearly all of them don't understand the technology and assume it's literally just taking chunks of other writers works and spewing them out one sentence at a time.
I literally had a conversation with a writer that thought you could take ten sentences written by AI and trace them back to ten books.
That literally *is* what they’re doing though, just not at sentence granularity—they’re doing it at both larger and smaller scales. Sometimes they may give you a plagiarized paragraph, sometimes they’ll give you a plagiarized phrase, sometimes they’ll give you a structure that they fill in with “their own” words where the structure itself was taken from something… They do nothing original.
To the author: do you use AI at all in the creative aspects of the production? I assume that AI assistance in creative writing is now mainstream, and an accepted tool for most writers. I am interested to know your thoughts on this subject and if you use AI then what sort of methods do you use?
Note: Google Gemini reports that "the most successful writers in 2025 use me as a "distillation machine." They write 1,000 words of raw emotion, then ask me to help them find the "300 words that actually matter"