In case you missed it, speculation about the effects of A.I. on the publishing industry are bubbling up again.
For instance: Over on X yesterday, a bunch of authors struck up a conversation about A.I. that included this item:
That tweet caught the attention of best selling author Larry Correia, who replied:
I see a lot of newbie authors thinking that AI is going to be some super tool, but the part you guys are missing is that writing is the fun part. Editing is the hard part. So you’re giving the fun part to the machine, and then going through and doing painstaking clean up to humanize it. At least if you want it to not suck ass.
So let’s say that I, as a very experienced author who we’ve established is pretty good at this shit, tells an AI give me a story about X, Y, and Z. And it spits it out for me in seconds. Yay.
Except then I need to take that AI generated manuscript and make it not read like it was written by a soulless autocorrect with a severe personality disorder.
What lazy authors will do is just take that AI dreck, do a quick editing pass (if that) and throw it out on the internet to try and make a quick buck. Slap an AI cover on it. They’ll spam Amazon, sell to some dupes, make a few bucks, maybe. And flood the market with shit.
Related: A.I. Writing
So back to me, an actual working pro with a name and a reputation for a certain level of quality and an existing fan base who pays my bills. I’ve got this AI generated manuscript, but I need to bring it up to snuff, otherwise my customers are going to read it and go what the fuck is this bullshit? And never buy one of my books again.
So this is the part you newbs don’t get, writing/creating is fun. Editing is WORK. So now I need to clean up every single fucking line of this AI generated manuscript because the machine doesn’t know shit about emotions. It doesn’t know shit about how things feel. It can only regurgitate what others have written before. It is a compulsive liar. It makes shit up, but it isn’t creative. It’s got no soul. It’s got no enthusiasm, and that’s the biggest one that I’ll come back around to.
Not to be contrarian, but I’m one of those oddballs who enjoys editing as much as writing. But I understand that most writers dread editing their work. Don’t stress if you’re one of them though, because I’m glad to offer you my services.
Back to Larry:
Because a collab should be a synergistic endeavor that results in something more than the sum of its parts. To do that requires putting in actual work. With this, you just teamed up to collaborate with a fucking robot that’s got the humanity and sense of humor of an impaired Speak & Spell.
By the time you get done painstakingly redoing every line of cloying bullshit, congrats, you could’ve just wrote the fucking book you wanted to begin with.
The creation is the fun part. I’ve said this many many many times, the writer’s greatest weapon is contagious enthusiasm. If I’m having fun writing it, I know you guys will have fun reading it. That’s it. That’s the big fucking secret.
AI has no enthusiasm.
Related: How Big Tech’s Novelty Obsession Killed Software
If an author isn’t having fun writing, you can tell when you read it. It’s a vibe. It’s a feeling. You just know. If the author was having a blast you know it. The scenes where a good author was grinning or crying or doing a triumphant fuck yeah fist pump, you fucking know. Because reader and author are both human, you fucking GET IT.
The AI doesn’t. It can’t. It can fake it. It can uncanny valley its way through a book, and it will probably get better and better at faking it, but it isn’t human, and good storytelling is a profoundly human endeavor.
Can confirm. As sci fi grandmaster John C. Wright has said, fiction writing is the closest we have to telepathy.
This is the same reason the big media corporate entertainment of the day sucks so bad. It’s made by a committee, and committees don’t have have enthusiasm. And fake enthusiasm will never replace real contagious enthusiasm. If the creator doesn’t give a shit, why should the audience?
AI can produce a TON of vapid soulless shit, but hey, so can modern Disney! In fact, when the creator doesn’t give a shit about his art, not only does the audience feel it, the audience gets pissed off.
So if you want to produce tons of unenthusiastic shit product and roll the dice hoping it somehow sticks and makes a buck, great. But if you actually give a shit about what you’re saying, then just fucking SAY IT.
My comment:
Larry is correct that visibility is the biggest challenge facing new authors.
And the prediction that A.I.-generated books will exponentially muddy the already murky ocean of Amazon seems to be the industry consensus.
Authors currently publishing through KDP know that Amazon has had A.I. guidelines for a while now. Here’s their policy:
According to Amazon’s guidelines, users who upload A.I.-generated manuscripts have to disclose it, even if they did extensive editing.
That doesn’t mean people won’t just lie, though.
The elephant in the room is why. One attribute the A.I. writing advocates I’ve seen have in common is having no idea how the publishing business works.
That guy in the first screencap above said that A.I. will let non-authors compete with authors. What he’s missing is that A.I.-generated novels will take extensive editing to be salable, and professional-level writing skill is a prerequisite for pro-tier editing chops.
So this is how the whole prospect comes off:
And even if generating novels with A.I. did save time, it was only advantageous back in the rapid release era, which never really lived up to the hype, and which A.I. writing helped kill.
That’s just one reason why I’ve been pivoting away from Amazon and concentrating more on crowdfunding.
Kickstarter’s A.I. guidelines are even stricter than Amazon’s, with failure to disclose which project elements are A.I. and which are manmade punishable by suspension. And attempts to dodge KS’s policy can get you banned.
What the inundation off KDP with A.I. writing looks like is an even better argument for neopatronage.
Get early access to my works in progress, the chance to influence my books, and a VIP invite to my exclusive Discord.
Sign up at Patreon or SubscribeStar now.
Dark fantasy minus the grim plus heroes you can root for battling overwhelming odds
Any reader who would read AI books is not too different from one who would put up with that 20 to 50k author who was caught plagiarizing years after the fact. Nothing happened to him despite the revelations. That’s because consoomers don’t care about intent or art, they only want to ingest the next product, preferably with the right trope list checked off. They want beltline production before quality and that’s what they’ll pay for.
Most readers, however, are not like this. At the end of the day, someone reads because they’re interested in another perspective, even if it’s for a silly adventure book. This new gimmick is not a perspective, it’s just a mechanism for slapping tropes together in the “right order” just like Hollywood has been training you to do with their writing books.
I do not understand the fear of AI.
“I do not understand the fear of AI.”
My answer to this is based on my observations.
Anytime someone criticizes AI and concludes ‘I don’t understand the fear’, I see it as a shortsighted statement because the critic is usually harping on current problems with AI. AI’s primary function is to learn. This is what a lot of people fear. AI is proving it can be taught and function as desired given time. That’s the short answer.
To give you an example, how many people are saying “LOL, AI can’t make hands, it’s over AI bros, AI is dead and will never be a thing!”? None. AI has learned. If there is anyone left, they’re being dishonest or using a crap and outdated AI image generating program.
You (not you specifically) can mock these people for spending 10k hours training an AI to write a book instead of writing it themselves in that time, but these guys are determined to make AI work.
I have a co-worker who keeps up with the developments in AI and he’s shown me some incredible stuff you wouldn’t believe. AI is capable of so much more than just writing books or making images, yet the peasantry is too busy squabbling over text and image generation. In a broad sense, it’s ok to have fear/concern about AI and it’s future in our every day lives, not just the creative field. It’s not an irrational fear.
To ask the bigger question here, “Why is AI being pushed so hard in the arts?”. The answer is simple, Art is a big industry and AI will save a lot of time and a lot of money when it finally gets it right enough to replace the jobs of a lot of artists. Becoming irrelevant is a legit concern since there seems to be a cultural movement to let robots do everything.
Independent artists will not be able to keep up with the production of works compared to AI and could wind up being buried. I think it’d be a similar situation to Youtube creators who got buried by the algo that rewarded news stations pumping out content 24-7. This is why we have so ay full time/pro youtubers. I think the only way around this will be neo patronage and living off the stipend you get from your patrons, but even then we won’t know how AI will affect that in the long run.
We can sit here and say the consumer spending habits aren’t right, but think about the generations growing up in the age of AI. What about their perspective? They won’t see AI generated things like we do; to them it’s the norm and will likely support it without knowing the difference or being significantly swayed by other options. To recognize your point, consoomers are gonna consoom and the optics right now are that if you don’t have consoomers buying your stuff, you won’t last long.
This isn’t an argument of what’s right or wrong in the creative process or the spending habits of people in relation to AI, but merely to drive home the point any issue AI has now is temporary because it can learn to be better and eventually it will be good enough to make commercial creative ventures obsolete. Maybe not completely, but I could see it knocking 99% of creatives out the market except for the few who have brand recognition (like a Brandon Sanderson). Afterall, why pay lots of money to someone else to be entertained when I can have AI do it cheap/free and well enough for myself?
Hope that helps with putting things into perspective.
I’m coming at this from a different angle than just function and ability.
All of this has nothing to do with AI specifically but in how art and entertainment is not absorbed but consumed. The reasons for AI are the same for flash animation, OldPub genres and word limits, and autotune–it makes things easier so the peasants can consume product faster in order to get ready to consume the next content on the beltline. Those who fear AI are so late to the party I wonder if they understand the reason it’s being used is because every change in the arts since at least the 1990s has been to slosh everything down into a content drip at the expense of ambition and creativity.
If people cared about art instead of consuming it then they wouldn’t care about AI to begin with.
Let me put it to you this way. Whether James Patterson decides to stop hiring ghost writers and start using AI effectively makes no difference to the quality of slop his pumping out. The audience who wants that isn’t going to stop, just like the above plagiarism example. People who consoom already don’t care about art. There is nothing you can do about these people. This is an issue far larger than a couple of creatives on the internet can do anything about.
My audience, though smaller, reads my books because I wrote them, because the words I choose matter and the stories I tell I believe in. The existence of AI writing doesn’t change any of that. My audience is not the same as the above one.
I understand the argument about noise and glut but, again, that was always the problem with having an open gate. Next time get gatekeepers who give a care and perhaps it won’t come to this. Whether I’m competing against plagiarists, AI, or content merchants (all of which are interchangeable) doesn’t change what I have to do. Sure, it’s difficult, but I was never under the illusion that it wouldn’t be.
Until the perception of art changes we deserve everything we’re going to get.
That last sentence is what it ultimately boils down to. Art has been abused and misused for too long. It does reflect in our culture where it feel hollow. Regardless of what the industry is like as a whole, we on the individual level, just have to do our best to make good art and put it out there.
Keep in mind that the modern AI methods have an “out of sight out of mind” aspect. That is, you may type something into Chat GPT and you get a quick result even though you didn’t expect to get an answer. They do an update, and the answers are better. It is easy to conclude that the AI is thinking, and that the updates are it adapting, and that there is no limit to this process.
But the “out of sight” part of this equation is the training. Consider just how many texts were analyzed as part of Chat GPT’s training, and how many computer hours were spent in that analysis. Even at that point it still would not give good answers until the models were reweighted according to the feedback from tons of human reviewers. The output is impressive, but it comes only after an absolutely mammoth amount of meaning from humans has been poured into it. The impressive output is not due to the process being particularly good at finding solutions, but due to having so much data that even a really bad solution process can still find answers.
We’re seeing line go up now because the process is new and there’s such an insane amount of data on the internet to harvest. But when we reach the limits of data harvesting and processing times, line will level off. Data harvesting is also not just a matter of scraping capabilities, it’s also a matter of making sure that you get quality human content. Image making programs have done very well because there were huge amounts of hand drawn images on various art websites. But now most of those websites have been flooded with AI drawn images and even the ones that have tried to crack down still have people sneaking stuff through. This means that new attempts to harvest data will almost certainly be contaminated by AI output, which leads to degeneration if it is used to train new models.
Brian, I realize this is off-topic, but I was wondering if you had any thoughts on the implications of the latest Synod. It’s generating a lot of smoke, but it’s hard to make out just where the fire is. The angrier Trads seem to regard it as the final straw, and bishops like Muller and Strickland are speaking out more ferociously than ever before, but the media and the SJW Catholics like James Martin aren’t as triumphal as I would expect if it had effected a major change. To me, it sounds like a lot of bafflegarb that really doesn’t change much but which lays down some cover for the progressive wing. As an ex-lapsed Catholic who’s come back into communion with the Church during this past year, this type of thing always triggers the gut reaction of “I should have stayed away,” which I then have to fight against. You’re one of the only online Catholics I trust to have a balanced view on things like this, so if you have time for a take once the smoke clears, it’d be a blessing to me.
My brother in Christ, if reports of such things rob you of your peace, stop consuming those news sources.
” As an ex-lapsed Catholic who’s come back into communion with the Church during this past year, this type of thing always triggers the gut reaction of “I should have stayed away,” which I then have to fight against. ”
Welcome back home! I came back to the Church too. Best move of my life. Glory be to God.
Anyway I’m going to double down on excellent advise. The average medieval Catholic did not care what his pope was doing, what he said, what letters he wrote to his bishops or anything like that unless it impacted him personally. Theologically, your parish, priest, and bishop is also the Church, so you can safely shut off all the news from the Vatican. Maybe a newsletter from your bishop/priest if you want to stay in the loop?
I have spent the last several years in online discussions advocating for the Church and Pope Francis. I stopped recently because it was bad for my mental health too. What I learned from that experience that the media has been spinning the truth for over a decade on this papacy. That statement includes nominally Catholic outlets. Finding out the truth requires reading whole documents, whole interviews, etc. Even “Pachamama” was just the usual media slight of hand, with them being “native” statues of Mary/Elizabeth. The video was granted horrid and it was easily misunderstood in the West.
Anyway, by far way to easiest deal with the noise is to shut it off entirely. If I hear some disturbing headline that requires 2-3 hours of research to clarify, I’m wasting my time. The solution in almost all cases I encountered was “Yes, Pope Francis said that, but that’s not what he meant”. Usually Pope Francis was addressing a complex issue in a complex way to a specific (not general) audience that could be matched to the catechism.
Speaking to your issue, Catholic synods are not Protestant synods. Nothing that matters to the vast majority of Catholics is going to happen, just like all at all the other synods. When nothing happens again after this synod, the same news sources will turn to other topics to induce fear and clicks in Catholics. I had personally had forgotten about the synod until I read your post. It’s a much more peaceful world to live in.
AI is beginning to eat its own tail, as it trains on more and more content generated by AI. I was just reading about the AI companies petitioning the Big Four publishers to let them train AI on their copyrighted backlist, and the publishers are cutting deals. The models have already consumed all public domain art, writing, and Google image search. The writing AI was trained on places like fanfiction dot net and it shows. (I always wonder why they don’t let the AI spit out stuff from classics … too based?)
Sure, AI will continue to improve. There’s some fantastic fake movie trailers out there, like a Legend of Zelda movie made in the 70s, all done by AI. But eventually it will hit entropy, like everything else in the universe. The neural nets are producing content faster than humans can, and all AI will run out of media to consume everywhere.
Now, AI in videogames and search engines and other places has been, and always will be, a useful tool. But generative AI has a time limit, and I don’t think creatives need to fear it. (I do have some thoughts about the Beast in Revelation and how the new world order will decide who lives and dies with AI… but we knew that was coming anyway, didn’t we?)
I think the AI-generated vs. AI-assisted distinction is an important one. I have been using ChatGPT to read my stories and give it feedback. It’s not perfect by any means, but I did have fun reading what it has to say about what I’ve written, putting a whole new meaning to “sounding board”. I also fed it my author profile and see how it can be improved. The key is not to take the output at face value.
And I think that’s the mistake that people are making. I heard about how the Indonesian government’s YouTube channel got the name of the President wrong in the description an *official government video*. The issue was that we just had a new president. So instead of putting in the name of the current President (Prabowo), they put in the former President’s (Jokowi). Later on, I learned that people there use ChatGPT to write these things (or at least, I have it on good authority that this is something that they would do). And I went, “ah that’s the problem, the robot hadn’t registered that we had a new president and whoever was in charge of the YouTube channel didn’t bother to check the output”.
So maybe a hot take (though maybe not), but I think “AI-assisted” is the way of the future. People will think of it no differently than writing on your computer as opposed to writing with your own hand.
One way that AI hurts bad authors is that it gives them the ability to generate not completely horrible text quickly. The first step of any artistic endeavor involves making lots and lots of bad art until you develop the skills to make good art. If you have to do everything yourself, then you just accept that this is something that you have to do to become an author. But if you can get a computer to output text, or to rewrite your text, it’s tempting to do that instead of developing the core skills yourself. This is the biggest pitfall of “AI-assisted writing.” It’s not that it can’t work, but that if a beginner does things that way it limits his development.
But this phenomenon is not new. Calculators have of course been of great assistance to anything math related, but they have also lead to generations of students for whom “mathematical reasoning” means “push buttons on a calculator.” Rick Beato has talked about how in the early days of rock if you wanted a band you’d all have to learn instruments. Compare having an actual drummer who can learn about the fine points of rhythm as he experiments to having a stock drum loop which sounds good but restricts you to just that one rhythm.