How an AI-written Book Shows why the Tech 'Terrifies' Creatives
For Christmas I received an intriguing present from a good friend - my extremely own "best-selling" book.
"Tech-Splaining for Dummies" (excellent title) bears my name and my picture on its cover, and it has glowing reviews.
Yet it was entirely written by AI, with a couple of basic triggers about me provided by my friend Janet.
It's an interesting read, and uproarious in parts. But it also meanders quite a lot, and is someplace in between a self-help book and a stream of anecdotes.
It simulates my chatty design of composing, but it's likewise a bit repeated, oke.zone and really verbose. It might have surpassed Janet's triggers in looking at data about me.
Several sentences begin "as a leading technology journalist ..." - cringe - which might have been scraped from an online bio.
There's also a strange, repetitive hallucination in the form of my cat (I have no pets). And there's a metaphor on practically every page - some more random than others.
There are lots of companies online offering AI-book writing services. My book was from BookByAnyone.
When I called the president Adir Mashiach, based in Israel, he told me he had actually offered around 150,000 personalised books, primarily in the US, because pivoting from putting together AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company utilizes its own AI tools to generate them, based on an open source big language design.
I'm not asking you to purchase my book. Actually you can't - just Janet, who produced it, can buy any further copies.
There is presently no barrier to anybody creating one in anybody's name, including celebrities - although Mr Mashiach says there are guardrails around abusive content. Each book consists of a printed disclaimer specifying that it is imaginary, developed by AI, and created "solely to bring humour and delight".
Legally, the copyright belongs to the firm, however Mr Mashiach worries that the product is planned as a "personalised gag present", and the books do not get sold even more.
He wants to widen his range, producing various categories such as sci-fi, and possibly providing an autobiography service. It's created to be a light-hearted kind of consumer AI - selling AI-generated goods to human clients.
It's also a bit terrifying if, like me, you write for a living. Not least because it probably took less than a minute to create, and it does, certainly in some parts, sound similar to me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being used to train generative AI tools that then produce comparable content based upon it.
"We should be clear, when we are speaking about data here, we in fact imply human developers' life works," says Ed Newton Rex, founder of Fairly Trained, which campaigns for AI companies to respect creators' rights.
"This is books, this is short articles, this is pictures. It's artworks. It's records ... The entire point of AI training is to learn how to do something and then do more like that."
In 2023 a song featuring AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms because it was not their work and they had actually not granted it. It didn't stop the track's developer trying to choose it for a Grammy award. And even though the artists were phony, it was still extremely popular.
"I do not think making use of generative AI for imaginative functions should be prohibited, but I do think that generative AI for these purposes that is trained on individuals's work without authorization ought to be prohibited," Mr Newton Rex includes. "AI can be very effective however let's build it fairly and relatively."
OpenAI states Chinese competitors utilizing its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and dents America's swagger
In the UK some organisations - consisting of the BBC - have chosen to obstruct AI designers from trawling their online content for training functions. Others have actually chosen to collaborate - the Financial Times has partnered with ChatGPT developer OpenAI for instance.
The UK government is considering an overhaul of the law that would enable AI developers to utilize developers' content on the internet to assist establish their models, unless the rights holders pull out.
Ed Newton Rex explains this as "insanity".
He points out that AI can make advances in areas like defence, healthcare and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and changing copyright law and destroying the incomes of the nation's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is also highly against removing copyright law for AI.
"Creative industries are wealth creators, 2.4 million jobs and a great deal of happiness," says the Baroness, who is likewise an advisor to the Institute for Ethics in AI at Oxford University.
"The government is weakening one of its finest carrying out industries on the vague pledge of growth."
A federal government representative said: "No move will be made up until we are definitely positive we have a practical strategy that provides each of our goals: increased control for ideal holders to assist them license their content, access to premium material to train leading AI designs in the UK, and more openness for ideal holders from AI designers."
Under the UK government's new AI strategy, a nationwide data library including public information from a broad range of sources will also be provided to AI scientists.
In the US the future of federal guidelines to control AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an order that aimed to increase the security of AI with, amongst other things, companies in the sector required to share information of the functions of their systems with the US government before they are launched.
But this has actually now been repealed by Trump. It remains to be seen what Trump will do instead, but he is stated to want the AI sector to deal with less policy.
This comes as a number of lawsuits against AI companies, and particularly versus OpenAI, continue in the US. They have been gotten by everyone from the New york city Times to authors, music labels, and even a comedian.
They declare that the AI firms broke the law when they took their material from the internet without their consent, and utilized it to train their systems.
The AI business argue that their actions fall under "fair use" and genbecle.com are therefore exempt. There are a number of aspects which can constitute fair usage - it's not a straight-forward meaning. But the AI sector is under increasing scrutiny over how it collects training data and whether it ought to be paying for it.
If this wasn't all adequate to consider, Chinese AI company DeepSeek has actually shaken the sector over the past week. It became the most downloaded free app on Apple's US App Store.
DeepSeek claims that it established its technology for a portion of the rate of the likes of OpenAI. Its success has actually raised security concerns in the US, and threatens American's present supremacy of the sector.
When it comes to me and a career as an author, I think that at the moment, if I truly desire a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the present weakness in generative AI tools for bigger tasks. It has plenty of mistakes and hallucinations, and it can be rather challenging to read in parts because it's so long-winded.
But offered how rapidly the tech is evolving, I'm not exactly sure for how long I can stay positive that my considerably slower human writing and editing skills, are better.
Register for our Tech Decoded newsletter to follow the most significant developments in international innovation, with analysis from BBC reporters around the globe.
Outside the UK? Sign up here.