How an AI-written Book Shows why the Tech 'Frightens' Creatives
For Christmas I received an interesting gift from a pal - my very own "very popular" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my picture on its cover, and it has radiant reviews.
Yet it was entirely composed by AI, with a few basic triggers about me supplied by my buddy Janet.
It's an interesting read, and really funny in parts. But it also meanders rather a lot, and is someplace between a self-help book and a stream of anecdotes.
It simulates my chatty style of writing, but it's also a bit repeated, and very verbose. It might have exceeded Janet's triggers in collecting data about me.
Several sentences begin "as a leading technology reporter ..." - cringe - which might have been scraped from an online bio.
There's also a strange, repeated hallucination in the form of my cat (I have no family pets). And there's a metaphor on practically every page - some more random than others.
There are dozens of business online offering AI-book writing services. My book was from BookByAnyone.
When I called the president Adir Mashiach, based in Israel, he told me he had offered around 150,000 personalised books, mainly in the US, since rotating from assembling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The firm uses its own AI tools to generate them, based upon an open source large language design.
I'm not asking you to purchase my book. Actually you can't - only Janet, who created it, can purchase any further copies.
There is currently no barrier to anyone developing one in any person's name, consisting of stars - although Mr Mashiach states there are guardrails around violent material. Each book consists of a printed disclaimer specifying that it is imaginary, produced by AI, and created "solely to bring humour and delight".
Legally, the copyright belongs to the company, but Mr Mashiach stresses that the product is planned as a "customised gag present", astroberry.io and the books do not get sold even more.
He wishes to expand his variety, creating various categories such as sci-fi, and asteroidsathome.net possibly providing an autobiography service. It's developed to be a light-hearted type of consumer AI - offering AI-generated products to human customers.
It's also a bit frightening if, like me, you write for a living. Not least since it most likely took less than a minute to produce, and it does, demo.qkseo.in certainly in some parts, sound much like me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being utilized to train generative AI tools that then produce similar content based upon it.
"We must be clear, when we are speaking about information here, we in fact suggest human creators' life works," says Ed Newton Rex, founder of Fairly Trained, which projects for AI companies to respect developers' rights.
"This is books, this is articles, this is photos. It's works of art. It's records ... The whole point of AI training is to discover how to do something and then do more like that."
In 2023 a tune featuring AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms since it was not their work and they had actually not consented to it. It didn't stop the track's developer trying to choose it for forum.batman.gainedge.org a Grammy award. And even though the artists were phony, it was still .
"I do not think using generative AI for innovative purposes ought to be banned, but I do believe that generative AI for these purposes that is trained on people's work without authorization should be prohibited," Mr Newton Rex adds. "AI can be very effective but let's build it fairly and fairly."
OpenAI states Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and damages America's swagger
In the UK some organisations - including the BBC - have selected to block AI designers from trawling their online material for training purposes. Others have actually chosen to work together - the Financial Times has actually partnered with ChatGPT creator OpenAI for example.
The UK government is considering an overhaul of the law that would enable AI designers to utilize developers' content on the web to help establish their designs, unless the rights holders pull out.
Ed Newton Rex describes this as "insanity".
He explains that AI can make advances in locations like defence, health care and logistics without trawling the work of authors, reporters and artists.
"All of these things work without going and changing copyright law and ruining the livelihoods of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in the House of Lords, is also highly versus removing copyright law for AI.
"Creative markets are wealth developers, 2.4 million tasks and a great deal of happiness," states the Baroness, who is likewise a consultant to the Institute for Ethics in AI at Oxford University.
"The government is undermining one of its finest carrying out industries on the unclear promise of growth."
A federal government spokesperson said: "No move will be made till we are absolutely positive we have a useful strategy that provides each of our objectives: increased control for best holders to assist them certify their content, access to premium material to train leading AI designs in the UK, and more openness for right holders from AI developers."
Under the UK federal government's brand-new AI plan, a national information library consisting of public data from a large range of sources will likewise be offered to AI scientists.
In the US the future of federal rules to manage AI is now up in the air following President Trump's go back to the presidency.
In 2023 Biden signed an executive order that aimed to enhance the security of AI with, among other things, companies in the sector required to share details of the operations of their systems with the US government before they are released.
But this has now been reversed by Trump. It stays to be seen what Trump will do instead, but he is stated to desire the AI sector to deal with less regulation.
This comes as a number of lawsuits against AI firms, and particularly versus OpenAI, continue in the US. They have been secured by everyone from the New York Times to authors, music labels, and even a comic.
They declare that the AI companies broke the law when they took their content from the web without their authorization, bio.rogstecnologia.com.br and used it to train their systems.
The AI business argue that their actions fall under "fair use" and are therefore exempt. There are a number of aspects which can constitute reasonable usage - it's not a straight-forward definition. But the AI sector is under increasing examination over how it collects training information and whether it must be paying for it.
If this wasn't all sufficient to consider, Chinese AI firm DeepSeek has shaken the sector over the previous week. It became the many downloaded free app on Apple's US App Store.
DeepSeek declares that it developed its innovation for a portion of the rate of the similarity OpenAI. Its success has actually raised security issues in the US, and threatens American's existing dominance of the sector.
As for me and a profession as an author, I believe that at the moment, if I really want a "bestseller" I'll still have to compose it myself. If anything, Tech-Splaining for Dummies highlights the current weak point in generative AI tools for larger jobs. It is full of inaccuracies and hallucinations, and it can be rather hard to check out in parts due to the fact that it's so verbose.
But given how rapidly the tech is evolving, I'm unsure for how long I can remain confident that my considerably slower human writing and modifying skills, are much better.
Sign up for our Tech Decoded newsletter to follow the most significant developments in worldwide innovation, with analysis from BBC reporters all over the world.
Outside the UK? Register here.