We can help with:
Accelerate your career
Join us today
Book an event
Pitch your work at our flagship Literary Speed Dating events.
Learn how to nail your pitch to publishers and literary agents.
We advocate for members
Stay in the know
Stay in the know with industry news and ASA views.
Search our resources
Find ASA members
FAQs on publishing and more.
Member only guide to the Australian book industry.
MIN READ
The last fortnight has seen a number of news stories relating to the book industry and AI – read our summary below.
In one of many headlines from the UK, the Society of Authors UK has launched human-authored labelling, to help readers distinguish between works generated by AI and those written by actual humans. Through this free service, authors can register their books and add a ‘Human Authored’ label to the back cover. The scheme was developed in collaboration with the US Authors Guild, which launched their own Human Authored labelling in 2025.
In their statement, the SoA attributed this scheme to ‘the absence of any measure by the Government to compel tech companies to label AI-generated outputs.’ SoA Chief Executive Anna Ganley commented that while ‘the onus should be on tech companies and online retailers to label AI-generated content,’ in the absence of such action, the SoA wanted to provide a solution to the overwhelming majority of their members who were interested in labelling and providing confidence to consumers and readers in the AI age.
The ASA has been in talks with both its UK and US counterparts about introducing Human Authored labelling – we will update our members with more information in due course.
Almost ten thousand writers published a mostly empty book, pointedly titled Don’t Steal This Book, to protest copyright theft by AI companies, which was launched at London Book Fair last week. The organiser of the project, ethical AI campaigner Ed Newton-Rex, noted that copyright theft is ‘not a victimless crime.’ Included in the book is a call to the UK government to protect the copyright of authors by refusing to ‘legalise book theft,’ warning that if they fail to do so, ‘empty pages, writers without pay, and readers deprived of the next book they’ll love’ will inevitably follow.
The UK House of Lords Communications and Digital Committee has published a report recommending that tech companies should be required to obtain licences for the use of copyright works to train generative AI models, and that the Government should ‘focus on strengthening licensing, transparency and enforcement.’ The AI, copyright and the creative industries report also referenced the Australian Government’s rejection of a text and data mining exception for AI training, calling on the UK Government to ‘follow the example of the Australian Government and make a clear public statement that it will not bring forward proposals for a new commercial TDM exception with an opt-out-based rights-reservation mechanism.’
Digital writing assistant Grammarly is facing a multimillion dollar lawsuit following the launch (and hasty removal) of their Expert Review tool, which used generative AI to provide feedback ‘inspired by’ the style of esteemed writers and academics without their permission. Lead plaintiff and journalist Julia Angwin told the BBC that the AI edits attributed to her were subpar, highlighting that editing is how she earned a living, and describing the notion of such work being associated with her name without her consent as ‘really appalling.’
The Publishers Association has published Content Superpower: UK publishing and the AI licensing market, detailing information about how publishers license content for AI training and revealing that ten publishers have agreed to one or more licensing deals, with a further eight publishers expected to enter deals this year. The PA called on the government to protect individual creators and the UK’s larger role as a cultural powerhouse by refusing to allow copyright exceptions for AI, and insisting on transparency from AI developers regarding the data they’ve used to train their models.
The UK collecting society, Publishers’ Licensing Services, has launched a collective licensing initiative designed to prevent unlawful use of published work by AI companies. Publishers can now opt in to the scheme, which PLS describes as ‘provid[ing] a clear and lawful route for AI companies to license content, helping to ensure publishers and authors are properly rewarded when their work is used.’