Become a member
4

MIN READ

December 13, 2023

An update on our artificial intelligence advocacy

Discussion of generative artificial intelligence has dominated advocacy this year. By way of update before the year ends, we summarise our survey results on AI, the outcomes of the series of Copyright Roundtables held this year, and the recent agreement reached on the European Union Artificial Intelligence Act. 

In March 2023, the ASA conducted an early survey of our members on their use of, and concerns about, AI. We have now updated our research with another survey conducted in  November as part of our annual member survey. It is clear that authors’ views about generative AI are markedly united and consistent. Authors have deep concerns about their works being used to train generative AI systems with only 3% of respondents agreeing that their books may be used in this way and 92% either not in agreement or not sure, depending on the conditions of use. 

An overwhelming 96% of respondents believe that authors and illustrators should be compensated for the use of their books in training generative AI. It is understandably less clear how that compensation should be achieved, with 56% of respondents supporting a collective licensing system, another 20% unsure and 17% ambivalent. 59% of respondents support a cultural levy that would provide authors with a fee for use of their work in training AI.

There has been a lot of hype about the time-saving benefits of generative AI for creators but, at this stage, 96% of respondents tell us they don’t use generative AI as part of their writing or illustrating process. 

It is clear the ASA’s calls for an ethical, transparent approach are deeply supported: 97% of respondents believe there should be a code of ethics in publishing relating to AI and 96% think readers should be made aware when generative AI has been used to generate all or portions of a work. 

Most respondents (51%) are reporting that their publishing contracts or platform terms of service don’t include permission to use their work for any AI-related purposes, with a further 48% percent unsure. 

The ASA encourages every author or illustrator who is negotiating a publishing contract to try to include a clause that makes clear the grant of rights to your publisher does not include permission to use your work, or allow access to your work by any third party, for the training of AI technologies without your prior consent. We believe it is prudent, in these early days of AI development, to reserve these rights to yourself while the industry grapples with possible licensing solutions. 

You may also wish to make it a term of your publishing contract that your publisher not input your work into generative AI software for editing or summarisation purposes because this is another way that generative AI systems ‘learn’. 

As we’ve reported previously, the Attorney-General’s Department has been hosting a number of Copyright Roundtables throughout 2023, one of which was focused on the implications of artificial intelligence. The final roundtable held on 4 December reflected on the outcomes of previous discussions around 5 key reforms: 

1. A scheme for the use of orphan works

There is broad support among participants for an orphan works scheme to be established, with further consultation required on the design of the scheme. 

2. Quotation from copyright material

Participants are not in agreement on the need for a new fair dealing exception for quotation. 

3. Use of copyright material in remote learning environments

Participants agreed that the Government could pursue minor amendments to section 28 of the Copyright Act (which permits the performance or communication of a literary, musical or dramatic work in the classroom) to confirm that online or remote classes are covered by this section. 

4. Implications of artificial intelligence for copyright law

There are polarised views on the implications of artificial intelligence. Representatives of the tech sector argue that amendments to the Copyright Act, such as the introduction of a new exception for Text and Data Mining (TDM), are needed to encourage and facilitate the development of AI in Australia. Representatives of the creative industries believe that existing copyright laws support the development of licensing solutions which will both facilitate AI development but also fairly compensate rights holders for use of their creative material.

The tech-led arguments for a copyright framework that supports ‘innovation’ are often made on the assumption that such innovation must be free; that exploitation of copyright works for AI development doesn’t harm creators and ought to be permissible in order for Australia to keep up with the rest of the world. The ASA firmly believes that this argument is exploitative of creators’ labour and also inaccurate because generative AI is, and will, harm the market for authors’ work. Copyright licensing is the foundation of creative careers and if we don’t value and pay for creative content, we will lose professional writers and artists. AI development – particularly by some of the world’s most well-resourced and powerful companies – mustn’t be subsidised by authors and artists, who are among our most vulnerable and under-resourced creators. 

Given such differing views, there was strong support for the establishment of a Copyright and AI Reference Group for ongoing engagement with stakeholders and the Attorney-General has announced such a group will be established.

5. Definition of ‘broadcast’ for the purposes of copyright law

Participants agreed that there is no immediate need for the Government to consider delinking the definition of ‘broadcast’ in the Copyright Act from the Broadcasting Services Act 1992 (BSA).

The Government will consider the outcomes of the Roundtables in deciding which reforms to take forward in the remainder of the current Parliament. 

Finally, the EU Artificial Intelligence Act got closer to becoming the world’s first broad laws regulating AI last week, with agreement reached between the European Parliament and EU member states on new laws to regulate AI. The EU Act takes a risk-based approach with the highest level of regulation applying to those systems that pose the highest risks. Importantly for authors, the Act includes transparency requirements such that companies building foundation models “will have to draw up technical documentation, comply with EU copyright law and detail the content used for training”.

The European Parliament still needs to vote on the AI Act early next yearbut this development means that Europe leads the world in regulating AI. We look forward to seeing the Australian Government’s response to its Inquiry into Safe and Responsible AI in the new year. 

We’d like to thank all of the members who contributed to our survey and have contacted us about their AI concerns. Your feedback is invaluable in informing our advocacy efforts. We will continue to work with partner organisations and international author groups and keep you updated on all developments.