Become a member
3

MIN READ

July 26, 2023

ASA raises concerns with Government about risks of AI

The ASA has raised concerns about the risks generative AI technology poses to the professional lives of authors and illustrators in a submission to the Department of Industry, Science and Resources Inquiry into Supporting Responsible AI.

AI is a complex and rapidly evolving topic which poses a range of issues, from copyright and consumer protection laws to privacy and data protection. In our submission, we have confined our comments to areas within the ASA’s expertise. We acknowledge that, like any technology, artificial intelligence offers new opportunities and efficiencies, but broadly we are concerned about:

  • The risk of copyright infringement and the degradation of author rights
  • The risk to incentives to create
  • The risk to integrity in publishing

As we have reported on previously, we consider the large-scale scraping and exploitation of works without regard to authors and illustrators rights to be outrageously unfair. Generative AI relies on training datasets to produce its outputs – training datasets which include books, journals, essays, articles, “ingested” from the internet without permission from, or compensation to, creators. Accordingly, Google, Microsoft, OpenAI and others have developed potentially lucrative software on the back of the intellectual and creative labour of creators without transparency, without acknowledgement, and without any of that profit being returned to the creators whose work enabled their technology in the first place.

We are disturbed by the potential of Generative AI models to produce and perpetuate inauthentic and fake art, appropriating Aboriginal and Torres Strait Islanders’ art, stories and culture without reference to Traditional cultural protocols, at a time when the National Cultural Policy has put ‘First Nations first’ and is working on stand-alone legislation to acknowledge and protect ICIP.

We have highlighted to Government the global protests and organised, large-scale creator objections to this unfair appropriation of their work.

It is already so difficult to earn a living wage from writing or illustrating that even a small disruption from Generative AI – and even further market dilution in an already crowded market – may mean the loss of many professional writers and illustrators, a contraction of voices and unique Australian perspectives, and a diminished industry.  For a diverse publishing industry, we must safeguard the incentives to create. 

Our submission also flags the well-documented bias and inaccuracies of AI-generated text, harmful particularly to women and people of colour, with ‘hallucinations’ (false information) presenting a currently unsolvable problem for all Generative AI models. 

We are asking Government to:

  • Mandate transparency on both inputs and outputs: AI companies should be transparent about the works included in training datasets and AI-generated products should be labelled as such
  • Support opt-in licensing: creators should be in a position to prohibit or authorise the exploitation of their works and be compensated
  • Maintain copyright: new exceptions should not be introduced to the Copyright Act which would permit copying or mining of copyright works for AI-training purposes
  • Protect creator livelihoods: consider a new scheme for the remuneration of creators, either by way of a cultural levy on AI products and services to ensure creators are paid for their work, or a universal basic income
  • Mandate human oversight: in AI policy development, embed a requirement for human oversight 
  • Establish a special expert group: the particular vulnerability of creators across the arts requires a sector-specific response. An expert panel should consider copyright issues, appropriate regulation, and the impact of Generative AI on the cultural, social and economic life of Australia, to ensure the vision of Revive is supported and not thwarted by AI
  • Slow down to conduct safety checks: heed the calls of experts calling on AI companies to slow down training of AI systems more powerful than GPT-4 until appropriate safeguards can be put in place – just as new drugs cannot be released to the public before testing their side-effects, new AI tools should not be made publicly available until they are safe.
  • Work internationally: coordinate with as many other countries as possible to reduce the ability of OpenAI, Google, Microsoft and others to shop for permissive forums in which to set up practices.

Alongside these recommendations, the ASA is actively working on guidelines for authors, a model clause for publishing agreements specifically relating to AI, and an industry Code of Conduct.

We will publish our submission on our website when it becomes publicly available.

To prepare our submission we conducted a survey of our members, and held focus group meetings with creators to understand the ways AI is impacting their careers. We’d like to warmly thank all the creators who gave their time to share their views with us.

We know AI continues to be a major concern for Australian writers and illustrators and will be using ongoing member feedback to inform our advocacy on this issue. If you would like to share any feedback or experiences please contact Lucy Hayward, Marketing & Communications Manager: [email protected].