We can help with:
Accelerate your career
Join us today
Book an event
A unique 3-minute opportunity to pitch to publishers & agents
Do you want to know how to nail your pitch to publishers?
We advocate for members
Stay in the know
Stay in the know with industry news and ASA views.
Search our resources
Find ASA members
FAQs on publishing and more.
Member only guide to the Australian book industry.
MIN READ
In this round up of AI news, we cover our response to the Proposal Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings announced on 5 September by the Australian Government, and Meta’s appearance before the Senate Select Committee Inquiry into AI Adoption on 11 September.
The ASA welcomes the Proposal Paper for Introducing Mandatory Guardrails for AI in High-Risk Settings announced on 5 September by the Australian Government.
Olivia Lanchester, CEO of the Australian Society of Authors says, “For ethical AI development, mandatory guardrails are key. We know that the foundational AI models have been built off the back of creators’ intellectual property, without consent or remuneration. This has to be remedied and transparency is a necessary first step.
“The Proposal Paper released yesterday by Minister Ed Husic and the Department of Industry, Science and Resources is very welcome and could bring Australia into line with the European Union in requiring AI developers to be transparent about what copyright works they have used to train their models.”
The Paper proposes a definition of ‘high-risk’ which includes consideration of the risk of adverse impacts to groups of individuals or collective rights of cultural groups and to the broader Australian economy, society, environment and rule of law.
“We look forward to commenting on the proposed definition of ‘high-risk’ and will call on the Government to specifically recognise the exploitative impact of generative AI on the creative and cultural industries,” Lanchester says.
“This is a promising start with some very welcome acknowledgements, including that First Nations cultural material has been used without consent and misappropriated by AI, that AI technologies are characterized by opaqueness, including around what data is collected, and that there are enforcement gaps due to lack of remedies for affected people.”
“We’re never going to have fair outcomes for authors and artists, including control over their intellectual property and fair licensing and remuneration, without mandated transparency. We hope that the Australian Government will follow the EU’s lead and ensure that transparency obligations on AI developers doing business in Australia apply irrespective of where the AI training took place.”
The Government has indicated they are considering expanding existing regulatory frameworks or introducing new AI-specific legislation such as an Australian AI Act.
The imposition of mandatory, rather than voluntary, obligations is particularly important in light of the evidence given by Meta before the Select Committee on 11 September. It became very clear that Meta will not grant Australian users the right to opt out of their Instagram and Facebook posts being used for AI training without legislation requiring such – read about illustrators’ concerns on this issue. It was clear from Meta’s testimony that the only reason EU users have this protection, and Australian users do not, is because privacy laws in Europe demand a higher standard.
We applaud the work of the Senators on the Committee questioning Meta about their exploitative use of Australian creators’ work for AI training. Senator Varun Ghosh asked Meta:
“Have you let Australian authors know that you’re commercialising pirated copies of their work? …Prize-winning author Richard Flanagan has called this use of that dataset the biggest act of copyright theft in history. Are you willing to fairly compensate authors or creators whose IP has been used to train your AI models but have not received compensation to this point?”
Meta’s representative declined to answer the question on the basis that it concerns matters which are the subject of litigation.
And, in his concluding remarks, Chair of the Committee, Senator Tony Sheldon said,
“…it’s obvious that journalists and authors and other creators around the world have not consented to you – I say – stealing and commercialising their work. People around the world are sick of tech giants doing whatever they want, completely ignoring laws and rights as they go…. they do expect governments to do something about it.”
We thank the Select Committee and Chair, Senator Tony Sheldon, for highlighting these matters of deep concern to Australian creators and look forward to the Committee’s report to Government due on 19 September.
The ASA will continue representing authors’ and illustrators’ concerns to government and is committed to working closely with government and industry stakeholders to ensure fair AI regulation and sustainable creative careers.