Australia is taking action to prevent the sharing of child sexual abuse material created using artificial intelligence (AI). The country’s internet regulator, e-Safety Commissioner Julie Inman Grant, announced that a new code has been developed, at the government’s request, which requires search engines such as Google and Bing to ensure that search results do not include AI-generated child abuse content. It also mandates that AI functions within these search engines cannot produce synthetic versions of such material, commonly referred to as deepfakes.
The rapid growth of generative AI has caught the world off guard, according to Inman Grant, leading to the need for updated regulations. The previous code, drafted by Google and Bing, did not address AI-generated content, prompting the industry giants to return to the drawing board.
This development is a significant example of how the regulatory and legal landscape is evolving in response to the proliferation of technologies that can automatically generate realistic content. As AI continues to advance, it is essential to address the potential misuse of such technology, particularly in the context of sensitive issues like child exploitation.
By implementing this code, Australia aims to ensure that search engines play an active role in preventing the dissemination of AI-generated child abuse material. The responsibility falls on industry leaders to align their search functions with these new regulations.
Q: What does the new code in Australia require from search engines?
A: The new code requires search engines to prevent the sharing of AI-generated child sexual abuse material and prohibits the production of synthetic versions of such content.
Q: Why did the previous code not cover AI-generated content?
A: The previous code was developed before the rapid growth of generative AI, leading to its inadequacy in addressing the issue of AI-generated child abuse material.
Q: Why is it necessary to regulate AI-generated content?
A: As AI technologies continue to advance, it becomes crucial to address potential misuse, especially in sensitive areas like child exploitation. Implementing regulations ensures responsible use of these technologies.