London: The UK has not achieved consensus on formulating an AI copyright code for the use of copyrighted materials in AI training. This failure comes after nearly a year of discussions involving key stakeholders such as the UK government, major AI firms including Microsoft and Google DeepMind, and various creative organizations including the BBC and the Financial Times.
The Intellectual Property Office (IPO) aimed to create a voluntary code of practice for text and data mining to guide the training of AI models on copyrighted content such as books, images, and films. However, these efforts have stalled, leaving the matter in the hands of the Department for Science Innovation and Technology, which is not expected to issue definitive policies soon.
This impasse is a concern for creative professionals who fear unauthorized use of their work by AI technologies, which can replicate and disseminate creations without proper credit or compensation. High-profile legal actions and public outcry have highlighted the issue, with accusations against tech companies for using artists’ work without permission to train their AI models.
The situation underscores the lag in legal frameworks keeping up with the rapid adoption of AI across various sectors, including entertainment. Equity, representing 50,000 performers and creative practitioners, has warned of potential industrial action to demand better protection for artists against AI exploitation.
Meanwhile, efforts to ensure ethical data sourcing in AI development are underway, with initiatives like Fairly Trained aiming to certify companies that adhere to ethical standards.
The government’s response to a review recommending clearer guidelines on intellectual property and generative AI includes working towards a code of practice to facilitate data mining licenses and protect rights holders. This approach aims to balance the growth of AI and creative sectors, promoting the UK as a leader in research and AI innovation. However, without consensus or adoption of this code, legislative measures may be considered.