5 Creative AI Legal Issues And Challenges
In May 2023, the UK’s Competition and Markets Authority launched an initial review of artificial intelligence models and will report its findings in September 2023. In the US, the Federal Trade Commission is also watching the development of generative AI closely and in May issued guidance for businesses on the use of generative AI, warning against unfair or deceptive practices. The office reiterated Wednesday that copyright protection depends on the amount of human creativity involved, and that the most popular AI systems likely do not create copyrightable work. From a financial point of view, it seems clear from an ethical perspective that when AI companies stand to make enormous profits from this technology, artists whose talent contributed to their success deserve their cut.
This gets even more complicated when you consider that Artificial Intelligence can cross international borders very easily. For example, the use of the .ai domain (the code for Anguilla) genrative ai is popular with AI businesses purely for the fun of the name. So, an AI algorithm may be developed in the US, be used by people all over the world, and operate from Anguilla.
Can AI art be copyrighted in the UK?
For example, it assumes that such regulators have the necessary expertise and resources. All in all, the ethical challenges of creative AI have the potential to pose significant legal challenges, both now and in the future. But, if there’s human intervention or support in conjunction with AI, copyright could potentially be conferred. Copyright Office’s stance that art crafted by AI doesn’t warrant copyright protection.
There also lacks empirical study on whether copyright protection will incentivise investments into the research and development of generative AI. The degree of originality required for copyright to subsist was original skill and labour by the author in bringing that work into existence. At the time of writing, the CO not only grants protection to work created by a human author, but also work generated wholly by a computer without any direct human involvement (“Computer-generated Work”).
More The Conversation Articles…
Generative AI is a form of machine learning trained on vast amounts of data that allows computers to generate content such as written text, music and art from prompts. In fact, the specific question of whether AI-generated work can be registered is currently being litigated. The same applies to synthetic data, which are artificially generated by AI systems rather than collected from real-world sources. For example, who owns the rights to synthetic images if an AI system generates synthetic images based on real-world images? This raises questions about the rights of the creators of the original data, as well as the rights of the creators of the synthetic data. From the intellectual property rights of AI-generated works to the liability of generative AI, several legal issues need to be navigated.
Founder of the DevEducation project
Competition law could shape the AI landscape
Copyright protects the expression of an idea, not the idea itself (the famous idea/expression dichotomy). It will be difficult in my opinion for an artist to successfully sue for copyright infringement as their style is not protected, and as mentioned above, it is unlikely that an AI tool will reproduce a work verbatim (can you use verbatim for images? I digress). Between open data, public domain images, and the data mining exceptions, this means that we can assume that the vast majority of training for machine learning is lawful.
As the gaming industry continues to drive innovation and implement AI in 2023 and beyond, it will need to take account of the evolving regulation of AI and the broader regulation of the digital economy. Generative AI offers the potential for even more engaging immersive experiences in virtual worlds and for improving processes for game development and monetisation opportunities. Keep up to speed on legal themes and developments through our curated collections of key content. But of course, money talks – and we can expect the AI providers – such as global tech giants Google, Microsoft and Meta – to strongly oppose any changes that will inhibit their ability to innovate (and make money). Toorent, for example, says she realized the scale of the problem when she visited a gallery displaying AI artwork and was able to identify works that were based on her own style. Accusations of breach of copyright have also been made by Getty Images, which alleges that 12 million of its proprietary photographs were used by Stability AI, without its permission, to train its Stable Diffusion image generation tool.
It is intended to provide a middle ground between giving authors a sufficient incentive to create (by granting them a level of control over how their works can be used), while giving the public the right to build on or use those works in new and interesting ways. Owen is a rights specialist with expertise in data protection and intellectual property, and considerable experience in both contentious and advisory contexts. He is a recognised authority in information sharing and data privacy in schools, fundraising, and the sports sectors, with a particular interest in safeguarding.
Whoever does the work in “preparing” an AI to create a work is the author and copyright owner. The use of Artificial Intelligence (AI) tools in the creation of all kinds of artwork – visual, textual, musical – has accelerated over the past year or so. The rise in AI-generated visual artworks has been fuelled by the development of new text-to-image tools, such as DALL-E, Stable Diffusion genrative ai and Midjourney. Competition authorities have been tracking the development of AI for some time, as part of a wider trend of scrutiny and intervention in digital markets. While it is too early to say whether generative AI tools such as ChatGPT will raise competition concerns, authorities may well road-test a range of competition issues based on how digital markets have evolved to date.