This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

The NO FAKES Act: Safeguarding voices and faces in AI

There have been several high profile court decisions related to AI and IP over the past few years, and many of the decisions indicate a need for new legislation to handle the challenges posed by the rise in AI technology. The recent submission of the NO FAKES bill[1] in the US is a real world example of this in action, and gives us an insight into how new legislation may be shaped to take AI and related technologies into account. 

In a very convenient acronym form, US lawmakers have recently proposed a bill that would create a property right associated with one’s image, voice and likeness. The NO FAKES act (Nurture Originals, Foster Art, and Keep Entertainment Safe Act) proposes a new IP right that, whilst similar in principle to existing copyright legislation, includes a few key differences.

The key right that is established by the proposed act is the exclusive right of a person over their digital replica, being an electronic representation that is “readily identifiable as the voice or visual likeness of an individual…”. This extends to all humans, not just those who might make a living from their celebrity, and even includes post-mortem rights extending for a period after a person dies.

The bill includes a number of unusual features, including that the digital replica rights are non-assignable (presumably in a move to protect creators from assigning their rights in perpetuity). Additionally, whilst the rights may be licensed, there is a proposed time limit on any licensing, again likely a move to protect creators rights. This in some ways appears reasonable, given a creator often has little bargaining power compared to a large studio or label, but may change as key stakeholders attempt to shape the bill. 

The bill in its current form does not appear to not consider any “fair use” provision of a person’s digital replica, however a carve-out for parody and review is likely to be introduced in line with other existing IP legislation. This omission also highlights the need for careful consideration of free speech and creative expression concerns as the draft bill evolves.

Ultimately the proposed “readily identifiable” test of the NO FAKES bill provides a passing-off style determination of what could constitute infringement. However, an IP right to one’s “digital replica” seems to be low hanging fruit, and it is difficult to imagine how new legislation might apply to other forms of IP - particularly those with strong existing legislative frameworks such as patent and design rights. Given that AI systems are now capable of generating content such as music, artwork, and even technological designs and arguably inventions (given the right input), it is easy to see a future where copyright, patent or design infringement could be avoided using a “just different enough” AI design around tool, which would be in theory acceptable under current legislation, but may weaken the value of many IP rights.

As the NO FAKES Act moves through the legislative process, it will be important to monitor how it addresses these broader implications for IP in the AI era. The bill's evolution may serve as a template for future legislation aimed at balancing innovation with the protection of individual and corporate IP rights in an increasingly AI-driven world.

[1] https://www.coons.senate.gov/imo/media/doc/no_fakes_act_bill_text.pdf

Senators Introduce Long-Awaited Bill to Protect Artists From AI Deepfakes

Subscribe to receive more articles like this here.

Tags

NO FAKES Act, artificial intelligence, copyright, digital transformation