Deep Fakes: The Current Status of the Law

This summer the US Copyright Office issued a report entitled “Copyright and Artificial Intelligence: Digital Replicas.” The report resulted from both the current concerns regarding the legal issues surrounding the use of “deepfakes” which are digital replica technologies that falsely depict a person. The report was the result of nearly a year of information gathering starting in August 2023. The report concluded that the current laws were insufficient to protect individuals.

While various state laws provide varying degrees of rights of publicity and privacy, these laws are inconsistent and, in some states, there are no rights at all. The only way to protect individuals is by the adoption of a federal right. So for now, if you are looking to protect your image you will need to speak with an attorney to determine your rights under state law. In may situations, the result may simply be that you have no protection and if you do, the ability to enforce the right is costly and without a chance of real-world results.

Congress has also been working on this issue since last year. The draft “Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act” proposed by Congress, is the current attempt to establish a federal intellectual property right in a person’s voice and visual likeness. The act incorporates many of the recommendations made by the Copyright Office.

However, the law has not passed. The act is currently with the Senate Judiciary Committee. The act was drafted with cooperation from the RIAA, SAGAFTRA, MPA, and various AI Companies. It is a bi-partisan bill and should receive widespread support across political lines once it comes to a vote.

Highlights of the ‘NO FAKES’ Act

Rights defined: The act grants individuals the exclusive right to authorize the use of their voice or visual likeness in digital replicas and defines a digital replica as “a newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual…”

Rights cannot be assigned: The act provides that you cannot assign (or grant) digital replication rights to another during an individual’s lifetime. The purpose of this is to protect artists and creators from being pressured into transferring their rights. Artists often have less bargaining power earlier in their career (if at all) and the act seeks to protect them from losing control over how their voice and image are used.

Rights may be licensed: The act provides that although the rights may be licensed, the duration cannot exceed 10 years for living individuals.

Rights after death: The act would grant rights after death, ranging from a minimum of 10 years to a maximum of 70 years post-mortem.

Rights of Safe Harbor: Most artists are familiar with the safe harbor provisions of the DMCA. The No Fakes Act includes a similar safe harbor provision. Artists would have a mechanism to seek the removal of unauthorized digital replicas and online providers would avoid liability if they remove them as soon as possible upon receiving a takedown notice.

In short, the act seeks to provide a federal right to hold people liable if they produce an unauthorized digital replica of an individual; hold online platforms liable for hosting an unauthorized digital replica; while excluding certain digital replicas based on recognized First Amendment protections.

If you are suffering from chronic insomnia, you can read the full reports and draft act on your own. The links are below. The key takeaway is that everyone is concerned about the legal issues raised by the new technology and there is at least a consensus that legislation needs to be enacted quickly to provide protection.

The reports from the US Copyright Office can be found here https://www.copyright.gov/ai/

The draft legislation can be found here

https://www.congress.gov/bill/118th-congress/senate-bill/4875/text?s=1&r=2&q=%7B%22search%22%3A%22no+fakes%22%7D

Callout BackgrounfCallout Backgrounf
Schedule Your Appointment