By Rebecca Bocchinfuso – Director of AMICUS
The term ‘deepfake’ refers to the digital copying of a subject's face onto another digital image in order to create ‘fake media’ which is extremely realistic looking.[1] While the technology is impressive in theory, the potential dangers of such an advancement cannot be overstated. For instance, in March of 2022, a video surfaced of Ukrainian President Volodymyr Zelenskyy appearing to order soldiers to surrender in the fight against the Russian invasion of the Ukraine.[2] This video was later debunked as a deepfake, but the consequences could have proven to be dire.[3] This exposes a major hole in the law which legal systems worldwide must undertake to counteract swiftly.
In the context of the British civil law, it is now understood from the decision in O’Shea v MGN Ltd[4] that a publisher will not be held liable to a claimant in defamation where it is a case of mere lookalikes. In O’Shea, the claimant was a woman who looked extremely similar to a picture which was posted on a pornographic website. She claimed that as a result of this physical resemblance, her friends and family were under the mistaken belief that she was involved in the business of pornography. The court denied Ms O’Shea’s defamation claim on the grounds that it would impose an impossible burden on publishers to check if an image resembles anyone else in the world before posting it. This decision is, of course, entirely logical, as surely with the billions of people in this world, it is likely that everyone has a doppelganger or two.
Now imagine a similar situation, only where the pornographic content which bears an eerie resemblance to the claimant was created by using deepfake AI technology. Would the publisher be liable for defamation? In following O’shea, it is arguable that since the image is not the claimant, and there are many people in the world who bear physical facial similarities to the pornographic content, it would be difficult to say for certain that it is truly the claimant’s face which is being used as the deepfake. This all sounds like a dystopian horror story, right? Sadly, not. In fact, most deepfakes today are currently being used to create pornography.[5] Recently, it has come out that several major social media streamers have become the victims of deepfake pornography, where explicit content is sold to viewers using the victims face, leaving the victims feeling violated and perturbed by such an invasion.[6] The result of this development is that nobody is safe from deepfakes, and anyone with content posted of themselves online runs the risk of becoming victim to deepfake pornography.
In a press release in November of 2022, the British government promised new laws to protect victims from abuse of intimate images upon the Law Commission’s recommendation.[7] As such, amendments have been made to the Online Safety Bill to criminalise the sharing of explicit deepfake images or videos which have been designed to look like an individual without their consent, which may warrant potential time in custody.[8] The bill is currently on its second reading in the House of Lords, and is yet to form part of British law.[9]
With the statutory formulation of the law in this area still underway, we are thus yet to see how the common law will respond. It is hoped that civil law will develop to mirror the objectives of the Online Safety Bill to discourage such violating usage of deepfake technology, and hold those in violation liable in the tort of defamation. How that is to work in practice is yet to be discovered.
[1]Dave Johnson, ‘What is a deepfake? Everything you need to know about the AI-Powered fake media’ (Insider, 10 August 2022) <https://www.businessinsider.com/guides/tech/what-is-deepfake?r=US&IR=T>accessed February 1 2023. [2] Bob Allyn, ‘Deepfake video of Zelenskyy could be the “tip of the iceberg” in info war, experts warn’ (Insider, 16 March 2022) <https://www.npr.org/2022/03/16/1087062648/deepfake-video-zelenskyy-experts-war-manipulation-ukraine-russia>accessed February 1 2023. [3] Bob Allyn, ‘Deepfake video of Zelenskyy could be the “tip of the iceberg” in info war, experts warn’ (Insider, 16 March 2022) <https://www.npr.org/2022/03/16/1087062648/deepfake-video-zelenskyy-experts-war-manipulation-ukraine-russia>accessed February 1 2023. [4] [2001] EWHC QB 425. [5] Tom Simonite, ‘Most Deepfakes are Porn, and they’re multiplying fast’ (Wired, 7 October 2022) <https://www.wired.com/story/most-deepfakes-porn-multiplying-fast/>accessed February 1 2023. [6]Josh Tyler, ‘Pokimane, QTCinderella & Sweet Anita respond after Atrioc deepfake drama’ (Dexerto, 31 January 2023) <https://www.dexerto.com/entertainment/qtcinderella-sweet-anita-respond-after-atrioc-streamer-deepfake-drama-2047529/>accessed February 1 2023. [7] Ministry of Justice, ‘New laws to better protect victims from abuse of intimate images’ (Gov.uk, 25 November 2022) <https://www.gov.uk/government/news/new-laws-to-better-protect-victims-from-abuse-of-intimate-images#:~:text=Under%20a%20planned%20amendment%20to,face%20potential%20time%20behind%20bars.> accessed 1 February 2023. [8] Ministry of Justice, ‘New laws to better protect victims from abuse of intimate images’ (Gov.uk, 25 November 2022) <https://www.gov.uk/government/news/new-laws-to-better-protect-victims-from-abuse-of-intimate-images#:~:text=Under%20a%20planned%20amendment%20to,face%20potential%20time%20behind%20bars.> accessed 1 February 2023. [9] UK Parliament, ‘Parliamentary Bills’ (UK Parliament, 26 January 2023) <https://bills.parliament.uk/bills/3137> accessed 1 February 2023.
Comments