The Artificial Intelligence (AI) take over has been looming for a while and over the last 12 months AI has progressed significantly, particularly with the rise of ChatGPT.
AI relies on the information available, being anything open to access on the internet, and is able to use that information as required. In theory, the use of AI should significantly increase efficiency where it can be used however in its current state the accuracy of its findings is not always 100%.
What are deepfakes?
Unfortunately, AI is not only being used for good purposes and a pressing concern currently in the media is the use of deepfakes. A deepfake is where AI is used to create a false image/video of another whilst appearing to be scarily real.
A recent example of this was seen online with a fake video of Martin Lewis advertising investment opportunities, ones which he did not personally endorse. The real risk here is that deepfakes could be used fraudulently and in doing so could harm others, including the reputation of the person a deepfake has been created of. This raises the question of whether any action is available by whom the deepfake was made.
Can I claim on the grounds of defamation?
One potential claim may be one of defamation. Defamation involves a claim being brought against any person or entity who has committed libel (written) or slander (verbal) by making false and damaging statements about another person or entity.
To satisfy the elements of defamation a Claimant must show a false statement has been made about them, that statement has been broadcast or published to third parties and as a result, significant harm to their reputation has been caused. However, it is necessary also to show serious financial loss in most cases or at least the potential for it.
The severity of the deepfake content might meet the criteria of a defamation claim and this will likely boil down to whether the victim has been falsely represented. Considering a defamation claim with the Martin Lewis example, if the deepfake had endorsed a fraudulent investment scheme that may satisfy the criteria of being a false statement. That statement will have been published if it was simply shown to another individual and in the digital age, such content has the potential to reach millions.
It is very likely that content such as this – if believed – would cause significant harm to the reputation of someone such as Martin Lewis. He is held in very high esteem due to his expertise in advising the public about financial matters, and if he were seen to have been endorsing a fraudulent investment scheme it would undoubtedly undermine his reputation.
Alternatively, an individual may be able to seek remedies under the UK General Data Protection Regulation including bringing a claim against the publisher for rectification or for the restriction of processing, as it is very arguable that a “deepfake” involves the use of personal data – which comprises any information from which someone may be identified. This can include an individual’s name, image, or voice.
The Court has the power to issue an injunction against any individual or organisation responsible for processing inaccurate data, and can enforce any organisation that holds data (a “data controller”) to rectify or erase inaccurate data.
Alongside a possible defamation or data protection claim, illegally impersonating someone for the purposes of obtaining money, goods or services is a criminal offence under the Fraud Act 2006 and if you consider you have been the victim of such a crime, you should speak with the Police as well as obtaining independent legal advice.
How can Nelsons help
Stuart Parris is an Associate in our expert Dispute Resolution team.
If you require any advice on the above subjects, please contact Stuart or another member of the team in Derby, Leicester, or Nottingham on 0800 024 1976 or via our online enquiry form.
Contact us