On September 30th, OpenAI launched Sora 2, the latest version of its generative artificial intelligence (“AI”) video creator tool. The app, which was downloaded over a million times within five days, allows users to create videos up to twenty seconds long from simple text prompts.

The release of Sora, and other analogous platforms like Google Veo 3 and Runway Gen-4, highlights the speed and scale of recent developments in the digital landscape. Over the past few years, AI generated photos, videos and documents have become increasingly widespread and realistic. These technologies have been praised for unlocking creative potential and improving workplace efficiency in certain settings. However, they have also made it easier for individuals to doctor documents and create ‘deepfake’ evidence (i.e. images, videos or audio recordings that have been convincingly manipulated to misrepresent someone). As a result, serious concerns have raised surrounding the risks of inauthentic digital evidence being presented in litigious proceedings. This article will consider the implications of doctored evidence in family law cases in particular, and the steps legal practitioners can take to prevent, detect and challenge these inauthentic documents and recordings.

Digitally falsified evidence and family law

Family law is perhaps one area where the risk of digitally falsified evidence is especially high, given the significant personal stakes in many cases. Indeed, there are already several reported instances of individuals manipulating documents in the family courts. Often, these falsified documents are presented by one party with the intention of misleading their current/ex-spouse in regard to their financial situation.

In the case of X v Y [2022] EWFC 95, for example, His Honour Judge Edward Hess found that the husband had failed to provide proper financial disclosure and had dishonestly manufactured a bank statement in order to convince his wife into giving up work and relocating to the UK. In this case, the judgement was published “to draw wider attention to the ability of dishonest parties to manufacture bank statements (and other documents) which, for all practical purposes, look genuine, but which are in reality not in that category”.[1]

Worryingly, there are also cases of falsified documents and recordings being submitted as evidence in court proceedings, a move that threatens to undermine legal proceedings and the right to a fair trial. In a 2020 child custody battle, for instance, the child’s mother presented a recording of the father, allegedly making direct and violent threats to his wife. Analysis by digital forensics experts, however, found that it had been highly doctored and manipulated. Byron James, a partner at Expatriate Law who was involved in the case, explained “if we hadn’t been able to challenge this piece of evidence, then it would have negatively affected him and portrayed him as a violent and aggressive man”. He went onto emphasise the knock-on effects of this case: “It raises all sorts of questions about what sort of evidence you can rely on”.[2]

Questioning the authenticity of any produced documents – CPR 32.19

Given this context, it is exceedingly important for family lawyers to understand how, and when, to challenge the authenticity of any evidence produced in court proceedings. This framework is outlined in the Civil Procedure Rules (“CPR”).

Notably, Part 32 of the CPR states that a party is deemed to admit the authenticity of a document disclosed under Part 31 (disclosure and inspection of documents) unless he serves notice that he wishes the document to be proved at trial. If you do wish to challenge the authenticity of any documents, it is therefore crucial to raise the issue early on and serve a notice clearly stating the intention to challenge the document and the reasons for doing do.

Under CPR 32.19, a ‘notice to prove a document must be served –

  1. by the latest date for serving witness statements; or
  2. within 7 days of disclosure of the document, whichever is later.’[3]

In such cases, the burden of proving the document’s authenticity lies on the disclosing party. However, the challenging party is typically advised to engage the services of a forensic expert, and they may also wish to make demands about what they consider necessary to prove authenticity. Here, the increasingly sophisticated and realistic outputs of generative AI tools may mean that finding a relevant forensic expert can be difficult or expensive.

The consequences of presenting inauthentic evidence in court are stark. The disclosing party could be found to be in contempt of court, punishable by up to two years in prison, a fine, or both. They could also be convicted under the 2006 Fraud Act, which carries a maximum penalty of ten years imprisonment and would constitute a criminal offence. In cases where a claimant relies solely on false evidence, the defendant may further seek to have the claim struck out by the court, either in part or in full.

Detecting and preventing digitally falsified evidence

Forgeries in family courts are of course nothing new but, thanks to the development of generative AI and other similar technologies, they are becoming increasingly difficult to spot. It is helpful, therefore, for legal practitioners to stay up to date with recent developments in technology and to critically evaluate any evidence presented to them.

In April this year, the Courts and Tribunals Judiciary updated their guidance on the use of AI. In part, this guidance alerted judicial office holders to the incidence of unrepresented litigants using AI chatbots as a source of legal advice whilst also warning against the inaccuracies that can arise when legal representatives have used AI to conduct their research. The guidance also highlights how ‘AI tools are now being used to produce fake material’ and reminds judges to ‘be aware of this new possibility and potential challenges posed by deepfake technology’.[4]

Some key signs to look out for when analysing evidence include incorrect spelling, non-existent dates (e.g. September 31st), inconsistencies within the document itself, and unusual metadata (i.e. the date, time and location the evidence was last edited).

In addition, there are a range of automated tools that have been designed to detect and authenticate evidence. Amped Five, for example, is one forensics software which claims to produce an ‘accurate, repeatable, reproducible’ report of any image or video. Resistant.ai is another, developed to detect fraud in documents. However, Dr Maura R. Grossman, Research Professor at the University of Waterloo in Ontario, Canada warns of the challenges of relying on these automated tools to authenticate and detect evidence: “we aren’t at the place right now where we can count on the reliability of the automated tools”.[5] Legal practitioners should therefore make sure to review the outputs of any such tools, whilst also applying their own critical reasoning to evaluate a document’s authenticity.

Conclusion

As generative AI continues to develop at record pace, so too do the risks of digitally modified evidence finding their way into the courts. In family law, this risk may be especially high, making it particularly important for legal practitioners to be confident when analysing and challenging suspicious material. These technological changes may lead to the introduction of new evidentiary standards or deepfake specific laws at some point in the future. In the meantime, however, family lawyers should focus on critically evaluating all evidence they are presented with, making sure to raise any questions or concerns early on in proceedings.

[1] X v Y [2022] EWFC 95

[2] Deepfake audio evidence used in court to discredit father | CYFOR

[3] PART 32 – EVIDENCE – Civil Procedure Rules – Justice UK

[4] Refreshed AI Guidance published version – website version

[5] Deepfakes on trial: How judges are navigating AI evidence authentication – Thomson Reuters Institute

 

By Evie Nicholls

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close