Americans looking to settle a divorce and obtain custody of their children could rack up unforeseen court costs by trying to disprove artificial intelligence (AI)-generated deepfake videos, photographs and documents, according to a leading family law attorney.
Michelle O’Neill, a co-founder of the Dallas-based law firm OWLawyers, told Fox Business that courts are seeing a “real increase” in fake evidence, frequently created with AI. The problem, she said, is becoming more commonplace and judges are being taught in schools and conferences to remain vigilant.
One type of faulty evidence is AI-generated revenge porn — including fake pictures and videos of individuals engaging in intimate acts. O’Neill notes that while deepfakes have primarily broken into the news when they affect celebrities, the issue also impacts ordinary citizens experiencing breakups or litigating divorces through family court.
A MAJORITY OF SMALL BUSINESSES ARE USING ARTIFICIAL INTELLIGENCE
O’Neill’s claim about these types of AI-generated content “exploding onto the scene” is backed up by statistics showing that the prevalence of deepfake videos, not including photographs, has gone up 900% on an annual basis since 2019.
“When a client brings me evidence, I’m having to question my own clients more than I ever have about where did you get it? How did you get it? You know, where did it come from?” O’Neill said.
The problem also overwhelmingly impacts women. The research company Sensity AI has consistently found that between 90% and 95% of all online deepfakes are nonconsensual porn. Around 90% of that number is nonconsensual porn of women.
Despite the staggering number, O’Neill says social media platforms are slow to act.
First Lady Melania Trump spoke on Capitol Hill in early March for the first time since returning to the White House, participating in a roundtable with lawmakers and victims of revenge porn and AI-generated deepfakes.
Congress is currently zeroing in on punishing internet abuse involving nonconsensual, explicit imagery.
AI SCAMS ARE PROLIFERATING. A NEW TOOL IS ATTEMPTING TO COMBAT THEM
The Take It Down Act is a bill introduced in the Senate by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., that would make it a federal crime to publish, or threaten to publish, nonconsensual intimate imagery, including “digital forgeries” crafted by artificial intelligence. The bill unanimously passed the Senate earlier in 2025, with Cruz saying Monday he believes it will be passed by the House before becoming law.
As the government pushes for new laws, O’Neill says AI used to create fraudulent and explicit content remains an “actual threat” to the judicial system.
“The integrity of our very judicial system depends on the integrity of the evidence that you can go in and present. If you can’t even rely on the integrity of evidence that’s being presented to a judge, if a judge can’t even rely on the integrity of the evidence they are receiving — our judicial system may be utterly at peril by the existence of artificial intelligence,” she told Fox Business.
AI, O’Neill notes, also negatively impacts economically challenged Americans who have fallen prey to fraudulent court evidence. Now, an individual challenging the authenticity of admitted evidence may have to pay an expert in the forensics of video to conduct an examination and verification test.
NEARLY 50% OF VOTERS SAID DEEPFAKES HAD SOME INFLUENCE ON ELECTION DECISION: SURVEY
Fraudulent evidence can even extend to videos indicating the abuse of a child when two parties are fighting for custody. If a party does not have the financial means to disprove that evidence of abuse is AI-generated, judges now must decide if they will take the word of the alleged victim or believe the footage that has entered court.
“What happens to people that don’t have the money [to disprove that]? So, not only do we have a threat to the integrity of the judicial system, but we also have an access-to-justice problem,” O’Neill said.
The family law attorney noted that judges primarily see nefarious AI use in creating fake documents, such as falsified bank records or drug tests.
One judge also told O’Neil that they came across a falsified audiotape that shed the other party in a negative light. The recording quality was not believable enough. The judge reprimanded the individual and the evidence was excluded.
GET FOX BUSINESS ON THE GO BY CLICKING HERE
However, with the rapid increase in this technology, O’Neill worries that the gap between what is real and what is AI-generated will narrow.
“I think it’s an issue at many levels of our society. And, you know, drawing attention to it is something that is very important,” she said.
Economically challenged clients looking to settle a divorce and obtain custody of children may incur additional costs trying to disprove AI deepfakes.