A session on the role of emerging technologies in the courtroom waspart of last month’s New York State Bar Association Annual Meeting in New York City.
“Emerging Technologies in Litigation” included a panel of local and federal judges as well as an e-discovery researcher and emerging technology attorney. The group discussed the use of artificial intelligence in the courtroom.
The session addressed the role that AI could play in judicialdecision making, where algorithms potentially can predict behavior andoutcomes resulting from different legal strategies. The rationale is that lawis based on precedent — if a case is similar to past cases, then theresults shouldn’t be all too surprising.
However, given the rise ofdeepfakes — and the possibility that AI in effect could manufactureevidence — some argued that the technology should be excluded from court proceedings.
Despite such concerns, the global “legaltech” market for AI isexpected to grow in the coming years, driven by the trend in major law firmsto adopt various legaltech solutions that aim to reduce turnaroundtime for some legal cases.
AI is used to help with document management systems, e-discovery, e-billing, contract management, and even practice and case management.
AI already has been employed at a lower level in the Los AngelesSuperior Court to handle seemingly mundane traffic citations. Visitorsto the court’s website can interact with Gina, an AI-powered onlineavatar, to pay a traffic ticket, register for traffic school, or schedule a court date.
Since being installed in 2016, Gina — which ispart of an effort by the LA Superior Court to reduce the backlog ofcases — has had more than 200,000 interactions a year, and has reduced traffic court wait times dramatically.
One Step Closer to PreCrime
AI’s predictive algorithms can be used by police departmentsto strategize about where to send patrols, and facial recognitionsystems can be used to help identify suspects.
Combined, these approaches sound eerily similar to the Philip K. Dick short story, “The Minority Report,” which became the basis of the Steven Spielberg-directed filmMinority Report, in which the police department’s PreCrime unitapprehends criminals based on foreknowledge of criminal activity.
“Courts currently are using AI algorithms to determine the defendant’s’risk,’ which can range from the probability that the defendant willcommit another crime to whether or not they will appear for their nextcourt date for bail, sentencing and parole decisions,” explainedtechnology inventor/consultant Lon Safko.
Often AI can be wrong — not only in determining whereofficers should patrol, but also in recommending how criminals should be sentenced.Here is where the Correctional Offender Management Profiling forAlternative Sanctions comes into play. It compares defendant answersto questions as well as personal factors against a nationwide datagroup and assigns a score, which is used to determine sentencing.
“Recently in Wisconsin, a defendant was found guilty for hisparticipation in a drive-by shooting,” Safko told TechNewsWorld.
“While being booked, the suspect answered several questions that wereentered into the AI system COMPAS,” he continued. “The judge gave thisdefendant a long sentence partially because he was labeled ‘high risk’by this assessment tool.”
AI in the Courts
At the present time it isn’t clear how widespread the use of AI in thecourts will be — in part because the courts at all levels have beenquite slow to embrace any new technology. This could be changing,however, as AI can help streamline the courts in ways that couldbenefit all parties.
“We believe the courts are leading digital transformation in themarket, and approximately 90 percent of courts have evolved fromtraditional court reporting to professional digital court reporting,”said Jacques Botbol, vice president of marketing at software firmVerbit.
“Certain applications of AI are often adopted faster than others –particularly those surrounding the automation of routine tasks andworkflows,” he told TechNewsWorld.
“It’s interesting to note that AI is also being utilized through morecomplex applications, such as utilizing AI to make decisions regardingcases,” added Botbol. “These use cases will be adopted more slowly asthere are significant concerns about due process, biases, etc.”
AI Court Reporting
Supporters of AI technology in the courts point to how it can helpcourt reporters do their job better.
“Today, most court reporting firms reject work since they don’t havethe necessary workforce to handle it all,” explained Botbol.
“AI is helping to fill the gaps that the retiring court reporters andthe legacy court reporter market have left,” he noted.
At the same time, “lawyers want to receive materials quickly, andtoday depositions are getting delayed because of the shortage in themarket — with some areas reaching more than 35 percent,” Botbol added.
AI, along with automatic speech recognition (ASR), allows forproceedings to be recorded and processed in a timelier manner.
“There is a backlog of cases that need to be transcribed, yet withAI-based ASR tools these transcripts can be processed at fasterturnaround times,” said Botbol. “Instead of relying on courttranscriptionists, the courts have multiple court reporting agenciesthat they can assign the work out to in order to clear their backlogand work more efficiently.”
Judge and/or Jury
No one is expecting that AI will fill the role of judge or jury –at least not in the legal system of the United States. However, AIcould help ensure that the accused in criminal cases truly aregranted the right to a speedy trial, while also addressing the backlogsin the civil courts.
“In the future, AI will not only serve as an add-on, but will alsohelp to streamline trials by removing delays, which will lead tosmarter and faster decisions being made,” said Tony Sirna, legalspecialist at Verbit.
“Applications of AI are being studied and piloted for a number of usecases,” he told TechNewsWorld.
These include not only sentencing and risk assessment such as COMPAS,but also settlement of disputes.
“Online Dispute Resolution is another aspect where we may seeautomated adjudication of small civil cases,” noted Sirna.
AI could help the parties reach an equitable settlement in civil cases.
“Mining extensive amounts of related court cases and decisions willcome into play, with parties submitting their cases and using AIcombined with data mining for settlement options or fairadjudication,” noted Sirna.
Another consideration that likely will come up is how AI will be treated by the courts. Can AI be an “expert witness,” for example? If so, how will AI need to be treatedby the courts? Will AI need to be granted some form of rights?
“AI likely won’t need ‘rights,’ but it will need control, and a teamthat manages the innovation in each court,” said Sirna.
“The aspect of ‘rights’ related to AI poses interesting legalquestions: Who is responsible for the AI? Is the AI algorithm fair orbiased? At what point does the AI make its own decisions? Who isliable for results or decisions rendered by algorithms — the user, thedesigner, or the court?” pondered Sirna.
However, many of these questions likely won’t need to be addressedanytime soon — nor will AI have the power to pass judgment.
“Our judicial system is by no means ‘early adopters,’ but for goodcause,” said Safko.
“Rendering a just verdict and sentence is paramount, and we have to besure that the defendants and plaintiffs are properly represented andthat their information is protected,” he said. “This is why doctorsinsist on still using fax machines over email, which can easily behacked.”
AI could have a place in the courtroom, but perhaps only to aid the humanlawyers, judge, court reporters and jury. AI shouldn’t replace any ofthose humans, but aid them in doing their job.
“Once a technology has proven itself to be reliable and show a time orcost savings, it has been and will be adopted,” suggested Safko.
“AI is not a perfect science — it is still programmed by humans, andnot every set of data perfectly matches the predetermined rulesprogrammed into the application,” he warned.
However, with the increasing pressure on court dockets, any time or cost saving measures need to be considered. It is important too, to consider how AI then could affect people’s lives.
“Every automated recommendation should be reviewed by a qualifiedjudge to verify the outcome. Then their recommendation needs to be fedback into that system to allow it to become more proficient atrendering appropriate decisions,” said Safko.
“We can’t risk peoples’ lives on automated apps that save money,” he noted.
“Even the Chief Justice of our Supreme Court, John Roberts, isconcerned about how AI is affecting the U.S. legal system,” Safkoexplained. “When asked about AI in our legal system, he said ‘it’s aday that’s here, and it’s putting a significant strain on how thejudiciary goes about doing things.'”