All revisions

Revision #1

System

15 days ago

Inside LA's $300,000 Gamble: Can an AI "Sous Chef" Fix America's Most Overburdened Court System?

Six Los Angeles County judges are now using artificial intelligence to help write their rulings. The pilot raises urgent questions about efficiency, fairness, and what happens when algorithms enter the chambers of the nation's largest trial court.

The Pilot: What It Is and What It Does

In February 2026, the Superior Court of Los Angeles County quietly launched one of the most consequential experiments in American judicial history. Six civil court judges were given access to Learned Hand, an AI platform purpose-built for the judiciary, capable of rapidly distilling hundreds of pages of legal motions, analyzing a judge's writing style, and drafting tentative rulings [1][2].

The tool, named after the legendary federal judge Billings Learned Hand, was founded in 2024 by Shlomo Klapper, a former attorney and federal law clerk. The pilot contract is valued at roughly $300,000 and runs through early 2027 [1][3]. Learned Hand is not a general-purpose chatbot; it is specifically engineered for judicial work and includes a proprietary fact-checking system called "Deep Verify," which interrogates every sentence of a generated order to ensure factual claims match cited case law [1][3].

The pilot is limited to civil matters only—primarily motions for summary judgment and class-action settlement approvals, with potential limited future applications for postconviction relief. Criminal cases, felonies, misdemeanors, and juvenile proceedings are explicitly excluded from the current phase [1][2].

Klapper has described the tool as a "judicial sous chef"—a support system, not a replacement. "There is no reason to fear that any technology company on earth, much less my own, should be making consequential decisions for the public," he told reporters [1].

The Crisis That Created the Opening

The pilot did not emerge in a vacuum. It arrived at a moment of acute institutional stress for California's court system—and for the Los Angeles Superior Court in particular.

California's trial courts processed roughly 5.9 million cases in fiscal year 2018–19. That figure plummeted during the pandemic, dropping to approximately 4.4 million in FY 2021–22, before partially recovering to 4.5 million in FY 2022–23 and surging to over 5.3 million in FY 2024–25 [4][5]. But the filings that stacked up during the pandemic years never went away. They compounded.

The numbers are staggering. In 2021, pending criminal cases across California, New York, Florida, and Michigan alone exceeded 1.3 million [6]. In Los Angeles, cases fail to proceed on their originally scheduled dates 85 percent of the time—a rate attorneys say is unmatched anywhere else in the country [7]. Meanwhile, the court recently discovered a backlog of approximately 464,000 unreported criminal case dispositions spanning six decades, caused by failures in its legacy case management system [8].

California Superior Court Filings by Fiscal Year

The root causes are structural. The fiscal year 2024–25 state budget required a $97 million reduction to California's trial courts [9]. In LA County, this translated to an estimated $30.3 million cut, plus an additional $3.9 million in reductions to state-funded programs [10]. The court responded with a Voluntary Separation Incentive Program, offering $35,000 to full-time employees with five or more years of service to resign—effectively paying experienced staff to leave [10].

"The cuts to the state's trial courts are concerning and consequential," the Presiding Judge said at the time, warning that reduced staffing would diminish the court's ability to provide "timely and efficient access to justice" [10].

For 2025–26, the Governor's budget allocated $5.5 billion to the judicial branch, including an $82 million ongoing augmentation to trial court operations—partly to restore the prior year's cut [11]. But attorneys who practice in Los Angeles say the damage is already done. Judge shortages persist. "There are never enough judges in L.A. County, but the judges in the county work extremely long and hard hours to accommodate," one court official noted [7].

How LA Compares: A National Backlog Epidemic

Los Angeles is not alone. Court backlogs have become a defining crisis of the American judicial system.

CBS News analyzed data from courts and district attorneys' offices in more than a dozen major cities and found that pending criminal cases nationally jumped from 383,879 in 2019 to 546,727 in 2021—a 42 percent increase [6]. In New York City, over 49,000 criminal cases remain pending, and murder cases now take an average of 27 months to resolve—37 percent longer than in 2019 [12]. Over 8,000 convicted individuals in New York were awaiting sentencing at one point [6].

In Cook County (Chicago), as of late 2020, more than 1,000 people on house arrest and over 2,200 in jail had been waiting over a year while still presumed innocent [13]. In San Antonio, a moratorium on in-person criminal jury trials during the pandemic drove a 67 percent increase in pending felony cases, to roughly 9,500 [6].

These delays carry profound human costs. In San Jose, a murder suspect arrested in 2018 endured 20 trial postponements over nearly five years. In San Francisco, a woman spent two and a half years in jail before her case was dismissed for insufficient evidence [6].

U.S. State & Local Government Employment (Thousands)
Source: Bureau of Labor Statistics (CES9092000001)
Data as of Mar 19, 2026CSV

The AI Landscape: Learned Hand Is Not Alone

Los Angeles's pilot places it within a rapidly expanding national movement. Learned Hand is already operational in 10 states, including a contract with the Michigan Supreme Court, which began using the software in summer 2025 to review applications for permission to appeal in both civil and criminal cases [3][14].

Other jurisdictions are pursuing different AI strategies. A federal judge in the Western District of Texas uses generative AI to summarize cases, identify key players, generate timelines, and prepare questions for attorneys [15]. Florida's Eleventh Judicial Circuit launched a "Drive Legal" program using AI to assist self-represented litigants [15]. New York City's Housing Court Answers deployed AI chatbots to help tenants navigate eviction and housing law questions [15].

Perhaps the most striking research comes from New York City pretrial data: researchers found that machine learning predictions could increase accuracy in identifying high-risk defendants by 25 percent, with one scenario showing a 40 percent reduction in pretrial jailing with no increase in crime rate [15].

Yet the track record is mixed. A federal judge in New Jersey had to reissue an order riddled with AI-generated errors. A judge in Mississippi issued an order containing apparent AI hallucinations that went initially undetected [15]. In California, defense attorneys identified AI-generated "hallucinations"—fabricated case citations—in prosecutor filings from the Nevada County District Attorney's Office, triggering a petition to the California Supreme Court [16].

The Anchoring Problem: When AI Writes First, Who Really Decides?

The most incisive criticism of the LA pilot has come from an unexpected quarter: Los Angeles County District Attorney Nathan Hochman, who warned that AI-generated draft rulings could unconsciously bias judges before they conduct independent analysis [1][2].

This concern—known in behavioral science as anchoring bias—is central to the debate. An anonymous LA County judge articulated it plainly: once an AI recommendation exists, it becomes "a reference point" that shapes all subsequent decision-making [2]. The question is not whether judges will rubber-stamp AI outputs, but whether the mere existence of a machine-generated draft subtly constrains the range of judicial reasoning.

Current safeguards require judges to "review and edit the draft before adopting tentative rulings" [1]. But critically, no current rules require judges to disclose that Learned Hand was used in producing a ruling [2]. A litigant who receives an adverse tentative ruling has no way of knowing whether it originated from the judge's independent analysis, from a law clerk's research, or from an AI system's algorithmic processing of case law.

This transparency gap is particularly troubling given the constitutional stakes. As a Duke Law analysis noted, "judicial legitimacy depends on the public's confidence that a judge's decisions are reasoned, ethical, and explainable" [17]. When AI developers and vendors—who are not subject to judicial canons or disciplinary oversight—shape decisions that affect liberty and rights, accountability becomes diffuse [17].

The Constitutional Minefield

The ACLU has called for a moratorium on algorithmic risk assessment tools in the justice system until independent audits can demonstrate fairness and reliability [18]. Their core argument: AI systems trained on historical criminal justice data inevitably encode the biases present in that data—including the disproportionate policing and prosecution of Black communities.

The concern is not theoretical. ProPublica's landmark 2016 investigation found that the COMPAS risk assessment algorithm incorrectly classified Black defendants as high-risk at nearly twice the rate of white defendants [18]. In 2025, The Washington Post reported that 15 police departments across 12 states were using facial recognition systems to make arrests without direct evidence, raising Fourth and Fourteenth Amendment concerns [19].

For the LA pilot specifically, the civil-only scope mitigates some of the most acute due process risks. Summary judgment motions and class-action settlements do not involve the same liberty interests as criminal proceedings. But the postconviction relief provision—and the stated possibility of future expansion—means these constitutional questions are not hypothetical.

Twenty-two scholars, lawyers, and criminal justice advocates have already filed a supporting brief with the California Supreme Court arguing that AI errors in court filings "represent an existential threat to the due process rights of criminal defendants" [16].

The Uncomfortable Counterargument

Yet there is a case that critics rarely confront directly: if the current system is already failing, how high is the bar for AI to clear?

When cases don't proceed on schedule 85 percent of the time in Los Angeles [7]; when a woman spends 30 months in jail before her case is dropped [6]; when 464,000 criminal dispositions go unreported for decades [8]; when state budgets force courts to pay experienced staff to leave [10]—the status quo is not a neutral baseline. It is a system producing its own form of systemic injustice, one measured in years of pretrial detention, foregone livelihoods, and eroded trust.

A Brooklyn pilot project in 2019 that imposed formal timelines and structured conferences—without any AI—achieved an 11 percent increase in case resolutions [6]. If modest procedural reforms can move the needle, what could well-audited AI tools accomplish?

The New York City pretrial research suggesting a 40 percent reduction in pretrial jailing with no increase in crime offers one data point [15]. But the honest answer is that no one yet knows. The LA pilot, and programs like it, will generate the evidence that either validates or undermines these claims.

What Comes Next

The $300,000 pilot is, by design, modest. Six judges. Civil cases only. A tool that drafts, not decides. But its implications extend far beyond the Stanley Mosk Courthouse.

Over 1,000 AI-related bills were introduced across U.S. states in 2025 alone [19]. The California Judicial Council has issued a Model Policy for Use of Generative Artificial Intelligence, addressing confidentiality, bias, safety, and accountability [20]. The question is no longer whether AI will enter American courtrooms—it already has, in at least 10 states and counting.

The real questions are about governance: Who audits these systems? What error rates are acceptable? Must AI assistance be disclosed to litigants? Can defendants challenge algorithmic reasoning under the Confrontation Clause? And who bears responsibility when an AI-assisted ruling is wrong—the judge, the developer, or the institution that adopted the tool?

Los Angeles, with its crushing caseloads and chronic underfunding, may be exactly the right place to test these questions. But the stakes demand more than a quiet pilot and a $300,000 contract. They demand transparency, rigorous evaluation, and a willingness to shut the experiment down if the evidence warrants it.

The AI sous chef is in the kitchen. The question is whether anyone is checking what it's cooking.

Sources (20)

  1. [1]
    AI pilot program in L.A. County courts will help judges craft rulings in some casesedinburgpost.com

    Six L.A. County civil court judges now using Learned Hand AI to summarize motions and draft tentative rulings in a $300,000 pilot program running through early 2027.

  2. [2]
    L.A. County Courts Test AI Tool to Help Judges Write Rulingskfiam640.iheart.com

    DA Nathan Hochman warns AI drafts could bias judges; anonymous judge says AI becomes a psychological reference point. No disclosure rules currently exist for AI-assisted rulings.

  3. [3]
    The Michigan Supreme Court Contracts with Learned Hand for Purpose-Built Judicial AInatlawreview.com

    Michigan Supreme Court began using Learned Hand in summer 2025 to review applications for permission to appeal in civil and criminal cases.

  4. [4]
    Court Statistics - Judicial Branch of Californiacourts.ca.gov

    Official court statistics from the Judicial Council of California showing statewide caseload trends from 2013-14 through 2024-25.

  5. [5]
    2026 Court Statistics Report - Statewide Caseload Trends 2015-16 Through 2024-25courts.ca.gov

    In FY 2024-25, over 5.3 million cases were filed statewide in California's superior courts, up from 4.5 million in FY 2022-23.

  6. [6]
    Growing backlog of court cases delays justice for crime victims and the accusedcbsnews.com

    Pending criminal cases jumped from 383,879 in 2019 to 546,727 in 2021 across major US cities. California, New York, Florida, and Michigan combined for nearly 1.3 million pending cases.

  7. [7]
    Attorneys welcome LA Superior Court reforms but warn backlogs persistdailyjournal.com

    Cases fail to proceed on their originally scheduled dates 85% of the time in LA County, a rate attorneys say is unmatched elsewhere in the country.

  8. [8]
    Superior Court of Los Angeles County - Unreported Arrest Disposition Reportslacourt.org

    LA Superior Court identified a backlog of roughly 464,000 unreported criminal case dispositions spanning 60 years due to legacy case management system failures.

  9. [9]
    Judicial Council Allocates Funding to Trial Courts With $97 Million Required Cutnewsroom.courts.ca.gov

    The Judicial Council approved funding allocations to trial courts including a $97 million reduction as required by the fiscal year 2024-25 budget.

  10. [10]
    LA County Superior Courts to Offer $35,000 to Full-Time Workers to Leave Jobspasadenanow.com

    LA Superior Court offered $35,000 voluntary separation incentives amid $30.3 million in state funding cuts, with the Presiding Judge calling the cuts 'concerning and consequential.'

  11. [11]
    The 2025-26 Budget: Judicial Branchlao.ca.gov

    Governor's 2025-26 budget includes $5.5 billion for the judicial branch with an $82 million ongoing augmentation to trial court operations funding.

  12. [12]
    How Long Does a Criminal Case Take In New York?tsiglerlaw.com

    Murder cases in New York City now take an average of 27 months to resolve, 37% longer than in 2019.

  13. [13]
    Waiting for Justice: Cook County Criminal Court Backlogchicagoappleseed.org

    Over 1,000 people on house arrest and over 2,200 in Cook County jail had been waiting more than a year for trial while presumed innocent.

  14. [14]
    Learned Hand - AI for the Judiciarylearned-hand.ai

    Official website for Learned Hand, the first generative AI platform purpose-built for courts, now operational in 10 states.

  15. [15]
    Meet the early-adopter judges using AItechnologyreview.com

    Federal judges in Texas and other jurisdictions using AI for case summarization, timeline generation, and attorney question preparation.

  16. [16]
    California Prosecutors' AI Mistakes Raise Concerns about Due Process Rightsdavisvanguard.org

    Defense attorneys identified AI-generated hallucinations in California prosecutor filings; 22 scholars filed brief with state Supreme Court calling AI errors 'an existential threat to due process.'

  17. [17]
    Judging AI: How U.S. Judges Can Harness Generative AI Without Compromising Justicejudicature.duke.edu

    Judicial legitimacy depends on public confidence that decisions are reasoned, ethical, and explainable. AI developers are not subject to judicial canons or disciplinary oversight.

  18. [18]
    Accountability in Artificial Intelligence - ACLUaclu.org

    ACLU calls for moratorium on algorithmic risk assessment tools until independent audits prove fairness; warns AI trained on biased data embeds discrimination.

  19. [19]
    AI's Complex Role in Criminal Law: Data, Discretion, and Due Processamericanbar.org

    Over 1,000 AI-related bills introduced across states in 2025. COMPAS algorithm found to incorrectly classify Black defendants as high-risk at nearly twice the rate of white defendants.

  20. [20]
    California Judicial Council - Model Policy for Use of Generative AIcourts.ca.gov

    California Judicial Council issued Model Policy addressing confidentiality, privacy, bias, safety, and security risks of generative AI in court-related work.