I’m always surprised when I encounter litigators who dismiss litigation analytics as a passing fad.  In fact, as shown in the reprint post below, it’s a century-long academic enterprise which has produced many hundreds of studies conclusively proving through tens of thousands of pages of analysis the value of data analytics in better understanding how appellate decisions are actually made.  Here’s the second in our reprint series, both here and at the California blog:

The application of data analytic techniques to the study of judicial decision making arguably begins with political scientist Charles Grove Haines’ 1922 article in the Illinois Law Review, General Observations on the Effects of Personal, Political, and Economic Influences in the Decisions of Judges. (17 Ill. L. Rev. 96 (1922)). Reviewing the records of New York City magistrate courts, Haines noted that while 17,075 people had been charged with public intoxication in 1916 – 92% of whom had been convicted – one judge discharged just one of 566 cases, another 18%, and still another fully 54%. Haines argued from this data that results in the magistrate courts were reflecting to some degree the “temperament . . . personality . . . education, environment, and personal traits of the magistrates.”

Two decades later, another political scientist, C. Herman Pritchett published The Roosevelt Court: A Study in Judicial Politics and Values, 1937-1947. Pritchett became interested in the work of the Supreme Court when he noticed that the Justices’ dissent rate had sharply increased in the late 1930s. Pritchett argued that the increase in the dissent rate necessarily weighed against the formalist view that “the law” was an objective reality which appellate judges merely found and declared. In The Roosevelt Court, Pritchett published a series of charts showing how often various combinations of Justices had voted together in different types of cases (the precursor of the some of the analysis we’ll publish later this year in California Supreme Court Review).

Another landmark in the data analytic literature, the U.S. Supreme Court Database, traces its beginnings to the work of Professor Harold J. Spaeth about three decades ago. Professor Spaeth undertook to create a database which classified every vote by a Supreme Court Justice in every argued case for the past five decades. In the years that followed, Spaeth updated and expanded his database, and additional professors joined the groundbreaking effort. Today, thanks to the work of Professors Harold Spaeth, Jeffrey Segal, Lee Epstein and Sarah Benesh, the database contains 247 data points for every decision the U.S. Supreme Court has ever rendered – dating back to August 3, 1791.  The Supreme Court Database is a foundational tool utilized by nearly all empirical studies of U.S. Supreme Court decision making.

Not long after the beginnings of the Supreme Court Database, Professors Spaeth and Segal also wrote one of the landmarks of data-driven empirical research into appellate decision making: The Supreme Court and the Attitudinal Model, in which they proposed a model arguing that a judge’s personal characteristics – ideology, background, gender, and so on – and so-called “panel effects” – the impact of having judges of divergent backgrounds deciding cases together as a single, institutional decision maker – explained a great deal about appellate decision making.

The data analytic approach began to attract widespread notice in the appellate bar in 2013, with the publication of Judge Richard A. Posner and Professors Lee Epstein and William M. Landes’ The Behavior of Federal Judges: A Theoretical & Empirical Study of Rational Choice. Drawing upon arguments developed in Judge Posner’s 2008 book How Judges Think, Posner, Epstein and Landes applied various regression techniques to a theory of judicial decision making with its roots in microeconomic theory, discussing a wide variety of issues from the academic literature.

Today, there is an enormous academic literature studying the work of the U.S. Supreme Court and the Circuits from a data analytic perspective on a variety of different issues, including case selection, opinion assignment, dissent aversion, panel effects, the impact of ideology, race and gender. That literature has led to two excellent anthologies just in the last few years: The Oxford Handbook of U.S. Judicial Behavior, edited by Lee Epstein and Stefanie A. Lindquist, and The Routledge Handbook of Judicial Behavior, edited by Robert M. Howard and Kirk A. Randazzo.  The state Supreme Courts have attracted somewhat less study than the federal appellate courts, but that has begun changing in recent years, and similar anthologies for the state courts seem like only a matter of time.

Image courtesy of Flickr by Jamison Wieser (no changes).

The Illinois Supreme Court Review recently marked its sixth anniversary.  In April, the California Supreme Court Review turns five.

So I thought it was time for a first: cross-posted reprints from the earliest days of the blogs.  My early attempts to provide context for the work and to answer the question I often heard in those days: “Interesting, but what difference does it make?”

So for the next 2-3 weeks, we’ll be reprinting those context posts – with minimal revisions – both here and at the sister California blog.  For readers who follow both blogs, be warned – the two posts reprinted each week will be largely identical (and don’t worry – it’ll be easy to tell when we resume our regularly scheduled programming . . .).  So here we go:

One of the primary reasons why appellate lawyering is a specialty is because appellate lawyers must contend with persuading a collective, institutional decision maker. An appellate panel isn’t like a jury. The members of a jury come together for the first time for a particular case, and part forever when it’s over. Members of an appellate panel have generally been on the Court for months if not years and will be there for years after a particular case is over. Members of a jury don’t share anything akin to the “law of the Circuit” or the “law of this Court” as a collective enterprise built over a span of years. And although historically, there’s been considerable pressure on jurors to find unanimity – although less so in recent years on the civil side – they almost always are trying to reach a binary decision: yes/no, one side wins, one side loses. An appellate panel, on the other hand, is attempting to reach unanimity on a collective reasoned, written argument. Decision making by appellate panel rather than individual judges has all kinds of potential effects on the outcome, and therefore on appellate lawyers’ task of persuasion – from making judges more reluctant to dissent from a decision they disagree with, to causing judges to vote in a more (or less) liberal or conservative direction than they otherwise would because of the panel’s deliberations.

Over the past few generations, political scientists, law professors, economists and statisticians have developed a host of tools for better understanding the dynamics of group decision making. These include game theory, organization theory, behavioral microeconomics, opinion mining and data analytics. Some researchers have used game theory to develop important insights about everything from the inner workings of the U.S. Supreme Court[1] to why Federal Circuits follow Supreme Court precedent.[2] Others have used traditional labor theory in an attempt to develop a unified theory of judicial behavior.[3] With the rise of widely available massive computerized databases of appellate case law, the most fast-growing and widely varied area of research has applied sophisticated statistical and “big data” techniques to understanding the law.

Data analytics is revolutionizing litigation. Several different companies are offering such services at the trial level. Lex Machina (acquired in 2015 by LexisNexis), Ravel Law (acquired two years later, also by LexisNexis) and Premonition each offer detailed analytics about trial judges, courts and case types based on databases of millions of pages of case information. ALM has also expanded its judicial profiles services to increase their focus on judge analytics.

In 2015, I started this blog to bring rigorous, law-review style empirical research founded on data analytic techniques to the study of appellate decision making. A year later, I expanded the project to the California Supreme Court Review. Both blogs are based on massive databases consisting of 125-150 data points (depending on the year) drawn from every case, civil and criminal, decided by the Illinois and California Supreme Courts, respectively.

Why?  Simple.  Litigators, no matter whether they’re usually in the appellate or trial courts, frequently find themselves predicting the future.  This jurisdiction or this judge tends to be pro-plaintiff or pro-defendant.  Juries in this county tend to return excessive verdicts, or they don’t.  Trial or appellate litigation in this jurisdiction takes . . . this long.  What does it mean that the state Supreme Court just granted review?  Or what does it mean that the Supreme Court asked me way more questions at oral argument than they did my opponent?

Every one of these questions has a data-driven answer.  Not just in Illinois and California, but in every jurisdiction in the country.  Sometimes the data confirms the traditional wisdom – and sometimes it proves that the traditional wisdom is dead wrong.

Want a more high-flown answer?  Try this one from Posner, Epstein and Landes’ The Behavior of Federal Judges:

The better that judges are understood, the more effective lawyers will be both in litigating cases and, as important, in predicting the outcome of cases, thus enabling litigation to be avoided or cases settled at an early stage.

So that’s what we do here.  For everyone who’s been with us for most or all of the six years since we started, thank you.  And for first-time visitors: we hope you’ll join us.

Image courtesy of Flickr by Jim Bowen (no changes).


[1]               James R. Rogers, Roy B. Flemming, and Jon R. Bond, Institutional Games and the U.S. Supreme Court (2006).

[2]               Jonathan P. Kastellec, “Panel Composition and Judicial Compliance on the U.S. Courts of Appeals,” The Journal of Law, Economics & Organization, 23(2): 421-41.

[3]               Judge Richard A. Posner and Professors Lee Epstein and William M. Landes, The Behavior of Federal Judges: A Theoretical & Empirical Study of Rational Choice (2013).

Yesterday, we showed that Justice Garman has voted with the minority in 6.84% of her civil cases since joining the Court, slightly below Chief Justice Burke’s percentage.  Justice Theis’ percentage is almost identical: she has voted with the minority in 6.83% of her civil cases since joining the Court in 2010.  There are no strong time trends in her data.  She was above baseline in 2012 (7.69%) and 2014 (11.11%), but below in 2013 (6.06%).  She was below in 2016 (3.57%) and 2018 (4.55%), but above it in 2017 (11.54%) and 2019 (11.76%).  Justice Theis was well below her career percentage once again in 2020, voting with the minority in only 3.13% of her civil cases.

Join us back here next week as we examine the data for two more members of the Court.

Image courtesy of Flickr by artistmac (no changes).


Last time, we looked at how often Chief Justice Anne Burke voted with the minority in civil cases – a proxy for how closely in sync with the philosophy of the other Justices she has been throughout her career.  Today, we’re addressing the same number for Justice Garman.

The Chief Justice has been in the minority in 6.91% of civil cases since joining the Court in 2006.  Justice Garman’s overall percentage is nearly identical – 6.84%.  Looking at time trends, she was below that baseline from 2004 through 2011 except for 2006 (11.11%).  Between 2014 and 2017, she was above baseline in three of four years.  After a one-year dip, she has again been above baseline in 2019 (8.82%) and 2019 (9.38%).

Join us back here tomorrow as we address the minority percentage for Justice Theis.

Image courtesy of Flickr by William Murphy (no changes).



With this post, we’re addressing a new question in our ongoing review of the Justices’ voting records: how often each Justice is in the minority.  The question serves as an indication of how closely in sync with the majority of the Court an individual Justice is philosophically, and during a Justice’s term as Chief Justice, it offers some indication of how much influence the Justice has over her or his colleagues.

Since joining the Court in 2006, Chief Justice Anne Burke has voted in 463 civil cases.  She has been in the minority in only 32 of those cases – 6.91% of the total.  Before becoming Chief Justice, Chief Justice Burke had voted in the minority in 7.11% of civil cases.  Since taking the center seat, she has been in the minority in only 2 of 41 civil cases – 4.87% of the total.  This is reflected in the year-by-year data below.  She voted with the minority in 15.38% of civil cases during 2017, but only 4.55% in 2018, 0% in 2019 and 6.25% in 2020.

Join us back here next week as we continue our review of the Justices’ minority percentages in civil cases.

Image courtesy of Flickr by Kate Brady (no changes).

This time, we’re beginning our review of the voting record of Justice Michael Burke, who took his seat on March 1, 2020, replacing the retired Justice Robert R. Thomas.  Previously, Justice Burke had served for twelve years as a Justice of the Second District Appellate Court.

During 2020, Justice Michael Burke voted in 19 civil cases.  He voted to affirm in 10 of those cases – 52.63%.  He voted to reverse in 7 cases, or 36.84%.  He cast one split vote to affirm in part and reverse in part and cast one vote to deny.

Join us back here next time when we discuss how often Chief Justice Anne Burke is in the minority in civil cases.

Image courtesy of Flickr by paul_p! (no changes).

Today, we’re examining the voting record of one of the newer members of the Supreme Court, Associate Justice P. Scott Neville, Jr.  Justice Neville took his seat on June 15, 2018, succeeding Justice Charles Freeman.  Prior to joining the Court, Justice Neville sat on the First District Appellate Court from 2004 to 2018. During his tenure, he served as Presiding Justice of the Second, Third and Fourth Divisions.

Since joining the Court and through the end of 2020, Justice Neville has voted in 68 civil cases.  His votes to affirm and reverse are nearly evenly split.  He has voted to affirm in 27 cases (39.71%) and to reverse in 29 cases (42.65%).  Justice Neville has cast split votes in 9 cases.  Justice Neville’s remaining three votes are evenly split – one each to deny, vacate and for “other.”

Join us back here next week as we continue our review of the individual Justices’ voting records.

Image courtesy of Flickr by Gary Todd (no changes).


Today, we’re beginning our examination of the voting record of Chief Justice Anne M. Burke.  Chief Justice Burke took her seat on July 6, 2006.  Through the end of 2020, she had voted in 463 civil cases.

It’s reasonable to suppose that the distribution of a Justice’s votes between affirmance and reversal might tell us something about what a vote to hear a particular case from that Justice might mean.  Does she see the Court’s function as reining in one or more Appellate Courts?  Does a vote from that Justice to allow a petition for leave to appeal suggest that she is likely to vote to reverse?

Justice Garman’s votes were almost perfectly split: 40.13% to affirm, 40% to reverse.  Justice Theis has been a bit more inclined to reverse: 37.58% to affirm, 42.55% to reverse.

The Chief Justice has been significantly more inclined to reverse than Justice Theis.  She has cast 163 votes to affirm in civil cases – 35.21% of her total – and 204 votes to affirm, or 44.06%.  There are no particular time trends in her voting patterns.  She has cast 70 split votes in civil cases – affirm in part and reverse/modify/vacate in part.  She has cast 11 votes to vacate, 9 to deny, 5 “other” and 1 to grant.

Join us back here tomorrow as we review the voting record of one of the newer Justices, P. Scott Neville.

Image courtesy of Flickr by Dan (no changes).


Today, we’re taking a look at Justice Theis’ voting record since joining the Court in late October 2010.

In her just over ten years on the Court, Justice Theis has voted in 322 civil cases.  Like Justice Garman, the distribution of those cases reflects the diminishing of the Court’s docket.  She cast 177 votes in her first five years on the Court (2011-2015) and 142 between 2016 and 2020.

Justice Theis has cast 121 votes to affirm and 137 to reverse.  Like Justice Garman’s numbers, there is a time trend.  Between 2011 and 2015, Justice Theis voted to reverse in 16 more cases than to affirm.  Since that time, her votes to affirm and reverse are evenly split.  (The time trends in Justice Garman’s numbers was in the opposite direction – predominantly affirm in her first years, predominantly reverse more recently).

Justice Theis has cast 46 split votes to affirm in part and reverse/vacate/modify in part – a very slightly higher percentage of her total than Justice Garman.  She has also voted to deny in 9 cases, to vacate 4 times and “other” in 5 cases.

Join us back here next week as our review of the Justices’ voting records continues.

Image courtesy of Pixabay by islandworks (no changes).

This week, we’re taking the first steps in a more detailed analysis, one Justice at a time, of the Justices’ voting records.  First up is Justice Garman’s record in civil cases.

From joining the Court in 2001 until the end of 2020, Justice Garman has voted in 760 civil cases.  The distribution of votes reflects the downwards trend in the Court’s docket: Justice Garman cast 435 votes in civil cases between 2001 and 2010, and only 325 from 2011 to 2020.

Those votes are almost evenly split between votes to affirm and reverse – 305 votes to affirm, 304 to reverse.  Curiously, the data shows clear time trends; between 2001 and the end of 2008, Justice Garman voted to affirm in 33 more cases than to reverse.  From 2009 through 2020, that trend flipped, with votes to reverse predominating.  Justice Garman has cast split votes – affirm in part and reverse/modify/vacate in part – in 106 civil cases.  The rest of her votes are scattered – 19 votes to deny, 17 to vacate, 1 to grant and 8 “other” – predominantly certified questions from the Seventh Circuit.  The yearly data is reported in Table 1703.

Join us back here next time as we begin our review of the voting record of Justice Mary Jane Theis.

Image courtesy of Pixabay by bk_advtravlr (no changes).