Keeping Up to Data: September 2024

Keeping Up to Data logo

September 2024 / Episode 6 / Under 20 minutes

Crunching the Numbers on the New LSAT Format

Welcome to the Keeping Up to DataSM podcast, a space in which we discuss, analyze, and contextualize trends and perspectives in the current law school admission cycle by taking a deeper dive into the most up-to-date data and making sense of the complicated world of legal education.

 

Untitled Document

SUSAN KRINSKY:  Welcome back to Keeping Up to DataSM. I’m Susan Krinsky, interim president and CEO at LSAC®, with my regular update on the application cycle and an interview with Anna Topczewski, LSAC’s director of assessment sciences, who will talk with us about the results of the August LSAT® administration — the first test of the 2024-25 testing year, and the first test without an Analytical Reasoning section. But first, some numbers.

We are in that part of the year where one admission cycle has ended and the next cycle is just getting started. As we close out the 2023-24 admission cycle, the final numbers show strong year-over-year growth in the applicant pool and continued gains in the diversity of that applicant pool. Applicants were up 5.7% over last year, with 64,291 applicants coming through. And applications were up 2.6%, with almost 428,000 applications submitted. Applicants of color represented 48% of this year’s applicant pool, compared to 47.1% last year and 44.1% four years ago. This year’s applicant pool included 57% women, up slightly from 56.6% last year and 54.9% four years ago.

It’s too soon to know how this diversity in the applicant pool will translate into the entering classes, but we’ll know more when the ABA Section of Legal Education and Admissions to the Bar releases the Standard 509 data in mid-December, and at that time, we’ll be prepared to provide context for those numbers.

In the meantime, I commend to you a very recently published report from our strategic research team, LSAC’s 2024 Knowledge Report on 2023-24 Test Takers. This is a very rich analysis of, among other things, when prospective law students first think about attending law school; what motivates them to attend law school; significantly, what may prevent them from pursuing legal education; and how they anticipate they will be viewed and valued in law school — how they view belonging. Our research team will be publishing another Knowledge Report later this year, this one focusing on the 2024 incoming class — important insights into how we can better support individuals in their journeys from applicants to admittees.

Looking ahead, the 2024-25 LSAT testing cycle is off to a strong start. Registrations for this year’s August and September tests were up by 18% over August and September 2023. Similarly, October 2024 registrations are running well ahead of October 2023 registrations a year ago at this point. So, while it’s too soon to speculate, the early signs point to another robust applicant pool and a competitive admission cycle again this year. As always, you can find the latest applicant trends and numbers on our website, which is updated daily, 365 days a year.

And now, I am very happy to welcome Anna Topczewski to Keeping Up to Data. Anna is director of assessment sciences at LSAC, and I’ve asked her to join us today to talk about the results of the August 2024 LSAT and, in particular, how those results compare to results on the prior tests. As we all know, the August 2024 LSAT was the first administration of our new test format without an Analytical Reasoning section. Our team of psychometricians conducted all kinds of analyses to ensure that scores for the new format are valid, reliable, and consistent with scores from the previous format. The team also compared August scores from multiple perspectives: first-time test takers and repeat test takers — that is, those who had taken the LSAT recently with an Analytical Reasoning section, and then again this August without that section.

And our data scientists also looked at test takers by other subgroups, like racial and ethnic diversity, gender diversity, and more. By every measure, the August 2024 scores are very much in line with August scores from recent years. We communicated this important news and context to all of our key stakeholders via email and to all interested parties in an August 28 blog post, which you can find on our website.

Anna, welcome. So, with the changes to the test, I know everyone was wondering and speculating about whether we would see any impact on performance on the test and on scores. You’ve spent the last month crunching all the numbers. What did you find?

 

ANNA TOPCZEWSKI: We were just as interested as everyone else. We knew from our historical analysis that we outlined when we announced the format change that scores for the new test and the previous format were likely to be quite similar, but even so, we were eager to get our hands on the August test results. After detailed analysis, the scores from the August 2024 test were very much in line with scores from the August 2023 test — in fact, very much in line with scores for almost all recent August tests. If you look at the previous four August tests, this year’s average score was slightly higher than two of those years and slightly lower than the other two — right in the middle.

 

SUSAN: So, I noticed that you’re focusing on the August-to-August comparison. Why is that?

 

ANNA: The makeup of test takers varies over the course of a testing cycle. We see more first-time test takers in April, June and August, and a higher proportion of repeat test takers in January and February. We also see consistent cyclical variations in scores over the course of the testing year, with slightly higher average scores in June and August, and slightly lower scores in January and February. Those variations have been consistent over many years, even decades, so in order to compare apples to apples, it’s best to compare admins of similar time of year to one another.

 

SUSAN: Now, one thing about the data is that just because average scores are the same or very similar, that doesn’t always mean that the score distribution is the same. A comparison of average scores could mask differences in the score distribution — you could see a tighter distribution or a broader distribution without any change in the averages. Did you see any significant differences in how scores were distributed?

 

ANNA: That is a great question, and the bottom line is that we did not see any significant differences in score distribution between the August 2024 test and the August 2023 test. If you look at individual scores, the distributions are again very similar. To put it a different way, let’s say you are interested in what proportion of test takers scored above a certain score. This August, about 1.4% of test takers scored 175 or above. Last August, it was about 1.5%. This August, about 6.6% of test takers scored 170 or above. Last August, it was about 7.6%. This August, about 17.6% of test takers scored 165 or above. Last August, it was about 18.5%. And so on. When we look at the score distributions from the last several August admins, they’re quite similar. If you didn’t know which year is which, you really couldn’t tell them apart.

 

SUSAN: What about different demographic groups? Did you see any changes in how different demographic groups performed compared to previous tests that included an Analytical Reasoning section?

 

ANNA: We looked at the data from so many angles: first-time test takers, repeat test takers, women, men, gender diverse, racial and ethnic groups, and more. Across all of these analyses, the August 2024 scores were within normal variation of that we have seen from admin to admin and year to year.

 

SUSAN: Admission leaders pay particular attention to first-time test takers, as they can be a bellwether for the volume of applicants. What did you see with respect to the number of first-time test takers in the August administration?

 

ANNA: Several things. First, August was a very large administration: nearly 22,500 test takers, which was much higher than August 2023. Second, the proportion of first-time test takers was a couple of percentage points higher than normal. 62.8% of test takers this August were first-time test takers, compared to 60.6% in August 2023. The combination of a larger test population and the higher proportion of first-time test takers meant that we had 4,000 more first-time test takers entering the process this August than last August.

Now, some of that surge in first-time test takers might be due to the change in the format. June had a slightly higher proportion of repeat test takers, and the August administration had a higher proportion of first-time test takers. That’s not surprising. If somebody already had taken the test with AR and wanted to get one more attempt in, they would’ve tested in June. And if someone hasn’t taken the LSAT yet, they might wait until the new format in August.

These sorts of variations are very common when testing programs make a change. In fact, the SAT just went from paper to digital, and there were several news reports on how people were either waiting for the new format or getting one last try in with the old format. Despite June having lower proportion of first-time test takers, because June had such high test taker volumes overall, the number of June first-time test takers was also very high: about 1,500 test takers higher this June, compared to last June. So, we will obviously keep watching that figure, but right now it points to an admission cycle with a very large volume of applicants.

 

SUSAN: And what about the performance of first-time test takers on this August’s test as compared to the performance of first-time test takers on last August’s test, which did include an Analytical Reasoning section?

 

ANNA: There is a very interesting data point related to first-time test takers. If you look at first-time test takers from this August and last August, their average scores are virtually identical: just four one-hundredths of a point’s difference. That makes us feel even more confident about the similarity of the August test scores.

 

SUSAN: Some people have claimed that the only way to get a very high score — say, 170 or higher — is to score perfectly on the Analytical Reasoning section, and therefore, the number of very high scores would drop with the change in the test. What does the data tell us?

 

ANNA: Well, the data doesn’t really support that claim. The data tells us that many test takers who scored 170 and above didn’t score perfectly on the AR. Historically, we have seen test takers miss 1, 2, 3, or even 4 points on AR and still get into the 170s. The same is true for LR and RC. For the August administration, we are right in line with historical trends on how many people are scoring 170 and above. For August, 6.6% of test takers scored 170 and above. Historically, for August, the percentage has ranged from 4.7% to 7.6% the last four years. When you look at subgroups, such as by gender, race, ethnicity, the trends are also very stable.

 

SUSAN: Obviously, this is only one test administration. What will you be looking for over the next few test administrations?

 

ANNA: We will keep doing the same detailed analysis that we did for the August test to make sure the scores continue to be within normal ranges and there isn’t anything unusual. And when I say normal ranges, what I mean by that is that we always see some variation over the course of the testing year and between years. So, what we are looking at is whether we see something that is consistent with historical patterns or normal ranges of variation.

The other point I’d like to make to our schools and pre-law advisors is that we publish the distribution of test takers’ scores for every administration a week or two after score release. We used to do it at the end of each testing cycle, but during COVID, we decided to publish interim reports after every administration to provide as much transparency as possible to admissions officers and pre-law advisors about the population of test takers, many of whom will be applying this year or next. We call these interim reports the Preliminary Informational Guide. And, full disclosure, we often call it the PIG internally. “Got to get the PIG out” is quite fun to say.

The PIG for the August test has already been published on our secure website for schools and pre-law advisors. We will continue to publish updates after every test so that schools and pre-law advisors have the insights they need into how people are doing on the LSAT.

 

SUSAN: Anna, thank you so much for this. Is there anything else you’d like our audience to know about the performance of test takers on this newest version of the LSAT?

 

ANNA: The title of LSAC’s blog post on the August administration said it well: “The more things change, the more they stay the same.” The August 2024 administration results were remarkably consistent with previous years, and we’re expecting this to be the case for future administrations as well.

 

SUSAN: Again, thank you for spending time with us today. I know you’re in the midst of analyzing the data from the September test right now. I look forward to checking in with you again throughout this cycle.

 

ANNA: Thank you, Susan. I’m happy to be here.

 

SUSAN: I want to thank Anna Topczewski again for an incredibly rich conversation. To our listeners, thank you for joining us at Keeping Up to Data. We look forward to your joining our next episode, when we expect to be able to talk about the 2025 application cycle. Until next time, stay well.

 

Thank you for joining us. Keeping Up to DataSM is a production of LSAC. If you want to learn more about the current law school admission cycle and the latest trends and news, visit us at LSAC.org.

Back to Keeping Up to Data