Skip to main content

Looking at Student Loan Default Rates

Student Loan defaults make a lot of news, but there is not a lot of understanding about what a default actually is, and there is not good, easily accessible data on default rates, nor a lot of good contextual analysis.  But this may help a little.

First, the source of the data is here.  You should read it, especially  the part about small numbers of students entering payment, or small percentages of students taking loans at a college skewing default rates.  You should also know that the definition of a default is being at least 270 days behind on a payment.

This is not the easiest data to work with.  For one thing, the file layout descriptions don't match the file; Financial Aid uses a different ID than IPEDS, and the crosswalk tables that might help you figure out the IPEDS ID (to get a richer view of context) use a different format than this table does. In addition the "Region" doesn't roll up the states in any way I've seen before, and the "Program Type" also puts colleges in categories that don't always make sense.  For most four-year institutions, try "Traditional" first in the selector box.

But here it is.

If you want to eliminate the small schools that skew things, you can use the "Borrowers Entering Repayment 2009--11" filter.  You can just type the ranges in the boxes and hit enter, or use the sliders.  You can also limit to states or region, in any combination.

A reminder that outputs are sometimes actually inputs.  If you enroll high ability, wealthy students, and are very selective in admissions, your default rates are going to be lower than other institutions that take more chances on students who come from low-income or less-prepared backgrounds.  It would be great if there were a way to recognize the institutions with lower default rates who took more risks.

What jumps out at you?



Comments

Popular posts from this blog

Changes in AP Scores, 2022 to 2024

Used to be, with a little work, you could download very detailed data on AP results from the College Board website: For every state, and for every course, you could see performance by ethnicity.  And, if you wanted to dig really deep, you could break out details by private and public schools, and by grade level.  I used to publish the data every couple of years. Those days are gone.  The transparency The College Board touts as a value seems to have its limits, and I understand this to some extent: Racists loved to twist the data using single-factor analysis, and that's not good for a company who is trying to make business inroads with under-represented communities as they cloak their pursuit of revenue as an altruistic push toward access. They still publish data, but as I wrote about in my last post , it's far less detailed; what's more, what is easily accessible is fairly sterile, and what's more detailed seems to be structured in a way that suggests the company doesn...

Educational Attainment and the Presidential Elections

I've been fascinated for a while by the connection between political leanings and education: The correlation is so strong that I once suggested that perhaps Republicans were so anti-education because, in general, places with a higher percentage of bachelor's degree recipients were more likely to vote for Democrats. The 2024 presidential election puzzled a lot of us in higher education, and perhaps these charts will show you why: We work and probably hang around mostly people with college degrees (or higher).  Our perception is limited. With the 2024 election data just out , I thought I'd take a look at the last three elections and see if the pattern I noticed in 2016 and 2020 held.  Spoiler: It did, mostly. Before you dive into this, a couple of tips: Alaska's data is always reported in a funky way, so just ignore it here.  It's a small state (in population, that is) and it's very red.  It doesn't change the overall trends even if I could figure out how to c...

Changes in SAT Scores after Test-optional

One of the intended consequences of test-optional admission policies at some institutions prior to the COVID-19 pandemic was to raise test scores reported to US News and World Report.  It's rare that you would see a proponent of test-optional admission like me admit that, but to deny it would be foolish. Because I worked at DePaul, which was an early adopter of the approach (at least among large universities), I fielded a lot of calls from colleagues who were considering it, some of whom were explicit in their reasons for doing so.  One person I spoke to came right out at the start of the call: She was only calling, she said, because her provost wanted to know how much they could raise scores if they went test-optional. If I sensed or heard that motivation, I advised people against it.  In those days, the vast majority of students took standardized admission tests like the SAT or ACT, but the percentage of students applying without tests was still relatively small; the ne...