Skip to main content

Six-year graduation rates at four-year colleges and universities

Graduation rates are always a hot topic in higher education, but often for the wrong reason.  To demonstrate, I offer my parents.  Here is a portrait of Agnes and Mark, married May 4, 1946.


One night while I was talking to my brother, he asked, "Do you think mom was the way she was because dad was the way he was, or do you think dad was the way he was because mom was the way she was?"  To which I replied, "yes."  My point, of course, is that in complex relationships, it's always difficult--impossible, actually--to detangle cause and effect.

And, despite the Student Affairs perspective that graduation rates are a treatment effect, I maintain that they are actually a selection effect.  As I've written about before, it's pretty easy to predict a college's six-year graduation rate if you know one data point: The mean SAT score of the incoming class.  That's because the SAT rolls a lot of predictive factors into one index number.  These include academic preparation, parental attainment, ethnicity, and wealth, on the student side, and selectivity, on the college side.

When a college doesn't have to--or chooses not to--take many risks in the admissions process, they tend to select those students who are more likely to graduate.  That skews the incoming class wealthier (Asian and Caucasian populations have the highest income levels in America), higher ability (the SAT is a good proxy for some measure of academic achievement, and often measures academic opportunity), and second generation.  And when you combine all those things--or you select so few poor students you can afford to fund them fully--guess what?  Graduation rates go up.

If this doesn't make any sense, read the Blueberry Speech.  Or ask yourself this question: If 100 MIT students enrolled at your local community college, what percentage would graduate? 

But graduation rates are still interesting to look at, once you have that context.  The visualization below contains three views, using the tabs across the top.  You'll have to make a few clicks to get the information you need.

The first view (Single Group) starts with a randomly selected institution, Oklahoma State.  Choose your institution of choice by clicking on the box and typing any part of the name, and selecting the institution. 

On the yellow bars, you see the entering cohorts in yellow, and the number of graduating students on the blue bars.  Note: The blue bars show graduates in the year shown (so, 4,755, which you can see by hovering over the bar) while the yellow bar shows the entering class from six years prior (7,406 in 2019, who entered in 2013).

The top row shows graduation rates at all institutions nationally, and the second row shows percentages for the selected institution.  You can choose any single ethnicity at the top left, using the filter.

The second view (Single Institution) shows all ethnicities at a single institution.  The randomly selected demonstration institution is Gustavus Adolphus College in Minnesota, but of course you can choose any institution in the data set.  Highlight a single ethnic group using the highlight function (I know some people are frightened of interacting with these visualizations....you can't break anything).

Note: I start with a minimum of 10 students in each year's cohorts for the sake of clarity.  Small schools in the Northeast, for instance, might enroll one Asian/Pacific Islander in their incoming class, each year, so the graduation rate could swing wildly from 0% to 100%.  You can change this if you want to live dangerously, by pulling the slider downward.

The final view (Sectors) shows aggregates of institutional types.  It starts with graduation rates for Hispanic/Latino students, but you can change it to any group you want.

Have fun learning about graduation rates.  Just don't assume they are mostly driven by what happens at the institution once the admissions office has its say.

Comments

Popular posts from this blog

Changes in AP Scores, 2022 to 2024

Used to be, with a little work, you could download very detailed data on AP results from the College Board website: For every state, and for every course, you could see performance by ethnicity.  And, if you wanted to dig really deep, you could break out details by private and public schools, and by grade level.  I used to publish the data every couple of years. Those days are gone.  The transparency The College Board touts as a value seems to have its limits, and I understand this to some extent: Racists loved to twist the data using single-factor analysis, and that's not good for a company who is trying to make business inroads with under-represented communities as they cloak their pursuit of revenue as an altruistic push toward access. They still publish data, but as I wrote about in my last post , it's far less detailed; what's more, what is easily accessible is fairly sterile, and what's more detailed seems to be structured in a way that suggests the company doesn...

The Highly Rejective Colleges

If you're not following Akil Bello on Twitter, you should be.  His timeline is filled with great insights about standardized testing, and he takes great effort to point out racism (both subtle and not-so-subtle) in higher education, all while throwing in references to the Knicks and his daughter Enid, making the experience interesting, compelling, and sometimes, fun. Recently, he created the term " highly rejective colleges " as a more apt description for what are otherwise called "highly selective colleges."  As I've said before, a college that admits 15% of applicants really has a rejections office, not an admissions office.  The term appears to have taken off on Twitter, and I hope it will stick. So I took a look at the highly rejectives (really, that's all I'm going to call them from now on) and found some interesting patterns in the data. Take a look:  The 1,132 four-year, private colleges and universities with admissions data in IPEDS are incl...

Changes in SAT Scores after Test-optional

One of the intended consequences of test-optional admission policies at some institutions prior to the COVID-19 pandemic was to raise test scores reported to US News and World Report.  It's rare that you would see a proponent of test-optional admission like me admit that, but to deny it would be foolish. Because I worked at DePaul, which was an early adopter of the approach (at least among large universities), I fielded a lot of calls from colleagues who were considering it, some of whom were explicit in their reasons for doing so.  One person I spoke to came right out at the start of the call: She was only calling, she said, because her provost wanted to know how much they could raise scores if they went test-optional. If I sensed or heard that motivation, I advised people against it.  In those days, the vast majority of students took standardized admission tests like the SAT or ACT, but the percentage of students applying without tests was still relatively small; the ne...