Skip to main content

On Rankings, 1911, and Economic Mobility

If you're alive today, you have lived your whole life with college rankings.  Yes, even you.  You may not have knows you were living in the time of college rankings, but indeed, you have been, unless you were born before 1911 (or maybe earlier.)  If you're interested, you can read this Twitter thread from 2020 where I discuss them and include snippets of those 1911 rankings as well as those from 1957, written by Chesley Manly.

You can read for yourself, or you can trust me, that in fact the rankings as we know them have been surprisingly consistent over time, and most people would have only minor quibbles with the ratings from 1911.  Perhaps that's because they have always tended to measure the same thing.

But what if we did different rankings?  No, not like the Princeton Review where they make an attempt to measure best party school, or best cafeteria food, or worst social life.  Something more quantifiable and concrete, although still, admittedly, a hard thing to get right: An economic mobility index.

Enter Michael Itzkowitz the former director of the College Scorecard.  He's taken loads of data and attempted to create that index, essentially to rank colleges by several important criteria:

  • How many low-income students they enroll and graduate
  • How affordable the college is (which is a combination of low cost and income, equating to "time to pay back" the investment
Like any ranking system, this is not perfect, nor is it precise.  And some might argue that the real benefit of college is not in money, even if you acknowledge that it's more important today than it ever was.  Further, much of it may be structural: Some states have low tuition, larger income disparity, and higher median incomes.  So, if nothing else, it might only be fair to compare colleges within a single state.  Still, this is intended to call out the ones that do it well, not, I think, look down your nose at the privates who, it might be argued, are free to not consider social mobility as a part of mission.

If you don't like it, you're perfectly free to create your own rankings, of course.

My Itzkowitz has has graciously explained the system and made the data available for down load here.  And he gave me permission to use it on the blog.

It's pretty simple: Dashboard 1 plots Low-income performance against economic mobility index (there is a strong correlation here because one factors into the calculation of the other.) Marks are colored by control, and the data are arrayed in quadrants, with the top right being the highest ranking institutions.

As always you can filter the data to show smaller groups; I've kept the axis and the quadrants fixed to give you some sense of the actual, rather than relative, positions.

The second view shows a bar chart with the mobility index, arrayed by region and state.

As always, let me know what you see here.

Comments

Popular posts from this blog

Freshman Migration, 1986 to 2020

(Note: I discovered that in IPEDS, Penn State Main Campus now reports with "The Pennsylvania State University" as one system.  So when you'd look at things over time, Penn State would have data until 2018, and then The Penn....etc would show up in 2020.  I found out Penn State main campus still reports its own data on the website, so I went there, and edited the IPEDS data by hand.  So if you noticed that error, it should be corrected now, but I'm not sure what I'll do in years going forward.) Freshman migration to and from the states is always a favorite visualization of mine, both because I find it a compelling and interesting topic, and because I had a few breakthroughs with calculated variables the first time I tried to do it. If you're a loyal reader, you know what this shows: The number of freshman and their movement between the states.  And if you're a loyal viewer and you use this for your work in your business, please consider supporting the costs

Changes in AP Scores, 2022 to 2024

Used to be, with a little work, you could download very detailed data on AP results from the College Board website: For every state, and for every course, you could see performance by ethnicity.  And, if you wanted to dig really deep, you could break out details by private and public schools, and by grade level.  I used to publish the data every couple of years. Those days are gone.  The transparency The College Board touts as a value seems to have its limits, and I understand this to some extent: Racists loved to twist the data using single-factor analysis, and that's not good for a company who is trying to make business inroads with under-represented communities as they cloak their pursuit of revenue as an altruistic push toward access. They still publish data, but as I wrote about in my last post , it's far less detailed; what's more, what is easily accessible is fairly sterile, and what's more detailed seems to be structured in a way that suggests the company doesn&

The Highly Rejective Colleges

If you're not following Akil Bello on Twitter, you should be.  His timeline is filled with great insights about standardized testing, and he takes great effort to point out racism (both subtle and not-so-subtle) in higher education, all while throwing in references to the Knicks and his daughter Enid, making the experience interesting, compelling, and sometimes, fun. Recently, he created the term " highly rejective colleges " as a more apt description for what are otherwise called "highly selective colleges."  As I've said before, a college that admits 15% of applicants really has a rejections office, not an admissions office.  The term appears to have taken off on Twitter, and I hope it will stick. So I took a look at the highly rejectives (really, that's all I'm going to call them from now on) and found some interesting patterns in the data. Take a look:  The 1,132 four-year, private colleges and universities with admissions data in IPEDS are incl