Skip to main content

On Rankings, 1911, and Economic Mobility

If you're alive today, you have lived your whole life with college rankings.  Yes, even you.  You may not have knows you were living in the time of college rankings, but indeed, you have been, unless you were born before 1911 (or maybe earlier.)  If you're interested, you can read this Twitter thread from 2020 where I discuss them and include snippets of those 1911 rankings as well as those from 1957, written by Chesley Manly.

You can read for yourself, or you can trust me, that in fact the rankings as we know them have been surprisingly consistent over time, and most people would have only minor quibbles with the ratings from 1911.  Perhaps that's because they have always tended to measure the same thing.

But what if we did different rankings?  No, not like the Princeton Review where they make an attempt to measure best party school, or best cafeteria food, or worst social life.  Something more quantifiable and concrete, although still, admittedly, a hard thing to get right: An economic mobility index.

Enter Michael Itzkowitz the former director of the College Scorecard.  He's taken loads of data and attempted to create that index, essentially to rank colleges by several important criteria:

  • How many low-income students they enroll and graduate
  • How affordable the college is (which is a combination of low cost and income, equating to "time to pay back" the investment
Like any ranking system, this is not perfect, nor is it precise.  And some might argue that the real benefit of college is not in money, even if you acknowledge that it's more important today than it ever was.  Further, much of it may be structural: Some states have low tuition, larger income disparity, and higher median incomes.  So, if nothing else, it might only be fair to compare colleges within a single state.  Still, this is intended to call out the ones that do it well, not, I think, look down your nose at the privates who, it might be argued, are free to not consider social mobility as a part of mission.

If you don't like it, you're perfectly free to create your own rankings, of course.

My Itzkowitz has has graciously explained the system and made the data available for down load here.  And he gave me permission to use it on the blog.

It's pretty simple: Dashboard 1 plots Low-income performance against economic mobility index (there is a strong correlation here because one factors into the calculation of the other.) Marks are colored by control, and the data are arrayed in quadrants, with the top right being the highest ranking institutions.

As always you can filter the data to show smaller groups; I've kept the axis and the quadrants fixed to give you some sense of the actual, rather than relative, positions.

The second view shows a bar chart with the mobility index, arrayed by region and state.

As always, let me know what you see here.

Comments

Popular posts from this blog

Educational Attainment and the Presidential Elections

I've been fascinated for a while by the connection between political leanings and education: The correlation is so strong that I once suggested that perhaps Republicans were so anti-education because, in general, places with a higher percentage of bachelor's degree recipients were more likely to vote for Democrats. The 2024 presidential election puzzled a lot of us in higher education, and perhaps these charts will show you why: We work and probably hang around mostly people with college degrees (or higher).  Our perception is limited. With the 2024 election data just out , I thought I'd take a look at the last three elections and see if the pattern I noticed in 2016 and 2020 held.  Spoiler: It did, mostly. Before you dive into this, a couple of tips: Alaska's data is always reported in a funky way, so just ignore it here.  It's a small state (in population, that is) and it's very red.  It doesn't change the overall trends even if I could figure out how to c...

First-year student (freshman) migration, 2022

A new approach to freshman migration, which is always a popular post on Higher Ed Data Stories. If you're a regular reader, you can go right to the visualization and start interacting with it.  And I can't stress enough: You need to use the controls and click away to get the most from these visualizations. If you're new, this post focuses on one of the most interesting data elements in IPEDS: The geographic origins of first-year (freshman) students over time.  My data set includes institutions in the 50 states and DC.  It includes four-year public and four-year, private not-for-profits that participate in Title IV programs; and it includes traditional institutions using the Carnegie classification (Doctoral, Masters, Baccalaureate, and Special Focus Schools in business, engineering, and art/design. Data from other institutions is noisy and often unreliable, or (in the case of colleges in Puerto Rico, American Samoa, and other territories, often shows close to 100% of enro...

Changes in SAT Scores after Test-optional

One of the intended consequences of test-optional admission policies at some institutions prior to the COVID-19 pandemic was to raise test scores reported to US News and World Report.  It's rare that you would see a proponent of test-optional admission like me admit that, but to deny it would be foolish. Because I worked at DePaul, which was an early adopter of the approach (at least among large universities), I fielded a lot of calls from colleagues who were considering it, some of whom were explicit in their reasons for doing so.  One person I spoke to came right out at the start of the call: She was only calling, she said, because her provost wanted to know how much they could raise scores if they went test-optional. If I sensed or heard that motivation, I advised people against it.  In those days, the vast majority of students took standardized admission tests like the SAT or ACT, but the percentage of students applying without tests was still relatively small; the ne...