Skip to main content

Posts

Yes, Enrollment is Going Down. Also up.

When designing a data visualization, the first thing to ask is, "What does the viewer want to see, or need to know?"  If you're designing a dashboard for a CFO or a CEO or a VP for Marketing, those things are pretty straight forward: You're designing for one person and you have a pretty good idea what that person wants. But in higher education, we want to look at segments of the industry, and trends that are specific to our sector.  And there are thousands of you (if this blog post is average, that is).  So I can't know. This visualization of enrollment data measures only one thing: Enrollment.  But it measures several different types of enrollment (full-time, part-time, graduate, and undergraduate, in combination) at many different types of institutions (doctoral, baccalaureate, public, private, etc.)  And the best thing is that you can make it yours with a few clicks. The top chart shows total headcount, and the bottom shows percentage change since the f...

Medical School Admissions Data

This is pretty interesting, I think, mostly for the patterns you don't see. This is data on medical school admission in the US; some of it is compiled for a single year, and some for two years (which is OK because this data appears to be pretty stable over time.) Tab 1 is not interactive, but does show applications, admits, and admit data on grids defined by GPA and MCAT scores.  Darker colors show higher numbers (that is, more counts, or higher admit rates.)  While we cannot get a sense of all takers like we do with other standardized tests, this does perhaps show some strong correlation between college GPA and MCAT scores (of course, another explanation may be that students self-select out, which then makes me wonder about that one student with less than a 2.0 GPA and less than a 486 Total MCAT score who applied, was admitted, and then enrolled. The second and third tabs show applicants by undergraduate major, and ethnicity, respectively.  Choose a value at upper ...

2017 Admissions Data: First Look

IPEDS just released Fall, 2017 Admissions data, and I downloaded it and took a quick look at it. If you've been here before, most of this should be self-explanatory. Three tabs, here: The first is to take a look at a single institution.  Use the control at top to select the college or university you're looking for. (Hint, type a few letters of the name to make scrolling quicker). The second tab allows you to compare ten (you can do more, but it gets messy).  I started with the ten most people want to see, but you can delete them by scrolling to their check in the drop down and deleting them, and clicking apply.  Add institutions by checking the box by their name. The final shows the relationships between test scores and Pell, which I've done before, but I never get tired of. Choose SAT or ACT calculated means for the x-axis, then limit by region and/or control if you so desire. Notes: 1) Some of the admissions data for 2017 is tentative, so anomalies are probabl...

The Death of History?

The last several days have seen a couple of articles about the decline of history majors in America.  How big is the problem?  And is it isolated, or across the proverbial board? This will let you see the macro trend, and drill down all the way to a single institution, if you'd like. The four charts, clockwise from top left are: Raw numbers of bachelor's degrees awarded from 2011-2016 (AY); percentage of total (which only makes sense when you color the bars) to show the origins of those degrees; percentage change since the first year selected; and numeric change since the first year selected. You can color the bars by anything in the top box at right (the blue one) or just leave totals; and you can filter the results to any region, or group of years, or major group (for instance, history, or physical sciences), or even any specific institution.  And of course you can combine filters to look at Business majors in the Southeast, if you wish. That's it.  Pretty ...

Your daily dose of "No Kidding"

As a young admissions officer in 1985, I went to my first professional conference, AACRAO, in Cincinnati. I don't remember much about it, but one session is still clear to me. I had chosen a session almost by accident, probably, because it was admissions focused in a conference that was mostly registrars. And fate stepped in. There was a last minute substitution, and Fred Hargadon filled in for some person whose name is lost to history. At the time, I didn't think I'd stay in admissions long; my personality type is atypical for the profession, and I didn't find a lot to excite me.  But in this session I found someone who could approach the profession, well, professionally; someone who could view admissions in a much larger context than I was used to seeing.  Someone who was more intellectual and conceptual than friendly (although he was both). I remember a lot of that session, but one thing has stuck with me through all this time.  He said, "In all my years ...

2018 AP Scores by State and Ethnicity

The College Board data on AP scores is now available for 2018, but it's hard to make sense of in a macro sense.  The data are in 51 different workbooks, and, depending on how you want to slice and dice the data, as many as eight worksheets per workbook.  What's more, is the data structure; they're designed to print on paper, for those who want to dive into one little piece of the big picture at a time. So before going any farther, I'd like us all to challenge the College Board and ACT to put out their data in formats that make exploring data easier for everyone. Unless, of course, they really don't want to do that. I downloaded all 51 workbooks and extracted the actual data using EasyMorph , then pulled it into Tableau for visualization and publication. There are four views here. The first tab is a simple scattergram, which may be enough: The relationship between a state's median income and the average AP exam score.  While blunt, it points out once again ...

Story Telling With Data Challenge

I've often seen the challenges issued by Cole Knaflic on the Story Telling With Data website, and found the most recent one, creation of a scatterplot , to be too tempting to pass up. I used Tableau to create it, and yes, I've written about this before. This is IPEDS data, from Fall of 2015 (the most recent complete set available).  It shows the strong correlation between standardized test scores and income.  And I think it shows something else, too. On the x-axis, choose SAT or ACT scores (depending on your comfort) to see how higher scores translate into fewer lower-income students (as measured by eligibility for Pell Grants).  The bubbles are color-coded by control, and sized by selectivity (that is, the percentage of freshman applications accepted.)  Highly selective institutions are coded as larger bubbles, and less selective as smaller bubbles. Note the cluster of private, highly selective institutions at the lower right: Most of these constitutions are ...

An Interactive Retention Visualization

As I've written before, I think graduation rates are mostly an input, rather than an output .  The quality of the freshman class (as measured by a single blunt variable, average test scores) predicts with pretty high certainty where your graduation rate will end up.  (Note: Remember, the reason test optional admissions practices work is that test scores and GPA are strongly correlated.  If you didn't have a high school transcript, you could use test scores by themselves, but they would not be as good; sort of like using a screwdriver as a chisel.  And the reason why mean test scores work in this instance is essentially the same reason your stock portfolio should have 25 stocks in it to reduce non-systematic risk.) Further, choosing students with high standardized test scores means you're likely to have taken very few risks in the admissions process, as high scores signal wealth, more accumulated educational opportunity, and college-educated parents. That essentia...

All the 2015 Freshman Full-pays

There is no problem so great that it can't be solved by enrolling more full-pay students, it seems.  And in the minds of some, there is no solution so frequently tossed out there.  I've heard several presidents say, "We're doing this to attract more full-pay students." Before we dive too deeply into this, a definition: A "Full-pay" student is not one who receives no aid; rather it's one who receives no institutional aid. Often these overlap considerably, but a student who receives a full Pell and/or state grant, and then takes out a PLUS loan is a full-pay; all the revenue to the college comes in cash, from another source, rather than its own financial aid funds.  The source of that cash matters not to the people who collect the tuition.  Got it? This is a fairly deep dive into the IPEDS 2015 Fall Freshman data (there is 2016 admissions data, but financial aid data is only available for 2015-2016, so I used that admissions data to line things up....

Measuring Internationalism in American Colleges

How International is a college?  And how do you measure it?  There are certainly a lot of ways to think about it: Location in an international city like New York, Chicago, or Los Angeles, for instance.  The extent to which the curriculum takes into account different perspectives and cultures, for another. And, of course, there is some data, this time from the IIE Open Doors Project.  I did a simple calculation, taking the number of international students enrolled, plus the number of enrolled students studying abroad, and divided the sum of those to come up with an international index of sorts. No, it's not precise, and yes, I know the two groups are not discreet, but this--like all the data on this blog--is designed to throw a little light on a question, not to answer it definitively. You'll find data on all the colleges that participate in the IIE survey, displayed in four columns:  Total enrollment (on the left), International enrollment, Overseas study ...

Looking at Transfers

It's official: Princeton has broken its streak of not considering transfer students for admission, and has admitted 13 applicants for the Fall, 2018 term of the 1,429 who applied, for an astonishing how-low-can-you-go admit rate of 0.9%.  Of course, we'll have to wait until sometime in the future to see how many--if any--of them actually enroll. I thought it might be interesting to take a look at transfers, so I did just that, using an IPEDS file I had on my desktop.  There are four views here, and they're pretty straightforward: The first tab shows the number of transfers enrolled by institution in Fall, 2016 (left hand column) and the transfer ratio.  The ratio simply indicates how many new transfer students for Fall, 2016 you'd meet if you were to go on that college campus in Fall, 2016 and choose 100 students at random.  A higher number suggests a relatively more transfer friendly institution. You can choose any combination of region, control and broad Carne...

Want to increase graduation rates? Enroll more students from wealthier families.

OK. Maybe the headline is misleading.  A bit. I've written about this before: The interconnectedness of indicators for colleges success.  This is more of the same with fresher data to see if anything has changed. Spoiler alert: Not much. What's new this time is the IPEDS publication of graduation rates for students who receive Pell and those who don't, along with overall graduation rates.  While the data are useful in aggregate to point out the trends, at the institutional level, they are not. First, some points about the data:  I've included here colleges with at least 20 Pell-eligible freshmen in 2015, just to eliminate a lot of noise.  Colleges with small enrollments don't always have the IR staff to deliver the best data to IPEDS, and they make the reports a bit odd.  And even without these institutions, you see some issues. Second, colleges that do not require tests for admission are not allowed to report tests in IPEDS.  Once you check "n...