Skip to main content

Enrollment at Women's Colleges, 2005 to 2013

Note: I got an email from Dean Kilgore at Mount Saint Mary's in California, who indicated I'd downloaded data for the wrong Mount Saint Mary College:In this case, the one in New York. I had to create the list manually, and it was just a slip on my part.

Sorry about that. I've removed them from the analysis, but unfortunately, can't add the correct one at this time without a considerable amount of work.



Sweet Briar College in Virginia recently announced, to the shock of many in higher education, that it would be closing at the end of this spring, 2015 term.  As often happens when a college decides to close, those who are or were close to it rally the troops and wage a fierce campaign to try to keep it open.  Sometimes it works, other times, it doesn't.

The scene playing out is not unusual: Allegations of secret deals, incompetence, blindness to all that is and was good at Sweet Briar. This is what happens when you decide to close a college.  And although I'm not taking sides, I did write before that the closing does seem to be curious in light of what little publicly available financial data there is: If you had to pick a college from this list that was going to close, it probably wouldn't be Sweet Briar.  Even the federal rankings of financial responsibility gave Sweet Briar a 3, a score higher than Harvard, which may only point out how absurd those ratings are in the first place.

A while ago, I downloaded a pretty extensive data set, using the members of the Women's College Coalition as my base.  Not all colleges have data available in IPEDS, however, so I did the best I could (for instance, the Women's College at Rutgers is not in IPEDS as a separate institution, or if it is, I couldn't find it.  And I took out Saint Mary of the Woods, as they just announced they're going co-ed).  Also, since there is no IPEDS data field that tells you when a college is a women's college, I couldn't go back and find out how many were labeled as such 20 years ago.  That might have been interesting.

Overall, though, the data were pretty uninteresting.  So I gave up on visualizing it.  There were trends, of course, but nothing dramatic.

So, when I saw this article, by one of the people leading the charge on the Save Sweet Briar campaign, one sentence jumped out at me:

Enrollment: There is no evidence that enrollment is declining, either at Sweet Briar or at women’s or liberal arts colleges. This claim is simply false. Numbers people, please check for yourself: The data are publicly available.

The data are available, and the link goes to the IPEDS site I use all the time.  So, take a look here. There are five views of the data, using the tabs across the top.  The first shows changes in freshman, total, undergraduate, and graduate enrollment over time.  The changes on the right are shown in relation to the prior year.  The second shows the same data, but the change is cumulative since 2005: As you can see, total undergraduate enrollment is down almost 6% during a time enrollment increased nationally.  The third shows admissions activity; the fourth breaks it out, showing Sweet Briar and all the other women's colleges in aggregate.  And the fifth shows total undergraduate enrollment in 2005 and 2013 (on the left) and change (on the right.)  As you can see, there are some big winners, big losers and a lot of small changes.

Decide for yourself.  And tell me what you see:





Comments

Popular posts from this blog

Changes in AP Scores, 2022 to 2024

Used to be, with a little work, you could download very detailed data on AP results from the College Board website: For every state, and for every course, you could see performance by ethnicity.  And, if you wanted to dig really deep, you could break out details by private and public schools, and by grade level.  I used to publish the data every couple of years. Those days are gone.  The transparency The College Board touts as a value seems to have its limits, and I understand this to some extent: Racists loved to twist the data using single-factor analysis, and that's not good for a company who is trying to make business inroads with under-represented communities as they cloak their pursuit of revenue as an altruistic push toward access. They still publish data, but as I wrote about in my last post , it's far less detailed; what's more, what is easily accessible is fairly sterile, and what's more detailed seems to be structured in a way that suggests the company doesn...

Educational Attainment and the Presidential Elections

I've been fascinated for a while by the connection between political leanings and education: The correlation is so strong that I once suggested that perhaps Republicans were so anti-education because, in general, places with a higher percentage of bachelor's degree recipients were more likely to vote for Democrats. The 2024 presidential election puzzled a lot of us in higher education, and perhaps these charts will show you why: We work and probably hang around mostly people with college degrees (or higher).  Our perception is limited. With the 2024 election data just out , I thought I'd take a look at the last three elections and see if the pattern I noticed in 2016 and 2020 held.  Spoiler: It did, mostly. Before you dive into this, a couple of tips: Alaska's data is always reported in a funky way, so just ignore it here.  It's a small state (in population, that is) and it's very red.  It doesn't change the overall trends even if I could figure out how to c...

Changes in SAT Scores after Test-optional

One of the intended consequences of test-optional admission policies at some institutions prior to the COVID-19 pandemic was to raise test scores reported to US News and World Report.  It's rare that you would see a proponent of test-optional admission like me admit that, but to deny it would be foolish. Because I worked at DePaul, which was an early adopter of the approach (at least among large universities), I fielded a lot of calls from colleagues who were considering it, some of whom were explicit in their reasons for doing so.  One person I spoke to came right out at the start of the call: She was only calling, she said, because her provost wanted to know how much they could raise scores if they went test-optional. If I sensed or heard that motivation, I advised people against it.  In those days, the vast majority of students took standardized admission tests like the SAT or ACT, but the percentage of students applying without tests was still relatively small; the ne...