Skip to main content

Looking at Discount, 2016

If you want to strike fear into the hearts of enrollment managers everywhere, just say, "The trustees want to talk about the discount rate."

If you don't know, the discount rate is a simple calculation: Institutional financial aid as a percentage of tuition (or tuition and fees) revenue.  If your university billed $100 million in tuition and fees, and awarded $45 million in aid, your discount is 45%.  In that instance, you'd have $55 million in hard cash to run the organization.

Discount used to be a reporting function, something you would look at when the year was over to see where you stood.  Now, it's become a management target. And that's a problem.  If you want to know why, read this quick little explanation of Campbell's Law. The short explanation is this: If you want to lower discount--if that's really the thing you are after--you can do it very easily.  Just shrink your enrollment.  Or lower your quality, as measured by things like GPA and test scores. Easy.

Of course, this is generally not what people mean when they say they want to decrease the discount rate.  They usually mean "decrease the discount and keep everything else the same, or better yet, improve those measures."  That's not so easy.  The simple reason is that decreasing your discount means you're raising price.  And we all know what happens when you raise price, unless you turn your college into a Giffen good which you can't do, of course.

What people really want is more net revenue: that $55 million in the example above.  You'd probably like to have it be $57 million, which would mean you lower your discount rate to 43%.  That happens because you either charge students more, or enroll more students who bring external aid, like Pell or state grants.  You don't care, really.  Cash is cash.

The absurdity of discount was demonstrated to me by a finance professor friend, who said back in the late 90's, "If we generate $12,000 in average net revenue on an $18,000 tuition (a 33% discount), let's propose raising tuition to $100,000 and the discount to 80%."  Yes, believe it or not, the denominator is important when calculating percentages, which is why it's hard to compare discounts in a meaningful way for competitors who charge more.)

If you're interested, here's a little presentation I did on why colleges have tended to increase discount and net revenue at the same time.  This exercise is probably close to the breaking point, however.

Now that you understand a little more about discount, on to the data. This is from the IPEDS data for Fall, 2016, the most recent available showing both aid and admissions data.  There are four views, using the tabs across the top.

View 1: Discount overview

No interactivity: Just average discount rates by Carnegie type, Region, and Urbanicity.  I think the bottom one is the most fascinating discovery I've come across yet.  Just by playing with the data.

View 2: Discount by Market Category

This one combines the three categories above: Carnegie, Region, and Urbanicity into a single category to see how discounts play out.  In order to be included in this, there had to be at least ten colleges in the category.  You can see that the highest discount, on average, is Baccalaureate institutions in distant towns in the South Central region of the US.  You can color this by any of the three individual categories using the little control at the top right.

View 3: Individual Colleges

This lists all the private colleges for which I could calculate a freshman discount rate and net revenue per freshman.  The controls at the top allow you to look at schools like yours, if you want.  Note the slider at top right: I started showing freshman classes of at least 200, as some small college data gets a bit funky.  You can expand or narrow that by pulling the sliders to your heart's content.

Sort by any column by hovering over the little icon in the x-axis label.  If you get in trouble, you can always reset using the arrow control at lower right.

View 4: Multidimensional

Each college in this view is a bubble, arrayed on the chart in two dimensions: Freshman Discount and Average net revenue per freshman.  The size of the bubble shows freshman selectivity (bigger is more selective).  The color of the bubble shows the percentage of freshmen with institutional aid.  Note that the highest net revenue institutions are also the most selective, suggesting people will pay for prestige (or prestige and wealth pave the way to admissions). And the lowest net revenue institutions are dark blue, showing almost everyone getting institutional aid (either "merit" or "need-based" although those distinctions are silly.)

Use the filters to limit the colleges on the view, and use the highlight function (just start typing) to highlight the institution of your choice.  Note especially what happens when you limit the view to colleges with higher tuition.  Go ahead.  You won't break anything.

As always, let me know what you see.

Comments

  1. Jon - this is great. I can't get the download for some reason. Shows up it crazy unreadable format.

    ReplyDelete

Post a Comment

Popular posts from this blog

Educational Attainment and the Presidential Elections

I've been fascinated for a while by the connection between political leanings and education: The correlation is so strong that I once suggested that perhaps Republicans were so anti-education because, in general, places with a higher percentage of bachelor's degree recipients were more likely to vote for Democrats. The 2024 presidential election puzzled a lot of us in higher education, and perhaps these charts will show you why: We work and probably hang around mostly people with college degrees (or higher).  Our perception is limited. With the 2024 election data just out , I thought I'd take a look at the last three elections and see if the pattern I noticed in 2016 and 2020 held.  Spoiler: It did, mostly. Before you dive into this, a couple of tips: Alaska's data is always reported in a funky way, so just ignore it here.  It's a small state (in population, that is) and it's very red.  It doesn't change the overall trends even if I could figure out how to c...

First-year student (freshman) migration, 2022

A new approach to freshman migration, which is always a popular post on Higher Ed Data Stories. If you're a regular reader, you can go right to the visualization and start interacting with it.  And I can't stress enough: You need to use the controls and click away to get the most from these visualizations. If you're new, this post focuses on one of the most interesting data elements in IPEDS: The geographic origins of first-year (freshman) students over time.  My data set includes institutions in the 50 states and DC.  It includes four-year public and four-year, private not-for-profits that participate in Title IV programs; and it includes traditional institutions using the Carnegie classification (Doctoral, Masters, Baccalaureate, and Special Focus Schools in business, engineering, and art/design. Data from other institutions is noisy and often unreliable, or (in the case of colleges in Puerto Rico, American Samoa, and other territories, often shows close to 100% of enro...

Education Levels in the US, by State and Attainment

Attainment has always been an interesting topic for me, every since I first got stunned into disbelief when I looked at the data over time.  Even looking at shorter periods can lead to some revelations that many don't make sense at first. Here is the latest data from NCES, published in the Digest of Education Statistics . Please note that this is for informational purposes only, and I've not even attempted to visualize the standard errors in this data, which vary from state-to-state.  There are four views year, all looking at educational attainment by state in 2012 and 2022.   The first shows data on a map: Choose the year, and choose the level of attainment.  Note that the top three categories can be confusing: BA means a Bachelor's degree only; Grad degree means at least a Master's (or higher, of course); and BA or more presumably combines those two.  Again, standard errors might mean the numbers don't always add up perfectly. The second shows the data o...