Skip to main content

COVID and AP Scores

Every year, The College Board releases summaries of the prior year's AP program.  While I've visualized these before (here and here), I've been unwilling to update the visualizations or do longitudinal analysis, for a couple of reasons:  First, the data are in multiple tables in multiple spreadsheets, and they are so heavily formatted for printing that scraping the data out of them is quite a burden.  Second, of course, is that the scores don't change a lot from one year to the next.

That is, until 2020, when COVID completely turned the world of higher education upside down.  I was interested in seeing how much scores changed from prior years.  As you can see, the changes are interesting, if not completely surprising.

By the way, if you enjoy Higher Ed Data Stories and use it in your work, you can support the web hosting and other costs associated with producing the content by Buying Me A Coffee, here.  If you're a high school teacher or counselor, just ignore and read on for free.

In the past I've been critical of the College Board, and I thought they should have cancelled the tests in 2020, notwithstanding the loss of opportunity it would have meant for thousands of students. And the test results seem to suggest the product in 2020 was not up to snuff.  And, to no one's surprise, the students we thought would be most affected by the pandemic were, in fact, apparently most affected by the pandemic.

A couple of notes about the data: The breakouts of public and private school students are estimates, based on subtracting the public school data from the overall data (the assumption being all schools are either public or private, but not accounting, of course, for home-schooled students, who are most likely in the latter category.)  I've also grouped a few subject exams together (Spanish Literature with Spanish Literature and Culture, and others where there was an apparent name change that may have included some content.)  I used the ethnicity labels College Board supplied, and I kept the awkward ALL CAPS labels on the exams, because well, if it's good enough for them, it's good enough for me.

And finally, let's remember a lot of things come with race/ethnicity, and school types that are not measured in this data: Income, parental attainment, opportunity, and student investment.  Don't jump to easy conclusions about what you see here. (And send a note to College Board and ask them to provide this data in more granular and detailed formats, so we can show that, too.)

These views are all very straightforward, and don't require a lot of explanation.  The last view, however, does break scores into two chunks: A weighted average of 2017--2019, and 2020, so you can see the comparisons of before and after (during COVID).

As always, let me know what you see here.


Comments

Popular posts from this blog

Educational Attainment and the Presidential Elections

I've been fascinated for a while by the connection between political leanings and education: The correlation is so strong that I once suggested that perhaps Republicans were so anti-education because, in general, places with a higher percentage of bachelor's degree recipients were more likely to vote for Democrats. The 2024 presidential election puzzled a lot of us in higher education, and perhaps these charts will show you why: We work and probably hang around mostly people with college degrees (or higher).  Our perception is limited. With the 2024 election data just out , I thought I'd take a look at the last three elections and see if the pattern I noticed in 2016 and 2020 held.  Spoiler: It did, mostly. Before you dive into this, a couple of tips: Alaska's data is always reported in a funky way, so just ignore it here.  It's a small state (in population, that is) and it's very red.  It doesn't change the overall trends even if I could figure out how to c...

First-year student (freshman) migration, 2022

A new approach to freshman migration, which is always a popular post on Higher Ed Data Stories. If you're a regular reader, you can go right to the visualization and start interacting with it.  And I can't stress enough: You need to use the controls and click away to get the most from these visualizations. If you're new, this post focuses on one of the most interesting data elements in IPEDS: The geographic origins of first-year (freshman) students over time.  My data set includes institutions in the 50 states and DC.  It includes four-year public and four-year, private not-for-profits that participate in Title IV programs; and it includes traditional institutions using the Carnegie classification (Doctoral, Masters, Baccalaureate, and Special Focus Schools in business, engineering, and art/design. Data from other institutions is noisy and often unreliable, or (in the case of colleges in Puerto Rico, American Samoa, and other territories, often shows close to 100% of enro...

Education Levels in the US, by State and Attainment

Attainment has always been an interesting topic for me, every since I first got stunned into disbelief when I looked at the data over time.  Even looking at shorter periods can lead to some revelations that many don't make sense at first. Here is the latest data from NCES, published in the Digest of Education Statistics . Please note that this is for informational purposes only, and I've not even attempted to visualize the standard errors in this data, which vary from state-to-state.  There are four views year, all looking at educational attainment by state in 2012 and 2022.   The first shows data on a map: Choose the year, and choose the level of attainment.  Note that the top three categories can be confusing: BA means a Bachelor's degree only; Grad degree means at least a Master's (or higher, of course); and BA or more presumably combines those two.  Again, standard errors might mean the numbers don't always add up perfectly. The second shows the data o...