Skip to main content

When Infographics Fail

There are a lot of bad infographics floating around the Internet.  When they concern things like the difference between cats and dogs, or how many hot dogs and hamburgers Americans eat over the 4th of July, it's no big deal.

But this blog is about higher education data, and when I see bad infographics on that topic, I feel compelled to respond.  This one is so bad it's almost in the "I can't even," category.  It takes very interesting and compelling data--The graduation rates of Black male athletes--and compares it to overall graduation rates at several big football schools in the nation.  Here it is:


For starters, this chart appears to stack bars when they shouldn't be stacked: A graduation rate of 40% for one group and 40% for another group shouldn't add up to 80%.  The effect is that it distorts much of what your brain tries to figure out.  For instance, look at the overall rates (longer bars) for Georgia Tech and Pittsburgh;  Georgia Tech at 79% is shorter than Pittsburgh's at 77%, because they started at different points.

But wait, they can't be stacked; Louisville's 44% + 47% is way longer than Notre Dame's 81%. Stacked bars on dual axes?

These also look at first like they could be two sets of bars, with one (the overall graduation rate, which is always higher) behind the Black male graduation rate.  But that can't be, either.  The effect is that you look at Notre Dame and see very long gap between 81% and 96% (a 15-point spread) that appears to be longer than the 37-point spread at Virginia.

In short, I cannot tell you how this chart was made, or what the assumptions are, let alone what the story really is.

And the image behind the picture is even worse; it makes it hard to see.

Finally, a third element might have been interesting here: The graduation rate of Black males who are not athletes.  It might shed more light on the problem, although if the same designer did it, I'd not be confident.

Here's the data presented three ways, each of which tells the story differently, but each better in at least one way. This was literally 15 minutes of work.

What do you think?






Comments

Popular posts from this blog

Baccalaureate origins of doctoral recipients

Here's a little data for you: 61 years of it, to be precise.  The National Science Foundation publishes its data on US doctoral recipients sliced a variety of ways, including some non-restricted public use files that are aggregated at a high level to protect privacy. The interface is a little quirky, and if you're doing large sets, you need to break it into pieces (this was three extracts of about 20 years each), but it may be worth your time to dive in. I merged the data set with my mega table of IPEDS data, which allows you to look at institutions on a more granular level:  It's not surprising to find that University of Washington graduates have earned more degrees than graduates of Whitman College, for instance.  So, you can filter the data by Carnegie type, region or state, or control, for instance; or you can look at all 61 years, or any range of years between 1958 and 2018 and combine it with broad or specific academic fields using the controls. High school and indep

The Highly Rejective Colleges

If you're not following Akil Bello on Twitter, you should be.  His timeline is filled with great insights about standardized testing, and he takes great effort to point out racism (both subtle and not-so-subtle) in higher education, all while throwing in references to the Knicks and his daughter Enid, making the experience interesting, compelling, and sometimes, fun. Recently, he created the term " highly rejective colleges " as a more apt description for what are otherwise called "highly selective colleges."  As I've said before, a college that admits 15% of applicants really has a rejections office, not an admissions office.  The term appears to have taken off on Twitter, and I hope it will stick. So I took a look at the highly rejectives (really, that's all I'm going to call them from now on) and found some interesting patterns in the data. Take a look:  The 1,132 four-year, private colleges and universities with admissions data in IPEDS are incl

So you think you're going back to the SAT and ACT?

Now that almost every university in the nation has gone test-optional for the 2021 cycle out of necessity, a nagging question remains: How many will go back to requiring tests as soon as it's possible?  No one knows, but some of the announcements some colleges made sounded like the kid who only ate his green beans to get his screen time: They did it, but they sure were not happy about it.  So we have some suspicions about the usual suspects. I don't object to colleges requiring tests, of course, even though I think they're not very helpful, intrinsically biased against certain groups, and a tool of the vain.  You be you, though, and don't let me stop you. However, there is a wild card in all of this: The recent court ruling prohibiting the University of California system from even using--let alone requiring--the SAT or ACT in admissions decisions next fall.  If you remember, the Cal State system had already decided to go test blind, and of course community colleges in