Skip to main content

Looking at "Discrepant Scores"

Several years ago, The College Board produced a study of "discrepant performance," after studying about 150,000 students and their freshman-year grades in college.  If you want to see the study, you can get a pdf of it here. The title, of course, is interesting. (And before we get too deep, it's important to note that these are old SAT scores, in case you think the new test is better and want to argue that point, nice tight concordance between the old and the new notwithstanding.)

Discrepant performance is defined as standardized test scores that are inconsistent with a student's academic performance in high school.  The distributions of scores and grades were normalized, and then each student's z-score on tests was compared to their z-score on grades.  Just under two-thirds of students had scores and grades that were consistent, and we don't need to talk too much more about them.

The most interesting thing is the other one third of students: Those whose grades were higher than their tests (High GPA) and those whose tests were higher than their grades (High SAT).  Each group was about one-sixth (17.5%) of the sample.

Back to The College Board presentation for a moment.  It suggests that high testers with low grades are a better risk than low testers with high grades, but it also says grades are a better predictor by themselves than tests.  That's not statistically impossible, of course, but it does seem curious.  And it does fly in the face of both my experience and conventional wisdom regarding the best way to make admissions decisions; I think most admissions officers believe high tests/low grades are the weaker bet of the two extremes.

But let's go with that for a minute.  And let's ask why it might be true. But first, let's argue with what little methodology is presented here in this study.  A lot of the conceptual problem in predicting human performance, of course, comes from our own arrogance: In this case, the belief that a limited number of pre-college factors represent the sum total of factors affecting freshman grades.  How limited, in this case?  Two.

If you really wanted to get a good model to predict freshman performance, you'd look at a lot of factors: Family income, parental attainment, ethnicity of the student vis-à-vis the student body and vis-à-vis the high school they came from, just to name a few.  All of those factors are important, and what we find is that students from lower-income families whose parents didn't go to college, and who feel out of place in the college they've enrolled have some struggles.  I don't see any of these factors controlled for in this analysis (if I'm wrong I'll be happy to correct it.)

You can see the table of how discrepant performance breaks out, but you can you really see it?  Let me draw you a picture.  On this chart, (which shows only the students with discrepant performance), the light blue bar on the left chart shows the number of students with high tests and lower GPA (High SAT); the orange bar on the left chart show the number of students with low tests and high GPA (High GPA).  Hover over the bars to see how many there are.  One the right chart, the mauve colored bar shows what percentage of each group had high SAT. (The ones The College Board says you should give the breaks to).



Surprise: Guess who tends to have higher grades and lower scores?  Women (who get better grades at every level of education than men, by the way); poorer students; students from underrepresented ethnic groups, and students whose parents have less education.  This narrative plays smoothly into the prevailing wisdom of 1930, which suggested they just were not suited for higher education, and which some people still seem to believe.

Who has higher scores and lower grades? Men, white and Asian students, wealthier students, and children of well educated parents.  And The College Board statistics tell you these are the students you should give a break to in the admissions process because they did better on a three-hour test.  You see, in the simple approach, only SAT scores and GPA determine your college performance, and it's not at all affected by how much you have to work, or worry about money, or spend time figuring out how college operates, or whether you belong there.  So keep giving the white guys the break.

Two final points: If I took the labels off the bars, and told you "This is the gender chart," or "This is the income chart" you could probably put the labels on in correct order pretty easily.  Second (and this could be a whole other blog post all together) even the poorest students (a B- average) with the lowest test scores ended the first year with an average GPA of 2.0, and the differences between and among the groups are exaggerated by a truncated y-axis on the chart in the presentation.

As always, let me know what you think.

Reminder: I appreciate support for webhosting and other costs associated with creating Higher Ed Data Stories.  You can support these efforts here.

Comments

  1. I have intuitively sensed this, but it is good to see the data. Somehow your blog fell of my radar. Good to come back to it. Keep it up. Thanks, Carey

    ReplyDelete

Post a Comment

Popular posts from this blog

Baccalaureate origins of doctoral recipients

Here's a little data for you: 61 years of it, to be precise.  The National Science Foundation publishes its data on US doctoral recipients sliced a variety of ways, including some non-restricted public use files that are aggregated at a high level to protect privacy. The interface is a little quirky, and if you're doing large sets, you need to break it into pieces (this was three extracts of about 20 years each), but it may be worth your time to dive in. I merged the data set with my mega table of IPEDS data, which allows you to look at institutions on a more granular level:  It's not surprising to find that University of Washington graduates have earned more degrees than graduates of Whitman College, for instance.  So, you can filter the data by Carnegie type, region or state, or control, for instance; or you can look at all 61 years, or any range of years between 1958 and 2018 and combine it with broad or specific academic fields using the controls. High school and indep

The Highly Rejective Colleges

If you're not following Akil Bello on Twitter, you should be.  His timeline is filled with great insights about standardized testing, and he takes great effort to point out racism (both subtle and not-so-subtle) in higher education, all while throwing in references to the Knicks and his daughter Enid, making the experience interesting, compelling, and sometimes, fun. Recently, he created the term " highly rejective colleges " as a more apt description for what are otherwise called "highly selective colleges."  As I've said before, a college that admits 15% of applicants really has a rejections office, not an admissions office.  The term appears to have taken off on Twitter, and I hope it will stick. So I took a look at the highly rejectives (really, that's all I'm going to call them from now on) and found some interesting patterns in the data. Take a look:  The 1,132 four-year, private colleges and universities with admissions data in IPEDS are incl

So you think you're going back to the SAT and ACT?

Now that almost every university in the nation has gone test-optional for the 2021 cycle out of necessity, a nagging question remains: How many will go back to requiring tests as soon as it's possible?  No one knows, but some of the announcements some colleges made sounded like the kid who only ate his green beans to get his screen time: They did it, but they sure were not happy about it.  So we have some suspicions about the usual suspects. I don't object to colleges requiring tests, of course, even though I think they're not very helpful, intrinsically biased against certain groups, and a tool of the vain.  You be you, though, and don't let me stop you. However, there is a wild card in all of this: The recent court ruling prohibiting the University of California system from even using--let alone requiring--the SAT or ACT in admissions decisions next fall.  If you remember, the Cal State system had already decided to go test blind, and of course community colleges in