Friday, July 26, 2013

FactCheck: Does Your Entrance Score Strongly Correlate With Your Success at University?

by Gavin Moodie, RMIT University
“The fact remains there is a very strong correlation between people’s entry score at university and their success rate” - Higher Education Minister Senator Kim Carr, ABC’s 7.30, 17 July.
Correlating entry scores and success rates requires analysis of big data files from the Department of Tertiary Education and correction for other factors, such as the secondary school attended and subjects studied at university.

You also have to take account of the different distributions of entry scores and university grades. Journalist Tim Dodd explains this in his Australian Financial Review article.

He notes that entry scores are allocated on a bell curve, which means that there is a big difference between the ability of a student with an entry score of 95 and a student with 85, but a smaller difference between the ability of a student with 85 and one with 75, and a much smaller difference between students with 75 and 65.

Fortunately we don’t have to make sense of the several studies that have used different methods and found different results, for Andrew Norton from the Grattan Institute recently published a useful graph of entry scores and completion rates in The Conversation.

Norton’s chart is below. It shows successful program completions by entry score for students who started higher education in 2005. The available data stops at 2011 and Norton notes that final completion rates will be a little higher because some students who began in 2005 are still enrolled.

The rate of degree completion by ATAR. A Norton: Should higher education student numbers be capped? 2 July 2013.



There is clearly a relationship between students’ entry score and their completing their program, but whether it is “very strong” depends on one’s judgement.

If the line in the graph was horizontal there would be no relationship and if it were at an angle closer to 30 degrees there would be a perfect correlation. Norton’s interpretation, referring to Australian Tertiary Admission Ranks (ATAR, or entry scores) is:
“The chart tells a complicated story. There is a reasonably strong relationship between ATAR and completion. Ninety per cent of students who began their degrees in 2005 with ATARs of 95 or more, completed a degree by 2011. By contrast, for students with ATARs below 70 completion rates are generally clustered in a few percentage points either side of two-thirds.”
The chart also reflects findings of several other studies that the correlation between entry scores and university performance is different for different score bands.

Generally, the correlation between entry scores and university performance in these studies is very strong for students with entry scores above 80, very weak to non existent for scores between 80 and 40, and stronger but very variable for scores below 40.

Verdict

The statement is somewhat true, but misleading in overstating the importance of entry scores in the middle band.

Review

I agree with the central finding of this fact check: while there is a correlation between a person’s entry score at university and his or her success rate, Senator Carr’s statement needs more nuance.

The fact check draws mostly upon data from one study for one year (Norton, 2013), however the findings are consistent with many others.

Extensive studies by Birch and Miller in 2005 and 2006 confirmed that students’ success during their first year at university is largely influenced by their university entrance score.

A 2008 report by the Centre for the Study of Higher Education at the University of Melbourne (by Nigel Palmer, Emmaline Bexley and Richard James) also references studies with complementary findings.

At the same time, the following are also factors: the size of the school the student comes from; the weighting applied to the final exam mark versus the continuing assessment; the student-teacher ratio; and gender.

Furthermore, other studies have found that the relationship between achievement at school and university can vary by subject area and institution (see, for example Evans & Farley, 1998) - Tim Pitman.

The Conversation is fact checking political statements in the lead-up to this year’s federal election. Statements are checked by an academic with expertise in the area. A second academic expert reviews an anonymous copy of the article.

Request a check at checkit@theconversation.edu.au. Please include the statement you would like us to check, the date it was made, and a link if possible.

The authors do not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article. They also have no relevant affiliations.
The Conversation

This article was originally published at The Conversation. Read the original article.

No comments:

Post a Comment