Monday 16 February 2015

University Entrance: Part VII - Conclusions

This post is part of a multi-part series on University Entrance and whether it is set at the right standard. For the previous part, click here - University Grading and Outcomes.

In this blog series, we’ve covered the following questions:
-          What is standards based testing, and how is that related to University Entrance?
-          Why was University Entrance changed?
-          Why does the government care where the UE standard is?
-          Why do secondary school students, parents and teachers care where the UE standard is?
-          Why do universities and university students care where the UE standard is?
-          Does UE accurately represent preparedness for degree-level study?

Through this discussion, we’ve uncovered some underlying themes. Perspectives towards University Entrance are informed by:
-          Accurate assessment and communication of standards
-          Macroscopic benefits to society
-          Individualistic benefits for life and the value of education
-          Resource limitations when investing in education
-          Changing cultures towards university education

Ultimately, we made some sweeping generalisations of the stakeholder groups to figure out what they think about the new University Entrance standard:
-          The government probably wants it a bit lower, but not too much lower
-          Secondary school students, parents, and teachers probably want it lower
-          Universities and university students probably want it (much) higher

At times like this, there’s an argument that pops to mind. If some people are telling you that it’s too low, and some people are telling you that it’s too high, then you’ve probably got it right. It’s the median voter model that political strategicians love to hate. As a result of this positional negotiation approach, nobody is really entirely happy. But maybe they’re happy enough.

Maybe the characterisation of the UE standard as existing on a single continuum is inaccurate, when there are separate elements targeted at resolving different problems. Maybe increasing the number of graduates or the number of students achieving NCEA is the wrong goal when the standards by which we assess students can move and fluctuate. Maybe preparedness for university is not the dominating factor that dictates whether a student is successful or not at degree-level study when there are many other factors at play. There are a lot of unknowns that we can’t truly answer, which I guess is why it becomes such a political topic; if there was an objectively correct answer there would be no argument (although that doesn’t always stop people from trying anyway).

Underlying the broad UE discussion is one about whether access to university education is a right or a privilege. There are advocates that argue that all people should have the right to try and succeed, and that locking people out of the system is inequitable and furthers systemic disadvantages. There are others that argue that university education should only be afforded to the best students who are best equipped to make use of the education, making the most efficient use of resources. In this equality vs. elitism battle, left-right groups form and ideologies dictate the flow of discussion.

I’ll end this series with the philosophical consensus that I reached with columnist Verity when we discussed this a few weeks ago. University should have higher standards and only the most capable of students should be able to enter. But this is only okay if all students have the same opportunities to learn, progress, and excel beforehand. Our society is still too unequal, too inequitable, and too unfair for us to lock people out and consign individuals to a set life path as soon as they finish high school. In a world with severe socioeconomic disadvantage, where some children start the game of life with more points than others, a high entry bar only serves the interests of the elite. So perhaps solving the University Entrance problem is much trickier than setting the standard at the right difficulty; perhaps there’s a broader societal problem to be resolved first.

University Entrance: Part VI - University Grading and Outcomes

This post is part of a multi-part series on University Entrance and whether it is set at the right standard. For the previous part, click here - Universities and University Students.

Are students with UE actually prepared enough for university?
In theory, every student who enters university should be sufficiently capable to complete a degree. This is important to understand, because if a significant proportion of students are failing university, then perhaps the university entrance standard is in the wrong place. There are plenty of reasons why a student might fail, so we shouldn’t expect a 100% pass rate, but similarly a 10% pass rate would be worrying. Unfortunately, the data that we need to answer this question is not easily available.

Luckily, last year I asked all of the universities under the Official Information Act (OIA) for pass rate data for all of their papers, and now have a(n incomplete) dataset to work with. I’ll document the OIA request and processes in an appendix post (here). I’ll note here that I did receive data from University of Otago, but it did not include class sizes so I can’t use it for the below analysis. In all of the following tables, courses with fewer than 10 students are excluded. So with data from three out of eight universities, we can have a look at the average pass rate across all papers (weighted for number of students per paper) from 2011 to 2013:
Weighted average pass rate
2011
2012
2013
Pass %
Courses
Pass %
Courses
Pass %
Courses
Lincoln University
81.5
395
83.5
402
83.9
393
University of Auckland
88
2,699
89
2,721
88
2,692
Victoria University of Wellington
88
1,372
88
1,359
86
1,397

Around 10-15% of students failing is probably reasonable and to be expected. There are many reasons to fail papers beyond preparedness for university. However, these statistics are across all papers at the university, including in some cases Foundational or Honours papers – the picture looks different when we only consider 100-level (commonly, although not exclusively, 1st year) papers:
Weighted average pass rate
(100-level papers only)
2011
2012
2013
Pass %
Courses
Pass %
Courses
Pass %
Courses
Lincoln University
75.2
49
76.3
51
75.4
52
University of Auckland
83
378
84
372
82
367
Victoria University of Wellington
78.3
205
80
200
79.6
202

Between 5-10% more students fail when we look at only 100-level papers. So it would be fair to say that the average first year is more likely to fail than the average student at a university, which is probably not a ground-breaking conclusion; the question is if that failure rate is too high, too low, or just right? However, rather than looking at averages, let’s look at the data another way. In each of the following graphs, only 100-level papers with at least 10 students enrolled are shown:

Note: The 0% pass rate papers at the bottom-left of the VUW graph are papers that students going on exchange are enrolled in for administrative purposes.

Beyond averages, it’s important to look at the spreads and see that there is broad variation from the mean. In particular, there are some courses with quite low pass rates - as low as 40%. Generally, failing one paper is enough to derail a degree and force students to have to take an extra semester (or more) to finish their course. Additionally, it’s important to recognise the sizes of the classes and appreciate the scope of the issue.

For example, let’s take the right-most point on the UoA graph. This course, with 287.69 EFTS enrolments, which equates to roughly 2,300 students (assuming that it’s a 15-point course), has a pass rate of 75%. One-quarter of students who take this 100-level paper fail. That’s roughly 575 students who failed that course in 2012, and are going to either repeat the course, try to do another paper instead if they’re allowed to (this one happens to be a pre-requisite for a number of majors), switch degrees, or drop out of university.

Now, with the caveat that students who fail one paper are likely to have failed other papers as well and therefore be double counted, we can look at the total number of course failures for 100-level courses at each university:
Total Number of Course Failures
(100-level papers only)
2011
2012
2013
Lincoln University
1,534
1,266
1,293
University of Auckland
14,598
14,066
15,218
Victoria University of Wellington
8,866
8,139
8,182

Making the critical assumption that the universities did a perfect job in teaching their students and preparing them for their exams as best they could and that the assessment standards are at the right places [cough cough], this table shows a worryingly high number of course failures. Taking a somewhat educated stab in the dark, we might see roughly 50,000 course failures across all eight universities per year. The question then becomes this – is this number acceptable?

Why are so many students failing?
There are two main reasons why a grade distribution might look the way it does – either the students are assessed at a standard and the distribution reflects that ability, or the assessors have an expected distribution and scale the marks to meet that distribution. As described in the previous part of this series, at university it’s likely a combination of both reasons. As several of the universities reminded me in their responses to OIA requests, there are many factors that influence pass rates, and similarly there are many factors that influence failure rates, some of which are difficult to pin down. However, one of those factors is that an increasing number of students are insufficiently prepared for university.

A report by the Tertiary Education Commission in 2013 (obtained by the Listener under the OIA) stated that most students who got an “achieved” grade at university would have had a “fail” grade under the School Certificate/Bursary system. Universities said that students are reaching them under-prepared and with a poor work ethic. Dale Carnegie at Victoria University said that students were able to “game” NCEA in a way that they couldn’t at university, giving the perception that they are more capable and prepared than they actually are. It suggests that if we continue to push more and more students who have reached the stated minimum entry requirement but are ultimately unprepared for university, we can only expect the number of students who fail papers to increase.

When an unprepared student enters university, fails one or more papers, and drops out of university, we end up with a lose-lose-lose situation. The government wastes money funding education and student loans, the university wastes time and resources catering to their needs, and worst of all the student wastes time that they could have spent working and wastes money that they could have spent on more useful things.

At the end of the day it’s about whether or not young people are in the right places for them and their future. University simply may not be the right place for everyone, and pushing students towards university when it’s not the right path for them has negative consequences. This effect has been seen in the United Kingdom, where university drop-out rates have soared to the 20% range; the University and College Union secretary suggested that the source of the problem is rising fees forcing students into courses that are cheaper but “do not suit their abilities”. In the United States, 33% of freshmen (first-year) students don’t make it to second-year. A lack of self-directed and disciplined learning, an inability to move away from rote learning towards conceptual thinking, and being motivated at university for the wrong reasons (such as just to please friends and family) are identified as key factors that drive students away from completing their first year of studies. When a young person is pigeonholed into a system like a university where they really should not be, maybe we are setting them up to fail.

What does this mean for University Entrance?
The reasonably high failure rates indicate that University Entrance does not accurately reflect adequate preparedness for degree-level study. So assuming that the universities do not strictly stick to their existing expected distributions, it will be interesting to see what happens over the next couple of years with a raised UE standard. In theory, the number of failing students should decrease, and the pass rates should rise. Ultimately, this is a good thing. At that exact moment when a student finds out that they missed out on getting into university, it can be a real struggle to appreciate that. But maybe they’ve avoided wasting time and money on struggling through a university education. And of course, if they really want to get into university, there are bridging and foundational courses that were developed to ensure that the students really are prepared and capable for university life, to give them a much higher chance of success at degree-level study.

Ultimately this all ties into broader ideological arguments about whether or not every person should have access to university (or tertiary) education. It leads to arguments about equality vs. elitism, which is not necessarily an argument that can be won. There are strong advocates on both sides, and I can see the merits of both sides, and I think the answer is probably a balance somewhere inbetween. So I’m going to wimp out and let believers from both sides duke it out in the comments below if they want to.

In this section, we looked at what happens after students obtain University Entrance and actually get into university, pass rates and their associated failure rates, discussed why failure rates are so high, and touched on how UE can affect these failure rates. In the next section, we’ll try to wrap everything up.

As an aside, for an interesting read of how the “failure” problem has continued over decades, this article published in Salient (the Victoria University student magazine) in 1966 discusses why students fail university courses and how that failure that can be anticipated and avoided. One line near the beginning sticks out: “students who do well at school are not necessarily as successful at university”.

Part VII of this series - Conclusions, is available here.

University Entrance: Appendix - Chronicling an OIA journey

Awhile ago, I was wondering about the pass rate of courses at different universities, and whether conclusions could be drawn about any given students' chances of success at a university once they have been accepted. Obviously there are a multitude of variables that affect pass rates, but there may be some overall patterns at play. There are many rumours about whether some courses are easier or harder to pass than others, whether universities have (un)official policies about pass rates, and whether teaching staff are "encouraged" to "massage" the numbers to get students to pass.

So on March 13th 2014, I sent the following e-mail to each of the eight universities in New Zealand:

To the Registrar [or other OIA designee],
Under the Official Information Act, I would like to request a list of the:
1. Course Code
2. Course Title
3. Number of students enrolled
4. Completion Rate
5. Pass Rate
6. Mean, median, and range of the marks (in %)
7. Whether the final scores were scaled from the raw scores
8. Whether the course is externally moderated
for every taught paper delivered by the University in the past three years.

I would strongly prefer this in a digital format (such as an Excel spreadsheet) if possible. Please let me know when this this request has been received. Please let me know in advance if there is a cost associated with this request.

Below are the responses that I received, in chronological order:

University of Waikato (17/03/14)
Please be advised that you can find this information in the Annual Report released yearly. Please visit http://www.waikato.ac.nz/annualreport/annualreport.pdf for more information. The 2013 Report will be released soon. You can also visit http://calendar.waikato.ac.nz/

This unfortunately did not satisfy my request. The Annual Report (which all universities produce) does have the total number of students enrolled, the completion rate, and pass rate, but I was interested in the data for each individual taught paper. I replied to let them know, but received no response after that.

University of Auckland (17/03/14)
I was called by the General Counsel of the University, who informed me that while 1-5 of my request would be relatively easy to source, 6 and 7 would be almost impossible as that would be managed at a Faculty level and it was unlikely that the information was even stored (which is reasonable). With regards to 8, most courses are not externally moderated as there are 5000 courses and it would be impossible to moderate them all. I was advised to send in an amended request so that the request wouldn't be refused overall (as per 18B of the OIA Act). I sent a new request for just 1-5 and a list of externally moderated courses on the same day.

University of Otago (19/03/14)
I received a response that my request had been passed on by the Registrar to the Manager, Policy and Compliance in the Academic Services Department for co-ordination. I replied to thank them for responding.

University of Auckland (01/04/14)
I got another call from the General Counsel, who said that they had the data for 1-5 ready and was able to send it to me. Unfortunately she had to refuse the request with regards to information about externally moderated courses - it wasn't an issue about funding, it was simply a time and resources issue because it would be difficult to co-ordinate the collection of that information. The e-mail I then received read as follows:

Please find attached the information you requested in points 1 to 5 in your amended request for information as stated in your email dated 17 March 2014. The information is provided in both your preferred format, and excel spreadsheet, and in a  pdf.

All courses taught in 2011 to 2013 are included in the list. Please note that the student enrolments are provided in EFTS as that is how the University reports on pass rates and Course Completions.  

As I mentioned to you during our telephone conversation this afternoon, we do not have a central repository of information relating to the papers that are externally moderated, and a list of such papers cannot be made available without substantial collation and research.

Accordingly, as I have already advised, your request for this latter information is refused under s18(f) of the Official Information Act. I am obliged to advise you have the right, by way of a complaint to the Ombudsman under s28(3) of the Official Information Act, to seek an investigation and review of this refusal.
University of Otago (02/04/14)
I recevied the following e-mail from the Registrar:
I refer to your request under the Official Information Act for information relating to student grades by course subject code.  Unfortunately, the statistical data in the particular format you describe are not readily available, as the University does not standardly report to the level of individual courses.  However, we have now had the opportunity to identify existing reports we have that could provide you with meaningful data in response to your request.

In the University of Otago’s Annual Report, we publish a table entitled Examination Pass Rates -  http://www.otago.ac.nz/about/otago045429.pdf (see page 102).  We have been able to break this down to a greater level of detail into pass rates by subject for the five years 2009 – 2013.  This report is attached for you.

We must emphasize that these data do not take into account the nature of student cohorts in the various subject areas, variations in course content, variations in the nature of course delivery, differences between professional programmes and general areas of study, and a range of other variables all affecting pass rates.  

However, bearing all these limitations in mind, we trust that it is of interest to you.

The data I received was a PDF, in a similar format to that seen on Kiwiblog here. Given that it is unfortunately quite different to the University of Auckland data, it will be difficult to do direct comparisons. In particular, the fact that the data was given for courses/subjects rather than papers, with no distinction for number of students enrolled or level of study, drawing conclusions may not be possible.

Victoria University of Wellington (11/04/14)
I received an e-mail from the Assistant In-house Solicitor, which included an official response (printed on the letterhead and then scanned):

I refer to your request of 13 March 2014 to Victoria University of Wellington ("Victoria") under the Official Information Act 1982 ("the Act") for the following information:

[The OIA request as stated above]

In response to parts 1) to 5) of your request, a spread sheet containing all courses for the 2011 - 2013 academic years, including title, code and the number of students enrolled is attached. The pass rate is the percentage of students who passed the course. The completion rate is the percentage of students who completed the requirements of the course, this may include some students who did not necessarily pass the course.

Where a course had 10 or fewer students I have withheld the information regarding pass and completion rates under section 9(2)(a) of the Act on the basis that it is necessary to protect the privacy of natural persons (the students) and that this is not outweighed by any other consideration that would make it desirable, in the public interest to release this information.

The information regarding 6) to 8) of your request is not held centrally and would require collating the information across all 29 schools at Victoria. Further, providing the information in part 6) of your request would, in some cases, involve staff undertaking calculations. As the requested information covers 6112 courses over three academic years, collating the requested information would take a significant amount of time and involve multiple staff members.

I have considered whether extending the timeframe or fixing a charge to provide the information would enable your request to be granted. However, I estimate that Victoria would need to extend the timeframe for response by 1-2 months and any charge fixed would be substantial due to the number of staff hours required, therefore, I do not consider these to be viable. As such, I am refusing these parts of your request under section 18(f) of the Act on the basis that this information could not be made available without substantial collation and research.

You have a right, under section 28(3) of the Act to seek review by an Ombudsman of my decision to refuse your request.

I replied with a gracious thank you, accepting the reasons for the refusal of the request. However the data was provided in PDFs, which is a little unwieldy to work with, particularly when there are thousands of courses, so I asked for an Excel spreadsheet. This was denied to "ensure that the data I had withheld was not inadvertently provided within the excel spreadsheets". I ultimately converted the PDFs into spreadsheets myself (it was a bit more than just a copy-paste job).

Lincoln University (14/05/14)
About two months after the original request, I received an e-mail from the Director, Governance. It included a letter that had been printed, signed by the Vice-Chancellor Dr Andrew West, and scanned back in. I hadn't really anticipated that the question would be escalated so highly.

Thank you for your request under the Official Information Act 1982, received by Lincoln University on 14 March 2014. Apologies for the delay in responding to the request.

You requested [the OIA request as above]. 

The data is provided in the attached Excel spreadsheet. There is a data page and a notes page with the applicable definitions or assumptions. The data does not include courses with four or fewer enrolled students (by head count) in order to protect the privacy of individual students who could potentially be identified. The number of such courses is as follows:

2011 - 148
2012 - 146
2013 - 159

For each taught course, you also sought information on any applicable scaling and/or external moderation. There are very few cases of scaling being applied at the University. The Academic Administration Committee scrutinizes examinations reports from faculties in which scaling would be identified if it was carried out, but scaling is very rarely applied by examiners and would most likely only have been applied when results fell outside the range of historical norms, and there was a specific reason for this. The small number of cases of scaling have therefore not been identified as it would be a substantial task to do so across the three year period.

Taught courses are not currently externally moderated, although theses and dissertations are externally examined. Certain courses will be moderated by professional bodies as part of accreditation activities.

Please note that you are entitled, under section 28 of the Official Information Act 1982, to have this response reviewed by the Office of the Ombudsmen.

I received no response from Massey University, Auckland University of Technology, or Canterbury University, and as indicated above an unsatisfactory response from Waikato University. I could go to the Ombudsman about this, but at this point it's been far too long and I just can't be bothered anymore. I'm happy to just work with the data that I got and try to go from there.

I am choosing not to attach the raw datasets and release them - I believe that the information would likely not be used appropriately by students and lead to consequences that I don't really want to deal with in the long-term. In some cases the class sizes are small enough that individuals would be identifiable (and then people would know if they passed or failed a particular paper). Also, teaching staff are people too, and I don't want to embarrass or cause undue stress to any staff members by releasing the pass rates of courses they teach. If anyone is motivated enough to want to find out the pass rates, they are welcome to submit an OIA request to the appropriate university themselves.