Sunday 9 September 2012

The Sorry Truth about A-level Re-Marks

I hadn't spotted it in the paperwork and I am yet to meet anyone who did, but OCR have changed the way in which they do re-marks.

Re-mark v. Review
Previously, when a candidate or centre believed that the original examiner (Examiner1) to have awarded the wrong mark, the script was re-marked by a different examiner (Examiner2), this mark took precedence and overall grades were adjusted accordingly. This no longer happens in every case.
The process now applied by OCR is not actually to re-mark the script, rather it is to review that the original mark is in the correct band. So long as Examiner1 put the essay into the right band, the mark remains unchanged. Only in those cases where Examiner2 places the mark into a different band will the mark change.
This is borne out by our Berkhamsted School's A2 OCR re-mark statistics:
  • 26/35 re-marks have seen no change
  • 3/35 have seen a change of ≤3
  • 1/35 has seen a change of 4-6 marks
  • 5/35 have seen a change of >6 marks
The inherent injustice of the OCR Review system:
In cases where Examiner1 gives a mark that is at the bottom of the band, it is quite possible that Examiner2 might place the pupil in the same band but with a mark up to 4 marks higher on each question (depending on the band). However, so long as the mark given by Examiner 2 is in the same band, it is the Board’s policy that the mark will remain unchanged.
Thus it is quite possible for a candidate who has been placed at the bottom of a band on two essays to be evaluated by Examiner2 as being 8 marks or more better than the evaluation of Examiner1 and yet not their remark be returned with no change in the mark because it falls within the same band. For those candidates who fall just short of a grade boundary, eight marks can be the difference between going up a grade and gaining a place at their chosen university.

Marking Low: A case study
Sadly, those of us who have worked within the British Examination System for a number of years are only too aware that there is a considerable range in the quality of marking both at GCSE and A-level. If Examiner1 consistently places candidates at the bottom of the correct banding, then those candidates and that centre will be penalised and now have lost the mechanism to receive an uplift within the band.
Because Centres can now request to see the A-level scripts, we have a growing body of evidence that, here at Berkhamsted, we have suffered from at least one examiner disproportionately placing pupils at the bottom of band. (He placed 7 out 10 scripts at the bottom of bands). Other schools are reporting the same phenomenon.
We have long known that some examiners are more generous than others. Requesting that a script is re-marked gives a candidate and centre the opportunity to to ensure that candidates are not penalised by an excessively harsh marker. The OCR review system undermines this.
OCR: Recognising Achievement - I don't think so! #bringbackremarks

3 comments:

  1. This is a reslly poor post which indicates a porfound ugnorance of how marking is actually done. I am a bit peeved this year to hear heads moan about marking when the markers have (at least apparently) been less generous. There were few complaints when markers were (apparently) more generous, which of course worked in headteachers' favour.
    In anycase, there are few subjects in which precise marking is possible, Mathematics and the sciences are the only ones I can think of. All the rest necessarily are, to some extent, subjective. Hence, banding.
    The senior examiner's roles is to ensure that the assistant examiners follow the mark scheme and, in effect, this means that they award candidates the marks in the band which is most appropriate. This can be difficult and, even her, nuance and opinion can be significant. However, the position in the band is a judgment of questions like "how clear is this?", "how good is the english?", "how well are they using examples?". These are judgments and we generally trust our examiners' competence. To do otherwise, is purely second guessing.
    During the remarking, process the board which employs me, selects different examiners to remark scripts, all senior exminers Ithat is the most experienced, can you do better?). Our job is to catch any that have fallen through the net and have been badly marked. It happens, we are not perfect. We ensure that students' work is marked in the correct band (as best we can) and, on occassion, change marks within a band but this is a very dodgy process so we are cautious about doing this.
    I am a senior examiner (not for OCR, who are of course hated rivals) and a teacher. I promise you that we do our best. If you can do any better let's see what you can do before making smartarse, ignorant criticisms.

    ReplyDelete
  2. Dear anonymous,
    The point of my post is not to question the integrity or industry of markers. (We all have sympathy with markers who face deadlines and who are paid a pittance.)
    Rather my point is to question the fairness of a system that does not allow for re-marks in order to redress the mistakes that inevitably will happen.
    There needs to be an appeal process that is fair, just and recognises pupil achievement. I believe that the Review process being employed by OCR is unjust and unfair on candidates.
    MSS

    ReplyDelete
  3. This post is full of inaccuracies which do little to enhance the debate about the British examination system. Procedures for reviewing marks awarded have been in place for over a decade and have not been recently changed as you claim. Neither are they the procedures of an individual examination board. They are a regulatory requirement. The myth of a remark is perpetuated by schools who encourage parents who can afford to do so to challenge results. It is, of course, right that this mechanisms should exist and it still does. But the service provided by exam boards is a check of the accuracy of marking and this is subtly different from a remark. Assessment is not an exact science and, as a result, tolerances are applied. In the 'good old days' a marker out of tolerance may have been scaled. This was a blunt and, in all likelihood, 'unfair' practice but was aimed at redressing a perceived lack of adherence to the standard set by senior examiners. Very few scripts were sampled and a rogue examiner could go undiscovered as they would mark their samples diligently. Sampling is now 'live' and it is commonplace to halt aberrant examiners - any marked scripts returning to the pool. It is sloppy logic to equate a lack of changes in marks at 'remark' stage to a procedure which effectively forbids this to happen. It is equally likely that the original mark awarded was correctly awarded. This is not to deny that the quality of marking is indeed variable. There are good markers and there are bad. 'Twas ever thus. In my experience, however, the bad don't survive a system of online marking as easily or do not take kindly to the checking of their marking which is now continuous. Rogue marks will get through any system, however. They should be corrected in a review (or if you insist, a remark - the distinction is a little semantic) and they are. In many mark schemes there are judgements which are made which distinguish between marks both between and within levels. The reviewing examiner, a senior examiner, will ask the question whether the judgement originally made was fair and accurate. If it is not, the mark will be changed. To suggest otherwise does, I am afraid, question the professional integrity of those who are involved in assessment of candidates' work - the vast majority of whom are teachers themselves. I have seen both generous marking and harsh marking. Thankfully I have witnessed few unprofessional markers who are not willing to adjust their standard to that set at the start of the process. Neither have I known a senior colleague who has been reluctant to change an incorrect mark at the Result Enquiry stage. We have, probably, one of the most open and transparent examination systems in the world. The really sad reflection is that schools have fostered a lack of trust in professional integrity, driven as they are by a results oriented approach to education rather than a focus on the learning process itself. Your sample size is simply too small to sustain the judgements you make - I have to say that the majority of challenged marks have actually been correctly and fairly awarded.

    ReplyDelete