Official Report 473KB pdf
I welcome everyone to the Education, Culture and Sport Committee, especially our witnesses. Our first witnesses are the young people of South Lanarkshire. You are a politician's dream—we have never had this kind of coverage before. Indeed, this is not our usual everyday event. You will have to come to more of our meetings.
Yes.
I believe that the other witnesses were sitting highers. I could be really awful and pick on someone to start. Who is feeling particularly brave and wants to go first?
When 10 August came, my results did not. I waited but they had still not come by the late post. I was concerned, so I phoned the school to find out whether it had received notification. However, it did not know what had happened, so I phoned the SQA helpline, which I had seen mentioned on television. I was told to phone the school again and then phone the helpline back. The school still did not know anything; it was in the same position as me.
When did you finally receive confirmation of your results?
I have yet to receive a complete certificate. I received a letter from the SQA, which said that there had been a complication and that I was due an A for higher physics. However, I am still waiting for my full certificate.
Thank you. I will ask each of you to say a few words before I ask members of the committee whether they have any questions.
I was in a similar situation to Lewis MacKinnon's. I did not receive a certificate on the day that the results were due out. That was worrying. We phoned the school to see whether it had my results but it did not have anything, so my mum phoned the SQA helpline. The SQA had my results but at that point it was not able to give them to us over the phone. We were told to phone the school again to find out whether it had them. We phoned the school again, but it did not have them and said that it was unlikely to have them until the next day.
What did you hope to go on to study?
Biomolecular and medicinal chemistry at the University of Strathclyde.
So physics was an important result for you.
Physics was an important part of my results. I could not go to university if I had wanted to when clearing took place. I was left with no choice; I had to go back to school.
I sat three highers and one intermediate 2. I got my results, but my higher modern studies was not mentioned. I was shocked by that, as I had done well in the prelim.
When did you finally hear what had happened?
We tried to get through to the helpline, but it was engaged. My dad eventually got through from work and the helpline people confirmed that I had failed modern studies.
What is the situation now?
I am still waiting for my appeal.
Thank you, Jennifer. Tell us what happened to you, Namita.
I sat five highers last year, but when my results came out on 10 August I had been given only three of them: maths, English and history. For physics, I was given only two of the units and the external exam was mentioned under external assessments. For chemistry, I was given only two of the units and there was no mention of the external exam. I went down to the school, but it did not have the results either.
What course do you want to apply for?
I am interested in doing dentistry.
Christina, I understand that you were sitting your standard grades and also had problems. Will you tell us about them?
When my results came out, there was no mention of my accounting and finance, but I was not worried to begin with as I had heard that there were a few problems. However, about a week later, I received a second set of results stating that further information had become available and that any improvement in my grades would be shown. I thought that I had failed, as the second set of results was the same as the first and still did not show accounting and finance. However, when I went back to school, I was told that I was recorded as not having sat the exam, as the SQA had misplaced a small group of accounting and finance exam papers from my school.
Have you heard any more since then?
No. I am still waiting for my result.
Have you had no indication whatsoever?
No.
Has that affected your choice of subjects for this year?
It would have affected me if the school had not allowed me to take accounting and finance as a higher, but fortunately it has allowed me to do that.
That is good. Thank you all for explaining what happened with your results. I shall now allow members of the committee to ask you questions.
I was interested in people's experiences of phoning the helpline. I had to phone it on behalf of some young people in my constituency. I would like you to say a bit more about your experiences, so that I can judge whether my experience was typical. What sort of information did you get or fail to get? How many times did you have to call?
I did not ring the helpline; my mum took the responsibility for calling. Our post usually arrives at 8.30 am; when there were no results in the post she was on the phone straight away. She was able to get through with almost no trouble, but at that point the helpline people could not tell me my results. They had them and the bloke on the end of the phone actually said, "I've got Alan's results here on a computer screen in front of me, but I'm not permitted to give them out at this time."
How did it feel to know that the result was there but you could not get it?
It was very frustrating. The person sitting at the end of the phone has the results in front of him and knows what they are, yet I am the one whose results they are and I am sitting at home really worried. It was really stressful. It was not a nice experience.
When it became obvious that the school did not have the results, what did the school say? Did it suggest that it would have them any minute? How long did it take for the school to get the results?
Perhaps Jennifer Irvine would answer that. Did you contact your school, Jennifer?
My mum tried to get through to the helpline, but failed, so I went up to the school to speak to Mr Sherry. He explained to me that, if the result was not on the certificate, I might have failed. The school did not have the results. It took a few days for the results to come through.
Did the helpline suggest that the results would be through any minute?
Yes.
Was it the same for you, Alan?
We phoned the helpline first and were told to phone the school; the helpline led us to believe that the school would have the results. The school was helpful, but it did not have anything. The people there did not know what was going on. I was told to phone back. Eventually, the school said that it probably would not have the results until the next day and that I should phone back then.
I am interested in finding out whether the guy on the helpline believed that the school would have the results at any moment. That is something that we need to ask the people at the SQA. Thank you.
Perhaps Namita can tell us whether it was the same at her school. Was someone identified whom you should contact?
I went down to the school on the day that I got my certificate. I spoke to Mrs O'Neill, the assistant head teacher. She took down the details of what had happened and I showed her my certificate. The school did not have the results, so there was no way that she could tell me what they were. She said that, because my chemistry result was not mentioned but my physics result was, it was possible that I might have failed chemistry. She was quite surprised. I was too, because I knew that I had worked hard, as I needed the result for the course that I want to apply to.
I have two quick questions. Namita, you said that you were surprised at the result, because it was six bands below what you expected. What did you get in your prelim?
I got a B in my prelim.
Was the exam that you sat harder than you expected?
I do not know. I can never tell how I have done when I come out of exams.
Was your teacher surprised?
Yes, the teachers were quite surprised, because a few of us, all at the bottom of the register, had failed, which made them suspicious. It was not just me; there were others who had been predicted to get an A or a B but who got bands 8 or 9 when the results came back.
Were those people at the bottom of the register alphabetically?
Yes.
So your teacher was suspicious that it might be something to do with that fact.
No, I felt quite comfortable with it.
What did you get in your prelim?
I got a band 1.
As far as you were concerned then, the course had gone quite well.
Yes.
What was your teacher's reaction to the fact that you had failed?
The teacher found out that a small group of people were showing up as not having sat the exam. He thought that it was impossible that none of us had turned up.
You were definitely there.
Yes.
I am sure that you remember it well. So, the situation is that your paper has been lost and nobody knows what is happening.
I have been told that my result will have to be based on an estimate from my teacher, which I will get around October or November.
How do you feel, having gone through all the effort of sitting the exam? How long was the exam?
I think it was around two hours.
Now you discover that your paper no longer exists. How do you feel about that?
I felt really disappointed, because I knew that, although I had put the work in and the teacher had put the work in in teaching us, the SQA had mixed it up.
One of the things that we have to do is to make sure that this never happens again. Although this may seem a blindingly obvious question, I would be interested to hear your thoughts on what should be done. From the sharp end, what would you say should be done to prevent this from happening again?
That is a difficult question to answer.
Jamie is well known for asking difficult questions.
It is important, because we have to sort this out.
I do not know the whole situation; I know only what my situation has been. Certainly there has been some form of administrative error, whether it has been not enough markers or problems with the post. Surely there should have been some indication that that would be the case, at least with regard to people not receiving results. I had heard that there would be problems, but it was not fair that people were not notified that their results were not going to come. That was the most painful part of the experience. Everyone else had their results, but I did not know where I stood.
I feel the same as Lewis, because I received no results. There were some indications on the news the night before that a few people would not receive their results, but I felt that if that was going to be the case there should have been notification beforehand to prepare people. In the weeks before your results come out, the tension builds up and you get more nervous. On the day, you just want to get your results and get it over with; afterwards, you can relax, but I could not because my results did not come. I am still highly strung and worried about what is going to happen.
After the work that you have put in, how do you feel about the SQA?
Can we move on from how the pupils feel about the SQA?
In a sense, this is still about how they feel about the SQA. Do you think that the helpline could have given clearer information? Alan, you said that the first time you phoned the helpline, the people there could not give you information.
It could have been more helpful and better organised. As I said, the person on the helpline knew my results but was not able to give them to me. That was the line that the helpline took at that time but, later on, after phone calls from other people, people there must have decided that they should give out the results, because that is what happened when my dad phoned in the afternoon—there had been a change of tack.
Was it clear to you that the helpline did not know that the schools did not have the information?
Yes, because the first time I called I was told, "Phone the school. It will have your results because we have sent your results to it." However, when we phoned the school, we were told that it did not have anything and did not know what was going on. It was clear that the helpline thought that the school had received the results and that the problem was just that my results had not arrived in the post.
How did people feel when they went to school and found that the school did not have the information that they were looking for?
It was a further blow. You thought, "If the school does not know, and I do not know, who does know? Have the results been lost?"
Was that another fear—that the results had been lost?
Yes, especially as the helpline was saying that the school should have the results. It was not as though someone had the results; they were lost.
Was the school supportive? Did people in school say, "Don't worry; a couple of days will sort this out and it will not make a difference," or were they panicking a bit as well?
The school reassured me that it would do everything in its power to help me to get my results. The school knew my academic ability and what I should get, and was intent on getting it for me.
I want to take you back a bit. Alan Burns mentioned the press reports just before 10 August, which began to alert you, especially the higher pupils, that something was going wrong. Did those of you who were sitting your highers last year ever get a sense that everything was not going to be completely normal? That could have come from what your teachers were saying or during the preparation for the exam. Did it all come as a big shock just the week before the results?
It was a brand new course and it was obvious during the term that there were some teething problems. However, the general opinion was that we were getting through it and that everything was going fine. It was only the week before the results came out, when things started coming out in the news, that I felt there might be a problem. It was said that only a few people would be affected. I thought that it would not affect me and that I would be all right. When it happens, it is a really big shock and people do not know what to do.
I think that Jennifer Irvine talked about the day when you opened your results and found that modern studies did not appear on the certificate. To what extent were you aware that that was happening to other people? To what extent did you think it was just you?
I thought it was just me. I was not aware of the problems the media had been talking about, as I had been on holiday and had got home the night before.
Do you think it would have been helpful to have got a letter alerting you to the fact that the SQA was having problems this year? You all said that you went to the school when you became aware of the problems. I do not think that, 30-odd years ago, I would have had the sense to go to my school. How did you cope with working out what to do next, having realised that you had a problem?
It was almost a gut reaction to phone the school. It is the link—people at the school were the other people, apart from the SQA, who should have known what was happening. We would phone them to see what they had to say. They did not really have a clue what was going on and they did not have my results. That left me in a bit of a situation. I did not know what to do next; the SQA helpline could not give me my results at that point. I was running about all over the place, sick with worry. I did not have my results, whereas everybody round about me did. At that point, my friends were coming to the door, asking me what results I had got, but I could not tell them. It was not nice.
Would it have helped to have had a very clear explanation, whether by letter—I think the SQA considered sending letters to you all a week before the results came out—or by press statement, that there might be a problem, not to panic and that it would be sorted out?
That would certainly have helped prepare me for not getting the result on time. As it turned out, it was totally unexpected.
You are still missing information, even now. Have you had a letter of explanation or some form of apology from the SQA?
Yes. I received a letter on 5 September, confirming that I had got my pass in higher physics. The SQA said that that letter could serve as my certificate, for the purposes of getting the correct codes. It said that it would send me a completed certificate in due course. I have still not received it.
I saw a few surprised looks from the other witnesses there. Was it not the same for you, Jennifer?
No, it was not. I did not get such a letter.
What about you, Namita?
I was sent a letter, saying that the SQA would investigate the courses for which the results were not complete and that the school had phoned about.
Christina, you were sitting your standard grades, your first national exams. Was your reaction also to contact the school?
I felt that there would be no point phoning the SQA helpline, as I thought that it was prioritising the higher candidates. I therefore thought that the best people to contact would be at school, and that they might have information about what was happening. I was fortunate, as they had the information that the SQA had misplaced my result.
My question is on similar lines. You said earlier, Lewis, that the form that you received this year was very complicated. Were you shown the form before? Did the schoolteachers give you a dummy form to prepare you to interpret the real form correctly?
No. My only idea of what the form would be like was based on the form for standard grades in previous years—it had just the names of the core standard grade qualifications and the grade. I thought that there would be a similar, simple format for higher, but in fact the form dragged on for about six pages.
It was obviously a complicated form but, when you read it, were you able to interpret it, know how you had done and know what to do if what you were expecting to see was not there?
I had a good look at it and I could tell that I had got an A for the exam that I sat in May. I could also see from the breakdown of the units that a unit test was missing. I knew that I had sat that and passed it, and that it had been sent, so I knew that that was a problem.
You have sat the highers and you have a good understanding of those exams, but was everybody aware of the differences between them and your higher still exams this year? Was the importance of the unit assessments made clear to you? Did you know that if one was missing you could not get the exam?
The school had done its best to make it clear that if you did not pass the unit assessments you could not sit the final exam and you could not, therefore, get an overall pass. All our teachers in all our subjects had made it clear that we had to pass the unit assessments to get a final grade. It was because I knew I had sat them all and passed them all that I could not figure out why I did not get physics.
It sounds as if you were all well prepared in the sense that you knew what was expected of you. Jennifer, you said that you are still waiting for a result. You are now in sixth year. Are you studying the courses that would have been your first choice if you had got that result, or are you studying different courses?
I am waiting for my modern studies appeal to come through, but I am studying my first choice subjects—the ones that I wanted to study.
If it comes through and you have failed, will that affect your choices?
Possibly, yes. I would probably resit it.
Christina, I think you said that your school had advised you that you would probably be all right and that you should proceed on the assumption that you had got the result that you were waiting for.
That is right—I was allowed to continue. My estimate from my prelim was a 1, so they estimated that I would get the same for my standard grade final exam.
Is anybody studying a course this year that they would not have chosen if the exam results had been different? No.
Am I right to say that Alan would not have been at school at all if he had got his results through in time? I think you suggested that you would have had the option of going to university.
It was a possibility. I would probably have gone back to do sixth year anyway, but not receiving the award for physics meant that I had no choice. Had I got the result, I might have decided that, yes, I wanted to go straight to university. As it was I had to wait until the result was confirmed, and by that time it was too late.
I would like to ask about the advice that was given once the results had come out. It is clear that there were flaws in the helpline system, in that it was no help whatsoever as far as I can see. What would have been your reaction if you had been told that the results would be delayed? Would that have caused less anxiety than what really happened, with a number people getting flawed results and a helpline being set up? The SQA could have decided not to put the results out on a particular day; it could have delayed them.
I think that it maybe should have done that, to ensure that all the certificates were right. It could have delayed them for a week or so, just to make sure.
All of you have told us how you received incomplete certificates, or certificates that claimed that you had not sat an exam at all. Are any of your friends who appear to have got a complete set of results in time anxious about whether their results are accurate?
When people heard on television that the SQA would be checking exam results, they were uncertain about their results. They would say, "Okay, I've got an A or a B, but did I really deserve that? Did I really fail that exam? Has there been a glitch?" I know about a few people who were in that situation.
Have those people been reassured now that the SQA has said that it has been through that checking procedure?
Eventually. The SQA said that if changes were made to results, only upgrades would be possible—marks would not be taken away from candidates.
I will follow up quickly a comment made by Alan Burns.
In science-based subjects, such as chemistry and physics, candidates study learning outcome 3, which is an experiment that must be conducted and written up. Candidates must pass learning outcome 3 as part of their internal assessment. To begin with, I do not think that teachers were quite sure how to go about that part of the course: how to set it up, how much help to give pupils and whether they should let pupils get on with the work themselves. As that part of the course had to be passed, teachers gave us quite a lot of help but, as I said before, they were not sure how to approach it.
Did anyone else have concerns about teething problems with courses?
The layout of the course means that a lot more pressure is put on time throughout the year. I certainly felt that. If I was going to be sick and off school for a day, I felt that I would miss a lot of work and that it could be difficult to catch up. The pace was rapid throughout the year and pupils had to put in a lot of effort to keep up with the pace. It was probably inadvisable to miss days by going on excursions or being ill.
It has been suggested in some of the evidence that has been submitted to us that as the exam timetable was more compressed this year, some people had to sit a couple of big exams on the same day. Did that happen to anyone here? Could the timetable have been laid out differently to benefit candidates?
That did not happen to me, but a lot of my exams were closer together than I would have liked. I had chemistry and geography exams within a day of each other, which did not give me much time to prepare for the first exam and then relax for a bit, as I had to go straight into the next exam.
I have a few questions about exam preparation.
I do not think that the papers for our prelims were issued by the SQA. I think we got our prelim papers from an independent body, although they were supposed to be based on what the higher still exam would be like.
Exactly. So you sat an exam that was like a prelim. Were the higher still exams like the prelims? Were the prelims a good preparation for sitting higher still exams?
I will step in at this point.
I think that there was a specific problem with maths this year.
We sit them in February or March.
I have a problem trying to cast my mind back to my school days.
They were a long time ago.
Exactly. You say that there were no past papers, but were you issued with model exam papers in February and March to take home and look at?
From what I can remember, we had papers for some subjects, but they were previous higher papers or old prelim papers, not model prelim papers.
For many subjects, we were using previous papers, because the school was not sure what type of questions to expect.
Were you warned that the exam might not be the same as the prelim?
Yes, because we were not sure what the exam would be like. However, we were told that the questions might be along similar lines.
We will have a couple more questions and then try to wind up the session. I can guarantee that, as soon as I say that, members will suddenly have more questions. Please bear with us for a few more minutes.
If you were not going to get an award for an exam, would it have helped if the report form had said "no award"? Did the form just have an empty space for a "no award"?
Yes.
And there was nothing at all about your modern studies grade on the form.
That is right, although the unit studies part of the form said that I had passed all the units.
So in future would it be more helpful if the form did not make you think that a subject had been missed out altogether?
Yes.
Did the core skills aspect make much sense to people? Did they expect it?
It was not totally unexpected. At the beginning of fifth year, we received a slip containing our current core skills. However, although we knew they existed, we did not really know what they were for. They just appeared on the certificate.
So what you want is a certificate that tells you what highers you have passed.
Yes.
Although it is important to have information about individual units, students probably do not need to know that on the day they receive their results; they want to know what exams they have passed. The other information can come later and will, perhaps, be easier to understand.
That is right.
I want to ask about your experience of the courses and exam pressure from the national assessment bank aspect and unit tests. When higher still was first established, it was probably thought that the tests would be sat through the term. Did any of you experience any slippage?
The new higher still geography course, for example, has many assessments—I think about 13 units—but you need to pass only roughly half of them. A few people in my class failed some tests and needed to catch up. That meant that, near the end of term, they had to sit the unit tests at lunch time. They might have originally sat the test in October, but they did not have the resit until April.
So the structure meant that it was possible that pupils might have half a dozen resits in a short period of time.
Yes.
I have a final factual question. Did any of you do higher still English?
No.
You will understand Ian Jenkins's deep questioning on this issue if I explain that he used to be a teacher.
Presumably you will all have to take examinations in future, as you are locked into the system. What comes to your minds when you think about that after your experience this year?
Never again.
Oh no, not this again. Will there be more problems?
I do not feel that confident in the system.
It is a question of trust. Can we trust what we are going to be given when we sit exams?
All of the above.
Namita, I am not sure what you plan to do. You had hoped to go to Glasgow University. What are your plans now?
My UCAS form has to be sent in because I am hoping to do dentistry at Glasgow University. I have spoken to the department a lot as I am interested in the course. I was told that C passes were discounted. That means that only my three A passes and one B pass will be taken into account. The university wants me to get a B in a chemistry certificate of sixth-year studies this year. That will make sixth year another hard year.
But you will have a bash and see if you can pick it up.
Yes, because I really want to do dentistry. I will have to work hard at it.
I thank each of our witnesses for giving up their time and coming in this morning. It has been useful to hear what they have experienced. That will add a lot to our inquiry. We hope that, by the end of the inquiry, we will be quite clear about what went wrong and that we will be able to set about ensuring that it does not happen again—either to our witnesses or to members of their families who will be moving on to sit the national exams. I hope that we will speak again at lunch time.
As the convener said, I am chair of the school board. We had discussions in the board prior to the examinations, because we heard that there were problems with their administration. I have heard anecdotal evidence, as everybody else has, from other parents, but this weekend I have been approached by several parents who wanted me to put forward their cases. Two of those cases really must be heard.
Would you like to tell us about the two parents who spoke to you?
The first was a father who phoned me the other night to say that he and his family have been going through a living nightmare since the day that the results came in. He is concerned that his daughter's confidence has been eroded. He thinks that her confidence has gradually ebbed away with the pressure of work during the year and he also feels that she has been let down and that that feeling will never go away.
Thank you. You raised a number of points that I am sure we will come back to during questions.
I welcome the opportunity to do so. My wife is sitting behind me and what I have to say concerns our son, Stephen. My evidence is therefore quite personal, but I should also like to give some views from my privileged position as a member of the school board and vice-chair of Strathaven Academy.
I thank both witnesses. I was interested to learn that you are chair and vice-chair of your school boards. What official representations were made to the SQA by your school boards, your head teachers or any other organisation of which you are aware? You have both talked about the run-up period and Mr Anderson mentioned his involvement as a parent. Was there any correspondence with HM Inspectors of Schools or with the SQA, perhaps via the rector?
We wrote to express our concerns to the chief executive of the Scottish School Board Association.
When did you do that?
That was before the summer recess—I do not recall the date.
I have an extract from minutes showing that it was reported to us in April that the SQA arrangements were very time consuming, and that the amount of assessment in the new system was engendering anxiety among pupils. The organisation of the school was coping; we congratulated our SQA co-ordinator and our office staff because they coped with a tremendous amount of extra work.
Did either school board address the role of Her Majesty's inspectors of schools in the run-up period?
No.
No.
I would like to put the same question to both witnesses. Janette Moore said that two of the problems are, that pupils sit exams just for the purpose of passing tests and that there is a lack of trust in the system. I do not know whether you have children who have still to come through the system, but given your experience this year—as members of school boards and as parents—how do you feel when you think about the next few years? When I asked the young people that question, I received very short replies. Will you expand on any fears or hopes you might have for the future?
Two of my sons have been through the system and my youngest son did his standard grades this year. His approach to his higher year is totally different from the approaches of the other two. For them, it was just a normal part of life—there was no problem. They trusted the system and looked forward to gaining the proper qualifications to go to university. However, the son who sat standard grades last year sees no value in the two identical certificates that he has received for his standard grades. One of his subjects is still missing from both of them, although he has been assured that he passed it. The problems have devalued what he did. He should be looking forward to a successful year—he is doing five highers—but he is questioning everything about his course.
I question the value—not only as a parent, but as a person in industry—of teaching pupils just to pass exams. In other words, I question the value of saying to pupils that all that is important is that they pass this or that assessment and that once they have passed it, they must pass the next. They are being taught narrowly, just to get them through assessments towards a final examination. That was what went on this year.
Given those profound concerns—which we have heard from a range of people from inside and outwith education—what do you hope for from the committee and its inquiry, both in relation to the SQA, which is the immediate issue, and in terms of the issues that you have raised?
The committee must find out what went wrong—nothing can be fixed until that is understood. I hope that the inquiry can pull out all the information because a get-well plan cannot be put in place until all aspects are understood.
We need to find out what went wrong with the administration of the exams—that needs to be corrected. Also, the exams and the internal assessments must be re-examined—I have heard time and again about levels of stress. Parents seem to be hearing about it—I suppose that that is because they experience the problems in their home. Perhaps the pupils are not aware that they are going through something that they have not been through before and that is not normal.
I also have kids who have been through the system. I remember the stress that accompanied waiting for the envelope, but I do not remember the stressed fifth and sixth-year pupils who have been described today. Witnesses are telling us that the kids are stressed throughout the year and I think that that is news to the committee. What can be done to change that?
Something has to be done about the internal assessments. There is a big build-up of pressure because pupils feel that they are being examined on several occasions throughout the year. As Ken Anderson said, some pupils are doing five highers and are assessed many times for each higher. For a pupil who takes his or her work seriously, that creates an awful lot of anxiety.
Higher still was meant to get away from the stress that results when everything depends on one exam.
Yes, but whenever somebody sits a test or an exam—whatever the school has chosen to call it—they experience anxiety.
I asked the SQA helpline whether, given the fact that the assessments are set by schools, it was possible that the pass levels are different in different schools. I also asked whether the degree of difficulty of assessments differed from school to school. In other words, is there a level playing field, and is that checked by the SQA, as the accreditation authority? After a lot of going round in circles and talking to different people—this was during one phone call, but the person to whom I spoke first could not answer without asking colleagues—it was admitted that it was possible for a school in Strathaven and a school in Hamilton to have different pass marks for the same subject. I found that appalling. So, that went into the melting pot along with the assessments.
I am interested to know how we can move on. How can we give kids confidence in next year's exams? We have heard stories of kids who are sitting standard grades—looking forward to their highers—who are not confident that their papers will even come back.
That is the major problem. Time is running out for those pupils. Schools have recently been told that the exams have been brought forward by a week, which has caused panic among some pupils. First there were all the problems getting last year's results right, with the lack of confidence that that caused; now pupils are being told that they have one week less to prepare for the next lot of exams. That was not a good step if we are trying to make things right.
A good start would be to say, "These are the things that were wrong." That would be refreshing. I do not mean pointing the finger, but telling people exactly what went wrong. We need someone to say, "These are the positive steps that we are taking to start to repair the damage." The damage will not be repaired overnight, nor will confidence be restored overnight. We must get away from the blame culture and people scoring points. That is not what parents want to hear. They want to hear what constructive steps are being put in place to fix the situation. They have not yet been told that. This inquiry is a good step forward, but people need to be told quickly what will be put in place. You must start to make some inroads into repairing the damage. If the issue is fudged—if there is a whitewash—you will not instil the confidence that must be put in place.
I am conscious of the time, but I think that Brian Monteith had a question.
Given that you perceive there to be an unlevel playing field between schools for marking and internal assessments, and given that assessments are not so much an assessment as a hurdle—a test for entry to the final part of the course, the exam—would you be more satisfied if the internal assessments were no longer a hurdle, but a means of advising the pupil and teachers how well a pupil is doing in a course? Would you be more satisfied if the assessments took place but were merely for information, rather than a door into the exam?
That would be a very good step, but the idea that the unit assessments could act on their own for some pupils would then be lost. I am not sure how that could be overcome. It would, however, be a good idea. A test should provide an indication of a pupil's level and of the amount of work they need to do. It should not, in itself, pose a problem to the individual pupil.
I would give anything to get away from the concept of just getting pupils over the hurdle towards one of assessing pupils' positions, what they have retained and what they need to do to improve. That would be applauded. There is perhaps a halfway house for fixing the problem.
As parents, how did you react to what the certificates were like? You may want to consider the question from another angle—Ken Anderson may wish to do so from an industry point of view, for example.
That would be a very good idea. Several parents have told me that they found the new certificate for standard grade and highers far too complex and that all they and employers want to know is which subjects have been passed and at what grade. Pupils would be pleased to get a run-down of how they have done in each part, but I do not think that that need concern everyone else, or that it is necessary to include that information in the final certificate.
I believe in the KIS—keep it simple—system. The exam certificate did not look simple: it was confusing and it was not helpful to parents, to pupils or to industry. We can understand, from looking at the certificate, why the SQA got into such a mess. It bit off far more than it could chew. It was, I think, a fatal error to put all of a student's history—probably leaving out only their inside leg measurement—on the certificate.
You will be relieved to know that you have now come to the end of your ordeal. We are very grateful to Janette Moore and Ken Anderson for giving their time and answering our questions this morning. As I said to the young people earlier, it is important to hear from such people as you, who were so closely affected, about the exact situation. The suggestions that you have made will also be taken on board.
Meeting adjourned.
On resuming—
Good afternoon, as it now is, to all of you. I thank you, as teachers, for joining us. I know that you have sat through our previous two sessions, so you will have heard all the information—not for the first time, I am sure.
I am the head teacher of Strathaven Academy in South Lanarkshire.
I am assistant head teacher at Earnock High in Hamilton.
I teach English and am the assistant head teacher at Uddingston Grammar School.
I am principal teacher of maths at Holy Cross High School in Hamilton.
I am assistant head teacher at Hamilton Grammar School.
Thank you.
I am a co-ordinator.
So am I.
So am I.
That is fine. Some of the questions will be directed towards you.
Yes, I marked after the Easter diet.
I was also a marker.
It is helpful for us to know that when we are asking questions.
When the parents gave evidence, they spoke about the weight of internal assessment. Will you give us your views on the place that that took in the first year of higher still? Will you talk about the relationship of intermediate 2 to the modules in respect of the unit assessments as they were introduced into higher still and how that affected the courses? We will go on to the higher still development unit later.
All secondary school pupils in Scotland are used to assessments. In the course of study in their higher year, all pupils are used to end-of-topic check tests and assessment. They are for formative, as opposed to summative, judgment. That was the difference this year. Although the end-of-topic check tests were pursued, the unit tests were formal. That placed an additional burden on our candidates.
We knew well in advance what the internal demands of the new qualifications were going to be. The new qualifications are a good thing; if they can gain a currency in the market place, the unit tests will be of value. Attainment at unit level is of value.
I agree with Mark Sherry. That academic rigour may put the pressure back on students. They recognised early on that the tests were official and that they were being monitored much more closely than had been the case under the formative assessments that we undertook previously. The number of tests to which they were being subjected became quite difficult to manage. Because of the number of students involved, it became impossible to manage the timetabling and to ensure that students who failed tests were not sitting three internal units on one day or in one week
Perhaps a bit of undue pressure was put on pupils. The unit tests in maths are set at a minimum level of competency. We found that the majority of our pupils passed first time and that a pass in the unit test did not tell teachers or students much. Perhaps pressure was put on them to pass unit tests that did not really prepare them for the final exam. I do not know how useful those tests were for teachers, parents and pupils.
We discovered a problem in that some pupils, particularly those who may have been just inside the category of those sitting intermediate 2 or higher, struggled with some of the unit tests, and that tended to build up. Those pupils might have been under the most severe pressure to pass the unit tests as they had a huge number of reassessments to undertake at the same time. That became quite discouraging for them.
You asked us about the currency of intermediate 2 in relation to the Scottish Vocational Education Council. My students who are sitting intermediate 2 see it as valid currency. They value the concept of intermediate 2 and its structure, through which their progress is indicated to them. They recognise that passing intermediate 2 gives them an access route to further qualifications that is much more valid than the previous system of modules.
Could you address the conflict of cultures between SCOTVEC and the Scottish Examination Board, which some people have talked about? As a teacher of English, you might be aware of those arguments.
The SCOTVEC culture was that internal assessments were verified at the end of the year. Teachers made up their assessments and had their courses validated. The assessment element was internal and, as one of the parents suggested, could vary from school to school.
Could you say something about the relationship that existed between the higher still development unit and the teachers? We have heard unions talking about a dismissive attitude to teachers' worries, which seem to a degree to have been fulfilled. Could you talk about the higher still development unit and HMI in the consultation and development processes, and how teachers felt about that?
There were times when we were made to feel that we were whining and being obstructive for the sake of it, whereas we were being constructively critical when we felt that there were areas to be addressed. There was the implication that we were trying to slow down the process. The process was not the problem; the timing was. Perhaps some of our observations were misheard, if not ignored.
I agree with Mr Browning. When we went to meetings held by the development unit with members of HMI or the SQA, there was an assumption that what will happen will happen. We were not being listened to. A number of concerns were raised on many occasions. Many people felt that they were being dismissed out of hand. That led to a lack of confidence in the system. There was no listening ear.
So although there were small changes to the length of a unit or the timing, the people you met did not get engaged in an argument about the bigger issues?
I mentioned English previously. There was little listening at first. If the teaching unions had not made a strong case, we might have had everybody on an inappropriate higher still English course. Even now there is lots of tinkering to be done with that course before it will be fully satisfactory.
Is it your view that it would have been better to introduce intermediate 2 and build up?
Yes. We have done that in our school. We have introduced intermediate 2 this year, because we have that option. Our students at intermediate 2 are going through that course and higher will be introduced when it is ready.
But you are not doing higher English yet?
We are not; we are still doing the revised higher.
On the introduction of intermediate 2, schools liaised with the SQA on how we would plan the introduction of courses at different levels. There was a lot of support from the SQA on that. This session, all schools went with a majority of highers and the intermediate 2s that they felt they were able to do, and they have a plan over the next two or three years gradually to introduce intermediate 2s.
Every school in South Lanarkshire agreed a phased implementation plan, which was helpful. Many discussions took place.
We have real live markers in front of us. I have never had that before, so forgive me, but I have some questions that I would like to ask. First, if you have marked in previous years, what are the differences between this year and previous years?
I have marked for 20-something years.
Wonderful. You will have lots of answers then.
The biggest difference was the initial contact. Normally, I am contacted in January. I get the opportunity to decide which papers I will mark, so I can plan my year and move forward accordingly. This year, I received a letter in March inviting me to start marking during the Easter holiday. Given that I had had no previous contact with the SQA, had I planned a holiday I would have been unable to do the marking. I was able to do that marking, but I was unable to do the summer diet marking because I had not arranged for time off. I would have had to attend a markers' meeting. I had not made that arrangement and I had filled my diary for June, so there was no possibility of taking on that marking. Despite that, I got phone calls from the SQA three and four days after the examinations took place, asking me to consider marking at intermediate 2—a course I had never taught. It was clear then that there was pressure because of the number of markers. I declined the invitation.
Were you a marker as well, Mr Goring?
Yes. I have been a standard grade geography marker for some 15 years. With the exception of the late invitation to mark, there was no difference at all this year, I am afraid. Some teachers at my school were in the position of being asked to mark their own pupils' papers, which obviously raised a few eyebrows. In addition, some people were invited to mark but not to attend markers' meetings. However, that was not my experience. The marking process this year was very much the same as usual, except that we had a reduced period of time in which to do the marking, which created a fair amount of pressure.
I sought some anecdotal advice from my colleagues about the marking situation, and heard a couple of what I would describe as horror stories. There was a marker who, on the last day of her diet of marking, was phoned up and asked if she would accept extra papers. She was due to fly off on holiday, so she declined the invitation and went away, only to find a bag of papers waiting for her on her return. That is just one example.
We have heard reports of unsolicited scripts arriving when people were off on holiday.
I can answer only for English, but prior to a markers' meeting, we would be sent a set of specimen scripts. We would already have received candidates' actual papers. We would be expected to study the specimen scripts and have a look at the standard of responses in the various scripts selected from one or two schools. On the day of the meeting, there would be a long discussion. Members of the markers committee would already have decided on the marking grades that they would like; the discussion would take place to standardise the quality of the marking.
Do you think that it is important for anyone who is involved in marking to attend a markers' meeting?
They must attend a meeting.
We have heard that there were occasions when people did not attend meetings.
I would be very distressed about that. For one paper—the internal English folio—one is invited to mark on the basis of sample scripts and a written report, once one has several years' experience. However, that is the case only for those markers with knowledge and experience of marking. For anyone who is in their first three years as a marker, it is a condition of being a marker that they must attend the meeting.
We are hearing that what happened this year is putting teachers off marking papers in future, and that there is an impending crisis for next year because the new diet is under way and new markers need to be in place. What messages can be given to markers and teachers about the importance of their continuing to be involved? Will there be a crisis in recruiting markers this time round?
It is important that people mark. If they do not, the system cannot work. It is probably one of the committee's tasks to find a way of ensuring that marking can take place under reasonable conditions, rather than expecting people to mark hundreds of scripts in a condensed period of time. We must ensure that markers are remunerated appropriately for that work and that there are enough quality people out there to do it.
I echo what many of my colleagues have said. The majority of teachers mark because the exams have to be marked, and because it is good professional development. When one has done marking, it informs one's teaching in future years. Many teachers mark for that reason, rather than simply for financial gain.
We all agree that forward planning was a major issue. I have a number of colleagues who are markers. I know of two cases of colleagues being contacted the day before the markers' meeting to ask whether they would mark. They are extremely experienced teachers, and I would have no qualms about their involvement, but the timing was an issue. We had moved on to the new timetable—like a number of schools in Scotland, we change timetable in the middle of June—and they felt that their pupils would suffer if they offered support. That issue should be considered in future. With advance notice, people might take on marking.
I want to ask about your experience of data handling and data processing in the run-up to the diet. The questions are of particular relevance to the co-ordinators, but other witnesses may have views. We have heard a great deal of evidence about the difficulties that individual schools have had. The submission from South Lanarkshire Council indicates that some schools were submitting information six or seven times—on one occasion it was 15 times. What was your experience? Did it ring sufficient alarm bells to make you do something about it?
I am an SQA co-ordinator. It was the first time that problems had been anticipated. Students are supposed to be entered for a course by 31 October. That deadline had to be extended into November. Schools send data to the Strathclyde educational establishment management information system, which collects data from South Lanarkshire and other authorities and sends it on to the SQA. On 8 March, SEEMIS wrote:
Were you asked to do that directly by the SQA?
Yes. In April, we were asked to recheck all the data that had been submitted. That included all the initial entries and all the unit results that had been put into the system by that date.
Can you tell us about that? It is important.
On 20 June, the situation became a little bit more concerning. Our initial concerns were that the data for the national qualifications would cause a problem. We wanted to ensure that the situation would not arise in our school—as it did, unfortunately, for some of the students who were here this morning—where students passed an external exam, but their record showed that they had failed internal assessments.
I would like to talk about the issues that arose in Strathaven Academy throughout the summer. At about the time that the SQA requested that a senior member of staff be available throughout the summer, we started to receive calls. From 29 June, the SQA made 60 checks with Strathaven Academy. Sometimes, a request would be for confirmation of one internal assessment grade for one candidate; sometimes we would be asked to check internal data for several candidates. In all, data for 500 individual presentations were requested and returned.
Given those difficulties, which I am sure were mirrored elsewhere, were you surprised by what happened on 9 August and 10 August? Did you fear that this might be going terribly wrong?
Eventually, we were requested to have someone available during the summer. Before the end of the summer, we decided to send a paper copy of all our internal results to the SQA's Glasgow and Edinburgh offices, to try to ensure that, if the SQA had to confirm units, it would have those copies even if there were problems with electronic copies.
You said that you did not receive your statement of results, and that you had no notification that you would not receive that statement. That is confirmed by a letter from David Miller, the chairman of the SQA, which we have received in evidence. He wrote to a fellow board member on 11 August and said:
Yes. I feel that the SQA should have announced that those results would not be available in school. Many children, having opened their certificates and queried them, thought that the school would provide a solution. Children such as Jennifer Irvine, whom you spoke to this morning, came into school, and we could not even confirm for her what her results were. I found that difficult.
I wanted to ask you about Jennifer Irvine, because she referred to you by name in her evidence. She came to you and said that she had not got her higher modern studies and that there must have been a problem. When she said that, and lots of other people said that, did your mind go back to all the difficulties with data collection, and did you say to yourself—I will put this question to others too—that there must have been a major problem?
This is only my impression—I have no evidence to back it up—but I believe that Jennifer Irvine's difficulties were to do with marking and the quality assurance of marking. Jennifer was an estimated grade 3 in modern studies, yet she went to grade 8; she went from the high 60s or 70s to the low 40s. In modern studies, we had to appeal for 10 of our 26 candidates. We appealed for seven of them at stage 1, and the appeals were all granted. Unfortunately, Jennifer is in stage 2, which will not be complete until the end of October. Over the past three years in modern studies, we had no appeals in 1997, no appeals in 1998, and three appeals in 1999. This year we jumped to 10, seven of which have been granted at stage 1. It is only my opinion, but I feel that there were problems with the marking of that exam.
I want to ask Jim Browning a specific question. Alan Burns and Lewis MacKinnon from your school gave evidence. When you discovered that they had not got their results, did your mind go back to the data entry problems? Did you immediately make that connection?
They were able to tell me what the SQA had told them; I could not get through to the SQA on 10 August.
The SQA did not answer the phone?
I got through to the switchboard eventually and was told that somebody would contact me. I was contacted at twenty to five the following day. For a school the size of mine, that was nonsense. I managed to get through to someone myself the following morning to get some information.
My final question is for Richard Goring. We heard Christina Fotheringham's extraordinary story about a paper that no longer exists or is somewhere in the ether. As a teacher of long standing, how do you feel about a pupil who has gone through the course and sat the paper with every expectation of passing, and whose paper disappears? Presumably you regard that as unforgivable, but how could it happen?
There are only two ways that that could happen: either papers have been lost or a batch of pupils who were expected to get credit did not turn up for the exam, which was clearly not the case. Ten pupils are in this category—
Are the papers of those 10 pupils lost?
They are the only 10 pupils in that grouping who were estimated to get credit passes or good general passes. As a result, they sat the general credit papers instead of the general foundation papers that were sat by the rest of the group. However, those 10 candidates all received code 99 for their exam.
Please explain that to us.
It means that the student did not turn up for the exam.
But they did turn up for the exam.
Yes.
So we have an exam system in which pupils can sit an exam and the paper disappears. What does that tell us about the system and what can we do to prevent it happening again? Surely that devalues everything that you have been working for.
To my knowledge, it has never happened before. My reaction on 10 August was absolute disbelief; my tremendous faith in the Scottish examination system has been built up over many years. In July, we submitted 105 sheets of paper containing confirmed results; principal teachers had to come out to the school in the middle of their holidays and so on. All that extra work was done to ensure that the results would be issued on time. I was absolutely flabbergasted when I arrived at the school on 10 August to discover that 134 higher candidates—almost 50 per cent—did not get complete certificates.
I do not want to cut into your answering time, but I am aware that time is moving on. We have sessions this afternoon.
I will be brief. It has been said that the schools were not informed at any stage that the results would not be coming. Were you actively informed that they would be coming?
No. On the day that the exam results come out, I would usually go to the school, study the results, process them and have them ready for examination by colleagues. I would use that data to inform myself about the option process, as pupils who have failed or passed exams often need to change their options. Over the next three days, I would conduct 40 to 50 interviews with pupils about changing their options.
It is important that the results go to the school. If, at any stage, it had been suggested to you that the results might not arrive on that day, I assume that you would have explained why it was so important that they should.
I speak only for maths, so my situation may be a little different. However, when the results were finally published this year, I found that I would need to submit 31 appeals at higher level. To put that into context, we had 125 presentations at higher level last year, and I would normally submit six or seven appeals. Many pupils felt that they had done badly in the exam, simply because it was not fair. We were reassured that that would be taken into account at markers' meetings. Two students who had been our top students since first year had scored an A in their prelims. One had gained 88 per cent in the prelim and the other had gained 84 per cent. They were within the top 12 of our year group of 125, yet they both ended up being awarded a C.
The problems that we have experienced with the certification of the new courses may have put more focus on the internal demands. As Mr Banks said earlier, children have always had regular end-of-unit tests. I know that those tests are more important now because they are certificated, but such tests were always in evidence. Perhaps the SQA needs to do more to ensure that the national assessment bank test better measures someone's ability beyond their bare competence. At the moment there is duplication because, in addition to that test, we have to give an additional test that measures ability in a similar way to the external exam. It would help if those tests could be combined.
Was this year any different from an ordinary year, in terms of the number of youngsters who were saying, "I can't do this course because I can't cope with the stress"?
It was a normal year in our school in that regard.
One or two of our students asked to drop down a level, or to drop out of one of their groups of subjects because they felt under pressure to get all four. I would call that a realistic appraisal of the situation, rather than stress. However, I have two students who have already requested stress counselling for this year.
Mr Goring, you mentioned that some markers had to mark their own papers. What do you mean by "own papers"?
Papers that had been written by pupils that those markers had taught. That happened in two cases.
Is that unusual?
It is unheard of. It is a condition of marking that you declare that you will not mark papers from your own centre. You are not sent papers from that centre.
So you would not expect markers to receive papers from pupils that they had taught.
That is correct.
A number of factors have been suggested—singly or in combination—for the increase in the number of appeals: the quality of the marking and the processing of papers; the employment of markers; the fact that the nature of higher still might make it more likely that some pupils will fail; and the fact that some parents pushed for appeals, almost for the sake of it, in the hope of obtaining better grades. What are your observations on those, or other, factors?
I am not sure that the nature of the courses has given rise to more appeals. However, our school submitted double the number of appeals this year—we had 125 appeals at higher level, whereas the most in the past three years was 65. Only two of those appeals were as a result of parental requests. We wrote to all the parents to say that there had been problems with certification this year and that they could contact the school if they felt that their child had been disadvantaged. Both responses that we received concerned higher maths. The problems do not lie with the courses; it is the marking system that may have caused the problems.
We submitted 131 appeals at higher grade, which is more than twice the norm for us. In each case, there was a substantial discrepancy between the estimated grade and the results. That did not happen across the board; it happened in eight specific subjects.
In its written evidence, South Lanarkshire Council said that the moderation process was almost wholly discredited and that little moderation was carried out. What impact would that have on the examination process, if it were true?
Moderation is and always has been an important part of the quality assurance process in Scottish education. Staff recognise that and, although they always think that a request for moderation materials will cause additional work for them, they appreciate that it is part and parcel of the quality assurance process.
When was that?
It was in the third week in June when the sack of moderation materials arrived back in school.
I have been promised that the final two questions for this part of the meeting will be short. I will hold members to that.
What are the experiences of the SQA co-ordinators in the current academic year? It is very early, but I would like to know whether they have started the process of submitting pupils for registration. Are they confident that the software being used by the SQA is receiving the data that they are transmitting to it? Are they confident with the situation as it is?
We cannot enter students for courses at this stage. There are two programmes in the school. As soon as we are told that we can use the SQA programme, we can move all the data across from our other computer programme. We should be able to do that before 31 October. As yet, we have been asked not to enter data in the SQA programme. We do not have a firm date for when we will be able to access the programme.
I endorse those points. The current situation is rather redolent of last year, when time scales were slipping. There is some anxiety in schools that, unless we get things registered as soon as possible, we will have difficulties.
Why did only about 20 per cent of English departments in the school system go for higher still this year? Does that suggest that the whole thing was started too early?
I cannot answer for those departments that went for higher still; I can answer only for those that did not go for it. We did not go for it because, initially, we would have had to carry out about 30 assessments. English cannot be assessed on a timed basis; it is not a subject for which there is an easily quantifiable learning outcome. We would have been assessing but doing no teaching.
That goes along with my experience. Does that suggest that a decision was taken a number of years ago to put the higher still programme into operation before those dealing with English—and therefore the largest number of pupils—were ready, so that the system looks a wee bit dodgy now?
I cannot help feeling that I am being drawn into giving a political answer. I would rather not go down such a speculative path.
I am quite happy for you to sidestep that one, Jim. I think that Ian Jenkins has already made up his mind on that.
Meeting adjourned.
Meeting resumed in private.
Meeting continued in public.
Before we invite the witnesses to join us at the table, we will deal with an item of unresolved business. Members will have received copies of the letter from the Minister for Children and Education on the disclosure of information. I hope that they have had a chance to read it.
I appreciate the work that you have done on the proposal and I thank you for consulting each member of the committee. I object to the proposal on two bases. First, a motion under section 23 of the Scotland Act 1998 has been lodged, seeking the disclosure of all information. Members will be aware that section 23 overcomes any ministerial or civil service codes, which are subordinate to that legislation.
Three points come to mind on reading Sam Galbraith's letter. The first is that whatever we do must be co-ordinated with what the Enterprise and Lifelong Learning Committee does. It is not clever for one committee to do one thing and another something slightly different.
There is a logical train of proposals in the letter. We must ask ourselves whether all the information will be available to us. It would appear that a list of items of information will be provided, covering the appropriate period. In response to the questions that we ask after we have seen the list, a memorandum, which will presumably be based on the minutes of the meetings at which advice was given, will provide the information that we need. That can then be verified by the conveners.
I am fairly new to the Education, Culture and Sport Committee and have the benefit of having worked on a number of inquiries before I became a full-time politician.
If I were a cynic, I might think that the committee's inquiry might not produce a result and that another way of getting a result would be to discredit the inquiry. I am disappointed that someone lodged that motion, although I do not know who did. The motion was lodged, not as a reaction to what this committee or the Enterprise and Lifelong Learning Committee was asking for, but because someone simply decided to do so. That step may have politicised the situation unhelpfully. The danger of that approach is that it gives out the unfortunate message, particularly to the young people from whom we heard evidence this morning, that some members of the committee want to collude with a cover-up.
I will be brief, as I do not wish to hold up proceedings.
In Sam Galbraith's letter, the paragraph at the bottom of the first page says that
Yes.
No.
There will be a division.
For
The result of the division is: For 7, Against 2, Abstentions 0. We have agreed to accept the report's recommendations.
Can you explain the relationship between the inspectorate and the higher still development unit and tell us how that relationship was carried down to the level of subject training days?
To answer that question, I introduce my colleague, Philip Banks, the chief inspector with particular responsibility for co-ordinating the inspectorate's work on higher still. He is responsible for channelling our advice to our policy colleagues in the department.
As Douglas Osler said, I line-manage the chief development officer. I should make it absolutely clear that the remit of the chief development officer is to manage the full force of field officers employed in the unit, as well as the very large army of development officers and curriculum writers. At busy times of the programme, the size of that army, believe it or not, can run into four figures.
I am not sure that too many people would be wildly excited by that description, but there you go. I am interested in the subject reference groups. When I attended in-service training in the English group, there was always an inspector as well as a member of the higher still development unit. We have heard evidence from the unions and from individual teachers that people felt that their arguments were treated dismissively at those meetings. Anxieties and worries were expressed consistently at those meetings. How were those concerns and the perceived lack of flexibility in the HSDU transmitted upwards?
That is relevant to the comments that HMI received from a range of people in schools, in the forum that you mentioned and in the various committees that managed the implementation of higher still. In all cases, we ensured that the information that was being received was fielded to the appropriate body. After all the occasions to which you refer, we listed the issues that were raised, took action to ensure that they were passed to the appropriate body and followed up any action that was taken thereafter.
I understand what you say about a lack of unanimity, but on occasions whole groups of professionals left the meetings feeling that their views had not been recognised.
On every occasion, the HSDU had evaluation forms completed. The vast majority of the responses that we received were exceedingly positive. We could talk about the same occasions on which English teachers went away feeling very discontented; however, many changes took place in English and that is testimony to the fact that issues were raised and addressed. Changes were made to assessment procedures and the implementation of higher still English was delayed further.
On reflection, would you accept that it might have been useful to have phased in higher still—as we did with intermediate 2—given the substantial objections that were made to some elements? At the meetings, people agreed with the rationale of higher still but had real difficulties about aspects of implementation and were worried about the integrity and validity of the testing. Do you think that, with hindsight, the introduction of this subject should have been approached differently?
Of this subject, or the whole of higher still?
I am sorry, I meant the whole higher still.
I do not want to join the cohort of retrospective prophets—we can all be wise with hindsight. I would not have given advice that things should be done differently. As I said, higher still arose from the work of the Howie committee. I was around at the time of those discussions and I know that the education minister was not too keen to set up the committee, because he believed that it would involve extensive change to the system. However, he was urged to do so following strong representations from the teaching profession, the Scottish Examination Board and other bodies. There has been far more consultation on aspects of higher still than there was on, for example, standard grades or five to 14. Everyone in the profession and associated bodies has had many opportunities to consult. In the course of higher still, there have been more than 240 different consultations.
I find that your account of the past does not equate with the one that we have been given by other people. I must say that I find your account to be astonishingly manipulative. All the evidence that we have received, from a wide variety of people in the teaching profession, tells us that there was substantial disquiet over a lengthy period and that people believe that the inspectors—for whom you are responsible—drove through the implementation process, listening only to voices that they wanted to hear and advising ministers in those terms. The submission from the Scottish Secondary Teachers Association, which my colleague Mr Jenkins quoted at our previous meeting, puts it quite nicely:
No. I readily accept the suggestion that many representations were made to us about various aspects of higher still. That is what I would expect, given the fact that higher still is a large, necessarily complex and ambitious programme.
Why is it necessarily complex?
It is a complex matter to provide important certification not only for all the upper secondary school pupils at the end of 13 or so years of education, but for a broad range of adults in further education contexts. The issue spans all the subjects from the former academic and vocational courses, at several levels, in order to ensure that all potential candidates have courses at the appropriate level to meet their needs. The programme is bound to be complex as a national strategy, although it does not appear complex to the individual within the system, who sees only the highers and intermediate courses that they need in order to progress.
But it became very complex for those within the system, particularly those who operated it, as month followed month. You have used the interesting phrase "necessarily complex". Do you accept that the system was unnecessarily complex by August this year?
By August, the problem was not complexity. At that stage, individual teachers and parents of pupils must have been able to see their own particular routes through the system, so they did not need to be aware of the complexity of the whole national provision.
You must have a huge vested interest.
This programme arose out of reports that were initiated by ministers, consulted on and then implemented.
Were you advising ministers?
I was one of the people who—
Right. Well, hang on a minute—
Excuse me, Mr Russell—
Let Mr Osler answer the question, Mike.
There is a closed circle here that we must get to.
Mike, I know that you have a lot of questions, but you must let Mr Osler answer.
I do not think that we will get any answers.
I am not going to stop you asking questions, Mike.
My role of giving advice to ministers is sufficiently complex to require me to answer your point. On these matters, I and my colleagues gather a range of comment from people across the system; indeed, it would be very odd not to receive such information, given our statutory responsibilities within the system. As policy advisers within the education department, we make policy advice available to our policy colleagues, who have also appeared before the committee. Those colleagues then incorporate that advice within the advice that they give to ministers. We are only one group of people—a very influential group, I hope—who make professional information available within the department, which can then be included within advice to ministers. In that respect, we operate as advisers within the department, which is quite different from our more independent role of inspecting and reporting on schools and other institutions.
Is not this a closed circle, Mr Osler? You have essentially said that you take in a range of comments, which you then sift. You take the sifted comments to the minister, who then asks you to do something that might include taking in a range of comments. That puts you back to where you started. In such a closed circle, it is possible that you are simply hearing what you want to hear—which is a phrase that several witnesses have used—and then passing that on to the minister, who then asks you to do what you want to do. At last week's committee meeting, John Kelly said that his union
I should point out that there are two separate issues to consider: first, the role of HMI and other bodies in higher still; and, secondly, what happened in the SQA during the summer. Those issues should not necessarily be linked.
This is my last question for the time being, because I know that my colleagues want to come in. Furthermore, we all want to move on to what happened with the SQA in August. There are links between that situation and the HMI, and Mr Banks is one of them.
You are still asserting that we generate policy. I disagree with that statement; all the facts show that we do not do so. It would be unthinkable for the inspectorate—which has statutory responsibilities for inspecting in pre-school, school, further education and teacher education and which most recently has been charged by the Scottish Parliament to examine the educational activities of local authorities—not to have amassed a body of evidence to which ministers could turn for advice. Furthermore, it would seem odd if the largest stock of information about the system after it has been openly evaluated and reported on were not taken seriously by ministers. That is not generating policy, but giving influential and important advice on policy.
Such advice should be taken seriously, but not within the current, fatally flawed structures.
I want to ask about higher still development. When higher still was implemented, was any thought given to the SQA's capacity to handle the new exam in terms of information technology and so on?
In partly answering that question, I should really clear up some of the recent misconceptions about HMI's role within the SQA. The inspectorate's remit does not run within the walls of the SQA at all. HMI does not sit on any SQA policy or operational body that has anything to do with examination arrangements or a particular diet. That was not the case with the two predecessor organisations, where we sat on the council, the board and the subject committees as observers. However, it was decided that HMI would not have such an involvement when the SQA was established.
I am talking about higher still development. If any agency was taking over the development of my project, I would want to ensure that it had the capacity to deliver it. Did anyone consider whether the SQA had the capacity to deliver what was expected of it?
Much of the evidence given by my colleague John Elvidge and the audit trail of the inspectorate's knowledge of what was happening in the SQA—which I am happy to give the committee—show that there was no suggestion that the SQA did not have the data processing capacity to cope with higher still. It might be that the data processing capacity was not properly used, but I do not think that anyone has ever suggested that the technology does not exist to cope with the information that comes from schools on our proposed examination system. Some organisations deal daily with technology that supports far more coming and going for far more clients. It was never suggested that the technology was not available; as other sources have told the committee, we were constantly assured that things were on course. I am happy to take you through the audit trail that shows when we were told about problems and how we dealt with them over the previous two years or so.
That does not answer my question—perhaps we can come back to it. We have taken evidence from many people who said that HMI must have known that there were problems. In schools, teachers were saying that their information was dismissed. Trade union representatives and others have also said that HMI must have been aware of the impending crisis in the examinations system. However, the impression given is that you did not take on board the fact that there was an impending crisis and that you did little about it.
I would like to take you through what we were told and what we did about it, as this is the first opportunity that I have had to do that. Like other bodies within the system, we were aware that there were difficulties that led back to the SQA. I am happy to let you know how we handled those difficulties, but I emphasise that on no occasion was there any intimation of cataclysm. There were certainly problems, but the experts were satisfied on every occasion on which they were consulted by or involved with the SQA. Although we raised a number of issues over two years or so, at no time were we aware that things would turn out as they did in August—I do not believe that anybody else was aware of what would happen.
I do not know who the experts are, but the people who have given evidence to us had concerns for more than a year and continued to make those concerns known in the spring and summer. However, you are saying that the SQA satisfied the experts. That is not an answer—
No.
You have a list of dates and so on in front of you—that will not give the answers that we need.
The list of dates shows that we were aware and we were acting—
It does not tell us anything; it tells us that we have been ignored.
I think that it tells you a bit more than not anything. It tells you the action that we took about particular advice given to us at particular times. I said that earlier this year the information that we were receiving became part of the advice that went to the education department, as John Elvidge described to you. We were part of the subsequent process of engagement between the education department and the SQA in the period before summer.
Would you agree that you have failed in your duty in reporting the information to the Executive? Inspectors must have been hearing what people in schools and others were saying, yet you seem to be dismissing that. Do you think that you should have taken that further?
I do not understand how you can say that I am dismissing it. I have explained in some detail the occasions on which we were aware of the information, where it was coming from and our timetable of communicating it to the SQA. I have explained that we involved the HSDU, that we made further information available to schools, that we made sure that our colleagues in the education department were aware of the information that we had and that we associated ourselves with them in the events that John Elvidge described to you. We were part of all that—we were very much aware of it and we were feeding information to the correct people.
With the same outcome, unfortunately.
Evidence from teacher union representatives is that there were fundamental problems. On 4 October, David Eaglesham of the Scottish Secondary Teachers Association said:
Are you talking about fundamental problems to do with the SQA's data management or to do with higher still? They are not the same thing.
The point was that there were fundamental problems in the schools.
To do with higher still?
The evidence from teacher union members was that fundamental problems were being experienced in schools and that those problems were being put to ministers and the wider public. Would you agree that there were fundamental problems?
I do not believe that there were fundamental problems in the implementation of higher still in its first year. There were fundamental problems in the management of data in the SQA.
In listening to your answer, I presume that you refute the evidence that we heard last Wednesday, during which a number of representatives of teachers unions spoke. In response to a question from Mr Stone, David Eaglesham stated:
Things were being achieved in schools—many higher still courses have been delivered successfully in schools. That is a fact. We have inspected about 55 schools, on which we reported in June. Those reports show a high level of learning and teaching in the classrooms where higher still was the new course. There is no doubt that teachers have delivered exceedingly well. There is also no doubt that, in the course of doing so, they have found a number of hiccups and problems with higher still. That is what the Scottish Executive's review relates to.
I understand the difference that you have emphasised between matters in schools and matters in the SQA. Some of the evidence that has been submitted to the committee has been discussed this morning and I wish to draw your attention to comments that were made by South Lanarkshire Council. A written submission from the council stated:
Those issues were raised, as I think I have mentioned. The greater part of the balance of issues that were raised with us was rightly about the operation of courses in classrooms, the availability of resources, the adequacy of materials and about learning and teaching issues—far more than was ever raised about the issues that Mr Monteith mentions. Issues that schools raised with us about difficulties that they were having with the SQA in handling material were very few and far between.
I want to clarify what the unions were saying to HMI. There might not be a direct link—or audit trail, as you put it—between this and the exam difficulties, but several unions made this point. The SSTA said that it
If that were just an implication, I would be inclined to deny it. I would like the people who implied that to give the chapter and verse on the occasions on which anybody accused the SSTA of "over-reacting and misreporting". Of course I do not accept that, but I accept that such points were being raised. There is no doubt that the SSTA and other groups, in their regular attendance at meetings, raised issues about the late supply of materials and about the late issue of the national assessment bank from the SQA—they were right to do so. After all, we were co-ordinating the programme and we were the right people with whom to raise such matters.
I want to return to a matter—which Mike Russell raised earlier—about the complexity of higher still. You used the expression "necessarily complex", whereas I believe the National Association of Schoolmasters and Union of Women Teachers said that things were unnecessarily complex. Do you accept that the complexity of higher still was a factor in this year's difficulties?
If I said no, I would be going back over—or contradicting—what I said about not having a role within the walls of the SQA. I know that higher still was successful up to the point at which young people completed their examination scripts. After that, something went wrong. As I do not have a role in the SQA, it would not be right for me to speculate about the nature of the complexity—that is for other people to do. In previous years, the SQA issued results satisfactorily, but the difference this year must be that the system was changed. Whether the technological hurdle was impossible to clear—which I do not believe—is for others to judge. However, things were different this year and it would be pointless to deny that.
Did HMI take the decision on the exam timetable? Was it on your advice that the exam timetable was shortened?
We do not take decisions on such issues—they are matters for the SQA. The exam timetable was consulted on and as I understand it, by and large, those who responded to the consultation wanted the exam timetable to be left as it was. HMI was not involved. On the other hand, had nobody consulted on the exam system, we would now be asking why not. However, the exam timetable is not an issue for us. If I had taken decisions about higher still, it might look different in some respects. Decisions about higher still are taken by ministers as a result of consultation, not by my colleagues or me.
I would like to double-check that. Are you saying that the advice to shorten the exam timetable came from the SQA and not from HMI?
Indeed. Philip Banks will confirm that; he was closer to that part of the process.
We should recognise that the SQA had a difficult task on its hands in running the exam timetable, because of the increase in the number of exams and the continued existence of previous exams. It was running a lot of exams at the same time. The SQA handled the matter by putting proposals out for public consultation. The running of the exam diet is the SQA's property and responsibility. As Douglas Osler said, if the SQA had not consulted properly on that issue, HMI would have pointed that out.
One of the pieces of evidence that has been submitted to the committee refers to the unrealistic deadlines that were set outwith the SQA. Did you impose deadlines on the SQA without consulting it?
Members will expect me to repeat that HMI would not impose deadlines on the SQA, and that we did not. The deadlines to which the SQA had to work were the result of ministers' decisions about when courses would be required to be offered and which courses would be subject to phasing. The SQA also had to meet deadlines relating to the point at which schools return information. Those deadlines are entirely a matter for the SQA and schools. We would not want to be involved in setting them and it would not be appropriate for us to be involved. However, if schools told us that deadlines were unrealistic, I would want to pass that on to the SQA.
You say that HMI's remit does not extend to the SQA and that HMI was not represented officially on any body that dealt with the running of the SQA. When teachers and other people brought their worries about the implementation of higher still to you, did you report those worries directly to the SQA in any form?
We did that in a variety of ways. If the source of concerns was an education authority, we often advised the director of education—because of the influence that he could bring to bear—to raise the concerns directly with the SQA. It was important that the SQA heard about concerns from other people and not just from us. When concerns were raised by individual schools, we collated them. For example, one of my colleagues held a meeting of the national assessment steering group that resulted in an approach to the SQA. When the concerns that were expressed were of greater magnitude, we ensured that our colleagues in the education department knew about them. We made absolutely sure that concerns were brought to the attention of the right people and that they were followed up.
I was interested in your earlier comments about the successes of higher still. It struck me that that was similar to saying that, if we put to one side the issue of the iceberg, the Titanic provided a very nice travel experience. Is your position that higher still was doable, but the problem lay in the SQA?
I believe that higher still is entirely deliverable. I also believe that the evidence that the committee has received so far shows that problems in the management of the information that came to the SQA are the main reason that we are sitting here today talking about higher still. I believe that, without those problems, there would have been progress towards the Scottish Executive's review of the first year of higher still, that a number of necessary refinements and adjustments would have been made—which is necessary in any exam system—and that the profession would have been congratulated on having taken us this far. Nobody has ever suggested that the problem happened before pupils reached the examination halls. It happened after the scripts left the schools.
Some of our evidence suggests something different. I will return in a moment to the question—
What I have just stated is not simply my personal view. It is based on the evidence that comes from inspection, on the results of the consultation process and on the discussions that took place in the groups involving all the main stakeholders, all of whom subscribed to the principles of higher still and the current programme for higher still when the liaison group met in December 1998.
I am interested in exploring how our education system came to be in hock to an organisation that was unable to deliver. Was there a stage at which something different could have been done? We have been told in earlier evidence that there was going to be a completely new IT system and that it was not possible to know whether it would work. We have also been told that, once it was decided to set up the SQA, there could be no safety net, because there is no substitute SQA and the Scottish Executive education department could not intervene.
We must remember that the SQA was formed from two organisations, one of which had very long experience and the other of which had fairly long experience of delivering examination systems. In any country that one visits, one will find that responsibility for processing examination results and qualifications is vested in an examinations body of some kind. It is difficult to envisage a different way of bringing together expertise in that area. The expertise in assessment arrangements that exists in Scotland is vested in the SQA. I have made it clear that, by choice, HMI was not part of the policy or operational bodies of the SQA. If it is thought that it would be inappropriate, HMI could have a role in evaluating what happens within the examinations body, provided that it is a meaningful role. I do not think that sitting as an observer or assessor is meaningful.
If HMI had expressed anxiety about handing over the huge responsibility for the integrity of the process to another body, and said that there was a high risk in doing that, it would not have made policy. It would certainly, however, have influenced decisions that were made. HMI could have said that this was a risk too far, that it needed to hold on to some of the responsibility and that there needed to be a safety net in case the whole system went pear-shaped.
Johann Lamont is taking me beyond my area of responsibility. It is not for HMI to advise ministers about the nature of an assessment body. However, I know because I was in the department at the time, that two bodies—both of which had credibility in delivering qualifications—came together for a period of time to deliver jointly, as the SQA, the kind of qualifications that they had delivered separately. The SQA was then given the responsibility of preparing for higher still. You would have to ask the SQA how it went about doing that.
As the crisis began to unfurl and anxieties were expressed—
I am sorry—I am afraid that I missed the beginning of your question.
When you began to hear about problems that were emerging—when things began to collapse might be a better description—was there any stage at which HMI could have intervened? If HMI had picked up earlier the concerns about markers and things not being delivered on time and so on—the kind of thing to which you referred—would HMI have been able to intervene and say, "This has gone too far"?
Given what I knew—as I have described it to you—and what my colleagues in the department knew, which John Elvidge described to you, the answer would have to be no. We did all the right things based on the knowledge that we had at the time. We have no remit within the walls of the SQA, so it was not possible for us to go knocking on its door. If I know that something is going wrong in a school or a local authority, I have statutory powers to ask questions, but I do not have those powers over the SQA. I remind members of John Elvidge's phrase. He said that we were at "one minute to midnight". If SQA officials did not know, I do not believe that HMI could have known.
I have a question for Mr Banks, who is on the receipt list for a range of memos and letters about which we have heard in evidence. Mr Banks, do you have regular contact with the SQA and its officials?
Yes, I do.
In that case, you might have seen Mr Morton's report. In the couple of months that he has been at the SQA, he has found an organisation whose job had not been properly scoped, where planning and preparation were poor, where there was limited risk assessment, where there was no adequate contingency planning, where there was poor project management and where there was poor management information. The list goes on and on.
Yes. The situation was clear. The SQA was under the management of a chief executive, a board of management and a set of directors. I was a professional adviser to policy colleagues as part of the normal liaison meetings that we had with the SQA. At those meetings, the reports that we received—which we had to take at the value that was accorded to them by the status of the organisation—were entirely convincing. Our concerns, which led to the visit by Paul Gray to the SQA, resulted in a top-level discussion with SQA officials. We went through matters point by point and SQA officials delivered assurances on those points.
So as far as you are concerned, the letter of 17 April that gave the SQA a clean bill of health is significant. We have discussed that letter. I will use Johann Lamont's analogy; it seems that the SQA management was putting on an Oscar-winning performance of competence and ability to deliver while standing on the deck of the Titanic as it headed for the iceberg, and that HMI was quite convinced by it.
You have discussed the matter with Ron Tuck, the chief executive.
I ask whether you were quite convinced.
I am not responsible for the management of the SQA.
Were you quite convinced?
I was convinced by the views of experts in our service who examined the situation and reported to colleagues. I am not an IT expert, nor is it my responsibility to be one. The evidence that was submitted satisfied the experts—
Those are not IT issues.
Mike, do not interrupt the witnesses.
Mr Russell, if you expect the inspectorate to be able to obtain that kind of information, we have to be part of the board or have a role within the SQA. We did not have that.
You had daily dealings with the SQA, yet you suspected nothing. Indeed, on 17 April, HMI gave the SQA a clean bill of health. Either Mr Morton's paper is inaccurate or the organisation was in crisis and HMI gave it a clean bill of health.
I do not think that anybody from the inspectorate gave the SQA a clean bill of health on any occasion.
My question follows logically from Mike Russell's questions. Philip Banks and Douglas Osler have both said that they are not IT experts, nor would anyone necessarily expect them to perform that role. However, we have a situation in which two organisations came together, with their separate cultures. One might have anticipated that that would be a risk. There was a new exam system, in which people had varying degrees of confidence—some say that they had total confidence in the system; others say that they had some concerns—which might also be said to have been a risk. There was also the roll-out of a new software system, which has been described in evidence to the committee as one of the biggest roll-outs of a new software and IT programme that has been seen in the United Kingdom. Given that, is not it the case that HMI would have wanted some practical assurance that the system was capable and that all the elements would come together, notwithstanding the fact that the witnesses are not IT experts?
If I had had a role to play in the SQA, I am sure that the answer would be yes. It would be quite inappropriate for the inspectorate to dig around in the SQA's IT arrangements. That is a matter for the SQA.
Yes, but would not you be digging around in schools to ask them what their experience of the IT system was?
When schools raised problems with us, we acted on them.
Would not you have asked the schools? Would not that have been pro-active, given that there was a new IT system and a new exams system, with a huge amount of data? Would you have gone into the schools and asked, "Is everything working? Are there any problems? Are there any issues?" as part of your normal work?
Indeed—that is where the information that we have put before the committee today came from. It was in answers to questions that were put to schools that we got answers that led us to realise that a number of schools were having problems.
When did that emerge first? You said that it was at "one minute to midnight". Some of the information in the evidence suggests that it was a good 24 hours before that and that there was perhaps time to raise those issues.
Members would have to go back to the evidence that they took from John Elvidge. At that point, we were part of the process that he described. I cannot say what I might have done in an area in which it was not appropriate for me to do something. Any advice or comment that we had was fed into the SQA, which handled it in the way that has been described to the committee.
I am struggling with the fact that there were all those different strands running together, but that ultimately—in a system such as this, which must deliver for young people—no one was overseeing it. Who, in your understanding, was ultimately responsible for ensuring that all the strands were pulled together and tied in a knot that would not unravel?
I am not sure that I am the right person to whom to put that question.
I ask only for your opinion.
From where I sit, I would point to the senior management of the SQA and the board. I know also that the committee has had described to it the nature of the statutory relationship between the Scottish Executive and the board. Somewhere therein lie the answers. Perhaps there is a gap somewhere—I know that there has been discussion about that. Perhaps the gap could have been filled by involving people such as HMI, who are close to the system, in the management of the SQA. I do not know. However, the situation was not as Cathy Jamieson said it should be.
That leads me to my other question. With the benefit of hindsight—always an exact science—what would you recommend as the way forward to ensure that this does not happen again and that confidence in the system is restored?
Do you mean in the SQA?
I mean in general. I am considering the bigger picture—all the strands.
I am not sure that I am in a position to answer that question. I am not familiar with the possible governance arrangements. As I said, if it was considered appropriate for HMI to have a role of any kind, I would want that role to be meaningful. That is not impossible. I was never happy with the assessor/observer role, because it means that we are there, but that we are not influential. There has been talk of independent scrutiny and the like. That is a matter for discussion with other people in other places. I am not the expert on that.
Thank you.
I must say at the outset, Mr Osler, that I find your answers difficult. You say that HMI's role is limited. You have said repeatedly that it is up to the SQA and others to think about matters. However, you are Her Majesty's chief inspector of schools and you are responsible in the first instance—correct me if I am wrong—to the Scottish Executive, on whose behalf you advise and act. One of your roles was to ensure—or to oversee—and report back on the successful implementation of higher still. Has the failure of the SQA—no matter what we say, it is a shambles—impaired that implementation?
Indeed.
Has that failure cast a cloud over the work of HMI?
That is the question.
Do you agree?
I cannot accept that because, as I said, HMI has a responsibility to co-ordinate the implementation of higher still—successive ministers have asked us to do so. We have statutory responsibilities to inspect and report on the quality of learning and teaching in classrooms. At the end of the certification process, we have a role to play in ensuring that the SQA maintains standards from year to year. However, we do not have a role to play between the point at which examination scripts leave the schools and when the results end up on certificates. We are not involved in that.
So you are saying that HMI has no responsibility in this fiasco. You are, effectively, blameless.
No, I am not. I would not want those words to be put in my mouth.
That is a fair answer.
We could go round in circles on that, Mr Stone. I have told you that we picked up the signals that were coming from the chalkface. I have explained what we did with those signals. Unfortunately, those signals did not alert us—or anybody else, including the SQA's senior officials and board—to the fact that the event would be cataclysmic. We had concerns, which we relayed. None of us in the system can hold our hands up and say that we were the prophets who saw the way in which this would turn out.
When did you first warn ministers that a problem could be coming up the tracks? What did you say?
You are aware that I am not at liberty to comment on how advice was given to ministers. I can say that, on such matters, we feed our professional advice to our colleagues in the department's policy divisions. Advice on such issues would be given to ministers through that route. The inspectorate would take that route. John Elvidge has replied on behalf of the policy divisions as to the nature of that advice.
Are you saying that you never spoke directly to the minister about this?
About what?
You said that you fed your information to John Elvidge and that the department would advise the minister. Are you saying that you have never spoken to the minister about this?
No, I am not saying that.
I want to follow up Jamie Stone's point. You are saying that you were aware that there were difficulties, and that you passed that on. You do not strike me as someone who would want to go to that trouble, then to find that nothing had been done about it. What should be done to ensure that, if difficulties were to arise in future, action would be taken to ensure that we would not get into this situation again?
You would have to predicate all that on an assumption that the SQA's arrangements would be put right to the extent that the essentially irritating things that happened to schools—that is how schools described those things to us—such as being asked twice for results, would not happen again. If—and it is for others to say whether this is the case—what happened concerned management or information technology, experts in management or information technology must police the system, not HMI. If what happened was about learning and teaching issues related to examination arrangements, it is entirely reasonable to invite us to play a part. We are not experts on the management of data in an examination body. It is about getting the right people to examine those matters. As we are currently constituted, we would not be the right people.
No, but you have your role and the SQA has its role. How can those roles be joined together to ensure that when you pass on concerns, they are acted on?
I do not think that the concerns were not acted on. We asked for feedback every time we raised the issues that I have discussed with the committee this afternoon. We were always careful to do so, for the reasons that you have suggested, so that the issues would not disappear into a black hole. We received assurances via independent experts, at committees and from the chief executive, that various steps had been taken. There is also evidence that things were being done. Further advice was given to schools as a result of those issues being raised. The trouble is that they turned out to be the wrong issues; they were low-level issues in terms of what eventually went wrong.
Did part of the problem arise with the uniting of the two bodies? As I have said before, I worry about the culture of SCOTVEC; I think that it was short sighted, a wee bit pedestrian and utilitarian in its tradition. That is perhaps cruel, but that is the drift.
It is certainly interesting speculation, but again you are asking me to comment on a body that I was not part of. I do not know what the culture was inside the walls of the SQA.
But you know the culture of its testing.
SCOTVEC worked for a great many people; the SEB clearly worked for a great many people. I knew the SEB much more intimately than SCOTVEC.
I am talking about the nature of the testing.
In all aspects, there was a point at which the programme ceased to be a development and had to be handed over to the statutory body—the SQA—that had responsibility for it. The point at which it was handed over was a matter of judgment; much of the discussion in the committee focused on that. I am sure that Philip Banks will want to add to this, but often the HSDU was still producing material after the point at which we might have expected it to have stopped because it was reacting to issues that had been brought to our attention. We were using the HSDU to help the SQA to keep up to date on some of its commitments.
The document to which Ian Jenkins referred was the result of feedback from senior management team seminars in autumn 1999, at which directors from the SQA made presentations. Senior management team members at all the national seminars agreed that they would welcome advice from the HSDU produced in consultation with the SQA. That document was the easiest way to make the advice available.
But do you acknowledge that January 2000 was a bit late to be telling people when to jump which hurdles in the exams—
The problem was that in autumn 1999, schools experienced difficulty in meeting the SQA deadline to finish unit 1. As a result, the SQA introduced more flexible arrangements. In light of that, issues arose to do with reassessment. The HSDU produced that genuine advice on the programme, because it was thought that that would be helpful at that point.
Some of those problems had been raised at in-service days a year and a half earlier. The HSDU stayed in touch with the inspectorate, which was advising schools directly. I am not saying that it should not have done so, just that it was not a case of a cut-off, then "Cheerio, it's the SQA now."
You could make too much of that. With any such programme, there is a ragged handover of responsibility. We wanted to ensure that somebody did what needed to be done, quickly and effectively. That was what we were aiming at, rather than constitutional niceties.
On the future of higher still, it has been mentioned in evidence to us that there is some confusion among parents and pupils, and possibly employers, about the certificate structure, which may have contributed to the data processing problems. Although the committee has heard about that before, I like to get the opinions of people who are giving evidence.
You seem to have concluded that it is confusing. When five to 14 was introduced, there were many debates about the fact that it seemed to be standing on its head in its relationship with standard grade. That issue would be well worth considering and trying to resolve in some way.
I thank Mr Osler and Mr Banks for answering our questions this afternoon.
Meeting adjourned.
On resuming—
Good afternoon, Mr Morton. I ask you to introduce your team.
Good afternoon. On my right is Dr Dennis Gunning, the SQA's director of development, and on my left is Jean Blair, a member of our staff who has helped me project manage my internal operational review.
Do you want to say a few words about the situation at the moment or would you prefer to take questions?
I am quite happy to take questions, convener.
Thank you. I believe that Michael Russell has a question on the missing data and software issues.
We have received evidence from Ron Tuck and others about the search for what they call the golden bullet. The head teacher of Strathaven talked about the 60 occasions on which they were asked for missing information. What is the situation today? Do you understand what went wrong with the computer system, the software and the data management? What actions are being taken to correct what went wrong? This is the difficult question: can you give us a categorical assurance that that is now behind the organisation and that it will not affect a future diet?
I made clear that the data management problem is essentially a behavioural thing. The SQA did not get that right. It is not the same as there being problems with the technology or the hardware.
Mr Tuck listed five possible causes for the problems with data: the failure of electronic transfer of data between centres and the SQA; the existence of duplicate entries creating a false impression of missing results; the submission of data out of sequence; paper reports being filed without being processed; and errors by data punch bureaux going undetected by the SQA's staff. He said that, although the possible causes had all been investigated, none of them provided the golden bullet.
There is evidence to suggest that each of those took place and contributed to the problem. From my review, it is clear that poor data management or information handling is at the crux of the problem. However, as you know from the evidence that I have submitted, there are a series of other contributory factors, such as not scoping adequately the task that the SQA took on at its commission, inadequate planning and preparation and poor risk assessment. In some areas, such as examination paper preparation, assessment moderation and some of the areas of work of our development division, contingency plans were triggered effectively but, essentially, I have discovered that a number of issues needed fixed or the behaviours were such that perhaps some more fundamental change was required.
Does the process include an examination of the qualifications and experience of those who head up information technology in the SQA? The committee has discussed the fact that IT did not seem to be given its proper place in the management structure and that there were people in charge of IT who had limited experience of writing programs. Will that be changed as part of the solution?
We will examine that, but I have not encountered any direct evidence that would suggest that the in-house capability was failing in the sense that you imply. Clearly there were instances of poor project management in the sense that user specifications for the software were being thought up at the same time as the software was being created. There is clear evidence that the software was not adequately tested. We need to think about the software as part of the planning for next year. If we believe that we need a capacity and capability that we do not have, we will procure it.
Since you were not there at the time, Mr Morton, perhaps Dr Gunning is better placed to answer my next question. I understand from previous evidence that when the final tape went to print the certificates around 1 August, a check would automatically be run to identify missing data. However, it took three to six weeks for you to confirm the total amount of missing data. At that stage, the SQA was still saying that 1,500 candidates were affected, but the number turned out to be much larger. Why were the missing data not picked up at that time?
I am not trying to absolve myself of any responsibility, because the responsibility is mine from here on in, but, as you say, I was not a member of the SQA at that time, so I will pass this question to Dennis, who will be able to fill in some of the detail.
The program that you are referring to is the one that converts the information that we have in the system into the certificate that is issued. It takes the data that are in the system for individual candidates and transforms them into a certificate. It is not specifically looking for gaps; it is taking the marks and the unit results that exist and reporting on them. A separate process would be required to identify gaps.
I understood that that process checked for gaps.
It did. It was being run regularly until 1 August.
However, when it was running, the number of gaps that it identified was far lower than the number that actually existed. Why was that?
I do not know.
Mr Morton, in addition to issues of data handling and so on, your report has a catalogue of damning items about what you have found in the organisation. I quoted some of them to Mr Banks and Mr Osler. I do not think that a reasonable observer would say that the problems you have listed could have happened overnight.
Any answer to that would be speculation on my part. I reiterate what has been said many times in this committee and in others: retrospective wisdom is a luxury not afforded to the people on the ground. I suggest that the accumulated effect of those factors became most apparent when the organisation, the structure, the processes and the behaviours within the SQA were under stress. That was the case with the greater volume of work that was associated with the diet in 2000. Having said that, I want to make it clear that, had the task been scoped as adequately as it should have been, that would not have been an issue.
Like other committee members, I probably have 100 questions to ask, but I will try to restrict the number.
Thank you.
I would like to pick up on Mike Russell's last point. You have talked about behaviours and organisational structures. Reading through some of the evidence that we have taken, I can agree with some of the things that you are saying. Ron Tuck indicated that the SQA nearly made it—it was only one wee thing that went wrong, and it was just one guy's fault. Are you saying that that is not the case?
You would have to ask Ron about his precise meaning. Serious issues have to be addressed. Things that are cracked and broken cannot be fixed or replaced but, in my review, I have not yet come across anything that was not capable of being remedied. I would also make the point that not everything is affected by the problems that were experienced this year. Many of the SQA's activities are still running normally, which is important for the customers and stakeholders involved.
You will be aware that we took evidence from young people this morning. They had some interesting points to make. One of the young women is still waiting for information because her papers got lost. Have you found the missing data? Do you know what went wrong? What is the situation for any young person who is still waiting for information about his or her exams?
When you consider this from the SQA's perspective, all the data that are required have been obtained and the cases have been resolved. However, I would not say that that was the end of the story. It is only when the schools and colleges—and especially the candidates themselves—feel that the results are robust and that they can be confident in them that we can consider the case resolved.
Evidence that we have heard has raised a catalogue of issues about markers. We have heard about the late appointment of markers, about people not attending marking meetings and about people receiving unsolicited papers. What can we do about this year? The 2001 diets have already begun and markers will need to be in place, but many teachers are saying that they will never mark again, that the situation has been handled badly and that they do not want to be involved. How can you overcome that?
Many of the anecdotal concerns about markers have passed into the mists of urban myth. I will openly and candidly concede that the administration of the marking system this year was extremely poor. It was handicapped right from the outset when—and this was the SQA's problem and nothing to do with the information that was transferred from the schools and colleges—the SQA was unable to verify that the base data for registrations and entries were complete and accurate. Unfortunately, that had a knock-on effect on assessment moderation and on recruiting the right number of markers.
Have you put systems in place for finding markers for next year?
We are considering the planning of all aspects of certification in 2001. As you will appreciate, that is an urgent task. It has to be handled pragmatically and diligently. At the moment, we are trying to ensure that the registration and entry process for schools and colleges is simplified—and simplified in a way that gives the schools and colleges that originate the data the right to verify those data as complete and accurate, and the responsibility for doing so. The concern has been expressed that that has given rise to a small measure of slippage. That is true, but we will go out to the schools and colleges and carefully explain what the improvements are. People may say that time is being lost at the moment, but I would prefer to say that it is being invested to allow us to get things right. I hope that people will agree that the simplification involved in those improvements will allow us the time to catch up.
You have conducted an initial review of the marking procedure. Will you give the committee more detail of what the review revealed? What tasks did you undertake in the review?
We considered the administration and wondered what lessons could be learned. As I have already indicated, a significant volume of improvement and change is required and that has been built into the planning process. We considered whether the quality assurance checks that normally run with the marking process did in fact run. By and large, they did.
The young people whom we heard from this morning outlined the problems of data management. The continuing problem is that people are still unsure about the quality assurance of this year's exam results. We are looking for hard facts about the standard of this year's marking, to ensure that everybody's results are valid.
That is a question for my colleague, Dr Gunning, as I am not an educationist. No evidence has been brought to my attention of any inconsistency in the internal assessment of units.
Nor has that been brought to my attention. Part of the moderation process is to ensure that there are no inconsistencies. The combination of the moderation process and, for most centres, the use of nationally standardised tests—and now there is the national assessment bank—is designed to remove any inconsistency.
Perhaps you can clarify the moderation process, as that seems to be the key to this. Are you saying that some or all of the unit assessments were moderated?
Moderation is always conducted on a sample basis, but the units are moderated.
However, the marking of the final exams was not moderated.
The marking of the final exams is moderated in a completely different way. All the marking that is carried out by individual markers is quality assured by the examining team. All marking is quality assured by a member of that team and any differences in standards are ironed out during that process. It is a very careful process indeed.
In a standard letter that is dated July 2000, David Elliot says:
I am puzzled by that statement. That is an area for which David Elliot was responsible. We undertook less moderation this year than we wanted to, but I would need to know the subject to which the letter refers to comment further.
This morning, a head teacher told us that the sample that they had expected to send in was returned to the centre.
If it would help the committee, I would be happy to address individual cases and return with supplementary information and evidence as required.
We are discovering that there are a number of systems for maintaining standards and ensuring that this year's candidates are treated fairly, one of which is moderation. Are you saying that all the unit assessments were moderated?
A sample was moderated.
On a sample basis, every unit assessment was moderated. However, only some of the final exams were moderated. Is that correct? Could we discover that some schools were not moderated? I am not sure how that system of moderation works.
Let us start again. Individual unit assessments are moderated by moderators—that is part of the process of quality assuring internally assessed units. That moderation is conducted on a sample basis, whereby some centres and some units are sampled. Last year, the units that were completed by centres early in the session were unlikely to have been moderated because at that stage we did not have entry data to indicate who was being entered for which unit.
So how could you ensure that the same exam was not easier in some schools than in others?
The assessment material is nationally standardised: national assessment bank items are used, to which a marking scheme is attached, and a pass mark is agreed nationally for examinations. Therefore, as well as the moderation process, the design of the internal assessment allows standardisation.
I am not sure that that process is entirely clear, but let us discuss the exams themselves. I understand that you are able to examine all the results and check whether this year's results in a subject are comparable with those of previous years. Do you undertake that comparison?
Yes. In analysing the trend, one would have to take account of the fact that new subjects and exams were introduced this year. However, I understand that those statistics are produced.
One of the things that has worried many of us is that if the volume of appeals is up—say, instead of 10 per cent of the pupil population appealing, nearer 50 per cent of them are appealing; I am not sure of the actual figure—and the same proportion of appeals are successful, that does not reflect well on the exams. It should be an absolute number of the scripts that were inaccurately marked, not a proportion. If half the appeals were successful—half of 10,000 appeals would mean 5,000 appeals being granted, whereas half of 50,000 appeals would be a significant number of badly marked exams—how could you assure us that quality of marking was maintained throughout?
There is no standardised trend that can be used to identify a pattern of behaviour in appeals. The higher volume of appeals this year was expected and has nothing to do with the complexity of the higher still examinations. The data management problem that has given rise to inaccurate and incomplete results has caused concern about the credibility of many of the exams and has inspired that greater volume of appeals.
I agree that it is too early to judge, but I hope that the SQA will consider this issue in detail.
Yes, indeed.
I agree that one reason for the number of appeals might be the alarm that has been caused by a lack of faith in the SQA. However, if the marking has been consistent, the number of appeals should not be any greater—at least, not massively greater—than in any other year. If there were a huge increase in the number of appeals granted, that would perhaps tell us something about the lack of moderation of the marking.
That may be true if you ignore the fact that the course marking is a combination of internal and external examination or assessment. The appeals process deals with both aspects. If there were enough evidence, based on internal assessment, to grant an appeal, an appeal would be granted. If there were any doubt at the end of that stage, both internal and external assessments would be considered and a judgment would be made on the outcome of the appeal.
I gather that you might not be able to answer all these questions at this stage. The point was made that the final exam is much shorter for most subjects than it has been in the past, and that that might have created anomalies—there might have been greater statistical variation because students did not have enough time to demonstrate their abilities. You might not be able to say whether that factor had an effect this year, but I hope that we will be able to answer that in the fulness of time.
We will consider all the lessons that are learned and all the intelligence that is gathered. We will make that information public, as there is a wide constituency that can use it to make improvements in the next year.
You said that many stories that have circulated had the status of urban myth. If I give you a couple of examples, you might be able to tell me whether they are true. Were many inexperienced markers used this year?
Eight markers out of 7,000 were inexperienced. If my memory serves me correctly, I think that the vast majority of those inexperienced markers attained the classification A in the assessment of the quality of their marking: six were awarded an A, and the other two were given a B.
I do not have my notes of all the myths that have circulated, but perhaps my colleagues can help me.
I might be able to help on that point. Are those eight markers the probationer teachers to whom the minister referred in his statement on 6 September?
Yes, the inexperienced teachers to whom I referred were probationers.
In asking about inexperienced markers, Ken Macintosh may have had in mind not probationers but teachers or lecturers who might be qualified but did not have much experience of marking. In that sense, can you address the question whether the use of inexperienced markers was a myth?
We will be happy to publish information when all the reviews have been completed and we have all the facts. There were new subjects in the exam diet this year, so one would expect that new teachers, who had experience of those subjects, would be required. It should be borne in mind that marking is under the supervision of principal assessors, who are experienced teachers in their subjects.
Another story that I heard was that markers did not attend markers' meetings. Did that happen?
I do not think that there is evidence to suggest that that was a major concern. There is a misconception that markers' meetings took place over, say, a couple of hours. The standards and processes for marking were agreed at those meetings, and where markers were unable to attend them, by and large the principal assessors briefed markers and marker teams in parallel.
Is it compulsory for markers to attend markers' meetings? How many people did not attend them?
I could not answer off the cuff, as I do not yet have the analysis.
Are we talking about tens or hundreds?
I do not want to speculate until I know the outcome of the review.
Will that be clarified in your internal review?
It will not be clarified as such in great detail. However, we are conducting a review of marking in the appeals process and the Deloitte & Touche exercise is examining the matter in detail. That information will not only be made available to the committee but will be published at the end of this month.
From the evidence that we took from the teaching unions, it is clear that many people went out of their way and took on extra work to get marking done. However, a representative of one of the teaching unions said:
Not being an educationist, I am mildly surprised that there is a reliance on the voluntary contribution of the time and expertise of the teaching profession in the education system, which is a key aspect of Scottish community life. There is a perennial dilemma because, if the SQA were to pay the markers better, there would be a knock-on effect on entry charges. Whether that is desirable or practical would have to be the subject of further consultation. It is clear that we will have to produce a more attractive proposition to overcome some of the natural reservations that the profession has about participating in marking in future. We are considering that matter and will present proposals to address it constructively and quickly.
I wish to change tack slightly. Is it deliberate that, under "Structure", in your submission, you do not mention the board?
My submission mentions the board.
I am sorry. Where does it mention the board?
I talk about the board in relation to assurances that were given. On the last page of the submission, under "Behaviours", it says:
I stand corrected. On the issue of structure, have you given thought to the future direction and manner of conduct of the board? Some members have perceived a blurring and lack of clarity between the board's strategic role and its overseeing of the chief executive's reporting function. Equally, have you thought about the role of members of the board? Although members are on the board in their own right, some are also members of other bodies, such as the Scottish School Board Association. I realise that you may say that decisions on those issues lie with the Scottish Executive, but I should be interested in the advice that you would give on them to the Scottish Executive.
The decisions lie elsewhere, but I think that the board, as it is currently constituted, represents very well the various stakeholders in education. I have met the board on only two occasions. I am due to meet it in just under a week to present the findings of the operational review.
My second-last question is a quick one. Douglas Osler referred repeatedly to the SQA and what HMI might or might not have said to the SQA at various times, about how the SQA's problems could be dealt with. Do you think that the relationship between HMI and the SQA should be strengthened, changed or radically changed in some way? We have heard Mr Osler's side of the story. I know that you have only recently started the job, but your impressions of the other side of the story might be useful.
It is difficult to comment on that, but, like any good chief executive, that will not deter me.
I am glad to hear it.
What I have ascertained as the problems that beset the SQA, which might constrain its positive progress in future, are essentially internal organisational and management issues. They concern the structure, the process and the behaviours of the SQA. If I remember correctly what Douglas Osler said, he focused on whether there was any contribution to be made in the realms of teaching and learning. HMI may well have a role to play in those areas.
HMI was responsible for overseeing and ensuring the successful implementation of higher still. Given those responsibilities, do you think that the communication channel with the SQA was as strong as it should have been?
I do not have the authoritative knowledge to comment on the past. The task of delivering higher still was clearly the SQA's responsibility, and the organisation should have scoped and prepared for it better. I agree that it was a feasible proposition and a commission that could and should have been delivered. The fact that it was not delivered to the standards that the SQA had achieved in the past is a matter of significant regret.
Your written evidence was helpful in identifying a number of issues. One of the problems seems to have involved training and development opportunities either not being available or not being taken up. From other evidence, I have formed a picture of lots of staff working very long hours and really trying their best to deliver, but without there being an absolute focus. I am concerned about the resource implications of trying to get the correct training in place and dealing with the cultures of the different organisations at the same time as managing a process that is already beginning to slip behind schedule for this year because of the time involved in all these inquiries. What will that mean in practice, and what will be the knock-on financial effects of doing that job in the proper time scale?
Big issues are involved. I have been genuinely impressed by the dedication, commitment and professionalism of the staff of the SQA. They have been badly shaken by the experience that the organisation has corporately visited on the candidates and centres this year. There were instances in which training and development opportunities were available but were not taken up. We will have to look more assertively at the training that is available, so that we can get some of the key capacities and capabilities in place where required and make them operate effectively.
You also identified a lack of contingency planning. To what extent will it be possible to have a contingency plan in place for 2001, in case anything goes wrong this time?
The contingency plan stems from having a good identification and assessment of risk. The risk assessment seems to have focused on the wrong area in the recent past. The view was that there were clearly risks associated with software development or with the processing of results. I do not think that anyone felt that the risk would happen where past practice had suggested that an excellent outcome could be expected. An element of blind faith was perhaps involved in managing that.
I am going to take a risk now and say that Brian Monteith's question could be the last one of the afternoon.
That was a risk on a grand scale, convener. First, Mr Morton, I would like to clarify a couple of points from your written evidence. Under section 2, headed "Since 10 August", the first list of bullet points, "Data checks and clear-up", includes the statement:
No. By 17 August, we had looked specifically at the areas of greatest priority: the candidates whose results were incomplete or inaccurate and whose entry to college or university was at stake. That was the initial focus. Gradually, as we resolved those issues, we were able to examine each successive component of certification this year. At the end of that process, we were in a position to conclude and to make it public that of the courses taken by 147,000 candidates this year, 2.7 per cent were impeded by the problems that we visited upon centres and candidates. In fact, the number of candidates who were affected was 16,748.
Was that concluded by 29 September, the point by which you had achieved clarification of the standard grades? I am trying to put a date on when that was concluded.
Those data were obtained for the production of the submission—last Thursday, that was an up-to-date-position. I would see that as being concluded.
Fine. Your submission states:
It was a moveable feast—it was an iterative process, as would be any clear-up activity. The information that was given to the minister was correct at the time. There might have been other clarification issues that we considered important for the purposes of the SQA.
Your submission says:
I shall answer your questions in reverse. I had no basis on which to judge whether the number of directors was relevant to the situation. Personally, I believe that it is the role and responsibility of the chief executive and the senior management team to scope the exercise that the organisation takes on. However, I say that with the benefit of hindsight. To a degree, the failing was a corporate one.
Your submission says that the
We are planning to ensure that we have the data in a complete and accurate condition. It is a balance of risk. Do we spend two weeks simplifying the process in order to guarantee that it works or do we become concerned about the slippage and the bigger risk that we might find ourselves in a similar dilemma next year? At the moment, we are simplifying the data-capturing exercise—registrations and entries—on the basis of listening to what the centres have told us about what they want. They want us to give centres the right and responsibility to originate and verify the data. That facility was previously available to colleges and was withdrawn last year. We want to reach a position where the centres are comfortable that the data that we hold are complete and accurate. When we reach that position, all the processes that flow from that will be less prone to the risks that arose this year.
The evidence that we have received from the chairman of the board, Ron Tuck, and David Elliot suggests that they thought that there was a problem of incomplete or incorrect course grades—that would end up on certificates—which had started at a high figure and were being reduced. There was some surprise, even as late as 9 and 10 August, because the problem was larger than they had been led to believe. They portray that as a problem of having been misled about the information that was available and have pointed to one person in particular, Jack Greig, from whom we hope to take evidence later. He was on sick leave for much of June and when he came back at the beginning of July, Bill Arundel had replaced him.
That was your final question, was it not?
No; I have another small one.
Those who have given evidence before me were there at the time. You have referred to their conclusions that they were misled. I cannot comment on that; I cannot contradict or confirm what they said. However, one needs effective management information to be able to manage an organisation. In the case of the SQA, that has been identified as inadequate. I can only presume that the information that was made available was sufficiently credible for people to conclude in advance of more detailed knowledge of what happened that the problem related to 1 per cent of candidates and might affect somewhere in the range of 1,000 to 1,400 people. Clearly, that was not the case.
At last week's meeting of the Enterprise and Lifelong Learning Committee, the Minister for Enterprise and Lifelong Learning said that you were taking a number of big sticks to the organisation. In your evidence, you suggest that there was concern about bullying in the organisation and that
I can think of a parliamentarian who described me as a slippery haddock, and I do not recognise myself either in that description or in the statement to which Mr Monteith just referred. I am a chief executive with 13 to 14 years' experience, and I will address the issue of what needs to be fixed or replaced because it is cracked or broken in a positive, constructive way. What happened is regrettable, but most of the contributory factors related to poor data management. Those problems can be put right. I am also encouraged by the capability and capacity of the staff of the SQA to ensure that we do that very quickly.
I have one very quick question about markers. We look forward to seeing the Deloitte & Touche report, but can you tell us what was the percentage of unsatisfactory markers this year compared with other years?
Markers are classified as A, B or C according to quality. The initial indications that I saw over the weekend suggest that this year there are slightly more As than in the previous two years and slightly fewer Bs. Offhand, I cannot remember how many Cs there were. The general impression is that there is not much of a difference in markers' scores between this year and the previous two years.
Most of your comments today and suggestions for changes relate to data management, and I understand very well why that is. If, however, there are changes to be made to higher still and the assessment process, who would make those and what would be the procedure? Mr Osler has said that he would not drive it. Who will consider the effects that the introduction of higher still has had on the examination system?
I will pass this question to Dennis Gunning. However, I have recently had contact with the stakeholders involved in education about higher still, and the SQA has proposed simplifying the way in which data are captured. I take issue with Ian Jenkins when he says that my suggestions are all about improvements and changes in data management at the SQA. My evidence indicates that there need to be corporate changes to the organisation as a whole. There has also been discussion about how the natural process of refinement may lead to simplification of internal assessment. However, that does not put in question the fundamental importance of internal assessment in higher still or the linkages between educational and vocational learning as part of a lifelong process.
I would like to say something about the arrangements that we make—the syllabus, the composition of the units in higher still and the assessment. Douglas Osler referred to the point at which those arrangements were handed over to the SQA. That was the point at which development was finished and implementation was in progress. We have a committee called the national qualifications committee, which is responsible for overseeing this family of qualifications, which includes all the higher still arrangements. There is also a committee called the national assessment steering group, which is chaired by an inspector and has membership from the SQA and the higher still development unit. Normally we would discuss proposed changes in assessments with that group, to ensure that all the key stakeholders are signed up to them. However, ultimately arrangements for the syllabus and assessments in higher still are the responsibility of our national qualifications committee, as these qualifications have now become operational.
The minister set up a review before all this happened.
Indeed.
Is the national qualifications committee the body to which he will report?
No. We are running higher still within the policy that was set at national level. It is not the job of the national qualifications committee to challenge the policy of higher still. That would be done at a higher level than the SQA.
I have one last question, which might be the final question from the committee. One of the problems that we have had to contend with—and which others have had to contend with all along the line—is that of assurances that turned out to be false. I am not accusing you of anything, but before today—presumably on bad advice—you said that early on there was no reason to doubt that the quality control mechanisms were in place, and it turned out that they were not. Today we have heard that the minister was given an assurance—presumably by you—which he repeated in his statement on 6 September and which turns out to be misleading.
I gave an answer to a different question that was used in the context of the question about the running of quality assurance checks being posed after the fact. Concordancy is simply a way of validating the relationship between school estimates and outcome. I understand that concordancy checks are run where there is a statistically competent track record to make it a meaningful exercise, such as in standard grade and the old higher. The reason why it was not run in the new higher is that in the first year there is not a statistically competent track record that makes it possible to establish a reliable trend.
I am not questioning that. However, given the circumstances in which the organisation now finds itself—and I do not doubt your word in any way—do you not think that some external independent reassurance needs to be given, perhaps over the next year or couple of years, which will make people feel that they are getting the truth? I am sure that they would get the truth from you, but do you not agree that, given the difficulties of the past year, extra reassurance would be helpful?
I do. I was trying to answer your question rather obliquely by saying that primacy of responsibility and accountability rests with me, as the chief executive. However, I would support greater openness and transparency. I am not in a position to make a decision on that or to judge how best that can be achieved. The committee may offer some guidance on that.
I will indulge myself and ask one final question. You have submitted a written report to us that I suspect is part of your internal review. Is it part of the review or is it the whole? If it is not the whole, when can we expect the review to be complete?
The report is on the whole review. This was the right opportunity to make public my findings from the operational review. Members will see from the completeness and candour of the report that nothing has been left out.
Thank you for giving evidence to us this afternoon. I am sure that you will be very interested in our on-going proceedings, particularly as they relate to the last point that Mike Russell made, which I am sure we will consider in the future.
Is it at 1.30 pm?
Yes.
Meeting closed at 17:26.