Skip to main content

Language: English / Gàidhlig

Loading…
Chamber and committees

Education, Culture and Sport Committee, 09 Oct 2000

Meeting date: Monday, October 9, 2000


Contents


School Exams

The Convener (Mrs Mary Mulligan):

I welcome everyone to the Education, Culture and Sport Committee, especially our witnesses. Our first witnesses are the young people of South Lanarkshire. You are a politician's dream—we have never had this kind of coverage before. Indeed, this is not our usual everyday event. You will have to come to more of our meetings.

Witnesses will need to bear with us for two minutes more, as we have a couple of housekeeping issues to address. First, I want to make the committee aware of the fire evacuation procedures. On the sounding of the alarm, please follow the instructions of South Lanarkshire Council staff and evacuate the floor by the fire exit doors from the reception point at the stairs opposite or by the internal stairway to the ground floor. The muster point is at the staff car park 400 yards to the rear of the building. Now that you have heard that, you are all safe.

I thank South Lanarkshire Council for providing wonderful facilities. This is luxury for MSPs, and we are thinking of meeting here more often. I also thank the council for its assistance in contacting our witnesses and inviting them to give evidence. Furthermore, I thank the Scottish School Board Association and the Scottish Parent Teacher Council for their help, particularly in identifying parents. Finally, I welcome the parents and teachers in the audience.

Our intention is to hear about the Scottish Qualifications Authority situation from the people who were most closely affected by it. I have said to a number of the young people that we want their experiences of the situation, not just from 10 August—when their results either did not arrive or were found to be incorrect—but in the period leading up to 10 August. Christina, were you sitting standard grades?

Christina Fotheringham (Hamilton Grammar):

Yes.

I believe that the other witnesses were sitting highers. I could be really awful and pick on someone to start. Who is feeling particularly brave and wants to go first?

Lewis MacKinnon (Uddingston Grammar):

When 10 August came, my results did not. I waited but they had still not come by the late post. I was concerned, so I phoned the school to find out whether it had received notification. However, it did not know what had happened, so I phoned the SQA helpline, which I had seen mentioned on television. I was told to phone the school again and then phone the helpline back. The school still did not know anything; it was in the same position as me.

I therefore phoned the SQA helpline again. The people there reluctantly said that they could give me the results via the phone, but would prefer it if I waited until my results came. I was anxious, as members will understand, so I asked them to tell me the results. They went through each result, but omitted my physics one. I told them that I had sat higher physics and asked what had happened with that. They said that they did not have any record of it. I was concerned, because I had done quite well in physics and hoped to study it further in sixth year. I let the school know and waited.

The next day the results finally came in the post. The format of the results is that your overall higher awards are on the front; a more complicated form gives you a breakdown of how you were awarded them, including the unit tests. When I read through it, it turned out that only two of the three unit tests that I had passed had been recorded. I had got an A in my external assessment—the exam that you sit in May or June. I had passed all my unit tests and got an A for the exam, so I expected an A. However, there was no record of me sitting the third unit test. It had been sent away with the rest of the information. I let the school know and received notification in September that there had been a problem and I would get the result eventually.

When did you finally receive confirmation of your results?

Lewis MacKinnon:

I have yet to receive a complete certificate. I received a letter from the SQA, which said that there had been a complication and that I was due an A for higher physics. However, I am still waiting for my full certificate.

Thank you. I will ask each of you to say a few words before I ask members of the committee whether they have any questions.

Alan Burns (Uddingston Grammar):

I was in a similar situation to Lewis MacKinnon's. I did not receive a certificate on the day that the results were due out. That was worrying. We phoned the school to see whether it had my results but it did not have anything, so my mum phoned the SQA helpline. The SQA had my results but at that point it was not able to give them to us over the phone. We were told to phone the school again to find out whether it had them. We phoned the school again, but it did not have them and said that it was unlikely to have them until the next day.

This was on the day that the higher results had come out, but I did not have my results; all my friends had their results, but I was sitting with nothing. It was worrying. In the afternoon, my dad phoned the SQA helpline from his work. He got the results from them. I had got four As but, like Lewis, nothing for physics—a unit assessment had been missed out. When my results came the next day—Friday—the certificate was incomplete. I had four As, but one unit from physics was missing. At that point, I had a fail for higher physics and four As from my other exams.

What did you hope to go on to study?

Alan Burns:

Biomolecular and medicinal chemistry at the University of Strathclyde.

So physics was an important result for you.

Alan Burns:

Physics was an important part of my results. I could not go to university if I had wanted to when clearing took place. I was left with no choice; I had to go back to school.

Jennifer Irvine (Earnock High School):

I sat three highers and one intermediate 2. I got my results, but my higher modern studies was not mentioned. I was shocked by that, as I had done well in the prelim.

When did you finally hear what had happened?

Jennifer Irvine:

We tried to get through to the helpline, but it was engaged. My dad eventually got through from work and the helpline people confirmed that I had failed modern studies.

What is the situation now?

Jennifer Irvine:

I am still waiting for my appeal.

Thank you, Jennifer. Tell us what happened to you, Namita.

Namita Veer Nayyar (Hamilton Grammar):

I sat five highers last year, but when my results came out on 10 August I had been given only three of them: maths, English and history. For physics, I was given only two of the units and the external exam was mentioned under external assessments. For chemistry, I was given only two of the units and there was no mention of the external exam. I went down to the school, but it did not have the results either.

The next day, I called the school again and was told that I had got a band 9 for chemistry, which was six bands below my predicted grade, so the school was quite surprised. I called the SQA many times, but it was not until the first week in September that I got confirmation that my physics result was a B and that I would eventually be given that grade.

More than a week later, four weeks after I should have got my results, I was told that my chemistry result was a C. The school then put in an appeal for my chemistry result, but I was told last week that there would be no change to that result and that it would stay as a C. I was quite disappointed about that, because my Universities and Colleges Admissions Service form has to be submitted to the school by tomorrow and I will have to write a C on it. For the course that I want to apply to, the University of Glasgow discounts Cs, and chemistry is the most important subject for that course. I feel that what has happened will affect my chances of getting a place on the course that I want to do.

What course do you want to apply for?

Namita Veer Nayyar:

I am interested in doing dentistry.

Christina, I understand that you were sitting your standard grades and also had problems. Will you tell us about them?

Christina Fotheringham:

When my results came out, there was no mention of my accounting and finance, but I was not worried to begin with as I had heard that there were a few problems. However, about a week later, I received a second set of results stating that further information had become available and that any improvement in my grades would be shown. I thought that I had failed, as the second set of results was the same as the first and still did not show accounting and finance. However, when I went back to school, I was told that I was recorded as not having sat the exam, as the SQA had misplaced a small group of accounting and finance exam papers from my school.

Have you heard any more since then?

Christina Fotheringham:

No. I am still waiting for my result.

Have you had no indication whatsoever?

Christina Fotheringham:

No.

Has that affected your choice of subjects for this year?

Christina Fotheringham:

It would have affected me if the school had not allowed me to take accounting and finance as a higher, but fortunately it has allowed me to do that.

That is good. Thank you all for explaining what happened with your results. I shall now allow members of the committee to ask you questions.

Cathy Jamieson (Carrick, Cumnock and Doon Valley) (Lab):

I was interested in people's experiences of phoning the helpline. I had to phone it on behalf of some young people in my constituency. I would like you to say a bit more about your experiences, so that I can judge whether my experience was typical. What sort of information did you get or fail to get? How many times did you have to call?

Alan Burns:

I did not ring the helpline; my mum took the responsibility for calling. Our post usually arrives at 8.30 am; when there were no results in the post she was on the phone straight away. She was able to get through with almost no trouble, but at that point the helpline people could not tell me my results. They had them and the bloke on the end of the phone actually said, "I've got Alan's results here on a computer screen in front of me, but I'm not permitted to give them out at this time."

We accepted that and they told us to phone the school, but when we phoned the school it did not have any results, so my mum then tried the helpline again. By that time, it was obvious that there were lots of problems and she could not get through. She tried for hours and hours. By then it was after lunch and she still could not get through. Eventually, my dad managed to get through from his work and got my results. There should have been more people on the phones.

How did it feel to know that the result was there but you could not get it?

Alan Burns:

It was very frustrating. The person sitting at the end of the phone has the results in front of him and knows what they are, yet I am the one whose results they are and I am sitting at home really worried. It was really stressful. It was not a nice experience.

When it became obvious that the school did not have the results, what did the school say? Did it suggest that it would have them any minute? How long did it take for the school to get the results?

Perhaps Jennifer Irvine would answer that. Did you contact your school, Jennifer?

Jennifer Irvine:

My mum tried to get through to the helpline, but failed, so I went up to the school to speak to Mr Sherry. He explained to me that, if the result was not on the certificate, I might have failed. The school did not have the results. It took a few days for the results to come through.

Did the helpline suggest that the results would be through any minute?

Jennifer Irvine:

Yes.

Was it the same for you, Alan?

Alan Burns:

We phoned the helpline first and were told to phone the school; the helpline led us to believe that the school would have the results. The school was helpful, but it did not have anything. The people there did not know what was going on. I was told to phone back. Eventually, the school said that it probably would not have the results until the next day and that I should phone back then.

I am interested in finding out whether the guy on the helpline believed that the school would have the results at any moment. That is something that we need to ask the people at the SQA. Thank you.

Perhaps Namita can tell us whether it was the same at her school. Was someone identified whom you should contact?

Namita Veer Nayyar:

I went down to the school on the day that I got my certificate. I spoke to Mrs O'Neill, the assistant head teacher. She took down the details of what had happened and I showed her my certificate. The school did not have the results, so there was no way that she could tell me what they were. She said that, because my chemistry result was not mentioned but my physics result was, it was possible that I might have failed chemistry. She was quite surprised. I was too, because I knew that I had worked hard, as I needed the result for the course that I want to apply to.

I have two quick questions. Namita, you said that you were surprised at the result, because it was six bands below what you expected. What did you get in your prelim?

Namita Veer Nayyar:

I got a B in my prelim.

Was the exam that you sat harder than you expected?

Namita Veer Nayyar:

I do not know. I can never tell how I have done when I come out of exams.

Was your teacher surprised?

Namita Veer Nayyar:

Yes, the teachers were quite surprised, because a few of us, all at the bottom of the register, had failed, which made them suspicious. It was not just me; there were others who had been predicted to get an A or a B but who got bands 8 or 9 when the results came back.

Were those people at the bottom of the register alphabetically?

Namita Veer Nayyar:

Yes.

So your teacher was suspicious that it might be something to do with that fact.

Christina, did you find the exam harder than you had expected?

Christina Fotheringham:

No, I felt quite comfortable with it.

What did you get in your prelim?

Christina Fotheringham:

I got a band 1.

As far as you were concerned then, the course had gone quite well.

Christina Fotheringham:

Yes.

What was your teacher's reaction to the fact that you had failed?

Christina Fotheringham:

The teacher found out that a small group of people were showing up as not having sat the exam. He thought that it was impossible that none of us had turned up.

You were definitely there.

Christina Fotheringham:

Yes.

I am sure that you remember it well. So, the situation is that your paper has been lost and nobody knows what is happening.

Christina Fotheringham:

I have been told that my result will have to be based on an estimate from my teacher, which I will get around October or November.

How do you feel, having gone through all the effort of sitting the exam? How long was the exam?

Christina Fotheringham:

I think it was around two hours.

Now you discover that your paper no longer exists. How do you feel about that?

Christina Fotheringham:

I felt really disappointed, because I knew that, although I had put the work in and the teacher had put the work in in teaching us, the SQA had mixed it up.

Mr Jamie Stone (Caithness, Sutherland and Easter Ross) (LD):

One of the things that we have to do is to make sure that this never happens again. Although this may seem a blindingly obvious question, I would be interested to hear your thoughts on what should be done. From the sharp end, what would you say should be done to prevent this from happening again?

Lewis MacKinnon:

That is a difficult question to answer.

Jamie is well known for asking difficult questions.

It is important, because we have to sort this out.

Lewis MacKinnon:

I do not know the whole situation; I know only what my situation has been. Certainly there has been some form of administrative error, whether it has been not enough markers or problems with the post. Surely there should have been some indication that that would be the case, at least with regard to people not receiving results. I had heard that there would be problems, but it was not fair that people were not notified that their results were not going to come. That was the most painful part of the experience. Everyone else had their results, but I did not know where I stood.

Alan Burns:

I feel the same as Lewis, because I received no results. There were some indications on the news the night before that a few people would not receive their results, but I felt that if that was going to be the case there should have been notification beforehand to prepare people. In the weeks before your results come out, the tension builds up and you get more nervous. On the day, you just want to get your results and get it over with; afterwards, you can relax, but I could not because my results did not come. I am still highly strung and worried about what is going to happen.

After the work that you have put in, how do you feel about the SQA?

Can we move on from how the pupils feel about the SQA?

In a sense, this is still about how they feel about the SQA. Do you think that the helpline could have given clearer information? Alan, you said that the first time you phoned the helpline, the people there could not give you information.

Alan Burns:

It could have been more helpful and better organised. As I said, the person on the helpline knew my results but was not able to give them to me. That was the line that the helpline took at that time but, later on, after phone calls from other people, people there must have decided that they should give out the results, because that is what happened when my dad phoned in the afternoon—there had been a change of tack.

Was it clear to you that the helpline did not know that the schools did not have the information?

Alan Burns:

Yes, because the first time I called I was told, "Phone the school. It will have your results because we have sent your results to it." However, when we phoned the school, we were told that it did not have anything and did not know what was going on. It was clear that the helpline thought that the school had received the results and that the problem was just that my results had not arrived in the post.

How did people feel when they went to school and found that the school did not have the information that they were looking for?

Lewis MacKinnon:

It was a further blow. You thought, "If the school does not know, and I do not know, who does know? Have the results been lost?"

Was that another fear—that the results had been lost?

Lewis MacKinnon:

Yes, especially as the helpline was saying that the school should have the results. It was not as though someone had the results; they were lost.

Was the school supportive? Did people in school say, "Don't worry; a couple of days will sort this out and it will not make a difference," or were they panicking a bit as well?

Lewis MacKinnon:

The school reassured me that it would do everything in its power to help me to get my results. The school knew my academic ability and what I should get, and was intent on getting it for me.

Fiona McLeod (West of Scotland) (SNP):

I want to take you back a bit. Alan Burns mentioned the press reports just before 10 August, which began to alert you, especially the higher pupils, that something was going wrong. Did those of you who were sitting your highers last year ever get a sense that everything was not going to be completely normal? That could have come from what your teachers were saying or during the preparation for the exam. Did it all come as a big shock just the week before the results?

Alan Burns:

It was a brand new course and it was obvious during the term that there were some teething problems. However, the general opinion was that we were getting through it and that everything was going fine. It was only the week before the results came out, when things started coming out in the news, that I felt there might be a problem. It was said that only a few people would be affected. I thought that it would not affect me and that I would be all right. When it happens, it is a really big shock and people do not know what to do.

Fiona McLeod:

I think that Jennifer Irvine talked about the day when you opened your results and found that modern studies did not appear on the certificate. To what extent were you aware that that was happening to other people? To what extent did you think it was just you?

Jennifer Irvine:

I thought it was just me. I was not aware of the problems the media had been talking about, as I had been on holiday and had got home the night before.

Fiona McLeod:

Do you think it would have been helpful to have got a letter alerting you to the fact that the SQA was having problems this year? You all said that you went to the school when you became aware of the problems. I do not think that, 30-odd years ago, I would have had the sense to go to my school. How did you cope with working out what to do next, having realised that you had a problem?

Alan Burns:

It was almost a gut reaction to phone the school. It is the link—people at the school were the other people, apart from the SQA, who should have known what was happening. We would phone them to see what they had to say. They did not really have a clue what was going on and they did not have my results. That left me in a bit of a situation. I did not know what to do next; the SQA helpline could not give me my results at that point. I was running about all over the place, sick with worry. I did not have my results, whereas everybody round about me did. At that point, my friends were coming to the door, asking me what results I had got, but I could not tell them. It was not nice.

Fiona McLeod:

Would it have helped to have had a very clear explanation, whether by letter—I think the SQA considered sending letters to you all a week before the results came out—or by press statement, that there might be a problem, not to panic and that it would be sorted out?

Alan Burns:

That would certainly have helped prepare me for not getting the result on time. As it turned out, it was totally unexpected.

You are still missing information, even now. Have you had a letter of explanation or some form of apology from the SQA?

Alan Burns:

Yes. I received a letter on 5 September, confirming that I had got my pass in higher physics. The SQA said that that letter could serve as my certificate, for the purposes of getting the correct codes. It said that it would send me a completed certificate in due course. I have still not received it.

I saw a few surprised looks from the other witnesses there. Was it not the same for you, Jennifer?

Jennifer Irvine:

No, it was not. I did not get such a letter.

What about you, Namita?

Namita Veer Nayyar:

I was sent a letter, saying that the SQA would investigate the courses for which the results were not complete and that the school had phoned about.

Christina, you were sitting your standard grades, your first national exams. Was your reaction also to contact the school?

Christina Fotheringham:

I felt that there would be no point phoning the SQA helpline, as I thought that it was prioritising the higher candidates. I therefore thought that the best people to contact would be at school, and that they might have information about what was happening. I was fortunate, as they had the information that the SQA had misplaced my result.

Mr Kenneth Macintosh (Eastwood) (Lab):

My question is on similar lines. You said earlier, Lewis, that the form that you received this year was very complicated. Were you shown the form before? Did the schoolteachers give you a dummy form to prepare you to interpret the real form correctly?

Lewis MacKinnon:

No. My only idea of what the form would be like was based on the form for standard grades in previous years—it had just the names of the core standard grade qualifications and the grade. I thought that there would be a similar, simple format for higher, but in fact the form dragged on for about six pages.

It was obviously a complicated form but, when you read it, were you able to interpret it, know how you had done and know what to do if what you were expecting to see was not there?

Lewis MacKinnon:

I had a good look at it and I could tell that I had got an A for the exam that I sat in May. I could also see from the breakdown of the units that a unit test was missing. I knew that I had sat that and passed it, and that it had been sent, so I knew that that was a problem.

Mr Macintosh:

You have sat the highers and you have a good understanding of those exams, but was everybody aware of the differences between them and your higher still exams this year? Was the importance of the unit assessments made clear to you? Did you know that if one was missing you could not get the exam?

Namita Veer Nayyar:

The school had done its best to make it clear that if you did not pass the unit assessments you could not sit the final exam and you could not, therefore, get an overall pass. All our teachers in all our subjects had made it clear that we had to pass the unit assessments to get a final grade. It was because I knew I had sat them all and passed them all that I could not figure out why I did not get physics.

Mr Macintosh:

It sounds as if you were all well prepared in the sense that you knew what was expected of you. Jennifer, you said that you are still waiting for a result. You are now in sixth year. Are you studying the courses that would have been your first choice if you had got that result, or are you studying different courses?

Jennifer Irvine:

I am waiting for my modern studies appeal to come through, but I am studying my first choice subjects—the ones that I wanted to study.

If it comes through and you have failed, will that affect your choices?

Jennifer Irvine:

Possibly, yes. I would probably resit it.

Christina, I think you said that your school had advised you that you would probably be all right and that you should proceed on the assumption that you had got the result that you were waiting for.

Christina Fotheringham:

That is right—I was allowed to continue. My estimate from my prelim was a 1, so they estimated that I would get the same for my standard grade final exam.

Is anybody studying a course this year that they would not have chosen if the exam results had been different? No.

Am I right to say that Alan would not have been at school at all if he had got his results through in time? I think you suggested that you would have had the option of going to university.

Alan Burns:

It was a possibility. I would probably have gone back to do sixth year anyway, but not receiving the award for physics meant that I had no choice. Had I got the result, I might have decided that, yes, I wanted to go straight to university. As it was I had to wait until the result was confirmed, and by that time it was too late.

Johann Lamont:

I would like to ask about the advice that was given once the results had come out. It is clear that there were flaws in the helpline system, in that it was no help whatsoever as far as I can see. What would have been your reaction if you had been told that the results would be delayed? Would that have caused less anxiety than what really happened, with a number people getting flawed results and a helpline being set up? The SQA could have decided not to put the results out on a particular day; it could have delayed them.

Alan Burns:

I think that it maybe should have done that, to ensure that all the certificates were right. It could have delayed them for a week or so, just to make sure.

Johann Lamont:

All of you have told us how you received incomplete certificates, or certificates that claimed that you had not sat an exam at all. Are any of your friends who appear to have got a complete set of results in time anxious about whether their results are accurate?

Alan Burns:

When people heard on television that the SQA would be checking exam results, they were uncertain about their results. They would say, "Okay, I've got an A or a B, but did I really deserve that? Did I really fail that exam? Has there been a glitch?" I know about a few people who were in that situation.

Have those people been reassured now that the SQA has said that it has been through that checking procedure?

Lewis MacKinnon:

Eventually. The SQA said that if changes were made to results, only upgrades would be possible—marks would not be taken away from candidates.

Cathy Jamieson:

I will follow up quickly a comment made by Alan Burns.

You said that the course was new and that while a lot of preparation had been done, there were some teething problems. What were those teething problems in your experience? Did anyone else experience teething problems?

Alan Burns:

In science-based subjects, such as chemistry and physics, candidates study learning outcome 3, which is an experiment that must be conducted and written up. Candidates must pass learning outcome 3 as part of their internal assessment. To begin with, I do not think that teachers were quite sure how to go about that part of the course: how to set it up, how much help to give pupils and whether they should let pupils get on with the work themselves. As that part of the course had to be passed, teachers gave us quite a lot of help but, as I said before, they were not sure how to approach it.

Did anyone else have concerns about teething problems with courses?

Lewis MacKinnon:

The layout of the course means that a lot more pressure is put on time throughout the year. I certainly felt that. If I was going to be sick and off school for a day, I felt that I would miss a lot of work and that it could be difficult to catch up. The pace was rapid throughout the year and pupils had to put in a lot of effort to keep up with the pace. It was probably inadvisable to miss days by going on excursions or being ill.

Cathy Jamieson:

It has been suggested in some of the evidence that has been submitted to us that as the exam timetable was more compressed this year, some people had to sit a couple of big exams on the same day. Did that happen to anyone here? Could the timetable have been laid out differently to benefit candidates?

Alan Burns:

That did not happen to me, but a lot of my exams were closer together than I would have liked. I had chemistry and geography exams within a day of each other, which did not give me much time to prepare for the first exam and then relax for a bit, as I had to go straight into the next exam.

I sat more exams the previous year, when I sat standard grades, but I had more time in between exams to prepare and relax, as they were spaced out further.

I have a few questions about exam preparation.

You may not be able to answer this question, Alan, but did you sit any prelims for higher still exams? Did you sit an exam paper that was issued by the SQA before the final exam?

Alan Burns:

I do not think that the papers for our prelims were issued by the SQA. I think we got our prelim papers from an independent body, although they were supposed to be based on what the higher still exam would be like.

Exactly. So you sat an exam that was like a prelim. Were the higher still exams like the prelims? Were the prelims a good preparation for sitting higher still exams?

Lewis MacKinnon:

I will step in at this point.

There was quite a big issue about the maths course. The prelim that I sat was straightforward and what I had been expecting. There were no past papers for higher still, because it is a brand new course, but there were model papers that had been passed as being a good representation of what the higher still exam would be like.

I got 88 per cent for my prelim, which is quite a good grade and well above the 70 per cent required for an A. However, the higher still maths exam was nothing like what I expected and I know that a lot of people were in the same situation. I ended up dropping 20 to 25 per cent between the prelim and the final exam—I ended up with a B overall.

I think that there was a specific problem with maths this year.

Do pupils tend to sit prelims about two months before the final exams?

Alan Burns:

We sit them in February or March.

I have a problem trying to cast my mind back to my school days.

They were a long time ago.

Exactly. You say that there were no past papers, but were you issued with model exam papers in February and March to take home and look at?

Alan Burns:

From what I can remember, we had papers for some subjects, but they were previous higher papers or old prelim papers, not model prelim papers.

Namita Veer Nayyar:

For many subjects, we were using previous papers, because the school was not sure what type of questions to expect.

Were you warned that the exam might not be the same as the prelim?

Namita Veer Nayyar:

Yes, because we were not sure what the exam would be like. However, we were told that the questions might be along similar lines.

We will have a couple more questions and then try to wind up the session. I can guarantee that, as soon as I say that, members will suddenly have more questions. Please bear with us for a few more minutes.

If you were not going to get an award for an exam, would it have helped if the report form had said "no award"? Did the form just have an empty space for a "no award"?

Jennifer Irvine:

Yes.

And there was nothing at all about your modern studies grade on the form.

Jennifer Irvine:

That is right, although the unit studies part of the form said that I had passed all the units.

So in future would it be more helpful if the form did not make you think that a subject had been missed out altogether?

Jennifer Irvine:

Yes.

Did the core skills aspect make much sense to people? Did they expect it?

Alan Burns:

It was not totally unexpected. At the beginning of fifth year, we received a slip containing our current core skills. However, although we knew they existed, we did not really know what they were for. They just appeared on the certificate.

So what you want is a certificate that tells you what highers you have passed.

Alan Burns:

Yes.

Ian Jenkins:

Although it is important to have information about individual units, students probably do not need to know that on the day they receive their results; they want to know what exams they have passed. The other information can come later and will, perhaps, be easier to understand.

Alan Burns:

That is right.

Ian Jenkins:

I want to ask about your experience of the courses and exam pressure from the national assessment bank aspect and unit tests. When higher still was first established, it was probably thought that the tests would be sat through the term. Did any of you experience any slippage?

I know that some of you probably got As and did not have any resits, but from your colleagues' experience, if someone failed an assessment test in October, was resitting the test a problem for the class and the teachers? Did that create any pressure towards the end of the term for some of your colleagues, if not for yourselves?

Alan Burns:

The new higher still geography course, for example, has many assessments—I think about 13 units—but you need to pass only roughly half of them. A few people in my class failed some tests and needed to catch up. That meant that, near the end of term, they had to sit the unit tests at lunch time. They might have originally sat the test in October, but they did not have the resit until April.

So the structure meant that it was possible that pupils might have half a dozen resits in a short period of time.

Alan Burns:

Yes.

I have a final factual question. Did any of you do higher still English?

Alan Burns:

No.

You will understand Ian Jenkins's deep questioning on this issue if I explain that he used to be a teacher.

Presumably you will all have to take examinations in future, as you are locked into the system. What comes to your minds when you think about that after your experience this year?

Lewis MacKinnon:

Never again.

Alan Burns:

Oh no, not this again. Will there be more problems?

Jennifer Irvine:

I do not feel that confident in the system.

Namita Veer Nayyar:

It is a question of trust. Can we trust what we are going to be given when we sit exams?

Christina Fotheringham:

All of the above.

Namita, I am not sure what you plan to do. You had hoped to go to Glasgow University. What are your plans now?

Namita Veer Nayyar:

My UCAS form has to be sent in because I am hoping to do dentistry at Glasgow University. I have spoken to the department a lot as I am interested in the course. I was told that C passes were discounted. That means that only my three A passes and one B pass will be taken into account. The university wants me to get a B in a chemistry certificate of sixth-year studies this year. That will make sixth year another hard year.

But you will have a bash and see if you can pick it up.

Namita Veer Nayyar:

Yes, because I really want to do dentistry. I will have to work hard at it.

The Convener:

I thank each of our witnesses for giving up their time and coming in this morning. It has been useful to hear what they have experienced. That will add a lot to our inquiry. We hope that, by the end of the inquiry, we will be quite clear about what went wrong and that we will be able to set about ensuring that it does not happen again—either to our witnesses or to members of their families who will be moving on to sit the national exams. I hope that we will speak again at lunch time.

We will now speak to some of the parents who have been through this situation. I welcome the parents who will give evidence to the committee this morning. As they were here while the young people were giving evidence, they will be aware of our procedures. We would like to know how the situation affected them or other parents whom they know. I am sure that cases will have been passed to Mrs Moore, who is the chair of a school board, as they have been to committee members. We know that the exam situation affected the students, but I am sure that there was also an impact on the lives of their families. We are anxious to know how parents were affected, how we can put the situation right and how we can deal with such issues in the future.

Mrs Moore and Mr Anderson will say a few words and then members of the committee will ask questions.

Janette Moore (Uddingston Grammar):

As the convener said, I am chair of the school board. We had discussions in the board prior to the examinations, because we heard that there were problems with their administration. I have heard anecdotal evidence, as everybody else has, from other parents, but this weekend I have been approached by several parents who wanted me to put forward their cases. Two of those cases really must be heard.

I have a son who was doing standard grades and the problems affected him, although not as badly as some other students have been affected. However, the effect on him and on many other pupils who were doing standard grade exams and have gone on to do highers is one of the key points that must be considered.

Would you like to tell us about the two parents who spoke to you?

Janette Moore:

The first was a father who phoned me the other night to say that he and his family have been going through a living nightmare since the day that the results came in. He is concerned that his daughter's confidence has been eroded. He thinks that her confidence has gradually ebbed away with the pressure of work during the year and he also feels that she has been let down and that that feeling will never go away.

When she started the year, her expectation and that of her parents was that she would achieve five B grades. She dropped one subject to ensure that, but when she opened her envelope she had one B and one C. The parents would very much like her papers to be re-marked. They are worried that those who marked the papers were not properly trained—they have lost all confidence in the system. The problem has been exacerbated by the fact that one of the subjects that the girl is noted as having failed cannot easily be retaken, as the course is now full. Next year's pupils had taken up the places before her situation became clear.

Another parent, whose son did not receive a certificate on the same day as other students, has a vivid memory of chasing the postman along the street. Two parents said the same thing to me, so I imagine that that must have been a common experience throughout Scotland. She was concerned about the length of time that it took to contact the SQA and by the fact that, however courteous they may have been, the staff were unable to be of any help, at least at first. As we have heard from the pupils, the helpline staff's only suggestion was that parents should contact schools. She felt that the SQA staff seemed to be totally unaware of the extent of the problem. Repeated phone calls were of no use, and it took until the afternoon before someone was prepared to tell her what results her son had achieved.

What that parent is most concerned about is that the pleasure and enjoyment of something that was unique in her son's life was taken away from him. He was never able to celebrate his success—he achieved five A passes—so something that should have been pleasant for the family turned out to be a time of tension. That is what she wanted me to put across to the committee.

Other points of concern that parents raised with me concerned the number of internal assessments and the stress that they occasioned. Parents also cited the fact that the unit tests did not always bear any relation to the end examination and mentioned the apparent increase in work that was required in comparison with that which was required for the old highers. Some pupils feel that what they are doing now is sitting courses for the sake of passing exams and that there is no enjoyment in studying subjects that they want to learn about. They are working only to pass exams and tests.

The main point that comes across is that parents and their children are experiencing a lack of trust in the examination system. Those who sat standard grade last year are questioning what they are doing now. They are not going ahead as if highers are a normal part of their education; they are questioning whether the course is right and why they have to do all the assessments. They wonder whether, in the end, their exam result will be a true reflection of their capabilities and potential. That is most worrying. We must ensure that this year's pupils are not disadvantaged as they go into the system.

Thank you. You raised a number of points that I am sure we will come back to during questions.

Mr Anderson, would you like to tell us about your experience?

Ken Anderson (Strathaven Academy):

I welcome the opportunity to do so. My wife is sitting behind me and what I have to say concerns our son, Stephen. My evidence is therefore quite personal, but I should also like to give some views from my privileged position as a member of the school board and vice-chair of Strathaven Academy.

On 10 August, we arrived home early from Brittany so that we could be there when the results came through the door and so that our son would have his parents with him, only to find that there were no results. We tried to phone the SQA helpline, but the phone lines were engaged all day. The school was contacted, but it had not received the results either, so it could not pass them on. The school was very helpful, but it did not have the results and could not give them to us.

It was not until well into the next day that Stephen was able to talk, via the helpline, to the SQA to get his results by phone. He sat three exams—geography, maths and chemistry. He got a B for geography, and failed maths and chemistry. The school contacted Stephen and confirmed the results, as they had been transferred electronically to the school on the same day. Some two weeks later, on 22 August, his certificate arrived.

Stephen was told on the Friday—the day after he should have received his exam results—that the certificate had been sent to our old address. We lived in Nairn in the north of Scotland, but that was some three years ago. Stephen had sat his standard grades at Strathaven Academy as well as two highers the previous year, and the results for those exams came to our Strathaven address. However, for some unknown reason, this year's results were supposedly sent to an address that we left three years ago.

I should also say that Stephen was lucky enough to get a second set of certificates as well. A week to 10 days after getting his first certificate—which was two weeks late—he got another certificate. The SQA was obviously trying to belt and brace everything. During that time, he was waiting to find out whether he could go to Warrington Collegiate Institute to take up a university course in music publishing and business studies—something that he cared passionately about and had been working towards for a long time. However, because he did not have a certificate in his hand and had been told over the phone that he had failed two subjects, the stress in the family, as members can imagine, was unbelievable.

My wife made a number of phone calls to Warrington. To cut a long story short, we were thankful that Warrington took the view that they would allow my son in because of his previous highers; one secured higher and two that it had been predicted that he would pass. I cannot stress too much the feeling that we had, which was not elation—we almost got on our knees and said, "Thank you."

During that time our son was very stressed out. He had the opportunity to go to Loughborough with some friends, so we sent him down there for a week, even though the situation was all up in the air. We felt that he had to get away from it. We said that we would try to work round the situation. Thankfully, he is now in Warrington, he is happy and he is getting on with his life. However, he has made two appeals. I cannot tell the committee whether Stephen passed the exams or not. He might have failed or he might have passed, but given the quality of information from the SQA, I do not know either way.

The stress that we, as parents, went through during that time was the climax to a year that was stressful, because of the internal assessments. We felt that there was increased stress for teachers, pupils and parents. It is appalling that we put our children through such high stress. It is normally a stressful time for them anyway, but the stress then was unacceptable.

When I joined the school board this year, I took the trouble to phone the SQA's helpline with some questions I wanted to ask so that I could become more informed about higher still, particularly with regard to internal assessment. The SQA failed miserably to answer my questions to my satisfaction, or even to come up with any information. The helpline just did not know the answers. As a parent and a member of a school board, it was apparent at the beginning of the year that data were going missing, which put stress not only on the board, but on teachers.

As parents, we wrote to the SQA—as we were invited to do by Mr Ron Tuck—regarding the maths higher. We wrote as Mr and Mrs Anderson, but in true SQA fashion the replies came back to Mrs Anderson. The last letter came back without a date on it. Perhaps that is indicative of the organisation. We wrote to say that we were concerned by the standard of the maths paper, in particular about the awarding of grades. We expected an accreditation authority to ensure that students were given a fair examination and that they were not subjected to unnecessary stress. We told the SQA that the school board was aware of stress among the school's pupils. We said that we should give our students the opportunity to succeed, and that we should maintain and increase standards. I will not go through its replies.

Recently, we received a copy of the report on the maths higher and my view is that things went wrong. I do not think that the pupils who opened that exam paper were offered a fair opportunity to give of their best. Although the SQA states that it has a quality assurance procedure, I do not believe that, given what has happened. That is a parent's view of what happened to his son.

Mr Stone:

I thank both witnesses. I was interested to learn that you are chair and vice-chair of your school boards. What official representations were made to the SQA by your school boards, your head teachers or any other organisation of which you are aware? You have both talked about the run-up period and Mr Anderson mentioned his involvement as a parent. Was there any correspondence with HM Inspectors of Schools or with the SQA, perhaps via the rector?

Ken Anderson:

We wrote to express our concerns to the chief executive of the Scottish School Board Association.

When did you do that?

Ken Anderson:

That was before the summer recess—I do not recall the date.

We wrote about the high level of stress that pupils were experiencing—concern about that had been expressed by parents, pupils and teachers. We were also concerned that the SQA did not seem to be retaining data. The head teacher of Strathaven Academy—who is sitting in the public gallery—will confirm that repeated requests for data were made. I understand that she will give evidence to the committee. There were many cases in which data went missing.

Janette Moore:

I have an extract from minutes showing that it was reported to us in April that the SQA arrangements were very time consuming, and that the amount of assessment in the new system was engendering anxiety among pupils. The organisation of the school was coping; we congratulated our SQA co-ordinator and our office staff because they coped with a tremendous amount of extra work.

In June, it was reported to us that the exams had taken place without any problems in the school—which was due to the efficiency of the staff. There were concerns among teachers and management that teachers had been asked to take marking home and to be absent from school to do that marking. The school board thought that that was a bad step; the school had started the next year's work and pupils would not be able to progress so well under supply teachers, even if supply teachers could be found.

At the start of the new session, I asked that the board be provided with an up-to-date report of the situation in the school for the board's first meeting at the end of August and a comprehensive report was provided. As a result of the discussions that took place among members of the board, we decided to write to Mr Bill Morton to express our concerns and to ask him to answer various questions once the inquiry had taken place. We copied the letter to Mr Sam Galbraith and South Lanarkshire Council. To date, I have received no reply, but I did not want one until it became clear what had happened, what assurances the SQA could give parents and what measures could be taken to ensure that such a situation could not recur.

Did either school board address the role of Her Majesty's inspectors of schools in the run-up period?

Janette Moore:

No.

Ken Anderson:

No.

Michael Russell:

I would like to put the same question to both witnesses. Janette Moore said that two of the problems are, that pupils sit exams just for the purpose of passing tests and that there is a lack of trust in the system. I do not know whether you have children who have still to come through the system, but given your experience this year—as members of school boards and as parents—how do you feel when you think about the next few years? When I asked the young people that question, I received very short replies. Will you expand on any fears or hopes you might have for the future?

Janette Moore:

Two of my sons have been through the system and my youngest son did his standard grades this year. His approach to his higher year is totally different from the approaches of the other two. For them, it was just a normal part of life—there was no problem. They trusted the system and looked forward to gaining the proper qualifications to go to university. However, the son who sat standard grades last year sees no value in the two identical certificates that he has received for his standard grades. One of his subjects is still missing from both of them, although he has been assured that he passed it. The problems have devalued what he did. He should be looking forward to a successful year—he is doing five highers—but he is questioning everything about his course.

My son was depressed during the summer, as he entered his higher year—I thought that I should ask for his permission to tell the committee that. He was unsure whether what he was doing would be as valid as what his brothers did before him. He does not expect to have the smooth transition that they had. He has asked them whether they had to do such-and-such a subject and said, "You did not have as much homework," and, "You did not have to do all these tests." He should feel that he will have enough qualifications this year to gain entry to university, but he does not have that trust or that confidence in the system. What has happened has changed his outlook; that is all the more clear to me because I have seen how different it was for his brothers.

Ken Anderson:

I question the value—not only as a parent, but as a person in industry—of teaching pupils just to pass exams. In other words, I question the value of saying to pupils that all that is important is that they pass this or that assessment and that once they have passed it, they must pass the next. They are being taught narrowly, just to get them through assessments towards a final examination. That was what went on this year.

If we consider the number of examinations that pupils take, it could be said that all they are doing at school is cramming for their next exam. Value and quality in the coursework does not exist. That is disappointing, but it is the way we seem to be going. I hope that we can get away from that—instead of waiting until the end of a course, only to find that they have failed, pupils could be continuously assessed. I do not have a problem with that. I go back to stress levels—I think that it is wrong that pupils must pass internal assessments before being given the opportunity to sit an external exam.

Michael Russell:

Given those profound concerns—which we have heard from a range of people from inside and outwith education—what do you hope for from the committee and its inquiry, both in relation to the SQA, which is the immediate issue, and in terms of the issues that you have raised?

Ken Anderson:

The committee must find out what went wrong—nothing can be fixed until that is understood. I hope that the inquiry can pull out all the information because a get-well plan cannot be put in place until all aspects are understood.

We have said that the situation must never be repeated. However, although the committee is conducting an inquiry—which is a tremendous thing—pupils have already started the courses for their next set of exams. We are in danger of building on problems that will still exist when the system is fixed. I hope that something can be done quickly and that the problems are not compounded this year.

I hope that the committee also considers the issues around higher still—not only issues related to the SQA, such as data input, but the problems to do with quality checks and ensuring that fair examinations were set. The committee should consider whether it might be better to delay implementation of higher still for a year, rather than introduce something that is half baked. The bottom line is that we, as parents, feel that it is unacceptable that our kids should be treated as guinea pigs at a vital time of their lives.

Janette Moore:

We need to find out what went wrong with the administration of the exams—that needs to be corrected. Also, the exams and the internal assessments must be re-examined—I have heard time and again about levels of stress. Parents seem to be hearing about it—I suppose that that is because they experience the problems in their home. Perhaps the pupils are not aware that they are going through something that they have not been through before and that is not normal.

Cathy Peattie:

I also have kids who have been through the system. I remember the stress that accompanied waiting for the envelope, but I do not remember the stressed fifth and sixth-year pupils who have been described today. Witnesses are telling us that the kids are stressed throughout the year and I think that that is news to the committee. What can be done to change that?

Janette Moore:

Something has to be done about the internal assessments. There is a big build-up of pressure because pupils feel that they are being examined on several occasions throughout the year. As Ken Anderson said, some pupils are doing five highers and are assessed many times for each higher. For a pupil who takes his or her work seriously, that creates an awful lot of anxiety.

Higher still was meant to get away from the stress that results when everything depends on one exam.

Janette Moore:

Yes, but whenever somebody sits a test or an exam—whatever the school has chosen to call it—they experience anxiety.

Ken Anderson:

I asked the SQA helpline whether, given the fact that the assessments are set by schools, it was possible that the pass levels are different in different schools. I also asked whether the degree of difficulty of assessments differed from school to school. In other words, is there a level playing field, and is that checked by the SQA, as the accreditation authority? After a lot of going round in circles and talking to different people—this was during one phone call, but the person to whom I spoke first could not answer without asking colleagues—it was admitted that it was possible for a school in Strathaven and a school in Hamilton to have different pass marks for the same subject. I found that appalling. So, that went into the melting pot along with the assessments.

There is also no level playing field for assessments. Members should bear it in mind that pupils have to get through the assessments before they have the right to sit the final external exam. The results of internal assessments are not just kept in-house; they are an important element of the build-up to passing the exam.

Cathy Peattie:

I am interested to know how we can move on. How can we give kids confidence in next year's exams? We have heard stories of kids who are sitting standard grades—looking forward to their highers—who are not confident that their papers will even come back.

Janette Moore:

That is the major problem. Time is running out for those pupils. Schools have recently been told that the exams have been brought forward by a week, which has caused panic among some pupils. First there were all the problems getting last year's results right, with the lack of confidence that that caused; now pupils are being told that they have one week less to prepare for the next lot of exams. That was not a good step if we are trying to make things right.

I do not have a clue how we go forward. I do not know what can be done. Something has to be put in place to protect this year's cohort. They are uniquely disadvantaged. They have heard what happened last year and do not know what will be put in place in future. Something has to be done quickly.

Ken Anderson:

A good start would be to say, "These are the things that were wrong." That would be refreshing. I do not mean pointing the finger, but telling people exactly what went wrong. We need someone to say, "These are the positive steps that we are taking to start to repair the damage." The damage will not be repaired overnight, nor will confidence be restored overnight. We must get away from the blame culture and people scoring points. That is not what parents want to hear. They want to hear what constructive steps are being put in place to fix the situation. They have not yet been told that. This inquiry is a good step forward, but people need to be told quickly what will be put in place. You must start to make some inroads into repairing the damage. If the issue is fudged—if there is a whitewash—you will not instil the confidence that must be put in place.

I am conscious of the time, but I think that Brian Monteith had a question.

Mr Brian Monteith (Mid Scotland and Fife) (Con):

Given that you perceive there to be an unlevel playing field between schools for marking and internal assessments, and given that assessments are not so much an assessment as a hurdle—a test for entry to the final part of the course, the exam—would you be more satisfied if the internal assessments were no longer a hurdle, but a means of advising the pupil and teachers how well a pupil is doing in a course? Would you be more satisfied if the assessments took place but were merely for information, rather than a door into the exam?

Janette Moore:

That would be a very good step, but the idea that the unit assessments could act on their own for some pupils would then be lost. I am not sure how that could be overcome. It would, however, be a good idea. A test should provide an indication of a pupil's level and of the amount of work they need to do. It should not, in itself, pose a problem to the individual pupil.

I am also worried about the intermediate pupils. They were the ones for whom the whole system was set up. The intention was for everyone to have somewhere to go after fourth year.

I am very worried that we are always talking about the higher candidates but not about the intermediate students. I wonder whether they have been lost in the process. I wonder what their view is of what has happened. They were told that their results would have to wait until the higher results were seen to. I do not know what message that gives to those students.

Ken Anderson:

I would give anything to get away from the concept of just getting pupils over the hurdle towards one of assessing pupils' positions, what they have retained and what they need to do to improve. That would be applauded. There is perhaps a halfway house for fixing the problem.

I have no evidence to say that one school is marking more leniently than others, but any such charge would not be denied by the SQA because there are no checks and balances. I am not saying that that happened, but it could happen. Teachers are trying to get their students over the hurdles. They are not trying to devalue anything, but they have to deliver as well. They are in a very awkward position.

Mr Monteith:

As parents, how did you react to what the certificates were like? You may want to consider the question from another angle—Ken Anderson may wish to do so from an industry point of view, for example.

At primary school, the levels for five to 14-year-olds go from A to F, with A being the lowest and F being the highest. When pupils move into standard grade, reports are given in terms of levels 1 to 7, with 1 being the highest. Under the higher still arrangements, access 1 is the lowest level. Within various levels—access, intermediate, higher and advanced higher—A is the highest level and C is the lowest level. As parents, do you find that confusing? Is that something that we should be looking into to simplify the system?

Janette Moore:

That would be a very good idea. Several parents have told me that they found the new certificate for standard grade and highers far too complex and that all they and employers want to know is which subjects have been passed and at what grade. Pupils would be pleased to get a run-down of how they have done in each part, but I do not think that that need concern everyone else, or that it is necessary to include that information in the final certificate.

I take on board what Mr Monteith says about the five-to-14 age group. There is a problem in that parents do not understand the grades. To them, an A is the highest mark but, as you pointed out Mr Monteith, A is the lowest mark for five to 14-year-olds.

Ken Anderson:

I believe in the KIS—keep it simple—system. The exam certificate did not look simple: it was confusing and it was not helpful to parents, to pupils or to industry. We can understand, from looking at the certificate, why the SQA got into such a mess. It bit off far more than it could chew. It was, I think, a fatal error to put all of a student's history—probably leaving out only their inside leg measurement—on the certificate.

The Convener:

You will be relieved to know that you have now come to the end of your ordeal. We are very grateful to Janette Moore and Ken Anderson for giving their time and answering our questions this morning. As I said to the young people earlier, it is important to hear from such people as you, who were so closely affected, about the exact situation. The suggestions that you have made will also be taken on board.

I thank you and hope that you will stay with us for the rest of this morning's session and that we will have a further opportunity to speak at lunch time.

I suggest that we have a five-minute break while we change witnesses—but I mean five minutes.

Meeting adjourned.

On resuming—

The Convener:

Good afternoon, as it now is, to all of you. I thank you, as teachers, for joining us. I know that you have sat through our previous two sessions, so you will have heard all the information—not for the first time, I am sure.

My intention is to proceed in roughly the same way. I will give you a few minutes at the beginning to say who you are and where you are from. We will then open it up to questions, which is probably the easiest way of handling the discussion. I will start with Elspeth Banks. If you tell us which school you are from and your position we will move on to questions.

Elspeth Banks (Strathaven Academy):

I am the head teacher of Strathaven Academy in South Lanarkshire.

Mark Sherry (Earnock High School):

I am assistant head teacher at Earnock High in Hamilton.

Jim Browning (Uddingston Grammar):

I teach English and am the assistant head teacher at Uddingston Grammar School.

Catherine MacKichan (Holy Cross High School):

I am principal teacher of maths at Holy Cross High School in Hamilton.

Richard Goring (Hamilton Grammar):

I am assistant head teacher at Hamilton Grammar School.

Thank you.

I believe that some of you were SQA co-ordinators within your schools. Can you identify who that was?

Mark Sherry:

I am a co-ordinator.

Jim Browning:

So am I.

Richard Goring:

So am I.

That is fine. Some of the questions will be directed towards you.

Was there a marker among you?

Jim Browning:

Yes, I marked after the Easter diet.

Richard Goring:

I was also a marker.

It is helpful for us to know that when we are asking questions.

Ian Jenkins:

When the parents gave evidence, they spoke about the weight of internal assessment. Will you give us your views on the place that that took in the first year of higher still? Will you talk about the relationship of intermediate 2 to the modules in respect of the unit assessments as they were introduced into higher still and how that affected the courses? We will go on to the higher still development unit later.

Will you tell me about the burden of internal assessment? Did it mean that people were tested too much?

Elspeth Banks:

All secondary school pupils in Scotland are used to assessments. In the course of study in their higher year, all pupils are used to end-of-topic check tests and assessment. They are for formative, as opposed to summative, judgment. That was the difference this year. Although the end-of-topic check tests were pursued, the unit tests were formal. That placed an additional burden on our candidates.

We started the year with some trepidation. All schools knew that we had a big job ahead of us. I am delighted that staff in Strathaven Academy—along with staff in all other schools in South Lanarkshire—rose to the occasion, met the challenge and delivered the goods.

We were mindful of stress levels, not just for staff but for our candidates. A feature that began to creep in at an early stage was the absence of a senior pupil the day before a unit assessment. It was not a major issue, but it began to happen. Certain pupils who were not usually off school were off the day before the assessment. They felt that they were not coping with stress levels. Being absent from school for a day had an adverse effect on work in their other subjects. We offered support to the pupils concerned. I made it clear that they had to continue and that if on the day of an assessment it did not go well and they failed, there would be another opportunity for reassessment. That was a significant feature of term one, which we had resolved by term two.

Mark Sherry:

We knew well in advance what the internal demands of the new qualifications were going to be. The new qualifications are a good thing; if they can gain a currency in the market place, the unit tests will be of value. Attainment at unit level is of value.

As Elspeth Banks said, all schools worked with our children to prepare them for the tests, ease the burden and reassure them that if they experienced failure there would be a subsequent opportunity. When that would be was a teaching decision—whether it would be better to deliver it quickly or to wait until later in the year.

In our school, we and the students coped well with internal assessment. We used the assessment tests that were sent to us by the SQA, which had marking schemes. All schools stuck rigidly to those. I do not believe that there was a discrepancy between schools in the marking schemes: I do not think that children in one school passed tests that they would have failed in others. There was rigour in those tests, and I believe that all schools carried them out rigorously.

Jim Browning:

I agree with Mark Sherry. That academic rigour may put the pressure back on students. They recognised early on that the tests were official and that they were being monitored much more closely than had been the case under the formative assessments that we undertook previously. The number of tests to which they were being subjected became quite difficult to manage. Because of the number of students involved, it became impossible to manage the timetabling and to ensure that students who failed tests were not sitting three internal units on one day or in one week

Our experience means that we will be better at timetabling this year, but the compression of unit studies was a problem for students as they tended to fall in November. That could not be avoided.

Catherine MacKichan:

Perhaps a bit of undue pressure was put on pupils. The unit tests in maths are set at a minimum level of competency. We found that the majority of our pupils passed first time and that a pass in the unit test did not tell teachers or students much. Perhaps pressure was put on them to pass unit tests that did not really prepare them for the final exam. I do not know how useful those tests were for teachers, parents and pupils.

Richard Goring:

We discovered a problem in that some pupils, particularly those who may have been just inside the category of those sitting intermediate 2 or higher, struggled with some of the unit tests, and that tended to build up. Those pupils might have been under the most severe pressure to pass the unit tests as they had a huge number of reassessments to undertake at the same time. That became quite discouraging for them.

On the other hand, many pupils who were not capable of sitting a final examination still picked up worthwhile units throughout the session. A large number of pupils who would previously have failed the exam or opted out of sitting it have something

Jim Browning:

You asked us about the currency of intermediate 2 in relation to the Scottish Vocational Education Council. My students who are sitting intermediate 2 see it as valid currency. They value the concept of intermediate 2 and its structure, through which their progress is indicated to them. They recognise that passing intermediate 2 gives them an access route to further qualifications that is much more valid than the previous system of modules.

I will pick up on Mr Goring's comments on unit currency, which was also commented upon by one of the parents. Students are filling in their UCAS forms. Those who passed internal units but did not succeed in the external exam are in a bit of a quandary as to what to record on their forms. I was in contact with UCAS last week and it advised that it does not know what to do with that information either. That sent students the message that the promised currency of internal units does not exist. That is another glitch that will need to be ironed out.

Could you address the conflict of cultures between SCOTVEC and the Scottish Examination Board, which some people have talked about? As a teacher of English, you might be aware of those arguments.

Jim Browning:

The SCOTVEC culture was that internal assessments were verified at the end of the year. Teachers made up their assessments and had their courses validated. The assessment element was internal and, as one of the parents suggested, could vary from school to school.

The difference between that and the new system is that the new system creates a national validity. When one teaches a course, the material seems to have more credibility. It is the same material—we are teaching much the same subjects—but there is credibility when pupils sit an assessment or a test.

Pupils such as those mentioned by Mr Goring are borderline when they head towards higher—they do not have the ability to go on to higher courses immediately. They are given a sense of, "Here is something of worth that tests me. Here is something that I am really being taught and that will build a step for me."

I am not sure that SCOTVEC's system encouraged pupils to feel that. In my opinion, intermediate 2 brings out the rigour that was brought out by the SEB.

Ian Jenkins:

Could you say something about the relationship that existed between the higher still development unit and the teachers? We have heard unions talking about a dismissive attitude to teachers' worries, which seem to a degree to have been fulfilled. Could you talk about the higher still development unit and HMI in the consultation and development processes, and how teachers felt about that?

Jim Browning:

There were times when we were made to feel that we were whining and being obstructive for the sake of it, whereas we were being constructively critical when we felt that there were areas to be addressed. There was the implication that we were trying to slow down the process. The process was not the problem; the timing was. Perhaps some of our observations were misheard, if not ignored.

Richard Goring:

I agree with Mr Browning. When we went to meetings held by the development unit with members of HMI or the SQA, there was an assumption that what will happen will happen. We were not being listened to. A number of concerns were raised on many occasions. Many people felt that they were being dismissed out of hand. That led to a lack of confidence in the system. There was no listening ear.

So although there were small changes to the length of a unit or the timing, the people you met did not get engaged in an argument about the bigger issues?

Jim Browning:

I mentioned English previously. There was little listening at first. If the teaching unions had not made a strong case, we might have had everybody on an inappropriate higher still English course. Even now there is lots of tinkering to be done with that course before it will be fully satisfactory.

Is it your view that it would have been better to introduce intermediate 2 and build up?

Jim Browning:

Yes. We have done that in our school. We have introduced intermediate 2 this year, because we have that option. Our students at intermediate 2 are going through that course and higher will be introduced when it is ready.

But you are not doing higher English yet?

Jim Browning:

We are not; we are still doing the revised higher.

Mark Sherry:

On the introduction of intermediate 2, schools liaised with the SQA on how we would plan the introduction of courses at different levels. There was a lot of support from the SQA on that. This session, all schools went with a majority of highers and the intermediate 2s that they felt they were able to do, and they have a plan over the next two or three years gradually to introduce intermediate 2s.

Elspeth Banks:

Every school in South Lanarkshire agreed a phased implementation plan, which was helpful. Many discussions took place.

Cathy Peattie:

We have real live markers in front of us. I have never had that before, so forgive me, but I have some questions that I would like to ask. First, if you have marked in previous years, what are the differences between this year and previous years?

Jim Browning:

I have marked for 20-something years.

Wonderful. You will have lots of answers then.

Jim Browning:

The biggest difference was the initial contact. Normally, I am contacted in January. I get the opportunity to decide which papers I will mark, so I can plan my year and move forward accordingly. This year, I received a letter in March inviting me to start marking during the Easter holiday. Given that I had had no previous contact with the SQA, had I planned a holiday I would have been unable to do the marking. I was able to do that marking, but I was unable to do the summer diet marking because I had not arranged for time off. I would have had to attend a markers' meeting. I had not made that arrangement and I had filled my diary for June, so there was no possibility of taking on that marking. Despite that, I got phone calls from the SQA three and four days after the examinations took place, asking me to consider marking at intermediate 2—a course I had never taught. It was clear then that there was pressure because of the number of markers. I declined the invitation.

The material that came through for the higher English folio was of the same quality—I was still doing the old higher. I felt that there was no problem with the quality of marking at that stage.

Were you a marker as well, Mr Goring?

Richard Goring:

Yes. I have been a standard grade geography marker for some 15 years. With the exception of the late invitation to mark, there was no difference at all this year, I am afraid. Some teachers at my school were in the position of being asked to mark their own pupils' papers, which obviously raised a few eyebrows. In addition, some people were invited to mark but not to attend markers' meetings. However, that was not my experience. The marking process this year was very much the same as usual, except that we had a reduced period of time in which to do the marking, which created a fair amount of pressure.

Jim Browning:

I sought some anecdotal advice from my colleagues about the marking situation, and heard a couple of what I would describe as horror stories. There was a marker who, on the last day of her diet of marking, was phoned up and asked if she would accept extra papers. She was due to fly off on holiday, so she declined the invitation and went away, only to find a bag of papers waiting for her on her return. That is just one example.

We have heard reports of unsolicited scripts arriving when people were off on holiday.

People have mentioned inappropriate markers and lack of training. Could you tell us what happens at a markers' meeting?

Jim Browning:

I can answer only for English, but prior to a markers' meeting, we would be sent a set of specimen scripts. We would already have received candidates' actual papers. We would be expected to study the specimen scripts and have a look at the standard of responses in the various scripts selected from one or two schools. On the day of the meeting, there would be a long discussion. Members of the markers committee would already have decided on the marking grades that they would like; the discussion would take place to standardise the quality of the marking.

By the end of the discussion, each marker would have been made clearly aware of the standard to which they should work. Thereafter, we would go away and begin our marking. Shortly after the marking process, we would send some specimen scripts back. The SQA would sample check those scripts to ensure that we were marking at a reasonable grade. There would also be higher English standardisation papers, which everybody would mark so that we could see we were still grading properly. I understand that a computer factor of plus or minus is put to our marking.

That standardisation process is the one that is vital to ensuring the academic rigour of the meeting. We are there to be trained. At the first meeting, we are nervous, but the second one is not so bad. After that, it is all pretty straightforward.

Do you think that it is important for anyone who is involved in marking to attend a markers' meeting?

Jim Browning:

They must attend a meeting.

We have heard that there were occasions when people did not attend meetings.

Jim Browning:

I would be very distressed about that. For one paper—the internal English folio—one is invited to mark on the basis of sample scripts and a written report, once one has several years' experience. However, that is the case only for those markers with knowledge and experience of marking. For anyone who is in their first three years as a marker, it is a condition of being a marker that they must attend the meeting.

Cathy Peattie:

We are hearing that what happened this year is putting teachers off marking papers in future, and that there is an impending crisis for next year because the new diet is under way and new markers need to be in place. What messages can be given to markers and teachers about the importance of their continuing to be involved? Will there be a crisis in recruiting markers this time round?

Mark Sherry:

It is important that people mark. If they do not, the system cannot work. It is probably one of the committee's tasks to find a way of ensuring that marking can take place under reasonable conditions, rather than expecting people to mark hundreds of scripts in a condensed period of time. We must ensure that markers are remunerated appropriately for that work and that there are enough quality people out there to do it.

Jim Browning:

I echo what many of my colleagues have said. The majority of teachers mark because the exams have to be marked, and because it is good professional development. When one has done marking, it informs one's teaching in future years. Many teachers mark for that reason, rather than simply for financial gain.

Teachers have to see that there is value in what they do. This year, I do not think that they were valued in the public eye—many of the complaints that one hears in the media seem to reflect on the markers, not on the system that employed the markers. Many markers do not want to get involved next year, as a result of the sense of condemnation, which blights their lives too.

Elspeth Banks:

We all agree that forward planning was a major issue. I have a number of colleagues who are markers. I know of two cases of colleagues being contacted the day before the markers' meeting to ask whether they would mark. They are extremely experienced teachers, and I would have no qualms about their involvement, but the timing was an issue. We had moved on to the new timetable—like a number of schools in Scotland, we change timetable in the middle of June—and they felt that their pupils would suffer if they offered support. That issue should be considered in future. With advance notice, people might take on marking.

One of my colleagues—a principal teacher who is an experienced marker—marked several batches in July and was also sent scripts from our centre. We checked with the SQA and my colleague marked the scripts and returned them with a covering letter. He was concerned that the scripts should be marked. He ensured that the quality assurance process was in place and informed the SQA of that. I believe that that has been the case in a number of centres.

Michael Russell:

I want to ask about your experience of data handling and data processing in the run-up to the diet. The questions are of particular relevance to the co-ordinators, but other witnesses may have views. We have heard a great deal of evidence about the difficulties that individual schools have had. The submission from South Lanarkshire Council indicates that some schools were submitting information six or seven times—on one occasion it was 15 times. What was your experience? Did it ring sufficient alarm bells to make you do something about it?

Mark Sherry:

I am an SQA co-ordinator. It was the first time that problems had been anticipated. Students are supposed to be entered for a course by 31 October. That deadline had to be extended into November. Schools send data to the Strathclyde educational establishment management information system, which collects data from South Lanarkshire and other authorities and sends it on to the SQA. On 8 March, SEEMIS wrote:

"As you are aware the SQA have been taking longer than expected to process data submitted by schools. Last week, the SQA completed processing the data submitted by yourselves up to Christmas"—

this was on 8 March—

"and they then agreed to take the data that had been accumulating since the beginning of January."

That rang alarms. We were submitting data to SEEMIS, which were being sent on, yet there we were in March and there were problems.

There was other information like that. On 4 April, we were asked to do a complete check of all the data that we had submitted.

Were you asked to do that directly by the SQA?

Mark Sherry:

Yes. In April, we were asked to recheck all the data that had been submitted. That included all the initial entries and all the unit results that had been put into the system by that date.

We did that, because the system needed it to be done, but we were asking principal teachers to check work that they had already checked to ensure accuracy before data left the school. We then had to redo it. Teachers took on that work, because they felt that they had to ensure that the data were as correct as they could be. To some extent, we tried to ensure that the pupils were not involved in those concerns about data. It was a new year and they were anxious about new courses. We felt that it would be counterproductive to inform children that there were potential data difficulties.

In the summer term, our concerns increased. I have a list of when we were asked for various data.

Can you tell us about that? It is important.

Mark Sherry:

On 20 June, the situation became a little bit more concerning. Our initial concerns were that the data for the national qualifications would cause a problem. We wanted to ensure that the situation would not arise in our school—as it did, unfortunately, for some of the students who were here this morning—where students passed an external exam, but their record showed that they had failed internal assessments.

On 20 June, it was requested that we resubmit data about English standard grade. Until then, we had no concerns about the quality of the standard grade data that we had sent to the SQA.

On 12 July, we were asked to do a major check, involving 13 courses or elements of courses. Fortunately for me, I was on holiday, but other people carried out that work. The depute came into school to sign that the information that we provided was correct. When we phoned the SQA to query one aspect of the check, we were told—after all the work had been carried out—that it was not required.

On 14 July, we received a request to resubmit information—for the third time, I think—on health and food technology, and on lifestyle and consumer technology. On 21 July, we were asked to resubmit grades for standard grade science practical abilities. Many of the grades for which data were being requested in June and July were internal estimates of work that had been sent in March. Obviously, that was a concern.

Elspeth Banks:

I would like to talk about the issues that arose in Strathaven Academy throughout the summer. At about the time that the SQA requested that a senior member of staff be available throughout the summer, we started to receive calls. From 29 June, the SQA made 60 checks with Strathaven Academy. Sometimes, a request would be for confirmation of one internal assessment grade for one candidate; sometimes we would be asked to check internal data for several candidates. In all, data for 500 individual presentations were requested and returned.

The first check was about a folio that appeared to be missing for home economics. We were quickly able to ascertain that that had been submitted.

Things began to hot up while I was on holiday for two weeks. A senior colleague who was in school at that time informed me that on 17 July the SQA had written to inform the school that a number of cases had been identified in which results had not been processed for national units. Copies of a number of internal assessment forms by group were supplied with that letter. On 19 July, the SQA requested that we hold fire on updating the forms, as there appeared to be an anomaly in the computer system. The SQA felt that it had most of the information that it required to allow certification to take place. My colleague asked how long it would be before we had confirmation of that fact and was told that if we had not been informed by the end of the following week—28 July—we should assume that everything was okay.

On 20 July, we contacted the SQA and were informed that the difficulties had not yet been resolved. In anticipation of a request for information by the SQA, we checked all the internal assessment forms. Several members of staff were asked to come into school. As Mr Sherry said, it was very important to check with members of staff to make absolutely sure, for the sake of the pupils, that the information on forms was entirely accurate. On 24 July, we returned 39 reports, and by 28 July all outstanding reports had been posted. That was the most significant period, but we continued to receive more minor checks thereafter.

Given those difficulties, which I am sure were mirrored elsewhere, were you surprised by what happened on 9 August and 10 August? Did you fear that this might be going terribly wrong?

Mark Sherry:

Eventually, we were requested to have someone available during the summer. Before the end of the summer, we decided to send a paper copy of all our internal results to the SQA's Glasgow and Edinburgh offices, to try to ensure that, if the SQA had to confirm units, it would have those copies even if there were problems with electronic copies.

I must be honest and say that I was disappointed, when I arrived at school on 10 August with other members of staff, ready to see the school results, to find that they were not there. At half-past eight, we decided to phone the SQA because we were not sure whether the results would come from the Post Office or from a courier. We phoned the SQA and were told, "Sorry, you won't get your results today." The SQA said that it had written to the local authority on the Tuesday of that week to say that the results would not be in schools on time.

Our first task was to phone the numerous teachers who were coming in to look at the results to tell them not to bother. From half-past eight onwards, the phone was ringing as parents tried to find out some information, and children were coming in. We could not help them.

Michael Russell:

You said that you did not receive your statement of results, and that you had no notification that you would not receive that statement. That is confirmed by a letter from David Miller, the chairman of the SQA, which we have received in evidence. He wrote to a fellow board member on 11 August and said:

"Schools are particularly incensed because they did not receive yesterday (and it was not our intention to do it then—something we did not announce) a Statement of Results".

Should the SQA have told you directly?

Mark Sherry:

Yes. I feel that the SQA should have announced that those results would not be available in school. Many children, having opened their certificates and queried them, thought that the school would provide a solution. Children such as Jennifer Irvine, whom you spoke to this morning, came into school, and we could not even confirm for her what her results were. I found that difficult.

Michael Russell:

I wanted to ask you about Jennifer Irvine, because she referred to you by name in her evidence. She came to you and said that she had not got her higher modern studies and that there must have been a problem. When she said that, and lots of other people said that, did your mind go back to all the difficulties with data collection, and did you say to yourself—I will put this question to others too—that there must have been a major problem?

Mark Sherry:

This is only my impression—I have no evidence to back it up—but I believe that Jennifer Irvine's difficulties were to do with marking and the quality assurance of marking. Jennifer was an estimated grade 3 in modern studies, yet she went to grade 8; she went from the high 60s or 70s to the low 40s. In modern studies, we had to appeal for 10 of our 26 candidates. We appealed for seven of them at stage 1, and the appeals were all granted. Unfortunately, Jennifer is in stage 2, which will not be complete until the end of October. Over the past three years in modern studies, we had no appeals in 1997, no appeals in 1998, and three appeals in 1999. This year we jumped to 10, seven of which have been granted at stage 1. It is only my opinion, but I feel that there were problems with the marking of that exam.

Michael Russell:

I want to ask Jim Browning a specific question. Alan Burns and Lewis MacKinnon from your school gave evidence. When you discovered that they had not got their results, did your mind go back to the data entry problems? Did you immediately make that connection?

Jim Browning:

They were able to tell me what the SQA had told them; I could not get through to the SQA on 10 August.

The SQA did not answer the phone?

Jim Browning:

I got through to the switchboard eventually and was told that somebody would contact me. I was contacted at twenty to five the following day. For a school the size of mine, that was nonsense. I managed to get through to someone myself the following morning to get some information.

I was very grateful to our office staff: once SEEMIS managed to download the information, the staff managed to put it into a format that I could use. They spent the day doing that. Similarly, in the summer, they spent one day working until late at night, faxing information to the SQA. We had to do it that day, because next day SEEMIS was going off-line as it turned round for the new session.

We were able to reassure Lewis MacKinnon and Alan Burns, as far as we could, that this was an internal component. As we had the record of their results, we could tell them that we could resolve the situation. Lewis has also been waiting for a maths appeal. As he said, he sat the exam with an 88 per cent mark in his prelims and managed to get through it despite breaking a leg two days before. Although he survived that whole process and got four As and a B, as far as we know, his appeal will not be granted. At this point, I just want to compliment Lewis.

Michael Russell:

My final question is for Richard Goring. We heard Christina Fotheringham's extraordinary story about a paper that no longer exists or is somewhere in the ether. As a teacher of long standing, how do you feel about a pupil who has gone through the course and sat the paper with every expectation of passing, and whose paper disappears? Presumably you regard that as unforgivable, but how could it happen?

Richard Goring:

There are only two ways that that could happen: either papers have been lost or a batch of pupils who were expected to get credit did not turn up for the exam, which was clearly not the case. Ten pupils are in this category—

Are the papers of those 10 pupils lost?

Richard Goring:

They are the only 10 pupils in that grouping who were estimated to get credit passes or good general passes. As a result, they sat the general credit papers instead of the general foundation papers that were sat by the rest of the group. However, those 10 candidates all received code 99 for their exam.

Please explain that to us.

Richard Goring:

It means that the student did not turn up for the exam.

But they did turn up for the exam.

Richard Goring:

Yes.

So we have an exam system in which pupils can sit an exam and the paper disappears. What does that tell us about the system and what can we do to prevent it happening again? Surely that devalues everything that you have been working for.

Richard Goring:

To my knowledge, it has never happened before. My reaction on 10 August was absolute disbelief; my tremendous faith in the Scottish examination system has been built up over many years. In July, we submitted 105 sheets of paper containing confirmed results; principal teachers had to come out to the school in the middle of their holidays and so on. All that extra work was done to ensure that the results would be issued on time. I was absolutely flabbergasted when I arrived at the school on 10 August to discover that 134 higher candidates—almost 50 per cent—did not get complete certificates.

I do not want to cut into your answering time, but I am aware that time is moving on. We have sessions this afternoon.

I will be brief. It has been said that the schools were not informed at any stage that the results would not be coming. Were you actively informed that they would be coming?

Jim Browning:

No. On the day that the exam results come out, I would usually go to the school, study the results, process them and have them ready for examination by colleagues. I would use that data to inform myself about the option process, as pupils who have failed or passed exams often need to change their options. Over the next three days, I would conduct 40 to 50 interviews with pupils about changing their options.

This year, when I was summoned to the school on 10 August when the meltdown had happened—I had hoped to stay home and see my daughter's results—there was nothing to refer to, nor was there any way that I could help the pupils. It was quite agonising for the office staff and myself to try to deal with parents' anxieties.

We have heard about pupils' anxieties on matters such as missing results. However, a simple cause for anxiety was how to read the certificate. Although the boys from my school said that they were not really sure whether they had been told how to do that, we had held briefing sessions on the subject. However, the form is very complex and, even though you might see one on an overhead projector, it means nothing until you get your own; even when you see your own form, it still means very little. It is like an electricity bill.

Johann Lamont:

It is important that the results go to the school. If, at any stage, it had been suggested to you that the results might not arrive on that day, I assume that you would have explained why it was so important that they should.

Mark Sherry said that his school phoned the SQA at half-past eight in the morning and was told not to expect the results because they had not been sent. At half-past eight the same morning, one of the students was told that they should phone the school because the results were there. At a simple level, there is a problem with that.

There is an issue about the stress that was caused by the SQA's inability to deliver or to tell schools what was going on. The stress of the exam system has also been commented on. You have experience of the old exam system and the current one. Is the stress that is caused by the end-of-unit assessments a different matter? Are there figures to prove that the stress had increased? Was the drop-out rate higher? Did more kids than in the past say that they could not face this year? Could you compare the level of stress among youngsters taking a win-or-lose final exam with that of pupils on higher still courses?

Catherine MacKichan:

I speak only for maths, so my situation may be a little different. However, when the results were finally published this year, I found that I would need to submit 31 appeals at higher level. To put that into context, we had 125 presentations at higher level last year, and I would normally submit six or seven appeals. Many pupils felt that they had done badly in the exam, simply because it was not fair. We were reassured that that would be taken into account at markers' meetings. Two students who had been our top students since first year had scored an A in their prelims. One had gained 88 per cent in the prelim and the other had gained 84 per cent. They were within the top 12 of our year group of 125, yet they both ended up being awarded a C.

I have advised those children to go on to an advanced higher course this year, but they were reluctant to do so. Their confidence has taken a real knock. They have lost confidence in mathematics. Their teachers have constantly to reassure them. They are capable students who should have been given As—I hope that they will get As on appeal—but they do not want to continue with maths. We have to battle against that constantly.

Mark Sherry:

The problems that we have experienced with the certification of the new courses may have put more focus on the internal demands. As Mr Banks said earlier, children have always had regular end-of-unit tests. I know that those tests are more important now because they are certificated, but such tests were always in evidence. Perhaps the SQA needs to do more to ensure that the national assessment bank test better measures someone's ability beyond their bare competence. At the moment there is duplication because, in addition to that test, we have to give an additional test that measures ability in a similar way to the external exam. It would help if those tests could be combined.

Was this year any different from an ordinary year, in terms of the number of youngsters who were saying, "I can't do this course because I can't cope with the stress"?

Mark Sherry:

It was a normal year in our school in that regard.

Jim Browning:

One or two of our students asked to drop down a level, or to drop out of one of their groups of subjects because they felt under pressure to get all four. I would call that a realistic appraisal of the situation, rather than stress. However, I have two students who have already requested stress counselling for this year.

Mr Goring, you mentioned that some markers had to mark their own papers. What do you mean by "own papers"?

Richard Goring:

Papers that had been written by pupils that those markers had taught. That happened in two cases.

Is that unusual?

Richard Goring:

It is unheard of. It is a condition of marking that you declare that you will not mark papers from your own centre. You are not sent papers from that centre.

So you would not expect markers to receive papers from pupils that they had taught.

Richard Goring:

That is correct.

Mr Monteith:

A number of factors have been suggested—singly or in combination—for the increase in the number of appeals: the quality of the marking and the processing of papers; the employment of markers; the fact that the nature of higher still might make it more likely that some pupils will fail; and the fact that some parents pushed for appeals, almost for the sake of it, in the hope of obtaining better grades. What are your observations on those, or other, factors?

Mark Sherry:

I am not sure that the nature of the courses has given rise to more appeals. However, our school submitted double the number of appeals this year—we had 125 appeals at higher level, whereas the most in the past three years was 65. Only two of those appeals were as a result of parental requests. We wrote to all the parents to say that there had been problems with certification this year and that they could contact the school if they felt that their child had been disadvantaged. Both responses that we received concerned higher maths. The problems do not lie with the courses; it is the marking system that may have caused the problems.

Elspeth Banks:

We submitted 131 appeals at higher grade, which is more than twice the norm for us. In each case, there was a substantial discrepancy between the estimated grade and the results. That did not happen across the board; it happened in eight specific subjects.

Only on a few occasions have we submitted appeals or requests for review on behalf of parents who felt confused about the whole issue and wanted to know for sure that their children's grades were what they should be. In those cases, I appended a letter stating that the request for review was from a head teacher. In addition, we also needed to have confirmation that other issues had been taken into account, so where we had medical certificates for pupils or there were extenuating circumstances such as bereavement or illness, we resubmitted the certificates. If I wrote to the SQA during the session as a result of parent or teacher concern relating to such factors as the non-arrival of teaching materials, for example, I attached a copy of that letter to the appeal. We are giving the candidates every opportunity to have a review of their results.

In its written evidence, South Lanarkshire Council said that the moderation process was almost wholly discredited and that little moderation was carried out. What impact would that have on the examination process, if it were true?

Elspeth Banks:

Moderation is and always has been an important part of the quality assurance process in Scottish education. Staff recognise that and, although they always think that a request for moderation materials will cause additional work for them, they appreciate that it is part and parcel of the quality assurance process.

It was therefore somewhat frustrating when a sackload of moderation materials was returned to us at the end of June. We thought that the sack had been intercepted on the way to the SQA, because it was open and there were tapes and papers at the bottom. We felt that our candidates were going to be disadvantaged because the material had not arrived at the SQA. However, it turned out that the sack was being returned from the SQA. I received a letter stating that, due to a variety of difficulties that had been experienced this year, the SQA had been unable to undertake the moderation of materials and that, on this occasion, our internal assessment grades had been accepted.

There was a feeling of frustration that, if SQA had been aware earlier that the moderation process was not going to be feasible this year, it would have been better to have said that internal assessment grades would be accepted and that the SQA would return to the rigour of the process next year. The fact that the materials had been returned in such a bad state left a bitter taste in some people's mouths. We received a letter at that time. That was perhaps a signal that all was not well.

When was that?

Elspeth Banks:

It was in the third week in June when the sack of moderation materials arrived back in school.

I have been promised that the final two questions for this part of the meeting will be short. I will hold members to that.

Mr Macintosh:

What are the experiences of the SQA co-ordinators in the current academic year? It is very early, but I would like to know whether they have started the process of submitting pupils for registration. Are they confident that the software being used by the SQA is receiving the data that they are transmitting to it? Are they confident with the situation as it is?

Mark Sherry:

We cannot enter students for courses at this stage. There are two programmes in the school. As soon as we are told that we can use the SQA programme, we can move all the data across from our other computer programme. We should be able to do that before 31 October. As yet, we have been asked not to enter data in the SQA programme. We do not have a firm date for when we will be able to access the programme.

Richard Goring:

I endorse those points. The current situation is rather redolent of last year, when time scales were slipping. There is some anxiety in schools that, unless we get things registered as soon as possible, we will have difficulties.

Why did only about 20 per cent of English departments in the school system go for higher still this year? Does that suggest that the whole thing was started too early?

Jim Browning:

I cannot answer for those departments that went for higher still; I can answer only for those that did not go for it. We did not go for it because, initially, we would have had to carry out about 30 assessments. English cannot be assessed on a timed basis; it is not a subject for which there is an easily quantifiable learning outcome. We would have been assessing but doing no teaching.

Those of us who were given the opportunity to opt for higher still delayed, principally because of the number of assessments. The programme carried on for a year. It was refined a bit and some aspects were taken out of the course, which will reduce the amount of assessment. However, that has changed the nature of the course and we have therefore been given an opportunity for further delay, which we have taken. We do not feel confident in the results. This is personal and anecdotal, but I understand that the English results of schools that went for higher still do not live up to the results of those that did not.

Ian Jenkins:

That goes along with my experience. Does that suggest that a decision was taken a number of years ago to put the higher still programme into operation before those dealing with English—and therefore the largest number of pupils—were ready, so that the system looks a wee bit dodgy now?

Jim Browning:

I cannot help feeling that I am being drawn into giving a political answer. I would rather not go down such a speculative path.

The Convener:

I am quite happy for you to sidestep that one, Jim. I think that Ian Jenkins has already made up his mind on that.

I thank each of you for answering our questions so clearly. I invite you, along with the other groups who gave evidence, to stay with us for lunch, when I hope that we will get a chance to speak further. Thank you for attending this morning.

I remind committee members that, at the beginning of this morning's proceedings, I mentioned a letter. I will make copies of it available to all members and we will discuss it immediately after lunch in private session. I ask that members return from lunch promptly. We will begin at 2 pm.

Meeting adjourned.

Meeting resumed in private.

Meeting continued in public.

The Convener:

Before we invite the witnesses to join us at the table, we will deal with an item of unresolved business. Members will have received copies of the letter from the Minister for Children and Education on the disclosure of information. I hope that they have had a chance to read it.

If there are no questions, I will outline what is proposed in the letter. As the committee decided, Alex Neil—the convener of the Enterprise and Lifelong Learning Committee—and I met the minister last Thursday to discuss how we could ensure that there are no gaps in our report and that we receive the information that we feel is missing. We have interviewed the civil servants, and at the end of these oral evidence-taking sessions we will have the opportunity to interview the minister. However, the feeling persists that we do not have information that we need.

The minister's letter proposes that he provides the conveners of both committees with a list of items of written advice that have been offered him by civil servants. The two committees would then discuss the list and have the opportunity to raise questions about it. Those questions would be passed to the Executive, which would respond in a memorandum. The two conveners would be given access to the documents providing the answers to those questions, so that they could verify the memorandum. Once verified, the committee could consider the memorandum as an open document, which could be appended to our report and could form the basis for comments in our report.

I know that several members initially wished to see every piece of information that is available. I also recognise that the civil servants who gave advice did so on the understanding that it would not be made public—that must be taken into account. The committee must produce a report that will answer the questions of the parents and students who addressed the committee this morning. From the oral and written evidence that we have received so far, I believe that the proposals offer a way in which we can reach an accommodation with the Executive to ensure that we produce such a report. I suggest that the committee accepts the offer that has been made. If we do, we will ensure that our report is fully informed. Is there any opposition to that suggestion?

Michael Russell:

I appreciate the work that you have done on the proposal and I thank you for consulting each member of the committee. I object to the proposal on two bases. First, a motion under section 23 of the Scotland Act 1998 has been lodged, seeking the disclosure of all information. Members will be aware that section 23 overcomes any ministerial or civil service codes, which are subordinate to that legislation.

The Scottish Parliament has the right to seek information. This is perhaps the most important inquiry that the Parliament has undertaken. Therefore, we should hold out until we receive all the information that we seek. We should ask for all documentation, including advice.

Secondly, although your point that civil servants gave advice in the expectation that it would remain confidential is a serious one, that argument becomes circular. At some stage, the Scottish Parliament must break that cycle and state that advice should not be given in that way. That is a Westminster convention, which is not in accord with the consultative steering group principles of openness and accountability. I think that this is the stage at which we should insist that we receive the documentation that we seek.

We heard this morning, from the children who gave evidence, the great damage that has been done. In the statement that he made on 6 September, the minister said that all information would be provided—he did not qualify that assertion in any way. Now we have an opportunity to follow up the motion on section 23 by making a formal request for information, in accordance with the law in Scotland.

I appreciate that the proposal is a compromise that you have worked hard to achieve, convener. I am grateful to you for doing that, but I think that a compromise is not nearly as good as establishing now, on behalf of the Scottish Parliament, the fact that we do not operate as Westminster does but work in the full light of day. The information that we need to draw our conclusions exists in the education department and should be provided in full.

Mr Stone:

Three points come to mind on reading Sam Galbraith's letter. The first is that whatever we do must be co-ordinated with what the Enterprise and Lifelong Learning Committee does. It is not clever for one committee to do one thing and another something slightly different.

Secondly, the minister says that he will give a list of correspondence. The snag is that, if that is only a list of memos and letters and their dates, we will not know what documents to ask for. Will there be a fuller description of each item?

Thirdly, I realise that you have worked hard on the proposals, convener, but, as I suggested to Johann Lamont, I wonder whether a way out would be to opt for four party spokesmen—Mike Russell, Brian Monteith, you and me. Would that be acceptable?

Mr Monteith:

There is a logical train of proposals in the letter. We must ask ourselves whether all the information will be available to us. It would appear that a list of items of information will be provided, covering the appropriate period. In response to the questions that we ask after we have seen the list, a memorandum, which will presumably be based on the minutes of the meetings at which advice was given, will provide the information that we need. That can then be verified by the conveners.

The problem with that train of proposals is that the list of items is not cross-checked or verified, as Jamie Stone pointed out. While I reserve my comments on Jamie's suggestion, it is important that, at the very least, we require the conveners to be able to cross-refer or verify the headings of the minutes of meetings at which advice was given with the list that we are to be given. In that way, we will know that we are being offered all the information. Later, when we ask to see that information, it can be cross-checked and we will know that all the information under those headings has been provided.

If we are given those assurances, the minister's proposals might meet the requirement of making available to us the advice to which we wish to have access. The minister's proposals might also enable us to come to a conclusion about the advice given to him and its interpretation, as we will be able to ask him questions about it. We would be able to do that while observing the important point made by the Executive on preserving the confidentiality, not of the advice itself, but of the civil servants who provided that advice. Any party that aspires to government and that wishes to obtain frank advice should bear that point in mind.

The minister has a point: it is understandable that civil servants will seek to give advice in confidence. If we were to accept the proposals, we would reveal that advice, but we would not seek to reveal the official. My personal view, with which others may differ, is that it is not fundamental for us to know the name of the official who is attached to the advice. I want to establish what was in the advice and how the minister acted on it. I am minded to accept those proposals if clarification is given on the creation of the list.

As Mike Russell said, a motion has been lodged on section 23 of the Scotland Act 1998. I was disappointed with proceedings during First Minister's question time on Thursday. Until then, a cross-party procedure had been in place, with all parties in the committees in negotiations with the minister through both of the committee conveners. If the offer that was on the table were found to be inappropriate, it would have been open to Mike or to any member from any other party then to lodge a motion for the Parliament to discuss.

Unfortunately, the new leader of the SNP, who has adopted a party political posture, has created a cause célèbre. He is trying to eliminate the cross-party approach that existed; I would like to return to it by supporting the proposals that are before us, as refined by my suggestion.

Cathy Jamieson:

I am fairly new to the Education, Culture and Sport Committee and have the benefit of having worked on a number of inquiries before I became a full-time politician.

I came to this inquiry looking for objective evidence that would allow us to find out what went wrong and to consider how to put things right, or how to assist others to put things right. Quite simply, I am not interested in members of the committee making party political points. I was happy that the convener of the Enterprise and Lifelong Learning Committee and the convener of the Education, Culture and Sport Committee appeared to have found a way of allowing us to get on with hearing evidence from witnesses. I do not want to spend a lot of time this afternoon discussing disclosure of information—I think that we should move on.

I am concerned about the issue of confidentiality and about how we should pursue it in another forum in future. I am sure that some of the work on freedom of information that the Parliament will undertake will cover that point. However, the motion could act as a block and could prevent us from accessing the information that we want. I am happy to move forward on the basis of the assurances that we have been given. If we believe that information is being kept from us at any point, we will revisit the situation, but I am happy to go forward on the proposals that are before us this afternoon.

Johann Lamont:

If I were a cynic, I might think that the committee's inquiry might not produce a result and that another way of getting a result would be to discredit the inquiry. I am disappointed that someone lodged that motion, although I do not know who did. The motion was lodged, not as a reaction to what this committee or the Enterprise and Lifelong Learning Committee was asking for, but because someone simply decided to do so. That step may have politicised the situation unhelpfully. The danger of that approach is that it gives out the unfortunate message, particularly to the young people from whom we heard evidence this morning, that some members of the committee want to collude with a cover-up.

I want to revisit the subject of the disclosure of information at each stage of our inquiry in order to check that we are getting the information that we want and that that information has been made available to the convener. The suggestion that I discussed earlier with Jamie Stone on the representation of various party spokespersons would reduce the size of the committee to a manageable number in order to consider evidence. It is a matter of trust; the significant difference is that elected members would have sight of evidence rather than consultants—that would make a difference to me and I would be comfortable for the conveners to have sight of that evidence.

Cathy Jamieson made a good point about freedom of information, which has been echoed in various other places. I hope that the committee's experience will inform future debates on freedom of information. It strikes me that, in relation to issues of confidentiality and the protection of officials who give advice, there is a difference between the Scottish and Westminster levels and that of local government. I would be happy for movement to take place on freedom of information, but that is a separate political argument. Our experience will illuminate that discussion, but I do not think that it is germane to our decision on the minister's proposals.

I hope that we accept the proposals and that we revisit them, keeping a close eye on the information that we receive. The proposals are the way forward and will allow us to get to the nub of the situation and to look for the solutions that were discussed this morning.

Mr Macintosh:

I will be brief, as I do not wish to hold up proceedings.

I want to state for the record that I, too, am uneasy with some of the ideas that have been raised. I am very much in favour of open government and uneasy with the practice of the Official Secrets Act 1989, which seems to be affecting our new transparent and devolved government in Scotland. Having said that, it is the committee's duty to seek a practical way forward. A possible solution and a couple of helpful suggestions are on the table and, to be frank, we will have to take up one of those suggestions.

Ian Jenkins:

In Sam Galbraith's letter, the paragraph at the bottom of the first page says that

"the Conveners of the Committees will have access to the documentation".

It goes on to say that the conveners will be allowed

"to verify the memorandum".

I presume, convener, that if you felt that something had been omitted from the memorandum, you would have an opportunity to address that. Because two conveners will be present, we will be able to expect the verification to be done in an unbiased way.

The Convener:

Yes.

As there are no further questions or contributions, I will answer a few of the points that have been raised.

Let no one doubt that this is probably one of the most serious inquiries that the Scottish Parliament has been called upon to hold since its inception. Committee members feel strongly the expectation that is placed upon them. It is important that we have access to all the information that will enable us to answer the questions that have been asked about what went wrong and to consider how we can put right the situation for the future.

It is essential that no one is able to pick up the committee's report and say that there are gaps in it. For that reason, it is important that we ask for as much information as possible to be made available.

However, we do so with hindsight in a new Parliament. Many of us do not think that the committee is the only forum in which the issue of disclosure of information will be raised, and the Parliament may wish to consider how similar situations should be handled in future. However, my main aim today is to ensure that the report that the committee will produce at the end of its inquiry is the most thorough report possible. The report must address what happened at the SQA—what went wrong and what the problems were—and advise us how that situation can be put right.

Today we have an opportunity to move in that direction. I hear what members are saying and if we find that we continue to have difficulties, we will reconsider the situation. I do not think that any one of us wants to produce a less than thorough report.

I take on board Brian Monteith's point about the conveners examining the original list that is to be provided. I could accept that suggestion as an addendum to the minister's proposals. I am not so happy with Jamie Stone's suggestion about the option of using spokespeople from each party, purely for practical reasons. In many ways, the conveners of the committees are seen to be less party political than other members. Given that two conveners will consider the information from the Executive, a balance has been struck. Sam Galbraith was able to agree to that on Thursday when the conveners met him, and I hope that members will accept that, too. Partly because of the time scale, I do not want to have to go back and try to renegotiate that. I know that Alex Neil is away on holiday for the next two weeks, so we would have to wait until he came back to discuss the matter further.

If we agree the recommendation today, we must still seek the agreement of the Enterprise and Lifelong Learning Committee. Only with that committee's agreement can we go ahead with the proposals that are on the table. However, I think that if we can agree the proposal today, we can start to make progress and ensure that the inquiry does not go on for longer than it needs to.

I move that we accept the recommendations in the report. Are we agreed?

No.

There will be a division.

For

Jamieson, Cathy (Carrick, Cumnock and Doon Valley) (Lab)
Jenkins, Ian (Tweeddale, Ettrick and Lauderdale) (LD)
Lamont, Johann (Glasgow Pollock) (Lab)
Macintosh, Mr Kenneth (Eastwood) (Lab)
Monteith, Mr Brian (Mid Scotland and Fife) (Con)
Mulligan, Mrs Mary (Linlithgow) (Lab)
Peattie, Cathy (Falkirk East) (Lab)

Against

Russell, Michael (South of Scotland) (SNP)
Stone, Mr Jamie (Caithness, Sutherland and Easter Ross) (LD)

The Convener:

The result of the division is: For 7, Against 2, Abstentions 0. We have agreed to accept the report's recommendations.

I invite our next witnesses, Douglas Osler and Philip Banks, to join us at the table. I understand, gentlemen, that the press would like to take some photographs of you before you begin your evidence.

Thank you for your forbearance, for listening to the committee and for allowing the photographers to take pictures. As you know, we have been taking evidence for some time and a number of issues have arisen, some of which we will want to ask further questions about. The first question will be from Ian Jenkins, who wants to raise the issue of the higher still development unit.

Can you explain the relationship between the inspectorate and the higher still development unit and tell us how that relationship was carried down to the level of subject training days?

Douglas Osler (HMI Senior Chief Inspector of Schools):

To answer that question, I introduce my colleague, Philip Banks, the chief inspector with particular responsibility for co-ordinating the inspectorate's work on higher still. He is responsible for channelling our advice to our policy colleagues in the department.

The higher still development unit was conceived in this development programme to take over much of the work that, in earlier developments and initiatives, had been undertaken by the inspectorate. The higher still development unit was, as you know, separately formed and staffed. The head of the higher still development unit works to Philip Banks in pure line management terms. He is responsible for agreeing with her the targets that she will meet and for discussing with her the delivery of those targets. The higher still development unit is managed by a body called the development unit advisory group, which is chaired by a colleague from within the department, Eleanor Emberson, whom the committee has already met.

I shall now ask Philip to say a little more about the way in which HMI works within the development unit.

Philip Banks (HMI Chief Inspector of Schools):

As Douglas Osler said, I line-manage the chief development officer. I should make it absolutely clear that the remit of the chief development officer is to manage the full force of field officers employed in the unit, as well as the very large army of development officers and curriculum writers. At busy times of the programme, the size of that army, believe it or not, can run into four figures.

The work programme that I agree with the chief development officer reflects the programme targets that are decided by the various policy monitoring groups, most notably the implementation group. In that sense, the scope of the higher still development unit's work is the property of the wide range of stakeholders who participate in the key groups that discuss the progress of the programme. That is an important point, as it clearly defines my specific responsibility in relation to the work programme of the higher still development unit. The chief development officer is responsible for ensuring that the unit delivers a set of programme targets that are in line with the agreed programme targets as set out more generally by the implementation group.

There are other levels of HMI involvement with the higher still development unit. I should mention that we have a set of key working groups containing senior members from the stakeholder agencies: the higher still development unit, the SQA, local authorities and Learning and Teaching Scotland. HMI staff have a role to play in all those groups. In some cases, they chair groups, and in others, they sit as observers.

The committee should also be aware that a great deal of the work of the higher still development unit is subject to the scrutiny of a large range of subject reference groups. That is the point on which Ian Jenkins wanted me to comment. The subject reference groups cover all the subjects that have been taken forward in the programme, with one or two exceptions, and with the general exception that late-arriving developments may well have been taken on specifically by the SQA.

It is important to note that those subject reference groups are chaired by professionals drawn from the teaching field. They are serviced by field officers from the HSDU and from the SQA; HMI operates in an advisory role, drawing on the officers' professional expertise. It is at the level of the subject reference group that real discussion takes place about how a particular subject should be developed in the early years of the programme or about how specific issues should be addressed.

Ian Jenkins:

I am not sure that too many people would be wildly excited by that description, but there you go. I am interested in the subject reference groups. When I attended in-service training in the English group, there was always an inspector as well as a member of the higher still development unit. We have heard evidence from the unions and from individual teachers that people felt that their arguments were treated dismissively at those meetings. Anxieties and worries were expressed consistently at those meetings. How were those concerns and the perceived lack of flexibility in the HSDU transmitted upwards?

Douglas Osler:

That is relevant to the comments that HMI received from a range of people in schools, in the forum that you mentioned and in the various committees that managed the implementation of higher still. In all cases, we ensured that the information that was being received was fielded to the appropriate body. After all the occasions to which you refer, we listed the issues that were raised, took action to ensure that they were passed to the appropriate body and followed up any action that was taken thereafter.

We must recognise that, in the case of English, there were deep ideological differences about what the subject comprises and how it should be taught. It was impossible to satisfy everyone. Things were slightly easier in many of the other groups. English was a special case, although an important one.

We passed all the issues that were being raised to the higher still implementation group or the liaison group, both of which I chair. As appropriate, we passed on issues to the SQA or the higher still development unit. When individual departments in schools raised issues, we would pass those on to the head teacher because the solution lay within the school. Sometimes we took issues back to the education authority or to colleagues within the Scottish Executive education department to ensure that they were followed up. Several papers were issued as a result of that follow-up work, such as further advice from the HSDU or the SQA.

Many issues were raised with us. Where they related clearly to the smooth implementation of the programme, we ensured that they were taken up by the appropriate body. When there were differences of opinion, following wide consultation—for example, over what should constitute higher English—it was rather more difficult to represent the opposing voices. All the courses were consulted on widely with the profession. A form of consensus had been reached and ministers had taken decisions, which had been passed for action to the SQA. Lingering—or more than lingering—discontent remained, but it was difficult to follow that up all the time. However, we made sure that we followed up any issues that related to smoothing the implementation or making the implications of courses clearer. We had a duty to do that.

I understand what you say about a lack of unanimity, but on occasions whole groups of professionals left the meetings feeling that their views had not been recognised.

Douglas Osler:

On every occasion, the HSDU had evaluation forms completed. The vast majority of the responses that we received were exceedingly positive. We could talk about the same occasions on which English teachers went away feeling very discontented; however, many changes took place in English and that is testimony to the fact that issues were raised and addressed. Changes were made to assessment procedures and the implementation of higher still English was delayed further.

Ian Jenkins:

On reflection, would you accept that it might have been useful to have phased in higher still—as we did with intermediate 2—given the substantial objections that were made to some elements? At the meetings, people agreed with the rationale of higher still but had real difficulties about aspects of implementation and were worried about the integrity and validity of the testing. Do you think that, with hindsight, the introduction of this subject should have been approached differently?

Douglas Osler:

Of this subject, or the whole of higher still?

I am sorry, I meant the whole higher still.

Douglas Osler:

I do not want to join the cohort of retrospective prophets—we can all be wise with hindsight. I would not have given advice that things should be done differently. As I said, higher still arose from the work of the Howie committee. I was around at the time of those discussions and I know that the education minister was not too keen to set up the committee, because he believed that it would involve extensive change to the system. However, he was urged to do so following strong representations from the teaching profession, the Scottish Examination Board and other bodies. There has been far more consultation on aspects of higher still than there was on, for example, standard grades or five to 14. Everyone in the profession and associated bodies has had many opportunities to consult. In the course of higher still, there have been more than 240 different consultations.

All the decisions that were taken on smoothing the implementation of higher still were made by bodies such as the implementation group and the liaison group, where all the main stakeholders had a chance to voice their thoughts. Ministers took decisions based on the advice that they had received. That was a fairly exhaustive attempt to meet the reasonable aspirations of the teaching profession to ensure that the courses were the right ones for young people in the 21st century.

Michael Russell:

I find that your account of the past does not equate with the one that we have been given by other people. I must say that I find your account to be astonishingly manipulative. All the evidence that we have received, from a wide variety of people in the teaching profession, tells us that there was substantial disquiet over a lengthy period and that people believe that the inspectors—for whom you are responsible—drove through the implementation process, listening only to voices that they wanted to hear and advising ministers in those terms. The submission from the Scottish Secondary Teachers Association, which my colleague Mr Jenkins quoted at our previous meeting, puts it quite nicely:

"Principles dominated practicality throughout the process".

I put it to you that you and your colleagues wanted the higher still programme to be implemented and were not prepared to listen to those people who were telling you about the substantial, practical difficulties that they were experiencing. Are you telling us that the many people from whom we have heard are giving us that evidence because there is a problem now and that they never said it to you at the time?

Douglas Osler:

No. I readily accept the suggestion that many representations were made to us about various aspects of higher still. That is what I would expect, given the fact that higher still is a large, necessarily complex and ambitious programme.

Why is it necessarily complex?

Douglas Osler:

It is a complex matter to provide important certification not only for all the upper secondary school pupils at the end of 13 or so years of education, but for a broad range of adults in further education contexts. The issue spans all the subjects from the former academic and vocational courses, at several levels, in order to ensure that all potential candidates have courses at the appropriate level to meet their needs. The programme is bound to be complex as a national strategy, although it does not appear complex to the individual within the system, who sees only the highers and intermediate courses that they need in order to progress.

Michael Russell:

But it became very complex for those within the system, particularly those who operated it, as month followed month. You have used the interesting phrase "necessarily complex". Do you accept that the system was unnecessarily complex by August this year?

Douglas Osler:

By August, the problem was not complexity. At that stage, individual teachers and parents of pupils must have been able to see their own particular routes through the system, so they did not need to be aware of the complexity of the whole national provision.

I want to return to the comment that HMI was determined to force through higher still. HMI has no reason at all to pick a programme such as higher still and decide that its role is to push it through. Successive education ministers asked the inspectorate to co-ordinate the implementation of higher still. That was carried out through the strategy group, the implementation group and eventually the liaison group, all of which included representatives from all the main stakeholders. We oversaw the implementation of higher still on behalf of ministers; HMI has no vested interest in pursuing the programme.

You must have a huge vested interest.

Douglas Osler:

This programme arose out of reports that were initiated by ministers, consulted on and then implemented.

Were you advising ministers?

Douglas Osler:

I was one of the people who—

Right. Well, hang on a minute—

Douglas Osler:

Excuse me, Mr Russell—

Let Mr Osler answer the question, Mike.

There is a closed circle here that we must get to.

Mike, I know that you have a lot of questions, but you must let Mr Osler answer.

I do not think that we will get any answers.

I am not going to stop you asking questions, Mike.

Douglas Osler:

My role of giving advice to ministers is sufficiently complex to require me to answer your point. On these matters, I and my colleagues gather a range of comment from people across the system; indeed, it would be very odd not to receive such information, given our statutory responsibilities within the system. As policy advisers within the education department, we make policy advice available to our policy colleagues, who have also appeared before the committee. Those colleagues then incorporate that advice within the advice that they give to ministers. We are only one group of people—a very influential group, I hope—who make professional information available within the department, which can then be included within advice to ministers. In that respect, we operate as advisers within the department, which is quite different from our more independent role of inspecting and reporting on schools and other institutions.

Michael Russell:

Is not this a closed circle, Mr Osler? You have essentially said that you take in a range of comments, which you then sift. You take the sifted comments to the minister, who then asks you to do something that might include taking in a range of comments. That puts you back to where you started. In such a closed circle, it is possible that you are simply hearing what you want to hear—which is a phrase that several witnesses have used—and then passing that on to the minister, who then asks you to do what you want to do. At last week's committee meeting, John Kelly said that his union

"pointed out that the inspectorate is now both the generator of policy and the policeman of policy, which cannot be right. If the inspectorate is pushing higher still—and it could be something else tomorrow—is it the best-suited body to listen to and represent the problems that might occur . . . ?"
—[Official Report, Education, Culture and Sport Committee, 9 October 2000; c 1542.]

When the history of this situation is written—part of which will no doubt be written by this committee—that closed circle and conflict of interest within the inspectorate will prove to be a major factor in what happened this year.

Douglas Osler:

I should point out that there are two separate issues to consider: first, the role of HMI and other bodies in higher still; and, secondly, what happened in the SQA during the summer. Those issues should not necessarily be linked.

I want to return to the comment about HMI being the "generator of policy". I am responsible for channelling to the department and then to ministers—or indeed, on inspection issues, directly to ministers—information turned into advice that I have obtained from my colleagues and the system. The people who decide on policy matters are ministers. Since I took up a senior position within the inspectorate, I have worked with seven education ministers, four secretaries of state and one First Minister. In that time, I have never known any member of the inspectorate to be allowed to make policy decisions. That would be entirely inappropriate, because ministers take such decisions. We are one of the groups of people who give advice. Sometimes that advice is accepted and sometimes it is rejected. Sometimes the response is varied. However, we only ever give advice.

Michael Russell:

This is my last question for the time being, because I know that my colleagues want to come in. Furthermore, we all want to move on to what happened with the SQA in August. There are links between that situation and the HMI, and Mr Banks is one of them.

However, is it not time, as many commentators have pointed out, to break the byzantine complexity of HMI's role in generating and policing policy, as Mr Kelly highlighted? Have the summer's events not shown how damaging that role has been?

Douglas Osler:

You are still asserting that we generate policy. I disagree with that statement; all the facts show that we do not do so. It would be unthinkable for the inspectorate—which has statutory responsibilities for inspecting in pre-school, school, further education and teacher education and which most recently has been charged by the Scottish Parliament to examine the educational activities of local authorities—not to have amassed a body of evidence to which ministers could turn for advice. Furthermore, it would seem odd if the largest stock of information about the system after it has been openly evaluated and reported on were not taken seriously by ministers. That is not generating policy, but giving influential and important advice on policy.

Such advice should be taken seriously, but not within the current, fatally flawed structures.

I want to ask about higher still development. When higher still was implemented, was any thought given to the SQA's capacity to handle the new exam in terms of information technology and so on?

Douglas Osler:

In partly answering that question, I should really clear up some of the recent misconceptions about HMI's role within the SQA. The inspectorate's remit does not run within the walls of the SQA at all. HMI does not sit on any SQA policy or operational body that has anything to do with examination arrangements or a particular diet. That was not the case with the two predecessor organisations, where we sat on the council, the board and the subject committees as observers. However, it was decided that HMI would not have such an involvement when the SQA was established.

Cathy Peattie:

I am talking about higher still development. If any agency was taking over the development of my project, I would want to ensure that it had the capacity to deliver it. Did anyone consider whether the SQA had the capacity to deliver what was expected of it?

Douglas Osler:

Much of the evidence given by my colleague John Elvidge and the audit trail of the inspectorate's knowledge of what was happening in the SQA—which I am happy to give the committee—show that there was no suggestion that the SQA did not have the data processing capacity to cope with higher still. It might be that the data processing capacity was not properly used, but I do not think that anyone has ever suggested that the technology does not exist to cope with the information that comes from schools on our proposed examination system. Some organisations deal daily with technology that supports far more coming and going for far more clients. It was never suggested that the technology was not available; as other sources have told the committee, we were constantly assured that things were on course. I am happy to take you through the audit trail that shows when we were told about problems and how we dealt with them over the previous two years or so.

Cathy Peattie:

That does not answer my question—perhaps we can come back to it. We have taken evidence from many people who said that HMI must have known that there were problems. In schools, teachers were saying that their information was dismissed. Trade union representatives and others have also said that HMI must have been aware of the impending crisis in the examinations system. However, the impression given is that you did not take on board the fact that there was an impending crisis and that you did little about it.

Douglas Osler:

I would like to take you through what we were told and what we did about it, as this is the first opportunity that I have had to do that. Like other bodies within the system, we were aware that there were difficulties that led back to the SQA. I am happy to let you know how we handled those difficulties, but I emphasise that on no occasion was there any intimation of cataclysm. There were certainly problems, but the experts were satisfied on every occasion on which they were consulted by or involved with the SQA. Although we raised a number of issues over two years or so, at no time were we aware that things would turn out as they did in August—I do not believe that anybody else was aware of what would happen.

For example, in late 1997 and early 1998, the fact that there were difficulties with the awards processing system was brought to our attention. Schools said that they were concerned that their information systems could not cope. We discussed that at a joint meeting with the higher still development unit and—because the further education sector is very involved with higher still—with the Association of Scottish Colleges. As a result of that meeting, a consultant from the microelectronic development centre talked to the SQA about the concerns that had been expressed. He reported that he was satisfied. As I said, the SQA always satisfied the experts. There was not at that time or later any suggestion that the technology did not exist to do the job. The issue is how the technology was used.

In April 1997, I was concerned about the slippage in the time scale for national assessment bank items coming to schools, because that affected the way in which schools were able to run their courses and prepare themselves for the coming year. That problem was raised on a number of occasions in meetings that I chaired. As Philip Banks knows, I offered to move the timings of meetings to ensure that the chief executive of the SQA could attend, so that all the stakeholders—the people with an interest in higher still—were round the table and had the opportunity to question him about that slippage. The national assessment steering group held a meeting to see what could be done to help the SQA to keep to its time scale. As a result, the higher still development unit plugged some of the gaps. We facilitated that in response to the concerns that had been raised with us.

On 22 November 1999, another meeting was convened, following further concerns expressed by schools and further education colleges to the inspectorate about the fact that they were contacting the SQA and not receiving satisfactory responses, or were being asked for information that they had already sent. As a result of that, one of my colleagues spoke to the co-ordinator of higher still at the SQA and was told that the awards processing system was up and running and that discs had been issued to all schools so that they could access the information. To be doubly sure, a letter was to be issued to all schools to explain what had been done and to encourage them to contact the SQA if that was not satisfactory.

In January 2000, we were told that the SQA was setting up a group to co-ordinate arrangements. That was our first indication that such a group did not already exist and that the SQA now saw the need for one. During all that time, we were feeding information from schools and from a number of education authorities into the education department, which led to the processes that John Elvidge has described. In February this year, at our request, the SQA issued advice to schools in response to the issues that were being raised with us. That is the trail of our knowledge and what we did about it during those months. The SQA always satisfied the experts. We are not experts on data processing.

Cathy Peattie:

I do not know who the experts are, but the people who have given evidence to us had concerns for more than a year and continued to make those concerns known in the spring and summer. However, you are saying that the SQA satisfied the experts. That is not an answer—

Douglas Osler:

No.

You have a list of dates and so on in front of you—that will not give the answers that we need.

Douglas Osler:

The list of dates shows that we were aware and we were acting—

It does not tell us anything; it tells us that we have been ignored.

Douglas Osler:

I think that it tells you a bit more than not anything. It tells you the action that we took about particular advice given to us at particular times. I said that earlier this year the information that we were receiving became part of the advice that went to the education department, as John Elvidge described to you. We were part of the subsequent process of engagement between the education department and the SQA in the period before summer.

Cathy Peattie:

Would you agree that you have failed in your duty in reporting the information to the Executive? Inspectors must have been hearing what people in schools and others were saying, yet you seem to be dismissing that. Do you think that you should have taken that further?

Douglas Osler:

I do not understand how you can say that I am dismissing it. I have explained in some detail the occasions on which we were aware of the information, where it was coming from and our timetable of communicating it to the SQA. I have explained that we involved the HSDU, that we made further information available to schools, that we made sure that our colleagues in the education department were aware of the information that we had and that we associated ourselves with them in the events that John Elvidge described to you. We were part of all that—we were very much aware of it and we were feeding information to the correct people.

With the same outcome, unfortunately.

Mr Monteith:

Evidence from teacher union representatives is that there were fundamental problems. On 4 October, David Eaglesham of the Scottish Secondary Teachers Association said:

"From our point of view, the key to this is the extent to which the advice—the virtually unanimous but separately arrived at advice—of the teaching unions, representing the whole profession, was ignored for what can only be regarded as a narrow political purpose . . . We were seen as reactionaries who were holding up the process. However, all the time we were totally right, as we were reporting back what practitioners were saying in the classrooms—that there were fundamental problems."—[Official Report, Education, Culture and Sport Committee, 4 October 2000; c 1540.]

What do you say to that comment, which is similar to comments from representatives of the other teaching unions? Do you believe that there were fundamental problems that were being put to the ministers and the various liaison groups, through the press and by the teaching unions, and that that should have further delayed the implementation of higher still?

Douglas Osler:

Are you talking about fundamental problems to do with the SQA's data management or to do with higher still? They are not the same thing.

The point was that there were fundamental problems in the schools.

Douglas Osler:

To do with higher still?

The evidence from teacher union members was that fundamental problems were being experienced in schools and that those problems were being put to ministers and the wider public. Would you agree that there were fundamental problems?

Douglas Osler:

I do not believe that there were fundamental problems in the implementation of higher still in its first year. There were fundamental problems in the management of data in the SQA.

It might be worth suggesting what answers we would have been giving to the questions about the first year of higher still if the problems within the SQA had not occurred. I think that we would have noted that 81 per cent of all the highers sat were new and that the courses had been successfully delivered in the classroom. We would have noted that standards of learning and teaching in classrooms were better for higher still courses than for previous courses, even though S5 and S6 were always the best-taught sector of schools—we have evidence to show that. We would have noted that 40,000 young people had qualifications at intermediate levels because of higher still when in previous generations they would have had no coherent record of success. That 40,000 is despite the fact that the higher still timetable did not require schools to make those courses available; the schools chose to do so because the courses were the best ones. We would have been congratulating teachers on a programme well started and we would have been congratulating 32 education authorities on successfully supporting teachers.

We have been waiting for the outcome of the review of the first year of higher still, which the Executive initiated in order to iron out the problems that were being raised. Had the situation that arose in the summer not occurred, we would not be sitting here questioning whether higher still was full of fundamental problems; we would be congratulating the profession on having delivered a very high standard in a complex programme.

Mr Monteith:

In listening to your answer, I presume that you refute the evidence that we heard last Wednesday, during which a number of representatives of teachers unions spoke. In response to a question from Mr Stone, David Eaglesham stated:

"There was haste that eventually proved to be damaging."—[Official Report, Education, Culture and Sport Committee, 4 October 2000; c 1541.]

That was about the implementation of higher still through the schools, not about the SQA itself.

George MacBride of the Educational Institute of Scotland said that

"undue haste was an important factor."—[Official Report, Education, Culture and Sport Committee, 4 October 2000; c 1541.]

There is clearly a difference between those views. On a number of occasions, it has been put to us that teacher unions and members of other bodies conveyed to ministers a real worry about the position in schools—I reiterate that we are talking about schools. However, the report from HMI to ministers said that implementation was achievable and could be delivered, and that schools were well advanced with the work. How do you explain the conflict in perceptions about schools' positions?

Douglas Osler:

Things were being achieved in schools—many higher still courses have been delivered successfully in schools. That is a fact. We have inspected about 55 schools, on which we reported in June. Those reports show a high level of learning and teaching in the classrooms where higher still was the new course. There is no doubt that teachers have delivered exceedingly well. There is also no doubt that, in the course of doing so, they have found a number of hiccups and problems with higher still. That is what the Scottish Executive's review relates to.

I want to take members back to the SSTA's evidence. The SSTA is represented on groups, such as the higher still strategy group and the implementation and liaison groups, in which it can have its say about the extent of the problems.

The issues that were raised in those groups dealt with the extent to which local authorities made adequate resources available for higher still, bilateral teaching in subject groups and the late arrival of national assessment banks. The discussions in the groups were about learning and teaching and did not herald the events that took place last summer.

Mr Monteith mentioned haste. Following the Howie report in 1992, ministers issued "Higher Still: Opportunity for All" in 1994. That document underwrote higher still with a view to introducing the examinations between 1999 and 2004. That is hardly rushing things.

There is a more general issue in education about the length of time that it takes to deliver new developments. As I said two weeks ago, there is always tension between proper planning and bringing a new, desirable development to young people as quickly as possible.

Members should also consider that, of the 32 most commonly taken highers, only eight—with the agreement of the unions—have major new content. There was wide consultation on all the issues, and there was a measure of agreement about all the courses.

There have been delays and there has been phasing, but it all came together in December 1998, when all the main bodies—including the SSTA—signed the circular that endorsed the principles of higher still and endorsed the time scale by which they agreed to implement it. All those decisions belong to us all, not only to the inspectorate.

Mr Monteith:

I understand the difference that you have emphasised between matters in schools and matters in the SQA. Some of the evidence that has been submitted to the committee has been discussed this morning and I wish to draw your attention to comments that were made by South Lanarkshire Council. A written submission from the council stated:

"Initial difficulties experienced by centres came to light early in session 1999-2000 when SQA co-ordinators in schools drew attention to problems of registration of candidates."

After further examination, it was shown that the deadline for the registration of candidates—31 October—had to be extended. South Lanarkshire Council's submission also said that the

"SQA was not able to confirm any entries for units and courses until February 2000".

Is that the sort of information that HMI would pick up? If so, would you relay that information, possibly conveying any concern that you had, to ministers or to appropriate officials?

Douglas Osler:

Those issues were raised, as I think I have mentioned. The greater part of the balance of issues that were raised with us was rightly about the operation of courses in classrooms, the availability of resources, the adequacy of materials and about learning and teaching issues—far more than was ever raised about the issues that Mr Monteith mentions. Issues that schools raised with us about difficulties that they were having with the SQA in handling material were very few and far between.

Because such points were raised, in late 1997 we brought together groups that ought to be made aware of them. A consultant examined the awards processing system and came away satisfied. Thereafter, we shared with our colleagues in the Scottish Office the concerns that were being raised with us. That formed part of the Scottish Office's approach to the SQA, which has been described to members in some detail.

I would certainly expect inspectors to be aware of the issues that Mr Monteith raised, and we were. However, such issues were much smaller than the other types of issue that were being raised with us. Nothing that was raised would have led any of us to have any reason to suppose that there was about to be a disaster, such as that which materialised.

Mr Macintosh:

I want to clarify what the unions were saying to HMI. There might not be a direct link—or audit trail, as you put it—between this and the exam difficulties, but several unions made this point. The SSTA said that it

"regularly raised problems of late or very late delivery of promised materials, late changes in NAB materials"

and so on.

The SSTA also stated:

"On every occasion, whether at the Higher Still Liaison Group, or meetings with Ministers and officials, we were accused of over-reacting and misreporting."

The implication is that HMI was either downplaying or contradicting the evidence that teachers gave to ministers. Do you accept that?

Douglas Osler:

If that were just an implication, I would be inclined to deny it. I would like the people who implied that to give the chapter and verse on the occasions on which anybody accused the SSTA of "over-reacting and misreporting". Of course I do not accept that, but I accept that such points were being raised. There is no doubt that the SSTA and other groups, in their regular attendance at meetings, raised issues about the late supply of materials and about the late issue of the national assessment bank from the SQA—they were right to do so. After all, we were co-ordinating the programme and we were the right people with whom to raise such matters.

I assure the committee that on every occasion on which we raised those matters, we followed them up and took action on them. I recall no occasion on which I said to David Eaglesham or to any of his representatives—or any other union representatives—that they were overreacting.

We always took such matters seriously at meetings and I think that the minutes make that clear. They also make it clear that action was taken on all those matters. I was not happy that materials were late or that the national assessment bank's timetable was slipping. That did not help schools and was not going to help smooth implementation of the higher still programme. I was not at all relaxed about such issues—I ensured that action was taken to put things back on course.

Mr Macintosh:

I want to return to a matter—which Mike Russell raised earlier—about the complexity of higher still. You used the expression "necessarily complex", whereas I believe the National Association of Schoolmasters and Union of Women Teachers said that things were unnecessarily complex. Do you accept that the complexity of higher still was a factor in this year's difficulties?

Douglas Osler:

If I said no, I would be going back over—or contradicting—what I said about not having a role within the walls of the SQA. I know that higher still was successful up to the point at which young people completed their examination scripts. After that, something went wrong. As I do not have a role in the SQA, it would not be right for me to speculate about the nature of the complexity—that is for other people to do. In previous years, the SQA issued results satisfactorily, but the difference this year must be that the system was changed. Whether the technological hurdle was impossible to clear—which I do not believe—is for others to judge. However, things were different this year and it would be pointless to deny that.

Did HMI take the decision on the exam timetable? Was it on your advice that the exam timetable was shortened?

Douglas Osler:

We do not take decisions on such issues—they are matters for the SQA. The exam timetable was consulted on and as I understand it, by and large, those who responded to the consultation wanted the exam timetable to be left as it was. HMI was not involved. On the other hand, had nobody consulted on the exam system, we would now be asking why not. However, the exam timetable is not an issue for us. If I had taken decisions about higher still, it might look different in some respects. Decisions about higher still are taken by ministers as a result of consultation, not by my colleagues or me.

I would like to double-check that. Are you saying that the advice to shorten the exam timetable came from the SQA and not from HMI?

Douglas Osler:

Indeed. Philip Banks will confirm that; he was closer to that part of the process.

Philip Banks:

We should recognise that the SQA had a difficult task on its hands in running the exam timetable, because of the increase in the number of exams and the continued existence of previous exams. It was running a lot of exams at the same time. The SQA handled the matter by putting proposals out for public consultation. The running of the exam diet is the SQA's property and responsibility. As Douglas Osler said, if the SQA had not consulted properly on that issue, HMI would have pointed that out.

One of the pieces of evidence that has been submitted to the committee refers to the unrealistic deadlines that were set outwith the SQA. Did you impose deadlines on the SQA without consulting it?

Douglas Osler:

Members will expect me to repeat that HMI would not impose deadlines on the SQA, and that we did not. The deadlines to which the SQA had to work were the result of ministers' decisions about when courses would be required to be offered and which courses would be subject to phasing. The SQA also had to meet deadlines relating to the point at which schools return information. Those deadlines are entirely a matter for the SQA and schools. We would not want to be involved in setting them and it would not be appropriate for us to be involved. However, if schools told us that deadlines were unrealistic, I would want to pass that on to the SQA.

Mr Macintosh:

You say that HMI's remit does not extend to the SQA and that HMI was not represented officially on any body that dealt with the running of the SQA. When teachers and other people brought their worries about the implementation of higher still to you, did you report those worries directly to the SQA in any form?

Douglas Osler:

We did that in a variety of ways. If the source of concerns was an education authority, we often advised the director of education—because of the influence that he could bring to bear—to raise the concerns directly with the SQA. It was important that the SQA heard about concerns from other people and not just from us. When concerns were raised by individual schools, we collated them. For example, one of my colleagues held a meeting of the national assessment steering group that resulted in an approach to the SQA. When the concerns that were expressed were of greater magnitude, we ensured that our colleagues in the education department knew about them. We made absolutely sure that concerns were brought to the attention of the right people and that they were followed up.

Johann Lamont:

I was interested in your earlier comments about the successes of higher still. It struck me that that was similar to saying that, if we put to one side the issue of the iceberg, the Titanic provided a very nice travel experience. Is your position that higher still was doable, but the problem lay in the SQA?

Douglas Osler:

I believe that higher still is entirely deliverable. I also believe that the evidence that the committee has received so far shows that problems in the management of the information that came to the SQA are the main reason that we are sitting here today talking about higher still. I believe that, without those problems, there would have been progress towards the Scottish Executive's review of the first year of higher still, that a number of necessary refinements and adjustments would have been made—which is necessary in any exam system—and that the profession would have been congratulated on having taken us this far. Nobody has ever suggested that the problem happened before pupils reached the examination halls. It happened after the scripts left the schools.

Some of our evidence suggests something different. I will return in a moment to the question—

Douglas Osler:

What I have just stated is not simply my personal view. It is based on the evidence that comes from inspection, on the results of the consultation process and on the discussions that took place in the groups involving all the main stakeholders, all of whom subscribed to the principles of higher still and the current programme for higher still when the liaison group met in December 1998.

Johann Lamont:

I am interested in exploring how our education system came to be in hock to an organisation that was unable to deliver. Was there a stage at which something different could have been done? We have been told in earlier evidence that there was going to be a completely new IT system and that it was not possible to know whether it would work. We have also been told that, once it was decided to set up the SQA, there could be no safety net, because there is no substitute SQA and the Scottish Executive education department could not intervene.

Today, you have told us that HMI had no input into the SQA. Was a risk assessment done at an early stage—by HMI or anyone else—that would have indicated that we were in danger of handing over the integrity of the education system to a body that might not deliver, and with no capacity being available to pull things back if there were problems?

Douglas Osler:

We must remember that the SQA was formed from two organisations, one of which had very long experience and the other of which had fairly long experience of delivering examination systems. In any country that one visits, one will find that responsibility for processing examination results and qualifications is vested in an examinations body of some kind. It is difficult to envisage a different way of bringing together expertise in that area. The expertise in assessment arrangements that exists in Scotland is vested in the SQA. I have made it clear that, by choice, HMI was not part of the policy or operational bodies of the SQA. If it is thought that it would be inappropriate, HMI could have a role in evaluating what happens within the examinations body, provided that it is a meaningful role. I do not think that sitting as an observer or assessor is meaningful.

Johann Lamont:

If HMI had expressed anxiety about handing over the huge responsibility for the integrity of the process to another body, and said that there was a high risk in doing that, it would not have made policy. It would certainly, however, have influenced decisions that were made. HMI could have said that this was a risk too far, that it needed to hold on to some of the responsibility and that there needed to be a safety net in case the whole system went pear-shaped.

Douglas Osler:

Johann Lamont is taking me beyond my area of responsibility. It is not for HMI to advise ministers about the nature of an assessment body. However, I know because I was in the department at the time, that two bodies—both of which had credibility in delivering qualifications—came together for a period of time to deliver jointly, as the SQA, the kind of qualifications that they had delivered separately. The SQA was then given the responsibility of preparing for higher still. You would have to ask the SQA how it went about doing that.

As the crisis began to unfurl and anxieties were expressed—

Douglas Osler:

I am sorry—I am afraid that I missed the beginning of your question.

Johann Lamont:

When you began to hear about problems that were emerging—when things began to collapse might be a better description—was there any stage at which HMI could have intervened? If HMI had picked up earlier the concerns about markers and things not being delivered on time and so on—the kind of thing to which you referred—would HMI have been able to intervene and say, "This has gone too far"?

Douglas Osler:

Given what I knew—as I have described it to you—and what my colleagues in the department knew, which John Elvidge described to you, the answer would have to be no. We did all the right things based on the knowledge that we had at the time. We have no remit within the walls of the SQA, so it was not possible for us to go knocking on its door. If I know that something is going wrong in a school or a local authority, I have statutory powers to ask questions, but I do not have those powers over the SQA. I remind members of John Elvidge's phrase. He said that we were at "one minute to midnight". If SQA officials did not know, I do not believe that HMI could have known.

I have a question for Mr Banks, who is on the receipt list for a range of memos and letters about which we have heard in evidence. Mr Banks, do you have regular contact with the SQA and its officials?

Philip Banks:

Yes, I do.

Michael Russell:

In that case, you might have seen Mr Morton's report. In the couple of months that he has been at the SQA, he has found an organisation whose job had not been properly scoped, where planning and preparation were poor, where there was limited risk assessment, where there was no adequate contingency planning, where there was poor project management and where there was poor management information. The list goes on and on.

You had contact and are—I presume, as one of HM inspectors—a man who is naturally sceptical and looks out for people who might not be telling the truth. Given that, are you telling the committee that you had no inkling that any of the problems were going on?

Philip Banks:

Yes. The situation was clear. The SQA was under the management of a chief executive, a board of management and a set of directors. I was a professional adviser to policy colleagues as part of the normal liaison meetings that we had with the SQA. At those meetings, the reports that we received—which we had to take at the value that was accorded to them by the status of the organisation—were entirely convincing. Our concerns, which led to the visit by Paul Gray to the SQA, resulted in a top-level discussion with SQA officials. We went through matters point by point and SQA officials delivered assurances on those points.

Michael Russell:

So as far as you are concerned, the letter of 17 April that gave the SQA a clean bill of health is significant. We have discussed that letter. I will use Johann Lamont's analogy; it seems that the SQA management was putting on an Oscar-winning performance of competence and ability to deliver while standing on the deck of the Titanic as it headed for the iceberg, and that HMI was quite convinced by it.

Philip Banks:

You have discussed the matter with Ron Tuck, the chief executive.

I ask whether you were quite convinced.

Philip Banks:

I am not responsible for the management of the SQA.

Were you quite convinced?

Philip Banks:

I was convinced by the views of experts in our service who examined the situation and reported to colleagues. I am not an IT expert, nor is it my responsibility to be one. The evidence that was submitted satisfied the experts—

Those are not IT issues.

Mike, do not interrupt the witnesses.

Douglas Osler:

Mr Russell, if you expect the inspectorate to be able to obtain that kind of information, we have to be part of the board or have a role within the SQA. We did not have that.

You had daily dealings with the SQA, yet you suspected nothing. Indeed, on 17 April, HMI gave the SQA a clean bill of health. Either Mr Morton's paper is inaccurate or the organisation was in crisis and HMI gave it a clean bill of health.

Douglas Osler:

I do not think that anybody from the inspectorate gave the SQA a clean bill of health on any occasion.

Cathy Jamieson:

My question follows logically from Mike Russell's questions. Philip Banks and Douglas Osler have both said that they are not IT experts, nor would anyone necessarily expect them to perform that role. However, we have a situation in which two organisations came together, with their separate cultures. One might have anticipated that that would be a risk. There was a new exam system, in which people had varying degrees of confidence—some say that they had total confidence in the system; others say that they had some concerns—which might also be said to have been a risk. There was also the roll-out of a new software system, which has been described in evidence to the committee as one of the biggest roll-outs of a new software and IT programme that has been seen in the United Kingdom. Given that, is not it the case that HMI would have wanted some practical assurance that the system was capable and that all the elements would come together, notwithstanding the fact that the witnesses are not IT experts?

Douglas Osler:

If I had had a role to play in the SQA, I am sure that the answer would be yes. It would be quite inappropriate for the inspectorate to dig around in the SQA's IT arrangements. That is a matter for the SQA.

Yes, but would not you be digging around in schools to ask them what their experience of the IT system was?

Douglas Osler:

When schools raised problems with us, we acted on them.

Cathy Jamieson:

Would not you have asked the schools? Would not that have been pro-active, given that there was a new IT system and a new exams system, with a huge amount of data? Would you have gone into the schools and asked, "Is everything working? Are there any problems? Are there any issues?" as part of your normal work?

Douglas Osler:

Indeed—that is where the information that we have put before the committee today came from. It was in answers to questions that were put to schools that we got answers that led us to realise that a number of schools were having problems.

When did that emerge first? You said that it was at "one minute to midnight". Some of the information in the evidence suggests that it was a good 24 hours before that and that there was perhaps time to raise those issues.

Douglas Osler:

Members would have to go back to the evidence that they took from John Elvidge. At that point, we were part of the process that he described. I cannot say what I might have done in an area in which it was not appropriate for me to do something. Any advice or comment that we had was fed into the SQA, which handled it in the way that has been described to the committee.

Cathy Jamieson:

I am struggling with the fact that there were all those different strands running together, but that ultimately—in a system such as this, which must deliver for young people—no one was overseeing it. Who, in your understanding, was ultimately responsible for ensuring that all the strands were pulled together and tied in a knot that would not unravel?

Douglas Osler:

I am not sure that I am the right person to whom to put that question.

I ask only for your opinion.

Douglas Osler:

From where I sit, I would point to the senior management of the SQA and the board. I know also that the committee has had described to it the nature of the statutory relationship between the Scottish Executive and the board. Somewhere therein lie the answers. Perhaps there is a gap somewhere—I know that there has been discussion about that. Perhaps the gap could have been filled by involving people such as HMI, who are close to the system, in the management of the SQA. I do not know. However, the situation was not as Cathy Jamieson said it should be.

That leads me to my other question. With the benefit of hindsight—always an exact science—what would you recommend as the way forward to ensure that this does not happen again and that confidence in the system is restored?

Douglas Osler:

Do you mean in the SQA?

I mean in general. I am considering the bigger picture—all the strands.

Douglas Osler:

I am not sure that I am in a position to answer that question. I am not familiar with the possible governance arrangements. As I said, if it was considered appropriate for HMI to have a role of any kind, I would want that role to be meaningful. That is not impossible. I was never happy with the assessor/observer role, because it means that we are there, but that we are not influential. There has been talk of independent scrutiny and the like. That is a matter for discussion with other people in other places. I am not the expert on that.

Thank you.

Mr Stone:

I must say at the outset, Mr Osler, that I find your answers difficult. You say that HMI's role is limited. You have said repeatedly that it is up to the SQA and others to think about matters. However, you are Her Majesty's chief inspector of schools and you are responsible in the first instance—correct me if I am wrong—to the Scottish Executive, on whose behalf you advise and act. One of your roles was to ensure—or to oversee—and report back on the successful implementation of higher still. Has the failure of the SQA—no matter what we say, it is a shambles—impaired that implementation?

Douglas Osler:

Indeed.

Has that failure cast a cloud over the work of HMI?

Douglas Osler:

That is the question.

Do you agree?

Douglas Osler:

I cannot accept that because, as I said, HMI has a responsibility to co-ordinate the implementation of higher still—successive ministers have asked us to do so. We have statutory responsibilities to inspect and report on the quality of learning and teaching in classrooms. At the end of the certification process, we have a role to play in ensuring that the SQA maintains standards from year to year. However, we do not have a role to play between the point at which examination scripts leave the schools and when the results end up on certificates. We are not involved in that.

So you are saying that HMI has no responsibility in this fiasco. You are, effectively, blameless.

Douglas Osler:

No, I am not. I would not want those words to be put in my mouth.

I have explained what information we received from the system. I have explained the role that we perform and I have told you what we did with the information that we received. I do not think that any of us who have a role in Scottish education would want to walk away and say that we are blameless. Something went wrong for young people in Scotland's schools. That is a matter of concern for us all; it is a matter of concern for me.

I would not try to deny any responsibility for higher still, or for what happened during the summer. We want to ensure that the best possible arrangements are in place at all times. It is upsetting to us all when those things go wrong.

Mr Stone:

That is a fair answer.

With the benefit of the aforementioned splendidly effective hindsight, do you agree that it is—to put the best gloss on it—unfortunate that HMI did not pick up the signals that were coming from either the SQA or the chalkface and relay them to the appropriate minister?

Douglas Osler:

We could go round in circles on that, Mr Stone. I have told you that we picked up the signals that were coming from the chalkface. I have explained what we did with those signals. Unfortunately, those signals did not alert us—or anybody else, including the SQA's senior officials and board—to the fact that the event would be cataclysmic. We had concerns, which we relayed. None of us in the system can hold our hands up and say that we were the prophets who saw the way in which this would turn out.

When did you first warn ministers that a problem could be coming up the tracks? What did you say?

Douglas Osler:

You are aware that I am not at liberty to comment on how advice was given to ministers. I can say that, on such matters, we feed our professional advice to our colleagues in the department's policy divisions. Advice on such issues would be given to ministers through that route. The inspectorate would take that route. John Elvidge has replied on behalf of the policy divisions as to the nature of that advice.

Are you saying that you never spoke directly to the minister about this?

Douglas Osler:

About what?

You said that you fed your information to John Elvidge and that the department would advise the minister. Are you saying that you have never spoken to the minister about this?

Douglas Osler:

No, I am not saying that.

The Convener:

I want to follow up Jamie Stone's point. You are saying that you were aware that there were difficulties, and that you passed that on. You do not strike me as someone who would want to go to that trouble, then to find that nothing had been done about it. What should be done to ensure that, if difficulties were to arise in future, action would be taken to ensure that we would not get into this situation again?

Douglas Osler:

You would have to predicate all that on an assumption that the SQA's arrangements would be put right to the extent that the essentially irritating things that happened to schools—that is how schools described those things to us—such as being asked twice for results, would not happen again. If—and it is for others to say whether this is the case—what happened concerned management or information technology, experts in management or information technology must police the system, not HMI. If what happened was about learning and teaching issues related to examination arrangements, it is entirely reasonable to invite us to play a part. We are not experts on the management of data in an examination body. It is about getting the right people to examine those matters. As we are currently constituted, we would not be the right people.

No, but you have your role and the SQA has its role. How can those roles be joined together to ensure that when you pass on concerns, they are acted on?

Douglas Osler:

I do not think that the concerns were not acted on. We asked for feedback every time we raised the issues that I have discussed with the committee this afternoon. We were always careful to do so, for the reasons that you have suggested, so that the issues would not disappear into a black hole. We received assurances via independent experts, at committees and from the chief executive, that various steps had been taken. There is also evidence that things were being done. Further advice was given to schools as a result of those issues being raised. The trouble is that they turned out to be the wrong issues; they were low-level issues in terms of what eventually went wrong.

As I have said, we had a different role in relation to the two previous organisations. If we were sitting here today discussing matters in relation to those two organisations, I would feel a great deal more responsible for what happened, because we were represented heavily on their committees and at the highest level. However, I am not sure that that form of representation with SCOTVEC and the SEB was all that meaningful, because it was done in the capacity of observer. If you want to scrutinise an organisation, you must have the relationship that we have when we inspect a school. You must be able to go in, with a statutory right, and insist on seeing what is happening, on seeing paperwork and on answers to questions, and to come to a professional evaluation. If that had been our role with the SQA, things might have been different. I do not know.

Ian Jenkins:

Did part of the problem arise with the uniting of the two bodies? As I have said before, I worry about the culture of SCOTVEC; I think that it was short sighted, a wee bit pedestrian and utilitarian in its tradition. That is perhaps cruel, but that is the drift.

When SCOTVEC and the SEB were united, people accepted that it was reasonable to have an umbrella organisation, but they were upset when the SCOTVEC culture dominated. Perhaps that happened because SCOTVEC was a bigger organisation. Did the SEB have a good record because its main focus was on running highers? Highers became less important in the new umbrella organisation. Is that part of the reason for some of our troubles?

Douglas Osler:

It is certainly interesting speculation, but again you are asking me to comment on a body that I was not part of. I do not know what the culture was inside the walls of the SQA.

But you know the culture of its testing.

Douglas Osler:

SCOTVEC worked for a great many people; the SEB clearly worked for a great many people. I knew the SEB much more intimately than SCOTVEC.

If the two bodies had not been merged, the delivery of higher still would have been a nightmare. It would have been very difficult to deliver higher still across two bodies, because of the level of co-ordination and contact that would have been required.

I do not know whether one culture dominated. Perhaps part of Bill Morton's management review will consider whether any culture did, or should have been allowed to, predominate. That is an aspect of the management of the organisation. I have no experience of bringing together two organisations like that. I am not sure how it would be done.

Ian Jenkins:

I am talking about the nature of the testing.

On the idea of distancing yourself from the SQA at a certain point, the handover from the higher still development unit—or whatever it was called at that time, it seemed to change its name a wee bit—to the SQA seems to have taken place quite late. When we examine the documents, we find letters that were sent to schools from the higher still development unit in January 2000, which talk about assessment and matters such as when units can be done. There are also letters from Mary Pirie to advisers in Aberdeen and so on. Those letters were written by the development unit, in which the inspectors were involved, as opposed to the SQA.

Douglas Osler:

In all aspects, there was a point at which the programme ceased to be a development and had to be handed over to the statutory body—the SQA—that had responsibility for it. The point at which it was handed over was a matter of judgment; much of the discussion in the committee focused on that. I am sure that Philip Banks will want to add to this, but often the HSDU was still producing material after the point at which we might have expected it to have stopped because it was reacting to issues that had been brought to our attention. We were using the HSDU to help the SQA to keep up to date on some of its commitments.

Philip Banks:

The document to which Ian Jenkins referred was the result of feedback from senior management team seminars in autumn 1999, at which directors from the SQA made presentations. Senior management team members at all the national seminars agreed that they would welcome advice from the HSDU produced in consultation with the SQA. That document was the easiest way to make the advice available.

But do you acknowledge that January 2000 was a bit late to be telling people when to jump which hurdles in the exams—

Philip Banks:

The problem was that in autumn 1999, schools experienced difficulty in meeting the SQA deadline to finish unit 1. As a result, the SQA introduced more flexible arrangements. In light of that, issues arose to do with reassessment. The HSDU produced that genuine advice on the programme, because it was thought that that would be helpful at that point.

Ian Jenkins:

Some of those problems had been raised at in-service days a year and a half earlier. The HSDU stayed in touch with the inspectorate, which was advising schools directly. I am not saying that it should not have done so, just that it was not a case of a cut-off, then "Cheerio, it's the SQA now."

Douglas Osler:

You could make too much of that. With any such programme, there is a ragged handover of responsibility. We wanted to ensure that somebody did what needed to be done, quickly and effectively. That was what we were aiming at, rather than constitutional niceties.

Mr Monteith:

On the future of higher still, it has been mentioned in evidence to us that there is some confusion among parents and pupils, and possibly employers, about the certificate structure, which may have contributed to the data processing problems. Although the committee has heard about that before, I like to get the opinions of people who are giving evidence.

Five to 14 uses A to F, where A is the lowest entry level and F is at the other end. Standard grade has seven levels, of which level 1 is the highest. In higher still arrangements, access 1 is the lowest entry level, and A to C are passes—C is the lowest pass—and so on. There are different approaches in the different examinations. Do you consider that to be confusing? Would that be an issue on which HMI might recommend some change?

Douglas Osler:

You seem to have concluded that it is confusing. When five to 14 was introduced, there were many debates about the fact that it seemed to be standing on its head in its relationship with standard grade. That issue would be well worth considering and trying to resolve in some way.

I remember chairing a group—you will expect me to say that it involved all the main stakeholders—at which the chief executive of the SQA offered us a sample of a certificate. We all agreed that it looked fine. In retrospect, I agree that perhaps it is complex. That is another issue that might emerge in the review of the first year of higher still, and something can fairly easily be done about it. Part of the problem is unfamiliarity. When standard grade was introduced, people said that nobody would ever understand the credit, general, foundation and national certificate SCOTVEC modules, and the short courses, all on one certificate. We would have to be sure that the problem is not just lack of familiarity.

Those are the kind of issues we should consider for the future. The status of group awards and so on are issues that will no doubt keep running and can be addressed. Had we not had that cataclysm in August, those issues would have arisen from what I suggest to you was a successful first year's experience of higher still in schools. That would be a more constructive way to view such matters.

I thank Mr Osler and Mr Banks for answering our questions this afternoon.

Meeting adjourned.

On resuming—

Good afternoon, Mr Morton. I ask you to introduce your team.

Bill Morton (Scottish Qualifications Authority):

Good afternoon. On my right is Dr Dennis Gunning, the SQA's director of development, and on my left is Jean Blair, a member of our staff who has helped me project manage my internal operational review.

Do you want to say a few words about the situation at the moment or would you prefer to take questions?

Bill Morton:

I am quite happy to take questions, convener.

Thank you. I believe that Michael Russell has a question on the missing data and software issues.

Michael Russell:

We have received evidence from Ron Tuck and others about the search for what they call the golden bullet. The head teacher of Strathaven talked about the 60 occasions on which they were asked for missing information. What is the situation today? Do you understand what went wrong with the computer system, the software and the data management? What actions are being taken to correct what went wrong? This is the difficult question: can you give us a categorical assurance that that is now behind the organisation and that it will not affect a future diet?

Bill Morton:

I made clear that the data management problem is essentially a behavioural thing. The SQA did not get that right. It is not the same as there being problems with the technology or the hardware.

With regard to giving a guarantee, it is important to recognise that the SQA owes a number of people a sincere apology, which I am happy to give. I apologise to the candidates and their families who suffered anxiety and to the centres and colleges that we work with. I will do my best to ensure that all aspects of improvement and change that are required in the SQA to prevent a recurrence of the experience this summer are addressed as diligently as possible. It would be foolish at this point to give an absolute guarantee.

Michael Russell:

Mr Tuck listed five possible causes for the problems with data: the failure of electronic transfer of data between centres and the SQA; the existence of duplicate entries creating a false impression of missing results; the submission of data out of sequence; paper reports being filed without being processed; and errors by data punch bureaux going undetected by the SQA's staff. He said that, although the possible causes had all been investigated, none of them provided the golden bullet.

Are any or all of those the causes of the problems? Are there additional causes? What have you done to make sure that the problems cannot recur?

Bill Morton:

There is evidence to suggest that each of those took place and contributed to the problem. From my review, it is clear that poor data management or information handling is at the crux of the problem. However, as you know from the evidence that I have submitted, there are a series of other contributory factors, such as not scoping adequately the task that the SQA took on at its commission, inadequate planning and preparation and poor risk assessment. In some areas, such as examination paper preparation, assessment moderation and some of the areas of work of our development division, contingency plans were triggered effectively but, essentially, I have discovered that a number of issues needed fixed or the behaviours were such that perhaps some more fundamental change was required.

We are planning carefully to ensure that we have in place a process of improvements to ensure a safe delivery of certification in 2001. That will entail some fairly radical propositions in terms of management of change, which will take a long period of time.

Michael Russell:

Does the process include an examination of the qualifications and experience of those who head up information technology in the SQA? The committee has discussed the fact that IT did not seem to be given its proper place in the management structure and that there were people in charge of IT who had limited experience of writing programs. Will that be changed as part of the solution?

Bill Morton:

We will examine that, but I have not encountered any direct evidence that would suggest that the in-house capability was failing in the sense that you imply. Clearly there were instances of poor project management in the sense that user specifications for the software were being thought up at the same time as the software was being created. There is clear evidence that the software was not adequately tested. We need to think about the software as part of the planning for next year. If we believe that we need a capacity and capability that we do not have, we will procure it.

Michael Russell:

Since you were not there at the time, Mr Morton, perhaps Dr Gunning is better placed to answer my next question. I understand from previous evidence that when the final tape went to print the certificates around 1 August, a check would automatically be run to identify missing data. However, it took three to six weeks for you to confirm the total amount of missing data. At that stage, the SQA was still saying that 1,500 candidates were affected, but the number turned out to be much larger. Why were the missing data not picked up at that time?

Bill Morton:

I am not trying to absolve myself of any responsibility, because the responsibility is mine from here on in, but, as you say, I was not a member of the SQA at that time, so I will pass this question to Dennis, who will be able to fill in some of the detail.

When I joined the SQA on 15 August, there was a genuine belief that the error had affected around 1,000 candidates. Between 10 August and 17 August, when the revalidation exercise ended, the numbers involved were revealed to be higher than that. At that point, we were involved in a process of ensuring that the situation that affected candidates who had received incomplete or inaccurate data—which should have been included in their proper sense when the results were processed—was resolved quickly.

Dennis Gunning (Scottish Qualifications Authority):

The program that you are referring to is the one that converts the information that we have in the system into the certificate that is issued. It takes the data that are in the system for individual candidates and transforms them into a certificate. It is not specifically looking for gaps; it is taking the marks and the unit results that exist and reporting on them. A separate process would be required to identify gaps.

I understood that that process checked for gaps.

Dennis Gunning:

It did. It was being run regularly until 1 August.

However, when it was running, the number of gaps that it identified was far lower than the number that actually existed. Why was that?

Dennis Gunning:

I do not know.

Michael Russell:

Mr Morton, in addition to issues of data handling and so on, your report has a catalogue of damning items about what you have found in the organisation. I quoted some of them to Mr Banks and Mr Osler. I do not think that a reasonable observer would say that the problems you have listed could have happened overnight.

Problems such as cumbersome management systems, inconsistent practices, confusion about the different functions of the business units, lack of accountability, poor communication, concern about bullying and performance management not operating correctly are not likely to have happened in August. They would have existed in 1999 and perhaps in 1998 and 1997. How long do you think they had existed and why did they not affect previous diets as badly as they affected this one? Given that—as you say—they were so severe, why on earth did nobody notice? With some of them, it seems that someone would have had only to walk into the place to see them.

Bill Morton:

Any answer to that would be speculation on my part. I reiterate what has been said many times in this committee and in others: retrospective wisdom is a luxury not afforded to the people on the ground. I suggest that the accumulated effect of those factors became most apparent when the organisation, the structure, the processes and the behaviours within the SQA were under stress. That was the case with the greater volume of work that was associated with the diet in 2000. Having said that, I want to make it clear that, had the task been scoped as adequately as it should have been, that would not have been an issue.

Like other committee members, I probably have 100 questions to ask, but I will try to restrict the number.

Thank you.

Cathy Peattie:

I would like to pick up on Mike Russell's last point. You have talked about behaviours and organisational structures. Reading through some of the evidence that we have taken, I can agree with some of the things that you are saying. Ron Tuck indicated that the SQA nearly made it—it was only one wee thing that went wrong, and it was just one guy's fault. Are you saying that that is not the case?

Bill Morton:

You would have to ask Ron about his precise meaning. Serious issues have to be addressed. Things that are cracked and broken cannot be fixed or replaced but, in my review, I have not yet come across anything that was not capable of being remedied. I would also make the point that not everything is affected by the problems that were experienced this year. Many of the SQA's activities are still running normally, which is important for the customers and stakeholders involved.

In this particular instance, a combination of effects centred on the data management problem, which I feel is at the heart of all this. That gave rise to 2.7 per cent of the certification in the diet being affected by incomplete and inaccurate data. Unfortunately—I have apologised for this before and am happy to do so again on behalf of the SQA—that affected some 16,700 candidates this year.

Cathy Peattie:

You will be aware that we took evidence from young people this morning. They had some interesting points to make. One of the young women is still waiting for information because her papers got lost. Have you found the missing data? Do you know what went wrong? What is the situation for any young person who is still waiting for information about his or her exams?

Bill Morton:

When you consider this from the SQA's perspective, all the data that are required have been obtained and the cases have been resolved. However, I would not say that that was the end of the story. It is only when the schools and colleges—and especially the candidates themselves—feel that the results are robust and that they can be confident in them that we can consider the case resolved.

Some candidates may be awaiting the outcome of an appeal. For the vast majority of them—for highers and sixth year studies—the deadline will be 31 October. Some candidates might not regard their results as fully confirmed until they receive a replacement certificate—and I can understand that. In the meantime, they will have advice through their centre and a letter from the SQA with their corrected results. It is important to move as quickly as we can to provide a replacement certificate that is complete and accurate. Only when 100 per cent of the 16,700 candidates who have been affected feel that their individual concerns have been resolved will I regard the issue as having been resolved.

Cathy Peattie:

Evidence that we have heard has raised a catalogue of issues about markers. We have heard about the late appointment of markers, about people not attending marking meetings and about people receiving unsolicited papers. What can we do about this year? The 2001 diets have already begun and markers will need to be in place, but many teachers are saying that they will never mark again, that the situation has been handled badly and that they do not want to be involved. How can you overcome that?

Bill Morton:

Many of the anecdotal concerns about markers have passed into the mists of urban myth. I will openly and candidly concede that the administration of the marking system this year was extremely poor. It was handicapped right from the outset when—and this was the SQA's problem and nothing to do with the information that was transferred from the schools and colleges—the SQA was unable to verify that the base data for registrations and entries were complete and accurate. Unfortunately, that had a knock-on effect on assessment moderation and on recruiting the right number of markers.

It would be disingenuous of me to say that we have completed our examination of the marking question because, through the appeals process, we are taking the opportunity to verify the standards. I know that your inquiry, and the Deloitte & Touche study, will consider it in great detail. The administration was clearly poor and we need to consider that for next year. I sincerely hope that the markers who have been inconvenienced this year will not let that stand in the way of their assisting with future marking. What matters most is to put in place a service for the young people undergoing education in Scotland.

Have you put systems in place for finding markers for next year?

Bill Morton:

We are considering the planning of all aspects of certification in 2001. As you will appreciate, that is an urgent task. It has to be handled pragmatically and diligently. At the moment, we are trying to ensure that the registration and entry process for schools and colleges is simplified—and simplified in a way that gives the schools and colleges that originate the data the right to verify those data as complete and accurate, and the responsibility for doing so. The concern has been expressed that that has given rise to a small measure of slippage. That is true, but we will go out to the schools and colleges and carefully explain what the improvements are. People may say that time is being lost at the moment, but I would prefer to say that it is being invested to allow us to get things right. I hope that people will agree that the simplification involved in those improvements will allow us the time to catch up.

You have conducted an initial review of the marking procedure. Will you give the committee more detail of what the review revealed? What tasks did you undertake in the review?

Bill Morton:

We considered the administration and wondered what lessons could be learned. As I have already indicated, a significant volume of improvement and change is required and that has been built into the planning process. We considered whether the quality assurance checks that normally run with the marking process did in fact run. By and large, they did.

It should also be borne in mind that the people who undertake the marking are professionals—they are the selfsame people who set the courses, moderate the work in the centres and invest in the teaching and learning of the candidates. In summary, we would have to consider whether the administration and quality assurance were in place. By and large, the latter seems to be.

Mr Macintosh:

The young people whom we heard from this morning outlined the problems of data management. The continuing problem is that people are still unsure about the quality assurance of this year's exam results. We are looking for hard facts about the standard of this year's marking, to ensure that everybody's results are valid.

One of the parents from whom we heard this morning raised the concern that unit assessments are not standardised for all schools; the units may be easier to pass in some schools than in others. That suggestion was refuted by one of the teachers from whom we heard subsequently, but he said that he had phoned the SQA helpline and had not received the reassurances that he would have liked. What is the true state of affairs?

Bill Morton:

That is a question for my colleague, Dr Gunning, as I am not an educationist. No evidence has been brought to my attention of any inconsistency in the internal assessment of units.

Dennis Gunning:

Nor has that been brought to my attention. Part of the moderation process is to ensure that there are no inconsistencies. The combination of the moderation process and, for most centres, the use of nationally standardised tests—and now there is the national assessment bank—is designed to remove any inconsistency.

I would be surprised if a telephone call to the helpline raised that kind of issue, as it is a technical and professional issue. The helpline is there to give general, not technical, advice.

Perhaps you can clarify the moderation process, as that seems to be the key to this. Are you saying that some or all of the unit assessments were moderated?

Dennis Gunning:

Moderation is always conducted on a sample basis, but the units are moderated.

However, the marking of the final exams was not moderated.

Dennis Gunning:

The marking of the final exams is moderated in a completely different way. All the marking that is carried out by individual markers is quality assured by the examining team. All marking is quality assured by a member of that team and any differences in standards are ironed out during that process. It is a very careful process indeed.

Mr Macintosh:

In a standard letter that is dated July 2000, David Elliot says:

"A variety of difficulties faced this year has meant that the SQA has been unable to undertake the moderation of the materials submitted by your centre for the above subject."

Can you explain that?

Dennis Gunning:

I am puzzled by that statement. That is an area for which David Elliot was responsible. We undertook less moderation this year than we wanted to, but I would need to know the subject to which the letter refers to comment further.

This morning, a head teacher told us that the sample that they had expected to send in was returned to the centre.

Bill Morton:

If it would help the committee, I would be happy to address individual cases and return with supplementary information and evidence as required.

We are discovering that there are a number of systems for maintaining standards and ensuring that this year's candidates are treated fairly, one of which is moderation. Are you saying that all the unit assessments were moderated?

Bill Morton:

A sample was moderated.

On a sample basis, every unit assessment was moderated. However, only some of the final exams were moderated. Is that correct? Could we discover that some schools were not moderated? I am not sure how that system of moderation works.

Dennis Gunning:

Let us start again. Individual unit assessments are moderated by moderators—that is part of the process of quality assuring internally assessed units. That moderation is conducted on a sample basis, whereby some centres and some units are sampled. Last year, the units that were completed by centres early in the session were unlikely to have been moderated because at that stage we did not have entry data to indicate who was being entered for which unit.

So how could you ensure that the same exam was not easier in some schools than in others?

Dennis Gunning:

The assessment material is nationally standardised: national assessment bank items are used, to which a marking scheme is attached, and a pass mark is agreed nationally for examinations. Therefore, as well as the moderation process, the design of the internal assessment allows standardisation.

There are two components of the external assessment. Some external assessments include marks for material that is assessed internally, which is then moderated by the SQA. The exams are marked by our markers, not by the internal assessment people, and are standardised completely separately. That process is run by the examining team—the principal assessor, the principal examiner and so on.

Mr Macintosh:

I am not sure that that process is entirely clear, but let us discuss the exams themselves. I understand that you are able to examine all the results and check whether this year's results in a subject are comparable with those of previous years. Do you undertake that comparison?

Bill Morton:

Yes. In analysing the trend, one would have to take account of the fact that new subjects and exams were introduced this year. However, I understand that those statistics are produced.

Mr Macintosh:

One of the things that has worried many of us is that if the volume of appeals is up—say, instead of 10 per cent of the pupil population appealing, nearer 50 per cent of them are appealing; I am not sure of the actual figure—and the same proportion of appeals are successful, that does not reflect well on the exams. It should be an absolute number of the scripts that were inaccurately marked, not a proportion. If half the appeals were successful—half of 10,000 appeals would mean 5,000 appeals being granted, whereas half of 50,000 appeals would be a significant number of badly marked exams—how could you assure us that quality of marking was maintained throughout?

Bill Morton:

There is no standardised trend that can be used to identify a pattern of behaviour in appeals. The higher volume of appeals this year was expected and has nothing to do with the complexity of the higher still examinations. The data management problem that has given rise to inaccurate and incomplete results has caused concern about the credibility of many of the exams and has inspired that greater volume of appeals.

It could be argued, constructively perhaps, that the combination of external and internal assessments has provided more evidence in individual cases to judge whether a candidate has attained the standard that may be reflected in a successful appeal. It is too early to make definitive judgments on any of this, but we will amass a volume of viable research through the appeals process this year.

I agree that it is too early to judge, but I hope that the SQA will consider this issue in detail.

Bill Morton:

Yes, indeed.

Mr Macintosh:

I agree that one reason for the number of appeals might be the alarm that has been caused by a lack of faith in the SQA. However, if the marking has been consistent, the number of appeals should not be any greater—at least, not massively greater—than in any other year. If there were a huge increase in the number of appeals granted, that would perhaps tell us something about the lack of moderation of the marking.

Bill Morton:

That may be true if you ignore the fact that the course marking is a combination of internal and external examination or assessment. The appeals process deals with both aspects. If there were enough evidence, based on internal assessment, to grant an appeal, an appeal would be granted. If there were any doubt at the end of that stage, both internal and external assessments would be considered and a judgment would be made on the outcome of the appeal.

Mr Macintosh:

I gather that you might not be able to answer all these questions at this stage. The point was made that the final exam is much shorter for most subjects than it has been in the past, and that that might have created anomalies—there might have been greater statistical variation because students did not have enough time to demonstrate their abilities. You might not be able to say whether that factor had an effect this year, but I hope that we will be able to answer that in the fulness of time.

Bill Morton:

We will consider all the lessons that are learned and all the intelligence that is gathered. We will make that information public, as there is a wide constituency that can use it to make improvements in the next year.

As I understand it, the structure of internal and external assessment is intended to capture the product of teaching and learning in terms of a series of outcomes. The externally assessed exam is built into the structure of assessment with that intention in mind.

You said that many stories that have circulated had the status of urban myth. If I give you a couple of examples, you might be able to tell me whether they are true. Were many inexperienced markers used this year?

Bill Morton:

Eight markers out of 7,000 were inexperienced. If my memory serves me correctly, I think that the vast majority of those inexperienced markers attained the classification A in the assessment of the quality of their marking: six were awarded an A, and the other two were given a B.

I do not have my notes of all the myths that have circulated, but perhaps my colleagues can help me.

I might be able to help on that point. Are those eight markers the probationer teachers to whom the minister referred in his statement on 6 September?

Bill Morton:

Yes, the inexperienced teachers to whom I referred were probationers.

Mr Monteith:

In asking about inexperienced markers, Ken Macintosh may have had in mind not probationers but teachers or lecturers who might be qualified but did not have much experience of marking. In that sense, can you address the question whether the use of inexperienced markers was a myth?

Bill Morton:

We will be happy to publish information when all the reviews have been completed and we have all the facts. There were new subjects in the exam diet this year, so one would expect that new teachers, who had experience of those subjects, would be required. It should be borne in mind that marking is under the supervision of principal assessors, who are experienced teachers in their subjects.

Another story that I heard was that markers did not attend markers' meetings. Did that happen?

Bill Morton:

I do not think that there is evidence to suggest that that was a major concern. There is a misconception that markers' meetings took place over, say, a couple of hours. The standards and processes for marking were agreed at those meetings, and where markers were unable to attend them, by and large the principal assessors briefed markers and marker teams in parallel.

Is it compulsory for markers to attend markers' meetings? How many people did not attend them?

Bill Morton:

I could not answer off the cuff, as I do not yet have the analysis.

Are we talking about tens or hundreds?

Bill Morton:

I do not want to speculate until I know the outcome of the review.

Will that be clarified in your internal review?

Bill Morton:

It will not be clarified as such in great detail. However, we are conducting a review of marking in the appeals process and the Deloitte & Touche exercise is examining the matter in detail. That information will not only be made available to the committee but will be published at the end of this month.

Cathy Jamieson:

From the evidence that we took from the teaching unions, it is clear that many people went out of their way and took on extra work to get marking done. However, a representative of one of the teaching unions said:

"On a minor practicality, it is obvious that people will not be attracted to giving up two and half weeks of their time . . . for less than £8 per hour. That will be a big problem this year."—[Official Report, Education, Culture and Sport Committee, 4 October 2000; c 1554.]

To what extent do you think that attracting markers will be a big problem, and what solutions are proposed?

Bill Morton:

Not being an educationist, I am mildly surprised that there is a reliance on the voluntary contribution of the time and expertise of the teaching profession in the education system, which is a key aspect of Scottish community life. There is a perennial dilemma because, if the SQA were to pay the markers better, there would be a knock-on effect on entry charges. Whether that is desirable or practical would have to be the subject of further consultation. It is clear that we will have to produce a more attractive proposition to overcome some of the natural reservations that the profession has about participating in marking in future. We are considering that matter and will present proposals to address it constructively and quickly.

I wish to change tack slightly. Is it deliberate that, under "Structure", in your submission, you do not mention the board?

Bill Morton:

My submission mentions the board.

I am sorry. Where does it mention the board?

Bill Morton:

I talk about the board in relation to assurances that were given. On the last page of the submission, under "Behaviours", it says:

"The SQA Board sought assurances, and in large part these appear to have been given."

Mr Stone:

I stand corrected. On the issue of structure, have you given thought to the future direction and manner of conduct of the board? Some members have perceived a blurring and lack of clarity between the board's strategic role and its overseeing of the chief executive's reporting function. Equally, have you thought about the role of members of the board? Although members are on the board in their own right, some are also members of other bodies, such as the Scottish School Board Association. I realise that you may say that decisions on those issues lie with the Scottish Executive, but I should be interested in the advice that you would give on them to the Scottish Executive.

Bill Morton:

The decisions lie elsewhere, but I think that the board, as it is currently constituted, represents very well the various stakeholders in education. I have met the board on only two occasions. I am due to meet it in just under a week to present the findings of the operational review.

Any non-executive board has two roles in effect. First, it has to contribute to the leadership and strategic direction of the organisation. The SQA board is very capable of doing that. The board is equally capable in its second role, which is corporate governance. If I were to make one suggestion for improvement, it would be that a better balance should be struck between those two roles. I suspect that due diligence was applied in the corporate governance role to ensure that propriety and probity and other standards of good practice were in place, but that assumptions were made about the interpretation of strategic direction. The board could become more active in that field, and I for one would certainly welcome that.

Mr Stone:

My second-last question is a quick one. Douglas Osler referred repeatedly to the SQA and what HMI might or might not have said to the SQA at various times, about how the SQA's problems could be dealt with. Do you think that the relationship between HMI and the SQA should be strengthened, changed or radically changed in some way? We have heard Mr Osler's side of the story. I know that you have only recently started the job, but your impressions of the other side of the story might be useful.

Bill Morton:

It is difficult to comment on that, but, like any good chief executive, that will not deter me.

I am glad to hear it.

Bill Morton:

What I have ascertained as the problems that beset the SQA, which might constrain its positive progress in future, are essentially internal organisational and management issues. They concern the structure, the process and the behaviours of the SQA. If I remember correctly what Douglas Osler said, he focused on whether there was any contribution to be made in the realms of teaching and learning. HMI may well have a role to play in those areas.

However, at this point in time, the emphasis is on organisational development and change to make the necessary improvements. That is a leadership and management issue. My responsibility is to ensure that the changes and improvements that are necessary are implemented effectively and quickly.

HMI was responsible for overseeing and ensuring the successful implementation of higher still. Given those responsibilities, do you think that the communication channel with the SQA was as strong as it should have been?

Bill Morton:

I do not have the authoritative knowledge to comment on the past. The task of delivering higher still was clearly the SQA's responsibility, and the organisation should have scoped and prepared for it better. I agree that it was a feasible proposition and a commission that could and should have been delivered. The fact that it was not delivered to the standards that the SQA had achieved in the past is a matter of significant regret.

Cathy Jamieson:

Your written evidence was helpful in identifying a number of issues. One of the problems seems to have involved training and development opportunities either not being available or not being taken up. From other evidence, I have formed a picture of lots of staff working very long hours and really trying their best to deliver, but without there being an absolute focus. I am concerned about the resource implications of trying to get the correct training in place and dealing with the cultures of the different organisations at the same time as managing a process that is already beginning to slip behind schedule for this year because of the time involved in all these inquiries. What will that mean in practice, and what will be the knock-on financial effects of doing that job in the proper time scale?

Bill Morton:

Big issues are involved. I have been genuinely impressed by the dedication, commitment and professionalism of the staff of the SQA. They have been badly shaken by the experience that the organisation has corporately visited on the candidates and centres this year. There were instances in which training and development opportunities were available but were not taken up. We will have to look more assertively at the training that is available, so that we can get some of the key capacities and capabilities in place where required and make them operate effectively.

That will be a big challenge, as the people upon whom we rely to prepare for diet 2001 are the self-same people who have been involved in clearing up the residual problems of diet 2000. The same people are managing a higher-than-ever volume of appeals at the moment. The work load that is placed on a small number of people in the organisation is disproportionate and that is something that we must seriously examine. We are considering new ways of organising the SQA to make it better aligned with what we are here to do, as well as ensuring that the right staff are in the right place at the right time. Staff must feel properly supported and must have a clear understanding of the direction in which the organisation needs to go. They must feel that management is around them to provide them with on-the-ground support to enable them to do their job. Those are all important issues that we are addressing right now.

You also identified a lack of contingency planning. To what extent will it be possible to have a contingency plan in place for 2001, in case anything goes wrong this time?

Bill Morton:

The contingency plan stems from having a good identification and assessment of risk. The risk assessment seems to have focused on the wrong area in the recent past. The view was that there were clearly risks associated with software development or with the processing of results. I do not think that anyone felt that the risk would happen where past practice had suggested that an excellent outcome could be expected. An element of blind faith was perhaps involved in managing that.

We will not have a perfect resolution of all those issues, but we will have a pragmatic and practical approach that, first and foremost, will be based on the SQA becoming a much better listener than it appears to have been recently. We have not been as attentive to the concerns expressed by centres, teachers, candidates and other stakeholders as I would like us to be in future. I will do my best, as will my colleagues throughout the organisation, to accommodate a better understanding of risk, a simplification of the processes that should allow us to avoid risk, and some pragmatic contingencies that can be called in.

One of the problems that I highlighted was that there was a compounding effect. As the problems of one stage rolled over into the next, the problems simply got worse, and solutions that were well intentioned did not have the effect that they should have had. We will try to ensure that there is a proper sequence to all of that this year, but we do not have a lot of time and we are working hard to ensure that we rectify problems and prevent what we were not able to cure.

I am going to take a risk now and say that Brian Monteith's question could be the last one of the afternoon.

Mr Monteith:

That was a risk on a grand scale, convener. First, Mr Morton, I would like to clarify a couple of points from your written evidence. Under section 2, headed "Since 10 August", the first list of bullet points, "Data checks and clear-up", includes the statement:

"Initial verification of the database was completed by 17 August."

The final bullet point in that section states:

"It was confirmed that, overall, 2.7% of results and 16,748 candidates were affected by missing or incomplete data."

I take it that those two bullet points go together and that it was by 17 August that those figures were confirmed. Is that the case?

Bill Morton:

No. By 17 August, we had looked specifically at the areas of greatest priority: the candidates whose results were incomplete or inaccurate and whose entry to college or university was at stake. That was the initial focus. Gradually, as we resolved those issues, we were able to examine each successive component of certification this year. At the end of that process, we were in a position to conclude and to make it public that of the courses taken by 147,000 candidates this year, 2.7 per cent were impeded by the problems that we visited upon centres and candidates. In fact, the number of candidates who were affected was 16,748.

Was that concluded by 29 September, the point by which you had achieved clarification of the standard grades? I am trying to put a date on when that was concluded.

Bill Morton:

Those data were obtained for the production of the submission—last Thursday, that was an up-to-date-position. I would see that as being concluded.

Mr Monteith:

Fine. Your submission states:

"All other Highers and CSYS results were confirmed by 22 September"

and that

"Standard Grade results were clarified by 29 September".

In his statement on 6 September, the minister said:

"The SQA has completed its checks and has confirmed final grades for all of this year's higher and certificate of sixth year studies candidates. It has also confirmed final grades in all but 85 standard grade cases and has promised to complete the last of those by Friday at the latest."—[Official Report, 6 September 2000; Vol 8, c 21.]

That is Friday 8 September. Can you explain why there is a difference between the statement made by the minister on 6 September and the statement in your submission that the results were not confirmed until 22 or 29 September?

Bill Morton:

It was a moveable feast—it was an iterative process, as would be any clear-up activity. The information that was given to the minister was correct at the time. There might have been other clarification issues that we considered important for the purposes of the SQA.

Mr Monteith:

Your submission says:

"The delivery of certification in 2000 as a whole had not been properly scoped."

Who do you believe was responsible for that task? Who had departmental responsibility? Do you think that the SQA experienced a problem in moving from having four directors to three, given that a director who left was not replaced, so departments were brought together?

Bill Morton:

I shall answer your questions in reverse. I had no basis on which to judge whether the number of directors was relevant to the situation. Personally, I believe that it is the role and responsibility of the chief executive and the senior management team to scope the exercise that the organisation takes on. However, I say that with the benefit of hindsight. To a degree, the failing was a corporate one.

We will take a different approach in future—the whole organisation will have a corporate understanding, based on sound communication and simplified and streamlined management. That is our task. Some aspects of the task were not thought through in terms of the logistics of guaranteeing successful delivery. That is what I mean when I say that it was not scoped properly. The dimensions of the task were unclear, so communication within the organisation was unclear. One could look back and say that, at the time, that judgment was based on a firm belief that the track record of the SQA suggested that the bigger task could be delivered. It is not for me to criticise that directly.

Your submission says that the

"SQA was unable to confirm that Entry and Registration data was complete and accurate from the start."

Are you confident that the SQA will be able to undertake that task this time round?

Bill Morton:

We are planning to ensure that we have the data in a complete and accurate condition. It is a balance of risk. Do we spend two weeks simplifying the process in order to guarantee that it works or do we become concerned about the slippage and the bigger risk that we might find ourselves in a similar dilemma next year? At the moment, we are simplifying the data-capturing exercise—registrations and entries—on the basis of listening to what the centres have told us about what they want. They want us to give centres the right and responsibility to originate and verify the data. That facility was previously available to colleges and was withdrawn last year. We want to reach a position where the centres are comfortable that the data that we hold are complete and accurate. When we reach that position, all the processes that flow from that will be less prone to the risks that arose this year.

There is clear evidence that where the SQA staff were aware that the data were not complete and accurate, they tried to fix the problem. In many cases, the schools and colleges were not made aware that that was happening. The level of understanding within our marketplace—our clients and partners—of the extent of the problem at the start was not as high as it should have been. We will do our best to ensure that the processes are simplified. If that means that there is less of a burden on the schools and colleges—and indeed, the SQA—any slippage that occurs in the meantime can be made up.

Mr Monteith:

The evidence that we have received from the chairman of the board, Ron Tuck, and David Elliot suggests that they thought that there was a problem of incomplete or incorrect course grades—that would end up on certificates—which had started at a high figure and were being reduced. There was some surprise, even as late as 9 and 10 August, because the problem was larger than they had been led to believe. They portray that as a problem of having been misled about the information that was available and have pointed to one person in particular, Jack Greig, from whom we hope to take evidence later. He was on sick leave for much of June and when he came back at the beginning of July, Bill Arundel had replaced him.

Have you been able to ascertain why, if at all, the information was not as accurate as possible? Given the comment made by Dr Gunning that, even recently, there is no real understanding as to why the information from the IT system is incorrect—I hope that Dr Gunning will forgive me if I have misrepresented him—can you be confident that you will have the right information available for the next exams?

That was your final question, was it not?

No; I have another small one.

Bill Morton:

Those who have given evidence before me were there at the time. You have referred to their conclusions that they were misled. I cannot comment on that; I cannot contradict or confirm what they said. However, one needs effective management information to be able to manage an organisation. In the case of the SQA, that has been identified as inadequate. I can only presume that the information that was made available was sufficiently credible for people to conclude in advance of more detailed knowledge of what happened that the problem related to 1 per cent of candidates and might affect somewhere in the range of 1,000 to 1,400 people. Clearly, that was not the case.

One of the most important things that we must do is ensure that we have robust, reliable and accurate management information, so that we know exactly what is happening with the leadership and management of the organisation. There has been great concern about the extent to which the SQA has been unable to advise people and keep them informed. I have seen how inadequate the management information was. I am not saying that there was no management information, but it was produced in the wrong form, at the wrong time, and not enough heed was paid to it. We need to improve that situation radically and urgently.

Mr Monteith:

At last week's meeting of the Enterprise and Lifelong Learning Committee, the Minister for Enterprise and Lifelong Learning said that you were taking a number of big sticks to the organisation. In your evidence, you suggest that there was concern about bullying in the organisation and that

"communication both externally and internally was poor."

However, in the improvements and changes identified in your review and your suggestions for forward planning, there is nothing that could be called a big stick. Obviously, you cannot speak for the minister, but are you able to tell members of the public and parents what you will do to make a significant difference in the SQA?

Bill Morton:

I can think of a parliamentarian who described me as a slippery haddock, and I do not recognise myself either in that description or in the statement to which Mr Monteith just referred. I am a chief executive with 13 to 14 years' experience, and I will address the issue of what needs to be fixed or replaced because it is cracked or broken in a positive, constructive way. What happened is regrettable, but most of the contributory factors related to poor data management. Those problems can be put right. I am also encouraged by the capability and capacity of the staff of the SQA to ensure that we do that very quickly.

I have one very quick question about markers. We look forward to seeing the Deloitte & Touche report, but can you tell us what was the percentage of unsatisfactory markers this year compared with other years?

Bill Morton:

Markers are classified as A, B or C according to quality. The initial indications that I saw over the weekend suggest that this year there are slightly more As than in the previous two years and slightly fewer Bs. Offhand, I cannot remember how many Cs there were. The general impression is that there is not much of a difference in markers' scores between this year and the previous two years.

Ian Jenkins:

Most of your comments today and suggestions for changes relate to data management, and I understand very well why that is. If, however, there are changes to be made to higher still and the assessment process, who would make those and what would be the procedure? Mr Osler has said that he would not drive it. Who will consider the effects that the introduction of higher still has had on the examination system?

Bill Morton:

I will pass this question to Dennis Gunning. However, I have recently had contact with the stakeholders involved in education about higher still, and the SQA has proposed simplifying the way in which data are captured. I take issue with Ian Jenkins when he says that my suggestions are all about improvements and changes in data management at the SQA. My evidence indicates that there need to be corporate changes to the organisation as a whole. There has also been discussion about how the natural process of refinement may lead to simplification of internal assessment. However, that does not put in question the fundamental importance of internal assessment in higher still or the linkages between educational and vocational learning as part of a lifelong process.

Dennis Gunning:

I would like to say something about the arrangements that we make—the syllabus, the composition of the units in higher still and the assessment. Douglas Osler referred to the point at which those arrangements were handed over to the SQA. That was the point at which development was finished and implementation was in progress. We have a committee called the national qualifications committee, which is responsible for overseeing this family of qualifications, which includes all the higher still arrangements. There is also a committee called the national assessment steering group, which is chaired by an inspector and has membership from the SQA and the higher still development unit. Normally we would discuss proposed changes in assessments with that group, to ensure that all the key stakeholders are signed up to them. However, ultimately arrangements for the syllabus and assessments in higher still are the responsibility of our national qualifications committee, as these qualifications have now become operational.

The minister set up a review before all this happened.

Dennis Gunning:

Indeed.

Is the national qualifications committee the body to which he will report?

Dennis Gunning:

No. We are running higher still within the policy that was set at national level. It is not the job of the national qualifications committee to challenge the policy of higher still. That would be done at a higher level than the SQA.

Michael Russell:

I have one last question, which might be the final question from the committee. One of the problems that we have had to contend with—and which others have had to contend with all along the line—is that of assurances that turned out to be false. I am not accusing you of anything, but before today—presumably on bad advice—you said that early on there was no reason to doubt that the quality control mechanisms were in place, and it turned out that they were not. Today we have heard that the minister was given an assurance—presumably by you—which he repeated in his statement on 6 September and which turns out to be misleading.

The inevitable question is: quis custodiet ipsos custodes—who will guard the guards themselves? We heard Mr Osler talking about the possibility that he might come in—heaven forfend—to inspect you. That is not something that you would look forward to. Given the sensitivity that exists about the organisation and the difficulties that there have been, over the coming period, what external assurances will you be able to give that are just that—independent and convinced? If we have to rely on assurances of the sort that have been given in the past, there will be some nervousness.

Bill Morton:

I gave an answer to a different question that was used in the context of the question about the running of quality assurance checks being posed after the fact. Concordancy is simply a way of validating the relationship between school estimates and outcome. I understand that concordancy checks are run where there is a statistically competent track record to make it a meaningful exercise, such as in standard grade and the old higher. The reason why it was not run in the new higher is that in the first year there is not a statistically competent track record that makes it possible to establish a reliable trend.

I will do my utmost to ensure that the information that I provide is complete and accurate. I can only give the committee my word for that. I will apply complete diligence to my role as interim chief executive of the SQA.

Michael Russell:

I am not questioning that. However, given the circumstances in which the organisation now finds itself—and I do not doubt your word in any way—do you not think that some external independent reassurance needs to be given, perhaps over the next year or couple of years, which will make people feel that they are getting the truth? I am sure that they would get the truth from you, but do you not agree that, given the difficulties of the past year, extra reassurance would be helpful?

Bill Morton:

I do. I was trying to answer your question rather obliquely by saying that primacy of responsibility and accountability rests with me, as the chief executive. However, I would support greater openness and transparency. I am not in a position to make a decision on that or to judge how best that can be achieved. The committee may offer some guidance on that.

The Convener:

I will indulge myself and ask one final question. You have submitted a written report to us that I suspect is part of your internal review. Is it part of the review or is it the whole? If it is not the whole, when can we expect the review to be complete?

Bill Morton:

The report is on the whole review. This was the right opportunity to make public my findings from the operational review. Members will see from the completeness and candour of the report that nothing has been left out.

The Convener:

Thank you for giving evidence to us this afternoon. I am sure that you will be very interested in our on-going proceedings, particularly as they relate to the last point that Mike Russell made, which I am sure we will consider in the future.

We cannot leave without expressing again our thanks to South Lanarkshire Council for its hospitality today. I hope that the council has enjoyed our visit as much as we have. I am sure that we will want to hurry back at some point in the future. Thank you for looking after us so well.

Do we agree that our next meeting, which is on 23 October, should open in private?

Members indicated agreement.

Is it at 1.30 pm?

Yes.

Meeting closed at 17:26.