Official Report 323KB pdf
Good morning, everybody. Welcome to this morning's meeting of the Education, Culture and Sport Committee. I extend a particular welcome to Irene McGugan, who is joining us for the first time. There will be more of this to come, Irene, so you must have been very wicked.
What were the effects of the rate of introduction of higher still? What are your thoughts on the advisability of the manner in which it was introduced? In previous meetings, the committee has touched on the issue of phasing. This is a key area for the committee, and I would be grateful to hear your comments.
All the teacher trade unions were deeply concerned about the initial timetable for the introduction of the higher still programme. We argued strongly for and welcomed the postponements that were implemented under the Conservative and Labour Governments that were responsible for the programme before the Scottish Parliament was set up. The Educational Institute of Scotland was deeply concerned that there was pressure on teachers in Scottish schools to implement the programme before they were ready. We were pleased by the decision to phase in the programme in those subjects where people felt that it could not readily be implemented during the first year of higher still. We welcomed the fact that teachers were not put under pressure to do that.
I do not disagree with anything that George MacBride has said. One of our concerns—and I am not sure whether this is political with a large or a small P, because it was the case under both Administrations—was that there seemed to be a reluctance to accept the professional judgment of teachers that the programme was not ready to be implemented. There was a political will that the timetable should be met. When Brian Wilson was minister of state at the Scottish Office, he was heard at one meeting to say, "Why am I hearing this for the first time?" There was a reluctance on the part of Her Majesty's inspectors of schools and, possibly, the higher still development unit to give ministers messages that they might not have wanted to hear.
I echo what John Kelly and George MacBride have said. From our point of view, the key to this is the extent to which the advice—the virtually unanimous but separately arrived at advice—of the teaching unions, representing the whole profession, was ignored for what can only be regarded as a narrow political purpose. Like John Kelly, I will not say whether that was political with a capital P. Teacher unions were castigated for being luddite and for refusing change. We were seen as reactionaries who were holding up the process. However, all the time we were totally right, as we were reporting back what practitioners were saying in the classrooms—that there were fundamental problems.
Are you suggesting that there was what verged on a reckless haste in implementing higher still, despite warnings from the unions?
I do not know whether the haste was reckless or whether it was otherwise motivated. However, there was a determination to implement the programme within the agreed time scale, regardless of any objections.
Do you agree that there was undue and possibly damaging haste?
There was haste that eventually proved to be damaging. It created problems for schools when they came to decide whether they should go ahead with higher still. It would have been far better if we had been able to proceed on a properly planned, fully resourced and fully ready basis. That was not the situation in which we found ourselves. Schools had to make difficult decisions about whether to present in higher still or in the traditional higher.
I would not use the adjective "reckless", but undue haste was an important factor. Many teachers in schools and members of all the teacher trade unions perceived a determination to push ahead with higher still. We associate that with the general culture in the management of Scottish education, which until now has operated on a top-down model where decisions are made without involving teachers in much of the detailed planning. That led to many of the practical difficulties, and we had to push extremely hard for the additional resourcing that finally underpinned the higher still programme. Even when Brian Wilson, as minister of state at the Scottish Office, was announcing the postponement by a year of the introduction of the higher still programme, leading members of the Convention of Scottish Local Authorities clearly disagreed and argued that teachers, as employees, should implement it with undue haste. We would therefore not identify any political party, but we would identify a political culture in Scottish school education.
I am anxious to get to the heart of what you think has or has not been going on with HMI. We heard that HMI was perhaps not carrying messages back to ministers, although it should have been. Can you expand on that and say why, if that is the case, that did not happen?
In our written evidence we have indicated that, at meetings with ministers, we were presenting our concerns. Certainly my organisation did so, and I imagine that other organisations did likewise. Initially, there was a clear trend to dismiss those concerns as unimportant and not time constrained. As time went on, it became clear that ministers were getting conflicting advice. They were getting advice from us saying one thing and advice from other sources—HMI and presumably others—saying very different things.
During the first year of implementation, we found that HMI, quite rightly, picked up the positive view of what was happening in schools, but seemed to be failing to pick up the problems and the more negative aspects, particularly in relation to internal assessment. HMI's view did not reflect the reality that most of us found in schools. Although it has been picking up curricular issues in school inspections, it appears to have failed to pick up the administrative difficulties. To that extent, we certainly have concerns.
As a general observation, the problem is not with personalities in the inspectorate but with the role of the inspectorate as it has been constituted in the past 10 years. In our evidence on the Standards in Scotland's Schools etc Act 2000, we pointed out that the inspectorate is now both the generator of policy and the policeman of policy, which cannot be right. If the inspectorate is pushing higher still—and it could be something else tomorrow—is it the best-suited body to listen to and represent the problems that might occur in implementation?
In the Scottish Secondary Teachers Association submission, one statement seems to sum up everything about the implementation phase. It says:
I was referring to something more general. There seemed to be a principled determination that this was some kind of flagship programme that had to go ahead. We were unclear why a programme that was devised under one Administration and taken up under another Administration should suddenly become that second Administration's flagship programme. It seems that there was multifaceted ownership of it. It was important—nobody was under any illusion about the fact that it was a vital programme that had to go ahead. However, the principle seemed to be that it had to go ahead in a certain time frame and on a certain basis and that, if it did not, the Administration might be seen to have failed. We did not think that the Labour Administration would be failing if higher still was suspended or postponed for a further period to enable it to be implemented properly. In that sense, our comment was not a political reflection. However, there seemed to be a political determination that it must go ahead; perhaps it became the educational millennium dome, which is unfortunate.
To borrow the SSTA's phrase, we believe that the principles that underpin the higher still programme are important ones that should be realised in Scottish education. They build on the success of our comprehensive schools and seek to promote social inclusion by bringing in those who are socially disadvantaged. They seek to recognise progression, by building from standard grade through higher still and into higher level qualifications. They seek to ensure that adults have access to further education and they break down the traditional divide between academic and vocational education. All those principles are important and should be sustained, whatever the outcome of this and other inquiries into recent events.
I do not dissent at all from the views of my fellow witnesses. Higher still has been seen as a significant advance in Scottish education in terms of social inclusion and recognising the worth of young people for whom the higher was not appropriate. However, if we are talking about principles dominating practicalities, we need look no further than the continuing failure to implement the new higher in English and communication. We all agree that there should be no bar between the academic and the vocational, but perhaps there should be two separate highers: higher English and higher communication. The profession has been saying that for six years, but it seems to go against the political mantra of equality of recognition. We have a political one-size-fits-all straitjacket, but one size does not fit all. However, when teachers have said that, we have been pooh-poohed.
There seems to be an idea that all courses ought to look the same, with the same number of boxes and units. What are your views on that? People have mentioned the SCOTVEC culture, with units being done sequentially. Will you comment on the shifting of the sands when the higher still development unit eventually recognised that concerns were being expressed and changed the times at which units could be done? We might have thought such developments okay if they had been done before, but shifting in the middle of the year can cause difficulties.
Substantial changes in the middle of a course would have caused considerable difficulty. In my school, colleagues in different subject departments had already adopted different models for delivering higher still courses. Although most departments were doing it sequentially, some—for sound professional and educational reasons—were doing it on a parallel basis. They believed that the timing of assessments was, to a considerable extent, under their control, but the immediate issue was the number of internal assessments, which was in many subjects far more than the number of units, as there could be several assessments within one unit.
I would like to pick up a number of points in the SSTA's submission; others will no doubt want to comment too. Mr Eaglesham, at one point in the submission, you refer to two specific meetings with HMI and ministers—one in January 1998 and one in May 1998. You say that, in January, HMI's reaction was "patronisingly dismissive" and that, in May, it was "aggressive and ill-judged". Will you expand on those comments? What points did you put at those meetings and what were the reactions?
At both meetings, we presented the same argument—that higher still was evidentially not ready. We pointed out episodes in which material should have been delivered and prepared but had not been. Our line did not change much between the two meetings, except that further evidence was available in May that had not been available in January.
You say that the minister had a dilemma, and you say in your submission that he was receiving conflicting advice at that stage, from HMI on one hand and from professionals on the other. Whose advice was he following?
That question is difficult to answer. I suppose that the person to ask would be the then minister. My view, from informal conversations, is that the minister had become convinced that there was a genuine problem that had not been fully addressed and that steps might have to be taken. Of course, at that point, a new minister came into post, who would receive the same advice as had been given to the outgoing minister. There was a transition period, which was unfortunate—although not because of the two people involved, for both of whom I have the greatest respect. The timing of the change between ministers made things more difficult. Had Brian Wilson continued in the post for the rest of the year, I imagine that there would have had to be some kind of reaction. However, it is understandable that the new minister with a new brief would take advice from the appropriate advisers.
The common factor during the period was HMI and the officials who, I presume, continued to give advice that was at variance with the experience of classroom practitioners.
I know of no evidence that suggests that HMI's advice changed over that period. We detected no change.
It is one thing to identify, and to be proved right on, defects in the implementation of higher still; it is another thing to prove any causative effect on what happened during the summer. Others may want to comment, but to what extent did the problems that you flagged up on the implementation of higher still contribute to the chaos of the exam results and issuing process this summer?
It would be nice to be able to draw the audit trails together and say exactly what happened at each stage—that is clearly what this committee is trying to do with the evidence that it is taking. To be frank, I cannot answer your question. All I can say is that, had the advice that we and other unions were giving at the time been listened to, the problem would not have occurred as it did occur, because the development of higher still would have taken place in a different time frame. For example, the merging of the two computer systems would have gone ahead, but not in hot pursuit of higher still. It is also possible that some of the assessment issues would have been resolved, which would have lessened the burden on our colleagues in the SQA.
With the benefit of hindsight, do you feel that, if some of your concerns had been listened to and acted on, it is at least likely that the crisis this summer might have been averted?
The impact of the crisis might have been less had some our concerns—and those of other colleagues—been listened to.
Nicola Sturgeon's last question was on a point that I wanted to explore. We may take evidence from the previous ministers at some stage, but I understand that the transitions between Government and Government, and between minister and minister, were almost seamless as regards the advice that people were receiving, and that that advice has contradicted the advice that you, as union representatives, were giving on the introduction and implementation of higher still.
If witnesses could keep their answers shorter than that question, we would be grateful.
We will try. As far as we understand them, the issues that arose this summer were largely operational. Like David Eaglesham, we could not, and did not, predict them. The EIS was aware of some continuing difficulties with data processing, but nothing more than that. I appreciate that I am speaking with the benefit of hindsight, but we would say that the clear difference between previous years—when the Scottish Examination Board, SCOTVEC and the successor body, the SQA, operated as efficiently and effectively as they had ever done—and this year was the additional volume of data that arose from internal assessment. Had some or all of our concerns about the burdens and pressures of internal assessment on young people, teachers, schools and, as it turned out, the SQA been listened to, we believe that that would have been a significant factor in preventing this year's difficulties.
You said that the SQA could not handle the volume of data that was generated by internal assessment—a point that has been made by many people. However, you say in your submission that teachers could handle it and that schools could handle it.
Teachers handled it, at some cost to themselves and possibly also to their on-going work. Senior staff in schools would input data in various formats, and teachers at all levels would have to input the data again, either for whole groups or individual youngsters. They would often then have to check with their colleagues and principal teachers, who then might have to check with class teachers on the exact status of the data.
Mr Shanks, as a principal teacher, will you tell us how the administrative burden and assessment procedures of higher still impacted on the rest of your teaching and on the other classes that you taught? How did it bulk in your work load?
It had a huge impact. I was one of the few enthusiasts who implemented higher still English and communication this year. I believed in it and thought that, as my colleagues have said, it was a good thing for the pupils of the school. It became clear very early on that the amount of assessment that was involved in internal unit assessments would be burdensome. Classes were large and we were dealing with a larger number of pupils because of the inclusion of intermediate 1 and intermediate 2. The marking took a great deal of time. The higher still assessments took a long time, sometimes an hour and a half, but critical listening took a great deal longer and individual presentation could take hours and hours. I had teachers who worked through their lunch hours and after 4 pm. I had pupils queuing up outside. First-year, second-year, third-year and fourth-year pupils were affected. Every aspect of teachers' work was affected by the burden of assessment for higher still. It has had an impact not just on the kids in fifth year and sixth year this year but on the quality and delivery of education by teachers throughout the education system.
That is precisely what I expected that you would say. If the results had been okay this year, the assessment procedures would have become part of the programme, which would have had a major impact. Is that right?
The burden of assessment needed to be examined. From early on we told the SQA that the amount of assessment was a burden. I think that all teachers, and in particular English teachers, would agree that the burden of assessment was almost too much and would have had to be reviewed to enable higher still to continue for another year. Many teachers were under a huge amount of pressure. Higher still could not have continued as it operated this year. There have been changes, which will reduce some of the burden of assessment. Certainly, there would have had to be a major review of assessment for higher still to run smoothly this year.
In response to Brian Monteith's question, I think that developments could have taken place outwith the hot pursuit of higher still. For example, the computer systems could have been brought together and there could have been trials of procedures. A number of changes occurred simultaneously, some of which could have taken place even with the previous diet of examinations. That would have allowed changes to take place and to be bedded in, which would make the introduction of higher still in the next year more straightforward. It was the overlap of a series of different things that led to some issues dropping through the gratings in the street.
David Eaglesham has largely covered the point that I had intended to make. The SQA got one thing wildly wrong. The introduction of higher still assessments at intermediate 1 and intermediate 2 resulted in far more candidates taking national exams, which required markers. I am not sanguine about what will happen in the current session when we get further into the implementation of higher still and more subjects come on stream.
I am anxious to move on. Mr Kelly has now raised the question of markers.
We are hearing a mixed message about higher still. To describe higher still as the millennium dome of Scottish education suggests that you thought that it was an expensive white elephant of the Scottish Executive rather than something to which you were committed. It was also described as a political mantra and a one-size-fits-all approach—presumably, the other side of the argument is that what we had before higher still was a one-size-fits-all approach from which many youngsters got no benefit.
That is a complicated question, to which there are several layers.
My association was committed to the principle of higher still from the very beginning, in 1994—indeed, there was a difference of opinion on higher still between it and other associations. Since then we have been committed to the principle that something like higher still should go ahead. It is important to articulate that the examination system at the end of secondary school should ensure that every pupil has equality of opportunity and access to every possible avenue for the development of their abilities.
Is it reasonable to say that, despite the practical problems, the teaching unions and the teachers delivered? Of course, the fact that that was done at huge cost has to be examined with a view to modifying higher still. Despite the anxieties of the profession, it managed to deliver higher still. It was something that was not foreseeable that caused the problems in the end.
Mr Kelly, do you wish to respond?
I will not respond to that, but will develop the point that David Eaglesham made. I emphasise again that the teaching profession has been committed to higher still. Higher still in its current form has had a fair input from teachers. If we had not had higher still, we would have had Howie, which we all said was not the best way forward for Scottish youngsters, particularly at lower ability levels. When there was talk of delay, teachers said that they would do intermediate 1 and intermediate 2 first, because the higher suits the market at which it has always been aimed. That advice was not taken. I do not think that it can be said that the teachers were not committed to the principles of higher still.
Let us move on to marking. We have heard stories about timing issues and about unsolicited scripts being sent to teachers. I am interested in exploring the marking issues.
Cathy Peattie will know that as well as being president of the EIS, I am a member of the board of the SQA, although I do not appear in that capacity today. It is fair to say that marking would have been an issue regardless of all the other operational problems that arose this year, partly because higher intermediate 1 and intermediate 2 had to be marked externally.
There have been several statements about how much people were paid and about markers being paid less this year. Can you clarify the position?
I am a marker. My payment did not drop, but it increased only a little. The payment of marking is based on the time that it takes to mark a paper. There is disagreement about how the papers are paid for, and the system needs urgent review, particularly considering the stress under which markers were put.
You talk about the quality of the markers. We have had mixed messages about that. You said that the quality was good; others have said that it was not good. Teachers and whole departments have stressed that candidates whom they had expected to achieve reasonable results did not achieve them. Does everyone agree that the quality was good, or is that simply your interpretation?
I know from my dealings with the SQA that the quality of the markers was high and equal to the normal standard. However, the time scale within which the markers had to mark the papers was significantly shorter than normal.
There have been many stories suggesting that the marking was not of such a high quality. We want those stories to be investigated. If it turns out that the marking was not of such a high quality as a result of time pressure, the way in which markers were recruited or the fact that quality assurance checks could not be performed—it would be speculation to talk about that issue—that will be a serious issue that requires to be addressed. It may be that, to some extent, some of the stories have been fed by media accounts that have not been entirely accurate. At this stage, we think that it needs further investigation.
Teachers that I have spoken to—I am sure that others say the same—say that they will never mark again. They talk about problems with timing and payment. How do we move forward? The current examinations diet started in June. How do we ensure that, this year and in future, there are markers to carry out the work?
To some extent, that is an issue for the Executive rather than the SQA. As Margaret Nicol has pointed out, the Executive must make a clear statement that funding will be available to ensure that markers will be adequately remunerated. The current pay levels are far too low to repay people for the stress and responsibility associated with the job. There must be a clear statement that the timing of the exam diet and the announcement of the results—matters within the control of the Scottish Executive rather than the SQA—will allow for marking and subsequent and accompanying quality assurance procedures to take place. That must include the more senior markers considering how markers have performed and the concordance procedures through computer programmes. Sufficient time for those checks and balances must be built into the time scale. That is a first and important step.
There are a couple of points that I would like to expand on. In our submission we were questioning the need for Scottish children's results to be out before those of English children—they all go through the same Universities and Colleges Admissions Service system. We are still dominated by the timetable for entry to universities. We have examinations for the whole of our cohort, regardless of ability. Just because, for the past 100 years, we have done something in time for the universities, there is no need to continue to do that. We need to consider how our system reports and the purpose of our assessment—that is a big question.
Several members want to ask questions.
I want to explore further the issue of quality. It is beyond doubt that, this year, there are question marks over pupil performance against predicted performance—there is a wealth of anecdotes about that. One of the explanations for that is quality of marking. Are there any other factors that might have contributed to that, such as the burden of assessment or the timing of the exams? Are there matters that are intrinsic to higher still that might explain in part why so many pupils appear to have underperformed in relation to their predicted results?
One of our great concerns has been the simplistic association of two elements—that performance is inevitably linked to poor marking standards. We refute that. We do not believe that to be the underlying cause. If there is a problem, its extent is not yet clear.
If one looks at the results overall, a major factor has been the number of appeals. There has been a huge increase in the number of appeals, although the percentage of successful appeals is not so different. That can be explained by the fact that the SQA did not run its usual concurrence checks. In effect, in schools with a proven good track record of estimates, the SQA carries out a sort of appeals procedure before the results are issued, so that pupils are much more likely to have a grade that is consistent with their estimated grade. That was not done this year, which would have had a major influence on the difference between estimates and performances.
Margaret Nicol has touched on an important point: the holy grail of concordance. For many years, schools have been told that they must be accurate in their estimates, for the very good reason that if the child underperforms on the day, they can be accorded the performance of which they were thought to be capable. In the past few years, schools have been told only once how concordant they were. That is information that they need to get every year. The information that comes back from the SQA must be more transparent. If I, as the man in charge of a certain subject, am consistently wrong in my estimates without knowing it, I am doing a disservice to the young people in my charge. We need that information.
I have a slightly different point. One of the things that we must bear in mind is the fact that the procedures of the SQA and its predecessors have resulted in a highly sophisticated and ambitious system that requires only a few things to go wrong for the whole thing to crumble, because of the knock-on effects. The standards of the SQA—I am referring to the way in which it is organised rather than its exam standards—make it one of the most sophisticated examining bodies in Europe.
I want to explore a different element of the marking issue. Every time I carried out marking I said that I would never do it again, so I am entirely sympathetic with people who feel that way. The pressures that markers were under this year were phenomenal.
Yes. The EIS raised those issues in several forums. First, we had meetings with the SQA. It was clear that we were raising the markers issue—there is a minute from 27 March. We contacted the SQA because of our serious concerns about estimates, grades, non-transmission of materials, marking and moderation, which was also a problem last year.
Would that minute have appeared before the board of the SQA? At what level were those alarm bells being rung? It seems to be a simple, practical point. There were three times more exams to be marked, but the appropriate number of markers are not being taken on and they are being taken on late. I can understand that different groups discussed it and received reassurances, but did somebody go to the SQA board and mention, or have in front of them, evidence of the unions' strong concerns that this was going to be undeliverable if there were not enough markers?
The alarm bells would have been rung through the assessment focus group. They were being rung in the national qualifications committee and they were being rung at the board. In the minute of the board meeting on 27 June there is reference to the difficulties that there had been in marking, the fact that new markers had been identified and the statement that the remaining papers that needed to be marked were to be marked shortly. The board was being given reassurances. That is specific; it is in the minute. It can be checked easily.
If those concerns were being raised over several months, were you, as a board member, raising those matters consistently at board meetings? The minutes do not reflect that.
It was not at variance—the reasons were at variance. The information that we were getting from schools was that information was being submitted and not being recognised by the SQA. From the SQA's point of view, the problem was perceived to be that schools had not submitted the information. We were being reassured—as late as 22 June, which was the last time that the board met before all this took place—that the entries were coming in, steps had been taken to identify people who would be in touch with the SQA during the holidays and it was confident that the information would be in in time for the completion date of 10 August.
What I am trying to get at—it is the same point that I pursued with Ann Hill at the meeting of this committee on Monday—is how vigorously those concerns were being pursued at the level of the SQA board. We have heard that reassurances were being given, yet we have heard from your union and other unions that teachers were continuing to raise concerns. Were those reassurances being accepted at face value? Was there vigorous discussion of those matters at board meetings? It came out at our previous meeting that the minutes of the board meetings do not reflect that those issues were being pursued vigorously.
How vigorously can you pursue a question of what is happening to the results—
Issues such as that are not mentioned. Apart from marking being mentioned on 22 June, other concerns that were being raised do not feature in the board minutes.
I would have to agree with that. The reports that were given were often oral. The problems were often identified by the SQA officers. They said, "We have had this problem and this is how we are going to resolve it." Board members did not always have to ask the question. I am satisfied that no question that I could think of that could have been asked was not asked. I am also satisfied that no answer was not pursued that I could have pursued. I am confident that it was not just me—other members of the board felt the same way.
I will ask David Eaglesham this question, which is not on markers, but on a similar subject.
There is a problem with the audit trail of each issue that members raise. When we met the SQA in April, we reported a series of SSTA members' concerns about data transmission. We met the SQA to explore the problems surrounding that. The SQA informed us of the procedures that it was following to deal with those problems. We were satisfied that those measures seemed to be right. We agreed separate releases timed to be at the same stage. People did not tell us subsequently that measures had been ineffective in specific ways.
David Eaglesham is saying that there were a series of different problems, which were difficult to trace.
Clearly, the recruitment of markers was an on-going issue, but it is important to bear in mind that that would not impact on many teachers in schools, because they would not be aware of difficulties. A person who had been recruited in the past as a marker, but who had not been recruited again, might have asked why that had happened in their case, but that would not necessarily impact on any other teacher in the school who was teaching a higher still course.
You raised the issue in September and you raised it continuously until April, but did you accept the reassurances every time they were given?
One does not merely accept the reassurances—one assumes that people are doing their best to resolve an issue when they say that they are.
Our members' experience essentially backed up what we were hearing from the SQA, which was resolving the issue by asking our members to mark double the amount of papers. The members who got in touch with us were those who were displeased because they were not being asked to do that.
I want to move on.
I want to return to what has happened in the past 12 months. George MacBride said a few moments ago that the SQA examination system was so sophisticated that it required only a few things to go wrong for it to be in trouble. I am paraphrasing, but that is what was said. All three organisations' evidence identified the same problems: late vetting of papers during the year; difficulties with candidate entries in the autumn; difficulties with updating lists in January and February; inability to confirm entries; difficulties with unit results, which were asked for as many as four times without a response from the SQA; deadlines which took no account of holiday periods; a breakdown in communication between schools' and the SQA's computer systems; insufficient or wrong exam papers being provided; and all the difficulties with markers that we have discussed.
It is fair to observe that no one was there to put together the pieces of the jigsaw, but you must remember that we are teachers first and trade union associates second and that we do not have the time to get the bigger picture.
It was a genuine observation and I would like you to comment on it from your practical experience.
I am the SQA co-ordinator in my school and everything that Mike Russell enumerated is correct. It all goes back to last September and the failure of the SQA to capture properly the original data entries. The reason for that—whether it was a software problem or a management problem—did not concern me at the time, but all the subsequent problems came from it. It is a garbage-in-garbage-out situation. If one's initial information is not correct, one cannot judge properly what will happen further down the line.
Many of the examples that have been given are aspects of the same problem, which is not the setting of exams—we will leave that to one side because it is a different issue—but the input of data. As John Kelly said, we are not clear whether the problem was due to a failure of systems, of hardware or of management. However, many of the problems resulted from data not being input once they had left schools, data being input inaccurately and—most important—corrections not being made to incorrect data. That was extremely time-consuming for SQA co-ordinators such as John Kelly, but the knock-on effects—schools being given the wrong number of papers on the day of an exam, for example—could not have been reasonably predicted by people in schools.
In theory, one could put all the information together, decipher the pattern and produce a construct. In fairness, the unions operated together fairly closely on higher still, and we operated fairly closely with the SQA and other groups. No combination of any of those organisations was able to come up with such a construct. It would be difficult for that to happen, even if it was not for want of trying to find a construct. If, by a process of fusion of some sort, we had identified that we were on to something, we would have pursued it vigorously. There is no doubt about that. In theory, it would have been possible to produce the construct, but in practice—and with the best will in the world and the co-operation of colleagues in other unions and the SQA—that did not happen.
When I asked that question of head teachers and teachers in my area, I heard two interesting responses, which I will put to the witnesses. One response was that the SQA was in effect saying—albeit politely—that the problem was the schools' fault, because they did not know how the system worked and they did not use it properly. There was a tendency to blame the schools and teachers for weaknesses in the SQA system. I am not saying that it was done maliciously, but the SQA tended to think that it knew its systems so well that the problem could not possibly be with the SQA.
On the view that the SQA blamed teachers, head teachers or schools, although a few EIS members have said that such comments were made to members or to schools by representatives of the SQA, that is not our general perception. That attitude is not what we perceive from SQA management or SQA employees who were in communication with schools. Although there might be some evidence to support the view that Mike Russell expressed, there is little evidence to sustain the view that the SQA blamed schools and teachers for failures.
Like George MacBride, I have the greatest respect for colleagues at the SQA. They have been unfailingly helpful and courteous during the nine or 10 years that I have been doing this job. Even with the pressures that they were under—particularly from March to June when there were constant phone calls and faxes between us—they were tremendously supportive. We should put up our hands and say that there were only minor changes to the way that data were sent to SQA; the major difference was the volume of data. I pressed the same buttons, only a hell of a lot more often.
I have a final question for each of the union representatives. The word "reassurance" has been overused throughout the piece; indeed, Margaret Nicol used the word again today. Have people in Scottish education been inclined to be too easily reassured by the SQA and other bodies? Could the unions learn the lesson of being less easily reassured and asking harder questions?
That is an important lesson for us all to learn. However, those who run the system should also learn the important lesson that there must be considerably greater transparency, openness and opportunities for dialogue, which is the conclusion that we reach in our written evidence. We must investigate the culture of non-departmental public bodies and how they relate to the stakeholders with whom they work. Furthermore, we need to explore the culture in the Scottish Executive at Victoria Quay—not the Scottish ministers themselves—which has tended to be modelled on the Westminster culture and which decrees that "We know best because we are the senior people in the system."
The lesson to learn is that there should be greater transparency.
Yes.
There has been much anecdotal evidence of concerns about marking. Now that appeals are under way, we are still receiving anecdotal evidence about dissatisfaction with the marking of appeals. Do your members have any experience or knowledge of whether appeals have worked well? Furthermore, do you support the idea that scripts from appeals—not all exams—should be returned to schools as a means of determining whether the initial marking of papers and appeals scripts, or the contributory factors of sitting higher still, have led to so many students' dissatisfaction with their marks?
Although I do not have a union role, I can say as a teacher that there is dissatisfaction with some of the results, and a sense that the results in English and communication, and other subjects this year are depressed—as are the pupils and teachers. We feel that the results do not represent the candidates' potential fairly. This year, most departments submitted far in excess of the usual number of appeals in the hope that, on re-examination of the papers, some justice would be done to candidates. Although we have not received the results of the bulk of the appeals, the emergency appeals that relate to university and college entrants have been returned, but to some disappointment. It remains to be seen whether the results of the bulk of appeals answer the prevalent pessimism.
Mr Eaglesham, in your submission, you say that the suggestion that papers be returned is unacceptable. Can you comment on Brian Monteith's point?
The return of papers poses technical and practical difficulties. When scripts are marked for the SQA, the principal assessor must be satisfied that they have been marked to the appropriate standard. That is an entirely different exercise from a teacher returning prelim scripts to their own class, when they need to explain to pupils where they have gone wrong and how to improve their performance. There is no point in sending back final examination scripts to candidates, saying, "If you do this or that next time, you will get a better mark."
You are free to give your opinion on the matter. However, my point is that there is a large degree of mistrust of the marking and the appeals process. However, it seems that in England, papers can be returned. Mr Kelly, given that the NASUWT crosses the border, do you have any anecdotal evidence on the experience of returning papers in England?
Although I have no observation to make on that matter—I have not had much to do with it—I would probably endorse David Eaglesham's views.
I do not object to commenting on the situation south of the border, where scripts were returned in a planned way. The decision was taken to return exam scripts and the marking process took account of that decision.
I understand the practical differences between prelim papers, exam papers and the English example of the planned introduction of returning candidates' scripts. You made those points well. I raised my point in order to establish what might be done to achieve credibility if exam scripts were to be returned. My point was to suggest that we limit the return of scripts to those candidates who are dissatisfied with their appeals. If that approach would not work, what more might be done—other than for every school to ask for examiners' reports—to try to re-establish the credibility of the SQA and the exam system?
Get it right next year.
That sounded like a good answer to me.
No—that was not a good answer.
That was a slightly different question.
Many of those students are still at school.
I stress Mr Monteith's point: many of those students are still at school and it is important that they realise that what will happen this year will be credible. To echo Margaret Nicol's point, we must get it right next year. That means that we must put in resources—not only money or more pay for markers, but resources of time, hardware and software at the SQA—to ensure that data processing and operational management systems work effectively this year. Young people and their parents must get that message, which must come from all levels of the Scottish education system.
I am keen to move on. Are there any more questions?
I was not here when you discussed the line of questioning.
I would like to round up this part of the meeting.
People have looked for various reasons why it all happened. One argument that was put to us forcibly was that the difficulty lay in bringing together the SEB and SCOTVEC. There is a view that SCOTVEC's bad practice infected the new organisation. We were told that that view was commonly held in Scottish schools. Do you believe that view to be commonly held and do you give it any credence?
It would be fair to say that in schools, as opposed to colleges, the SEB was the major provider of the highers and standard grade systems. Among Scottish teachers, there is very strong loyalty to the SEB and a feeling of its quality, stretching back in time. The same feelings do not exist for SCOTVEC, because teachers had much more limited involvement in the more than 700 modules that SCOTVEC provided. There are some teachers who do not think as much of the quality of the SCOTVEC modules as they do of the new intermediate 1 and 2. However, to extrapolate that view, as Johann Lamont has done, is perhaps not true of the majority of Scottish schools.
I did not extrapolate. I presented a case and asked for your view.
In our evidence on the proposals to merge the SEB and SCOTVEC, we indicated that we hoped the prevailing culture of the new organisation would be that of the SEB, in which we had more confidence. The culture and mores of the SEB were much more conducive to teachers and schools. We would not be overtly critical of SCOTVEC or say that it was wrong, but the preferred culture was that of the SEB. There have been difficulties in bringing together those two cultures.
I understand that Johann Lamont was referring to someone else's view. However, whatever our views, we do not think that the use of emotive language is helpful to the debate.
A concern that has been around since the beginning of the higher still programme is that of parity of esteem. Many schools were not comfortable with the concept of internal assessment under the SCOTVEC model. Nationally, that was recognised from the beginning, which is why there is a hybrid system of internal unit assessment and the stamp of external examination. Like George MacBride, I would prefer not to use emotive language. The people at SCOTVEC were always fine and professional in their dealings with schools. We should not allow aspersions to be cast on them. However, schools have been more comfortable with the SEB tradition. Perhaps they will have to learn to change, too.
The verification procedures for SCOTVEC did not seem to be as rigorous as those for the SEB.
Please ask a question.
I was coming to that. There is the problem this year, and there is the review of higher still that Margaret Nicol mentioned. What can we do this year? Is there something that we should do in the short term and should we take a wee bit longer to do the things that Margaret Nicol talked about? I wonder whether we have time to make the changes that you are suggesting in the current year.
Although we believe that there is a need for radical review of and change in the assessment procedures—especially the internal assessment procedures—and for a drastic reduction in the amount of internal assessment, it would be totally inappropriate to start that on a major scale during a session that has already begun. At all levels of higher still, young people are now several weeks into their courses. Making radical changes this year would disadvantage them and cause confusion to them and their teachers. However, we must be ready to act quickly on the conclusions of inquiries such as this and of the Executive-sponsored review of the implementation of higher still, so that changes can be made not in the far distant future, but in the session after this one.
Would it be possible, even at this stage, to shift the exam timetable, or would that be a step too far?
Bringing the exam diet forward considerably would pose problems, because people are planning on the basis of 160 hours. John Kelly has already indicated the more flexible date—the date on which young people are told the results of exams.
When the examination diet is being composed, regard must be had for the pressure that young people are under at that point. That can be done without dealing with the problem of less teaching time.
Are you content with that, Mr Kelly?
Yes.
Good. I thank our witnesses for their attendance and for their answers to our questions. We will be taking evidence from other witnesses, and you are welcome to stay to hear that.
I thank members of the committee for their questions, which have been thought provoking.
There has been a delay, for which I apologise, in starting the next evidence-taking session. We are time constrained. I thank Pat Cairns and Alex Easton for their patience. We will proceed immediately with questions from members. I will stop taking questions at about a quarter past 12.
I intended to ask a general question, but I have decided to change tack.
Please make questions specific.
Section 2 of your written submission relates to concerns about marking; in it, you give a full explanation of one school's results and describe how they were
Those were marks that were sent to the school in response to inquiries. The school in question was one in which the marks would be expected to be high; it was in the independent sector and had a very good track record. When it received the results, it discovered that virtually the whole cohort had failed, when normally they would get A and B highers. That suggested that something was awry. The second return gave all the pupils a band 1 pass. Now the school has received final results, but it is still asking how certain it can be about those. I chose that school from a range of examples as one of two extreme cases, just to make my point.
In the last paragraph on page 1 of your submission, you say that:
That quotation was taken from SQA documentation that was sent out. I passionately believe in equity, and that has been one of the great things about the Scottish system. To put it bluntly, opportunistic parents have latched on to that. Parents' appeals were accompanied by a lawyer's letter giving a lawyer's interpretation of the statement and telling the school that, in an appeal, the school would have no choice.
Your submission also mentions with regret
Once the possibility that estimates could be amended got into the public domain, pressure was put on schools to do that. That was not helpful.
I do not quite understand that point. Were those instructions issued to centres by the SQA in August?
We were asked to resubmit estimates. On a few occasions, we were told that estimates could be revisited, as some estimates had been lost somewhere. That was interpreted by some parents as—
Was it done in response to some parents? Were parents pushing for that? Do you think that the instruction came from the SQA in response to parents pushing for it?
The SQA would have to speak for itself about that, but I suspect that the answer is almost certainly yes.
The paragraph of your submission that deals with the effects on pupils and parents says that one of the difficulties for pupils was that they were unable to seek reassurance from their teachers. That takes us back to the failure to provide schools with copies of the results. What was the effect of that and what action did you take in response to it, knowing that you would not be in a position to reassure pupils?
We all felt completely foolish, because there had been no communication to schools that we would not receive that information. I am sure that every school in the land had its board of management in place, together with secretarial support, to deal with the usual rush of phone calls. Also, because we were all interested in knowing what the results were, we were all there. People come back from holiday specifically for the exam results date. We were there, but there were no results. We were informed only on that day, when we started telephoning, that the results would not be made available to us.
When you were told that the results would not be made available, were you also told that, at the end of that week, the pupils would not get their results either?
On the day that the results were due to come out, we were told that neither we nor the students would get the information.
In the past, did you get the information prior to the day when the students got their results?
No. The students and the schools usually got the results on the same day. Occasionally, the schools got them the day before, but usually we got them for sure in the first post on the same day.
You would be in a good position to tell us what is happening now and what your fears are for the future. I have evidence from two places in Scotland that some students are still waiting for written confirmation of results today—10 higher biology students in Greenock and five history and geography students in Newton Stewart. Do you have evidence of students who are still waiting for final confirmation of their results?
Yes. I think that that is still the case. There has been talk of suggested action. The suggested short-term solution is to get the data-handling and administrative procedures really tight. I think that that can be done. The other suggested solution concerns quality assurance management. I am sure that that will be done. Because of the good reputation of the Scottish Examination Board, perhaps too much faith was put in the system.
You have gone a step ahead of me. I will stop you on that first point. You are telling us that you believe, from your experience or from what your members have told you, that some pupils are still waiting for final confirmation of results.
The final appeals are being mopped up.
Do you mean appeals or results?
I am talking about appeal results.
I asked about results. Is anybody awaiting final confirmation of results?
There are still people who have not received their standard grade results. I do not know of anybody who has not received their higher results, but there are still youngsters who have not received confirmation of their standard grade results.
Are you aware that that is contrary to the expectation that the minister expressed in his statement of 6 September?
Yes.
Let us move on to the second point, on the future. Your submission proposes several actions, including a review of the SQA administration, the removal of stress points by the higher still liaison unit, and the expansion of the role and responsibility of principal assessors. The suggested actions on which I will focus are, first, the proposals to second some school SQA co-ordinators to assist with data management and, secondly, to
School SQA co-ordinators and, in particular, principal assessors know that side of the system inside out. It is important to involve such knowledgeable people in the review to ensure that the systems are in place.
It should be remembered that any reduction in the teaching year would have an impact on the course content, and would therefore require a review of courses. The time lines will have been set for this year, as youngsters have already embarked on courses.
But you are in favour of that proposal for future years.
It is worthy of consideration within a review of other matters, such as course content.
You say, rightly, in your paper that pupils have been the hardest hit and that many are disillusioned. Given that we know that some of our pupils have not yet received their standard grade results, how do we encourage young folks and give them confidence in the examination process? A senior teacher at a school in my area told me that some kids whom he was assessing for standard grade maths asked him whether they would receive marks for the assessments.
Probably the people whom pupils and parents trust most are—God bless them—head teachers. Parents know us, have worked with us and will accept reassurance from us that they would not accept from other people. They know that they can knock on our door and meet us in person.
Will that strategy help to give pupils confidence? There is still a feeling among pupils that, if teachers are unhappy with the situation, pupils cannot be happy with it either.
I think that teachers are confident in their estimation of pupils' abilities and that pupils trust those estimations. We can give pupils only our view of how they are performing, and the reasons for that view, and tell them to carry on with what they are doing. It is a matter of trust.
In a previous answer, you expressed concern about slippage in processing pupils for entrance to the 2001 diet. Did such slippage occur for the 2000 diet? Are you saying that we are already repeating some of the mistakes that were made last year?
As far as I can judge, a different mistake is being made. Schools submitted data timeously last year, but the data were lost and not processed.
For entering?
For entering. We did not get the feedback that we had called for that the data had been entered.
Are you concerned that something similar is happening?
I hope that everything will be made as bomb-proof as it can be in the short term. That is what is being done at Dalkeith and through our internal quality assurance. There may well be sound reasons in an action plan—which I hope exists—to explain that it helps to delay the request for the initial data. The fact is that that request is later than we had anticipated.
It is important to note that a key change was made to the approach to the delivery of higher still. We had started on the premise that children would begin a unit, be assessed, and move on to the next unit. It emerged that too many youngsters were failing, so the critical change to testing pupils when they were ready was made. Of course, that led to a huge slippage in data input and pushed the bulk of data input to the end of the year.
Obviously, if a strategic decision is taken to carry out all the unit assessments at the end, an enormous amount of data will arrive late in the year.
Is there any theory as to why so many pupils failed assessments after completing units?
You will appreciate that the difference between standard grade and higher is huge. In the traditional higher, youngsters had the opportunity to do assessments, be regraded and benefit from that, but this time youngsters had to pass the assessments. Youngsters are unable to produce a top-quality essay first time around, three or four weeks into their fifth year. The steps that were taken were sensible, but they meant that we were not operating under the original premise.
In paragraph 7 of your submission, on suggested action, you say:
Yes. The simplistic view had been that there was a mixture of internal and external assessment and that, therefore, the external assessment was shortened. As has been said, there is a lot of pain in administering the internal assessment. Assessments have served the purpose of motivating youngsters and teachers, but it has transpired that internal assessments have operated a bit like a class ticket. The mark that a pupil received was decided entirely by the final exam. As long as a pupil had completed the unit assessments, their final mark seems to have been determined entirely by whether they had passed or failed the exam. The question is raised of the balance between internal and external assessments.
Can you clarify that the mark for a pupil who does three unit assessments in a term does not count toward the final exam?
The exam is a separate entity, but one must pass all the internal assessments as well as the external exam.
A pupil must pass the internal assessments, but the mark that they get for them does not count toward the final grade.
The final grade is decided entirely by the external exam. In some cases, there are practical components that count towards the final grade.
The final exam was the be-all and end-all. It had not been understood that the final grade was not really a balance of internal and external assessments.
So the system is not continuous assessment. The internal assessments are just a hurdle to clear.
I think that the SPTC is suggesting that internal assessments should not be mandatory this year. They may serve another purpose, which is to motivate and check, but we should step back from them this year.
Your submission makes no clear suggestion about the SQA. What are your thoughts about it? The EIS and other bodies from which we have heard have made suggestions, such as that the SQA should have closer links with the Executive.
I covered that in points a) and b) of paragraph 7 of my submission. The SQA was starting up a new system of administering exams with 100 years of excellent experience behind it. To start from scratch would be crazy. People made the mistake of thinking that there was a new computerised system. There was not—the SQA had faxes and bits of paper flying in from all over Scotland. As soon as possible we want to move to an e-mail system. I signed the papers for the current census in my school and took accountable responsibility for that. I am comfortable with that, as it is what I am paid for. Eventually we should reach a situation in which schools input data and they are e-mailed to the SQA. That would avoid the danger of wrong buttons being pressed, as has happened at the SQA. Not all schools are on e-mail yet, so what I am suggesting may be a year or two down the line. In the short term, there should be rigorous scrutiny of what the SQA is doing. I imagine that the Scottish Executive will adopt a much more inspectorial role regarding the quality assurance procedures.
Do you see no case for reconsidering how the board operates with the executive of SQA, for example?
To dismantle or to change radically the SQA at this time would not be helpful. That is a personal opinion.
In my view, a failure to manage and a failure to manage data were at the root of this problem. Apart from the schools, everybody failed to understand the huge complexity of the task and how much information was going to descend at the very last minute. The problem was flagged up regularly, but schools were not listened to as they should have been. Whether the SQA was able to implement in one year all that was being asked of it is a question that I could not possibly answer. We cannot allow these failures to happen again.
Do you think that the SQA was unable to deliver all that was being asked of it?
Probably. There were three changes. Two boards had been amalgamated, there was a totally different structure of course delivery and there were new information technology systems, which in the first instance rarely function as they should. The combination of those three things made it very difficult to have success first time round.
Does blame lie with the SQA's political masters?
There was pressure on schools to comply, which they did. The only success story is that schools delivered for young people. I am very happy about that.
There was an overambitious—I use that word rather than reckless—push on schools from the higher still development unit and HMI. We were perceived as conservative if we tried to suggest that we move forward at a reasonable pace, as happened with standard grade. If at public meetings we said that we ought to ca cannie and think things through, the response was often very sarcastic.
Three members have indicated that they wish to speak and there are only two minutes for questions, so I ask members to keep their questions brief.
I am interested in the impact of the way in which the SQA responded to the crisis. You say in your submission:
We are referring to statements that were made very late on, when it was like Saigon in the last days of the Vietnam war. People were under pressure and close to panic, and it would be inappropriate to criticise them. Nevertheless, some of the statements that appeared in the press late on suggesting that schools had been contacted were misleading. These were mistakes made by individuals. I am not pillorying anybody, as people were under tremendous pressure. There was an element of panic at times.
I would like to put the same question in a different way. Would it be possible for you to comment on particular statements when submitting fresh written evidence to the committee? Could you identify statements that you thought were misleading and tell us why, so that we could consider that?
I have statements from schools about e-mails being sent to parents to say that the school did not send any information. We chose to submit a succinct report based on the considerable amount of material that we have received.
We are grateful to you for that.
I want to return to the issue of complexity. Are we still trying to do too much? Does the complexity of the information that appears on the certificates that youngsters receive reflect the impractical manner in which higher still has been implemented, and does it need to be simplified and clarified?
With the wisdom of hindsight, I would accept that. There are even worse things in the pipeline, to do with core skills and working with others. I hope that there will be a review that will consider what is practical. The higher still development unit does not yet have all the material that it needs. There are youngsters in my school doing an access 3 Spanish course without materials, even though those were due in August 2000. We are still short of teaching materials.
It has been stated that in England pupils do not receive certificates as such. Instead, notification is sent to the schools. Would you consider that?
The current system has been very successful over the years. An SEB certificate was someone's lifelong record of achievement. It has been so successful over the years that I would not want to step back from it.
It would be possible for schools to produce the unit assessment certificates and for the SQA to produce the core certificates.
That is one option in the debate about the purpose and function of internal assessments.
It should be acknowledged that amendments have already been made to some courses to reduce the amount of internal assessment. That is hugely appreciated by the staff concerned. Some of our comments have been listened to.
I apologise for the frantic questioning, but that meant that we asked more direct questions and got the information that we wanted much more quickly. Thank you for attending this morning and for answering our questions. Thank you also for your succinct submission.
Thank you. I am sorry that we were a bit like a Gatling gun with our answers, but we were aware of time pressures.
We will proceed straight to our next set of witnesses. I believe that Mr Elliot will join us first.
I would like to ask about the board of the financial, planning and general purposes committee. Do you think that it gave proper consideration to the issues that were raised regarding the 2000 diet and that its meetings were minuted appropriately?
I cannot help the committee there, because I did not attend meetings of the finance, planning and general purposes committee of the SQA. I have not been in the SQA's offices since 11 August.
What was your impression of how the board operated apropos the executive of the SQA? Do you think that the board was fully informed, aware and in control of what was happening?
It was my responsibility to report to the board from June 1999 until March 2000. At each board meeting I reported on the progress of work on the software. The board had a good discussion after I made my report and asked the officers of the SQA some penetrating questions about how we were progressing.
Were you never copied minutes of such meetings?
I had access to the minutes.
Were they a fair reflection of what was said and done?
Of the board meetings?
Were they a fair reflection of your contact with the board? Or were your reports made informally?
No, I reported formally to the board. Inevitably, the minutes of board meetings are a summary. They do not contain all the discussion that took place, but they are a reasonable summary of what was discussed. I cannot speak for the finance, planning and general purposes committee, because I did not attend its meetings regularly.
I want to ask you about the operations unit. How many people work in the operations unit?
The blueprint was for approximately 30 people, but additional staff were drafted in quite early to assist with the testing of software.
Roughly 500 people work for the SQA.
The SQA has more than 500 members of staff, who are divided among three divisions and 21 units. I was responsible for seven units in my division. One of those units was the operations unit. As far as I recall, the number of staff was in the 30s, but we had built it up a little to cope with software testing.
Does the operations unit handle the data that come into and go out of the SQA?
That is correct.
Is it also the unit that was using the new software?
Responsibility for developing the new software lay with the information technology unit, but the operations unit was probably the biggest user of it—the whole organisation uses IT.
Mr Tuck's submission says—I do not have it to hand—that 200 of the 500 members of staff had to reapply for their own jobs or for other posts. Were there many people in the operations unit who were not used to working there, or was there continuity of staff?
There was great continuity of staff, but the job changed radically.
How did the job change radically?
The staff of the operations unit were primarily former SEB staff, who were very skilled and had an excellent track record in running the SEB's examinations. However, they were now working in the SQA and were responsible for running more than a million national certificate modules, Scottish vocational qualifications, higher national certificates, higher national diplomas, as well as all the examinations that had existed before and the higher still examinations.
Did the SCOTVEC people who were responsible for handling the data for that part of the operation merge with the SEB people to become the operations unit?
Not exactly. One of the difficulties was that the SCOTVEC staff were based in Glasgow and the SEB staff were based in Dalkeith. This year there were probably four or five former SCOTVEC staff in the operations unit, as we had been keen to ensure that that expertise was present in the unit.
Is it right to say that all the data handling that had been done by SCOTVEC was now done by the operations unit, which was in effect a continuing SEB unit?
By and large, that is the case.
There was continuity within the unit. Before the new computerised system was introduced, did you use a paper system or did you use a different computer for the entering of data?
The SEB used the examination processing system—EPS—which was new in the early 1990s and had bedded in nicely by 1999. Last year's exam worked very smoothly. SCOTVEC used a system that was based on an IBM AS400 computer. Both of those systems were then replaced by the new SQA computer system.
How were the data handled? Was there a brand-new system for the paper that came into the building?
The system for handling data that came in by paper was a continuation of what had happened before. Electronic data were coming in using new software and were loaded on to new software.
Is there an electronic data transmission service as well as a back-up paper system?
The paper system is not a back-up. There are certain data that are by and large received electronically, such as registration and entries. In the past, standard grade internal assessment grades have come in on paper and that was also the case this year. The method that was used depended on the qualification and the choice of centre. We tried to accommodate those centres that were very keen to use electronic methods as well as those that wished to rely more on paper methods.
Did the people in the operations unit who handled the data work with the IT people to design the system, or did the IT people design the system and then tell them how to use it?
The users, the operations unit, had to tell IT what they wanted. We used industry-standard techniques in developing the software. Some people were allocated the role of senior business user. Those people had to tell IT what data processes they wished to be supported by IT. IT then had to produce the software. The development was very much user driven.
Can you describe the system of checks that is in place to ensure that the information that is received is acknowledged and is verified with those who have sent it?
When a file containing candidate entries comes in, it goes through validation checks to ensure that, as far as we can determine, the data are in good order. A report is then sent back to the sender if any problems are identified with the data that have been submitted.
There have been problems with the system continuously since it was first used in October last year.
We had a demanding year right from the start. However, we had the software facility in good time to receive entries—it was available in September. Entries came in pretty slowly, and my recollection is that by November we had received only about 10 per cent of the total that we expected. The rest of the entries came in eventually, and by March we were able to run the various first procedures of the examinations.
There were various critical points before then. For example, at Christmas, there was intensive dialogue with schools about the data that were coming in and the fact that deadlines might be slipping. Is that right?
When I took over responsibility for the IT project in April last year, it was already running a little late and had been reconfigured to ensure that it would deliver on time. It was decided that, instead of planning the whole system before starting to write the software, we would write the software to enable us to register candidates, as that was the first thing that we needed to do, and then move on to results software. Such decisions were being taken all the time, and we were constantly monitoring, managing and prioritising to try to meet the date of the examination, which, of course, was not negotiable.
But the dates for the registration of candidates, for example, were slipping even at the beginning.
It is true that our recommendation was that candidates should try to get their entries in by 31 October and not many entries were received by then, but that was not crucial. We could allow slippage at that stage. We were falling over backwards to be helpful to the centres because, of course, they too needed to bed in new software. Entries were coming in more slowly than usual, but we tried to be understanding and did not push the centres too hard.
That date slipped, and a series of other dates slipped. Did the alarm bells not start ringing? Did you not realise that that would jeopardise the final programme?
Those dates were not hard deadlines. We were prepared to accept entries until January or February of this year. We realised that entries were coming in more slowly than usual, but we assumed that that was because schools were experiencing difficulties with their software. We were not alarmed at that stage. It was not until later that the flow of data caused us very great concern.
The operations unit was staffed mostly by former SEB employees. Although the system was new, they were used to working to operational deadlines. Did they not report their alarm to you or to the head of their unit?
The sense of alarm was not transmitted to me at the time that we are discussing. I began to be alarmed in March, when I noticed the amount of overtime that was already being worked. I could see that the staff were getting tired then, and I was conscious that there was a long way to go until the end of July. I thought in March that the operations unit was not coping with its full range of responsibilities. We took action on that, and one of the first, most drastic, courses of action was to move all the employer and training-provider work back through to Glasgow to give operations staff a clearer run at the higher still and standard grade exams, as they affected schools and colleges.
Were there ways for the staff to report their anxieties to you? Were you aware that they were anxious that things were not working? Did they report that to you or to their line managers?
I could not say what staff reported to their managers, but, in my discussions with them, they said that they felt strongly that we were up against a great challenge in the year ahead. I shared that feeling utterly. I would rather that the staff had shouted louder and sooner. The tradition in the operations unit was that staff would perhaps be reluctant to say that they were not coping as soon as I would have liked them to have done so.
You say that you are reluctant to hire more people to operate that unit, or at least to rejig or restructure the unit. Did you not think at that stage that you could have hired more people to work in the unit to help it along?
This is a very important point. From last summer, I realised that I would like to restructure the unit. As the year wore on, I increasingly took the view that the unit not only needed restructuring, but needed considerably more staff. I thought, however, that it would be fatal to start restructuring the unit, going through all the SQA processes of advertising posts and interviewing, when staff were already working a massive amount of overtime every week. It would have been very disruptive and would have placed us in greater jeopardy.
I wish to ask you two points, and see whether our rally—in tennis terms—can be shorter than the one that you have just been through.
Sorry.
It is normal practice with such a huge project for there to be a contingency plan should the software for a new computer system not be delivered or should it not deliver what you wanted. For example, air traffic control software could not be introduced until it was entirely failsafe. What was your contingency plan?
That is a very interesting analogy. The air traffic continues, but what people are trying to produce a computer system to cope with is not changing, apart from an increase in its volume. We were not in such a situation; we were trying to produce qualifications that were only being introduced. Neither the SCOTVEC software nor the exam board software could support higher still. The software did not exist anywhere in the world. We did have contingencies round the edges of the system, but the contingency for the failure of the new software to operate on time was that we could not get higher still. There was no other way to do it than by producing new IT.
So there was not a fallback position if the software did not deliver, if the computer system did not deliver or if, as we now know, the data management system did not deliver.
The fallback would have had to have been quite drastic. We based results simply on the examination, and processed them using the old SEB software. We could not process higher still that way, as it required software to collate unit and exam results.
The system was therefore coming together at a crucial point where, if the software or data management system failed, there was no alternative. Is that correct?
That is the pressure of running a public examination system—it was not possible to delay the examination by a month or two. We were constrained by all the requirements that were placed upon us.
I am not sure if you have seen the evidence that David Miller gave us on Monday—
I have not.
Let me read a paragraph of it:
I had not spoken to the minister, but I was speaking frequently with members of the Scottish Executive. I am quite surprised by the comments that Mike Russell has just read out. I think that I had a fair share of responsibilities to execute at the time. Managing the contact with the centres was not my responsibility at directorate level. I am not sure about what that conversation was, about who it was with or about why my views on it were deemed to be particularly significant.
Did the chairman raise that with you subsequently?
No.
So Mr Miller went to the centre and spoke to an individual who said that not much more than 80 per cent of the results would be right. He said that he was "knocked sideways" by that comment. He later identified that staff member as Bill Arundel. Does Bill Arundel work in your division?
Yes. Bill Arundel was head of data processing, and was promoted to acting head of the operations unit.
Bill Arundel told the chairman that not much more than 80 per cent would be got right. Did Bill Arundel tell you that at any stage?
Bill Arundel and I were in constant discussion. Bill must have arrived at that figure on the basis that nothing would happen between that date and the date of issue of results. At one point, we were missing a substantial amount of results, for whatever reason. We made the situation good, which is why the certificates that we issued were in a much better state than a 20 per cent deficit. There is no doubt that there was a lot of anxiety among the staff. I was extremely anxious myself.
I want to concentrate on the fact that Bill Arundel told David Miller
It does not. I was extremely anxious about the situation from March until August. We felt that the situation was retrievable and worked extremely hard to retrieve it. However, the airy self-confidence that Mike Russell referred to in no way reflects what I experienced during the 12-hour and 14-hour days that we were working in Dalkeith.
This is the key point. We have constantly heard the word reassurance. We have an example of such reassurance. The chairman believes that he has been reassured when, after being given a piece of very bad news, he is told at a videoconference that the SQA is a can-do organisation and that the members have their heads up. However, you do not consider that to be reassuring the chairman that everything is fine.
This is a very difficult issue on which to comment; it is a matter of how the words are used. Officers came to the videoconference meetings that we held every morning from the beginning of July and were encouraged by Ron Tuck and me to be utterly frank about the situation. I am sorry to hear about that comment to the chairman, because I felt that officers were being totally frank at the meetings. None of them would suggest that Ron Tuck and I are intimidating people and we really encouraged them to tell it how it was. We tried to tell officers that we could not afford to get very despondent and start panicking—we could not cancel the examination. We continued to work very hard to keep a rational perspective on the situation and to keep solving problems, and we were successful in doing so. The situation improved significantly from the end of June to the end of July.
Was Bill Arundel right to make that comment to the chairman? Furthermore, from the evidence, do you think that Mr Arundel and others might have been preparing themselves for the worst case scenario—that they were not going to get it right—whereas you appear to have been doing something else?
The staff were very concerned, which caused me to dwell quite a lot on what constitutes leadership. Indeed, wartime analogies were quite common. Despite the fact that the situation looks very difficult, staff need to be reassured that things can be done to get to where we want to go. There were things that we could do. We were taking quite drastic action such as taking staff off other very important work and putting them into operations to try and bolster the unit. We felt that the staff were taking the perfectly sensible view that this was not a typical year and that the situation was very worrying. The late access to marks data was causing operations particularly grave concerns and we just had to keep reassuring them that we were being very active in remedying the situation and that doing nothing would make things very difficult. Although staff felt very despondent, we took the bigger picture and put in more resources to make the situation better than they thought it was. We were successful in doing so, but not nearly as successful as we would have liked.
The SQA staff have clearly been under a great deal of pressure. Have you been asking too much from them? For example, last April, your operations line manager took on the role of IT line manager. That seems like a lot of ground to cover. Might not some reasonable strategic management at the start of the situation have alleviated the stress suffered by members of staff?
The operations unit was not constructed to cope with the SQA's business. I took over the unit in April last year with no background in data processing. However, as the 1999 exams worked very smoothly, there was no immediate sign of any problems. My greatest anxiety was the development of the new software, without which no one would have received a certificate this year. As a result, I began to realise over the winter and into the early part of this year that the operations unit could not cope with the SQA's business but, by that time, we were already into a firefighting situation.
Yes, but planning and strategic management can take into account the fact that a certain year might be difficult due to new IT systems and structures. Did any such planning take place or was there a real failure on the part of management to plan for the situation that the SQA was facing?
There was a failure to plan the structure of the operations unit to cope with the SQA's business. However, other factors that put pressures on the unit could not be so readily foreseen. Although we succeeded in getting the software in place to do the core processing, a lot of pressure was put on the operations unit by the late delivery of the tools to manage data. Furthermore, the system was sometimes slow to respond, which delayed the work and put more pressure on the unit. Finally, insufficient markers were appointed, which was probably one of the most nightmarish situations for the unit as it did not have the markers to send scripts to for marking.
Could you not have planned for that? Would not planning have at least been helpful in the situation?
Of course the situation could have been planned better. I was informed quite late about the problems with marking, which were causing grave problems for the operations unit and for many of our procedures. However, I do not know whether I could have been expected to foresee that particular problem.
I am not suggesting that your planning was at fault. It seems that, instead of one person being to blame, the problem stems from communication between different departments and the whole organisation's strategic planning for dealing with the situation.
Although it sounds awful to say this, given this summer's dreadful problems, the organisation did pretty well within the allocated time scale. In 1997, not only did we not have any software, we did not have an organisation. There were two separate organisations. We had to create the SQA first and then plan the data processing. Although an awful lot of very good strategic planning was undertaken, we had so much to do that we did not have enough time to get it all right.
At any time, did you say to HMI that you had too much to do this year and that you needed to wait?
I took over the operations unit and IT two months before the courses started in schools. It was just not an option to ask the Government to tell the schools to take another year.
So when you took over two months before the courses started, it was too late to do anything. Do you think someone should have said something earlier on?
There was probably a reaction to the fact that it took 20 years to implement standard grade and there was a feeling that we should get higher still up and running. Because the higher still programme had been running for six years, the schools were more or less in place. However, the SQA had not been running for the same time; it came into existence only in April 1997. A lot of people have been saying that it was doable. It was, but any major change to the public examination system comes with risks, and the bigger the change and the quicker it is introduced the bigger the risks. With higher still, the implementation of changes at intermediate 1 and 2 and higher within the time scale put too much pressure on us and did not allow us to be resilient when things went wrong.
You mentioned the lack of markers. Which unit head was responsible for appointing the markers?
My colleague Don Giles was directly responsible for that, with the appointments and committee unit responsible for making the appointments.
What is the name of that unit again?
Committee and appointments, I think.
We heard earlier that Mr Miller, the chairman of the board, had received an indication that 20,000 results might not have been completed. On Monday, he explained how that figure began to decrease over a period of time to about 1,400 on the weekend before the certificates were issued. Even then, as he drove down to Dalkeith to congratulate the staff on their good work, he was called to be told that that figure had further been reduced to about 400 candidates.
The way in which we were managing the organisation was to have daily videoconference meetings. On Fridays, we met the Scottish Executive. The chairman attended some of those meetings—I cannot recall how many. When he visited Dalkeith, where I was spending most of my time, I would discuss with him the general situation on the data. It was certainly not me who made the phone call to him on his way to Dalkeith.
Would the job of collating the information on the number of outstanding certificates be that of Jack Greig or Bill Arundel, depending on who was in situ at the time?
Given the number of developments in the time scale in which we were working, the reports that we were getting out of the system were not as user-friendly as I am sure they will be next year. We discussed the situation at the daily videoconference meetings. Various people were using different tools to get an insight into where we stood on the data and they brought those insights to the meeting. More than one person was involved in bringing the data together.
I am trying to establish how the reduction in estimates came about. David Miller tells us that on the day he went to Dalkeith, he turned up to find what he called thousands of certificates still waiting to go out. There seems to be some conflict in the evidence that we have received. A number of people who have given evidence have said that that they felt misled. They seemed to point to Jack Greig as the person responsible, but from his evidence—you may want to dispute this—it would appear that he was not in his post after June and that he had been on sick leave for part of June. The chairman is telling us that the numbers are coming down, but it cannot be Jack Greig who is supplying that information; it is Bill Arundel or other people. Can you shed any light on that?
Jack Greig was responsible for the management of the unit until he took sick leave in June, but we were not working in a hierarchy. I have no evidence that anyone was deliberately misleading anyone else. The problem was that we were dealing with a complex system out of which we were trying to get management information. We had different ways of doing that—people were doing their best. We identified that a lot of data that should have been on the system were not; we took a cut of the data almost nightly, to see how we were getting on with reducing the outstanding data. That was being done by a range of people, such as Colin Urie, David Falconer, Bill Arundel and others.
Given that Jack Greig had been relieved of that post from the end of June, it would be unlikely that the chairman would be taking information from Jack Greig. He would be getting it from other people in the organisation.
By that time, yes.
I want to clarify something. You said that you had videoconferences every morning. We have a note of management meetings that were held regularly in July. Is that what you are talking about?
Yes.
You said that you met the Scottish Executive on Fridays—was that every Friday during July? Was it earlier?
It was not earlier. We met on 4 August and, I think, the two weeks prior to that. That will be in the public domain.
There are notes of the management meetings, so would there be notes of those meetings?
I cannot recall. They were very much working meetings—papers were presented at those meetings, which gave the latest information on outstanding data, as we saw it.
We have not been provided with working papers from or notes of those meetings. We should request them, because they are germane to the inquiry. I am surprised that we have not had them.
The meetings were to update the Executive. I have my own papers here. We went over all the outstanding issues, the time scales and how many unit results, component scores and standard grade assessment grades we seemed to be lacking, as well as the various other jobs that were having to be done to ensure that we—
Who attended those meetings from the Scottish Executive?
It was not always the same, but Eleanor Emberson, Mike Ewing and Philip Banks attended some of them.
From the information technology section?
Philip Banks was from Her Majesty's inspectorate of schools.
Was anybody from the IT section there?
Not that I recall.
I am happy to ask for that information, but we will have the chance to ask Mr Tuck for further details.
I want to focus on something slightly different. You agreed that there was no contingency plan. The Scottish Executive representative we had here said that there was no substitute SQA. What risk assessment would have been done? Would you have been involved at an early stage, when it was proposed to go forward with a system for which there was no contingency plan and no one who could bail the SQA out if the system crashed? Would you have been involved in assessing whether it was too risky to go ahead at an earlier stage?
I was not involved—nor did I expect to be—in the important decision about when the SQA should be created and when higher still should be implemented.
That is not quite what I was asking. You were involved in facilitating the process of bringing the APS on stream. I am asking whether it was your—or someone's—professional judgment that it was hugely risky to go into the dark, with no contingency plan in case the system crashed. We have already heard from the Scottish Executive that it had no contingency plan, because it could not substitute for your organisation's expertise. As far as you are aware, was there at any stage a pros and cons discussion about the risks of that? If there was not, do you think that there should have been?
We accepted as a given that higher still would be implemented in 1999-2000. At the regular meetings of the APS project board, there was a standing item on risk assessment in relation to delivery of the software. We were very conscious of that and we were always monitoring what the risks were.
Were the videoconferences digitally recorded?
No, they were not recorded. The SQA instituted a video link between the Dalkeith and Glasgow sites, which proved extremely helpful. The meetings were held daily throughout July. They were minuted, but there is no electronic recording of them.
If there are no further questions, I thank Mr Elliot, especially for having come back for a second week. Thank you for your answers.
I will ask you about a set of minutes of the chairman's committee, which have been provided to this committee. Mr Miller said that the role of the chairman's committee was to consider issues between meetings of the board. The minutes of the chairman's committee appear to discuss only SQA fellowships at some length. If that committee's role was to discuss matters between meetings of the board, should it not have been more active between April and August? Were you responsible for drawing up the agenda for that meeting? What did you tell the chairman at that meeting and other meetings about what was happening?
All I can say in response to that is that the chairman's committee was not used in that way. It was the chairman's prerogative to decide how he used it. It tended not to be used to discuss those matters.
He told us that it was used to deal with urgent matters between meetings of the board, yet all it discussed was fellowships, which although interesting are not urgent. Why did the chairman not use the committee to deal with urgent matters? Did you suggest to him that he should?
The chairman and I had regular discussions. His favoured route between meetings was to send letters to members of the board, which he did from time to time. He may also have occasionally phoned board members. He tended not to use the chairman's committee in that way.
There seem to be problems with the minutes of several committees. One of the recurrent themes of the evidence that we took on Monday was that the chairman and others claimed that there had been lots of discussions about the difficulties that the SQA was experiencing with higher still implementation, such as the problems with data handling and centres, yet the minutes do not reflect that. The minutes are anodyne and do not give much detail of those discussions. Do you recall the board meetings discussing in great detail the matters to which I am referring?
Absolutely.
Why would the chairman describe himself as being "knocked sideways" by Bill Arundel's comment, in his conversation with him at the beginning of July, that they were not going to get much more than 80 per cent right. If the chairman was that well briefed, he presumably knew that those were huge concerns, as Mr Elliot told us, across the organisation.
As I made clear in my submission and in my evidence last week, there were different concerns at different stages. The concerns that proved fatal emerged only in middle to late June. I would support everything that David Elliot said about events leading up to that. When we gave reassurances it was not in the sense of being sanguine. At no point during the year were we calm or getting a full night's sleep. It was a difficult year. However, when faced with that situation, what are your options? Either you are so concerned that you say, "This definitely cannot happen" or you set about addressing the problem, which is what we did.
But you said that in July as well. David Miller said:
I do not recall Bill Arundel making the 80 per cent comment to the chairman. That does not mean that it did not happen. A lot was happening then. I remember that Bill made that comment to another member of staff, the head of human resources, who relayed it to me. We had a discussion with Bill about it.
What did you say to Bill?
I said, "We have heard your estimate that we will get it 80 per cent right. Tell us about that." We had been in a meeting earlier, when he had not conveyed that point of view. Understandably, what we got from Bill Arundel was a general anxiety. We wanted to pin it down. We wanted to know why he was saying that, what the key problems were that gave rise to his estimate and what we could do about it. We did not get that information. A lot of the staff expressed general anxiety. A manager cannot act on general anxiety; you need to know specifics.
But you took a specific action. I presume you would speak to your chairman from time to time. According to the chairman:
I agree with David Elliot's comment that the duty of the leadership of an organisation is not to get bogged down in despondency. You must lift your staff. You must project an image of confidence—but not undue confidence because the staff are well informed. Our tone through that period was, "Yes, we understand the concerns. There are real difficulties. This is what we are doing about it. Let us get ahead and do it."
You were wrong. The gaps were not small enough.
That is right. We were mistaken in our belief. I was explaining the basis of the advice that I gave the chairman and the nature of the decision-making process during July.
According to the chairman, the advice you gave him at the beginning of July was
I cannot remember using those exact words. I certainly could not tell you the exact occasion. At a certain stage in July, the tenor of the advice changed.
The second piece of advice to which you referred was that if the data gap was small enough you would go ahead on 9 August. That also turned out to be wrong. With hindsight, did you give the wrong advice to the chairman on both those occasions? Heads were not up. Among others, Bill Arundel was saying that things were pretty dreadful. Your advice that the data gap was small enough turned out to be untrue. Who told you that the data gap was small enough?
The remark that heads were up was a comment that I deliberately made to David Miller in context. David Miller knew perfectly well—and I was not trying to pretend otherwise—that the staff were not all singing, dancing and happy with life; they were working very hard and were under a great deal of stress. Heads were not up in that sense, but what was impressive—and this is the point that I was making—was that staff were volunteering to come in. Sometimes we had to persuade staff to take a day off. That happened because of their commitment to getting things right for the candidates. Heads were up in that sense. Staff wanted to get it right and were working extraordinarily hard to do that. That is the context in which that remark was made.
Mr Tuck, you said that you felt that everybody was working hard and that people were putting in extra hours as everyone in the organisation was gearing up to ensure that the results were produced. Would you say that those who were managing and who should have had an overview of what was going on were so busy with the immediate difficulties that they were not able to get that overview and that that was why you were not able to see what was coming?
I think that that is a plausible part of the explanation.
In your answer to Michael Russell's question, you explained that you had a figure that you wanted to reduce and that you were confident that you were managing to reduce that figure because of the amount of data coming in but that, because you left the post on 10 August, you were unable to give a definitive answer as to why the discrepancy arose. Could it be that, while those data were coming in and you were at liberty to reduce your estimates, the problems that were inherent prior to that point—poor entering of data and so on—were continuing and that some of the data that were coming in were not being applied properly? That would explain why, even though you thought the number of certificates that would be wrong had decreased, it had not.
It is possible. As you say, it is difficult for me to speculate as to why that information gap appeared.
I want to talk about Jack Greig. He had his request for early retirement approved in May or June, I understand.
I believe that it was in April.
Thank you. At the end of June, it was decided that Bill Arundel should take up the task that Jack Greig had previously been in charge of. What were the factors behind the decision to approve his request for early retirement and what weight did you give them? Was his state of health a factor, given that he had been absent for some time in June? Were the concerns that were being raised by people such as Bill Arundel about the state of play in the operations unit another factor?
Both were factors. When Jack Greig went off sick, we had no idea when he would return from sick leave and we had growing doubts about his ability to manage the situation.
Had he been sick prior to June?
He had had a back problem. I am sure that David Elliot could inform you of when that occurred.
Was the problem stress related?
No. He was off with stress following the death of his wife. The subsequent health problem was to do with his back and I cannot recall what the final health problem was.
Back problems can be related to stress, but I will not go into that.
Thank you, Dr Monteith.
Indeed.
Both. He took part in videoconferencing meetings on two or three occasions and, by July, we were in daily contact by telephone. The chairman obviously wanted to know what the state of play was in relation to the issues that we were pursuing.
He explained to us how he travelled down to Dalkeith to congratulate the staff on the work that they had done and on getting as far as they had on higher still. He told us that, on the way, he was telephoned with the information that the number had come down to about 400 candidates. Are you aware of that phone call? Did you make it?
What date are we talking about?
I suspect that it would be 9 August—the date on which certificates were due to go out.
The figure of 400 candidates would relate to only one component. As I recall, there were reported to be 400 missing internal course assessments and 400 missing standard grade assessments.
The chairman explained to us that, when he arrived at Dalkeith, he was gobsmacked—to paraphrase—to find a large number of certificates in a state of unpreparedness and unlikely to go out. That genuinely seemed to shock him, given all the estimates that he had heard.
I think that you must be talking about 10 August, the day on which candidates should have received their certificates.
In that case, what David Miller was explaining to us on Monday was about two separate things. He discovered certificates that had not gone out for the reasons that you explained, as opposed to certificates that had gone out with missing data.
Yes.
Was it the SQA's decision not to carry out concordancy checks on the new national courses?
Our decision was to carry on with concordancy checks for standard grade and the revised higher examinations.
But not to carry them out for higher still examinations?
That is correct.
Why was that?
There was a combination of reasons. We did not have all the estimates—a situation for which we accept responsibility—and that would make it difficult to conduct concordancy checks.
Another issue was raised by the representatives of the HAS. They said that they regretted your decision to allow centres to amend estimates at the time of appeal. They implied that, at the point when you realised that there would be difficulties, you gave in to pressure and allowed a situation to arise in which pushy, articulate, middle-class parents could advantage their children.
The decision that we made was to relax a normal rule. The normal rule is that centres can submit an appeal on behalf of a candidate only on the basis of a previously submitted estimate and that they cannot suddenly change their estimates and appeal for something higher.
Do you accept that some parents and children were more likely to take advantage of that than others were? Some schools would be more able to take advantage of the ability to change the estimates than other schools.
Perhaps. Would not that apply to appeals in general? Parents from affluent backgrounds might put more pressure on schools to submit appeals in the first place. I am not sure whether that is a particularly new factor. However, I have not given the matter much thought.
Your submission makes it clear that the key problem was data management. Mr Elliot said earlier that he thought that the markers were a significant and disruptive factor in the data management.
In my submission, I said that the fatal problem turned out to be data management, rather than the software or the markers. However, the late development of software and the late recruitment of markers added to the pressures on data management. That is what David Elliot is saying, too. The fact that some scripts were being marked late meant that the normally strict sequence of events on which good administration depends broke down. David and I are saying that we have no evidence to suggest that the quality assurance of marking was any different from normal.
Many people have raised the issue of marking and it has emerged that concerns about marking were present from last October. However, at the last minute, there were still not enough markers. How can you explain that?
There is a difference between a general concern and a specific concern. We had a general concern about markers, which is why we talked about it with the Association of the Directors of Education in Scotland and the Headteachers Association of Scotland. We thought that the problem was a time bomb—that each year it was getting more difficult and that if we did not do something soon, a problem would emerge. In September, no one said to us that they thought that we would fail to recruit enough markers for the summer and we did not believe that to be the case.
Can you tell us about your relationship with the higher still development unit? We have talked about overload—your organisation must produce national assessment bank material—and at some point there was a shift of responsibility from the HSDU to the SQA. Was that an easy shift, or did it put on so much extra pressure that it became a reason for the overload?
I do not think so. The national assessment bank was a huge enterprise, which we shared with the HSDU. It was fraught with all sorts of difficulties, not just because of the scale, but because we had to rely on people who were not under our direct control, we had to recruit teachers, and there were copyright issues and all sorts of things. It was a huge, complex statistical exercise. There were no particular handover issues that added to the difficulty.
I am surprised.
Do you have an example of something specific?
We were talking about markers being appointed late and so on, and from the teachers' point of view, assessment bank materials were being requested at a volume that was not what they expected. The whole thing exploded. I am surprised that you say that the NAB did not give you extra problems.
You asked me about the relationship with the HSDU, which, on the whole, was a good and effective partnership. Most of the national assessment bank materials were produced on time, although often later than when teachers wanted them. However, there is another issue about when teachers wanted the material in relation to the published schedule—most of the material was published on time.
And some of the material was better than others.
That will always be the case.
I hope that this point does not relate to the time after you left the SQA. Usually, when results are prepared for the certificate run, they are prepared for schools and a tape is provided for the Universities and Colleges Admissions Service. Was that done this time, and if not, why not?
The computer run to produce the statement of results took much longer than anticipated. It was a bolt out of the blue—to me—that the statement of results was going to be quite as late as it was. However, as I recall, we produced the electronic version by the evening of Wednesday 9 August, only to discover that there was a subsequent problem in the ability of the software suppliers to receive that information. Again, that is so close to the end of my period of tenure that I am not really able to shed any more light than that.
When you say that it was a bolt out of the blue, do you mean that it was not foreseen that the printing would take so long or was the delay in printing a result of the earlier problems?
The statement of results can be produced only when the certification run is complete. The problem was that the time that it took to process the statement of results was much longer than I had been advised it would be.
I have taken a close interest in the reporting control mechanisms between the SQA and the board. You will have heard the evidence that has been given by the board members and others. Given that the members of the board are there in their own right, was it your impression that those members were making representations at the board to suggest that there was something wrong? I am interested in your impressions.
Board members, particularly those from the education sector, brought their direct experience to bear and raised issues. However, many of the issues related to the implementation of higher still. With hindsight, it appears that we should have spent the whole year thinking about data management. However, we did not because our eyes were focused on the new things: higher still implementation, getting feedback from schools on higher still and developing the new software system. I accept the criticism that we should also have been examining the old things, which has worked in the past. However, we did not and people did not see it coming. Board discussions were about higher still implementation, unit assessment, delivery of the national assessment bank and the progress of the APS. People were raising issues and we were addressing them.
We have been provided with the minute of the SQA and Scottish Executive education department liaison committee. There is a curious item in the main minute in which you are quoted as saying that you are reorganising the SQA conference to allow Mr Galbraith to arrive later than anticipated, which would have the benefit of ensuring that Mr Galbraith did not attend the question-and-answer session and be asked awkward questions. What awkward questions did you expect the minister to be asked at your conference?
Mr Russell, you would have to ask that of the person who wrote the minute.
The minute quotes you as saying that.
I am quoted as saying what?
The minute quotes you as saying that you had rescheduled a session and that the advantage of that would be that Mr Galbraith would not be at the question-and-answer session to be asked awkward questions. What awkward questions did you expect?
I would have to dispute the accuracy of that minute. It is not any part of my job to defend the minister from awkward questions. We issue an invitation to the minister to speak at our conference. It is up to the Scottish Executive to determine when the minister attends and whether he will take questions.
So the minute is plain wrong?
The minute is plain wrong.
A small issue was raised by Judith Gillespie of the Scottish Parent Teacher Council. She pointed to the February meeting of the liaison group when it took a decision that effectively allowed unit assessments to take place at the end of the courses. She said that at the time she did not realise the significance of that decision, which meant that all the unit assessments would be delayed. Did you realise that that had implications for the SQA?
We are in the area of advice to schools on implementation, rather than SQA regulations. It has always been up to the centres to decide when they want to conduct unit assessments. They could do it sequentially or they could save them all up to the end. However, if that were to become widespread practice, we would have to step in with regulation because it would cause us administrative difficulties. The discussion that you are talking about might reflect a change in the tenor of the advice that HMI or the HSDU were giving to schools about implementation. That was not a matter for the SQA.
Thank you, Mr Tuck, for returning to the committee and giving us your answers this morning.
Thank you.
Does the committee agree to begin Monday's meeting in Hamilton with a private item?
At what time?
At 9.30 am.
Some members may have difficulty getting to Hamilton for 9.30 am.
Members will have to make an effort, given the fact that we have overrun this morning. Is it agreed that we open that meeting in private?
Meeting closed at 13:38.