Skip to main content
Loading…
Chamber and committees

Education, Culture and Sport Committee, 27 Sep 2000

Meeting date: Wednesday, September 27, 2000


Contents


Exam Results

Good morning and welcome to the Education, Culture and Sport Committee. I particularly welcome members of the Scottish Executive education department. Mr Elvidge will introduce his team.

John Elvidge (Scottish Executive Education Department):

On my left is Douglas Osler, the senior chief inspector of schools, whom the committee knows well. On my right is Eleanor Emberson, the head of the division that deals with policy on higher still and other matters.

I will repeat what I said in my letter to the clerk about why I thought it would be helpful for Douglas Osler to be here today as well as in a couple of weeks' time, when you will see him separately. It is important to recognise that higher still is a corporate responsibility of the department and that Douglas and some of his staff, as well as members of Eleanor Emberson's division, are involved in the issues. Although I am not sure of the extent to which the committee will want to get into issues relating to higher still, I thought that it would be more convenient for you to have everyone who is responsible for the issue present in one place to facilitate the questioning.

The Convener:

Thank you. I am sure that that will be helpful.

We have your written submission in front of us. People have had an opportunity to read through it and we have a number of questions to ask. Specific questions will be put and I will try to bring in any members who have supplementary questions. We have approximately an hour for this section.

Ian Jenkins (Tweeddale, Ettrick and Lauderdale) (LD):

Before we get into the nitty-gritty of what went wrong in the past few months, I want to ask three questions. First, to what extent does the department now consider that the decision to merge the Scottish Examination Board and the Scottish Vocational Education Council was responsible for difficulties relating to corporate philosophy and structural amendments to the testing regime? Secondly, did the decision to go ahead with the implementation of higher still come a little early for everyone concerned? Thirdly, taking those two points together, does the department accept that the volume and complexity of the data that were subsequently mishandled were at the heart of the problem? The decision to merge the SEB and SCOTVEC was contentious in certain quarters. Some people thought that the organisations had different philosophies and that, although it was logical to create an umbrella organisation, the two did not fit together well.

John Elvidge:

Those questions go to the heart of the matter. As members will know, when the previous Government consulted on four options, the view was widely held in Scotland that the merger of the two organisations was the best solution. Given what we know now, was that judgment right?

Inevitably, one is drawn into having an opinion about what precisely went wrong. The Enterprise and Lifelong Learning Committee heard some very useful evidence on that—I am sure that it is ground that this committee will explore in much more detail. Without jumping to the conclusion that the answers that were given yesterday will prove to be definitive, I will assume, for the purposes of answering this question, that they are substantially correct.

If those answers are correct, the root of the problems was in a part of the Scottish Qualifications Authority that was not substantially affected by the cultural mix. It was a part of the SQA's operations that was run almost exclusively by people who brought with them the expertise of the former Scottish Examination Board. Although one can speculate that some of the cultural issues may have impacted on the quality of internal communications, the mystery of how the problems could exist without being widely known in the organisation—I think that that will come to preoccupy us all—did not have its root in cultural issues relating to the merger. As far as one can judge from what is known now, one would conclude that, whatever problems of merging cultures existed in the organisations, those problems were not at the root of what went wrong.

Ian Jenkins:

I will jump to my third question. A consequence of the SCOTVEC philosophy entering the exam system was modularisation and the use of performance indicators. That led to a volume and complexity of data of a different order to what had existed before. Of course, I know that there had already been a move to internal assessment, but I am thinking of the number and type of data and the way in which internal assessment was to be merged with exam performance. Even given what you say about the culture, do you think that the considerable change in the way in which things worked—in addition to the volume and complexity of data—was a problem?

John Elvidge:

I have spent some time thinking about that, as one of the natural questions to ask was whether there was a step change in the volume of data that clearly meant that the old ways of doing things were no longer adequate.

It was helpful to get a feel for the total number of pieces of information that were being handled, which I understand to be in the region of 4 million, and what proportion of those were exam scripts—the pieces of information that would have been around under a system that had not added on the elements of internal assessment. I understand from the SQA that, in round terms, that second number would have been around 3 million pieces of information. In volume terms, the new system added something less than a third to the total volume of data. I say something less than a third because the old system was not devoid of pieces of internal information that had to be handled by the SQA. It does not seem to be an obvious conclusion from that that a step change in volume of data is the explanation for why the SQA failed to live up to its former high standards in handling those volumes of data.

Ian Jenkins:

I am rather surprised that only a third of the volume was added, but I am not in a position to argue with your statistics. Do you accept that some of the material that ended up on certificates was of no particular use and that a great deal of work had gone into producing something that was of questionable value but that must have added to the complexity of things? I am talking about such things as core skills and the long-term move to group awards. That sort of extra has made the whole business more complicated than it needed to be.

John Elvidge:

We should bear in mind the fact that, in relation to this summer's exams, we are dealing with a subset of the SQA's customers. I am not sure that anyone ever thought that for candidates coming from schools, particularly those candidates for whom the primary purpose is to obtain a passport to higher education, some of the data about core skills would be the most important part of the certificate. For the other client groups that the SQA serves, that information was considered extremely useful. I am thinking now about candidates whose primary concern is entering the world of work.

We must remember that the whole foundation is built on the strong belief that existed in Scotland in the early 1990s, when the system was being designed, that integration of academic and vocational education was the one guiding principle that should shape our work. The inclusion on certificates of things that candidates in schools may not find particularly relevant is a natural consequence of that.

Ian Jenkins:

Everyone accepts that there was a gap in provision for the population that was coming into fifth and sixth year in schools and a need to change the system accordingly. Did the decision to go ahead with implementation come just a little bit too early for people?

John Elvidge:

That was a difficult decision. The fact that implementation had already been postponed twice obviously suggests that the question of when it was right to make the move was at the forefront of people's minds. We certainly believed that we had reached the stage at which implementing the change would be challenging but manageable. That is different from saying that anyone thought that implementing it in the year just passed would result in everything going smoothly. Implementing a change of that magnitude is never accompanied by the expectation that things will go absolutely smoothly in the first year. However, there was a belief that we, the SQA and schools were essentially ready to cope with the change. In the light of what we think we know about what went wrong, I would not be inclined to revise that opinion.

Why was implementation postponed twice? Why were things different the next time?

John Elvidge:

I think that there was a combination of two factors, although I may ask Douglas Osler to comment further in a moment, as he is better acquainted with the history. There were two essential preconditions: whether we were prepared for teaching higher still in schools and whether we had the necessary materials; and whether the SQA was ready to undertake its part in the process. In previous years we concluded that, on both fronts, a bit more time would be helpful in getting to the starting line in good shape, as both preconditions involved substantial undertakings and a lot of work on a broad front.

Douglas Osler (HM Chief Inspector of Schools):

There was a clear view that the previous system was not meeting the needs of all young people. In education, if there is a highly desirable change, one always wants to bring the benefits of that change to young people as quickly as possible. The first year proposed for the introduction of higher still was quite obviously unrealistic, so ministers took the view that more time should be given. That was the story of the subsequent postponement and the rephasing decisions that were taken more recently.

I should add that our inspection evidence shows that the levels of achievement of young people in S5 and S6 are significantly higher in higher still courses than in courses that are not higher still. It is quite clear that the quality of learning, teaching and attainment has risen. Our evidence shows that schools had coped well up to the point at which examination scripts left the schools to go to the SQA.

The 25,000 or so young people who will have intermediate certificates will have something that no previous generation had. There will always be questions about how quickly we wanted to bring that benefit, but our inspection evidence suggests that schools that did higher still had coped very well. S5 and S6 have always been of high quality in Scotland. The signs so far are that the higher still courses have improved on that.

You said that the SQA provided information saying that there were 4 million pieces of data this year, compared with 3 million in previous years. Is that correct?

John Elvidge:

That is not precisely what I said; what I said was intended as an approximation of the volume of data.

Nicola Sturgeon:

Even if it is an approximation, I want to challenge you on that figure, because there are three unit assessments in every subject for higher still. It strikes me that, where there had been one piece of data—the exam script—in previous years, there would have been four pieces of data under higher still. Do you believe the figures given by the SQA that you have quoted this morning?

John Elvidge:

What you have said would be true if highers were the only exams that were being handled in the system. However, highers form a relatively small proportion of the total number of exams being handled. The increase in data attached to the new higher has only a limited effect on the total volume of data handled by the SQA in the summer diet. I have been through the same process of questioning that you have just gone through.

It would be useful to have a breakdown of the figures that you have provided. Would that be possible?

John Elvidge:

Certainly.

Nicola Sturgeon:

My second question is about the speed of implementation of higher still. When standard grade was introduced, it was piloted in the first year in a limited range of subjects. Was piloting higher still ever discussed in your department? Was it something that the SQA ever suggested? If piloting was not discussed, why was it not discussed? It would seem to be a reasonable way to introduce change of that magnitude so that problems could be ironed out.

John Elvidge:

I will need Douglas Osler's help as we go further back. However, I understand that it was not felt necessary to pilot because we had had two successive delays and had been able to prepare adequately across a wider range of subjects. As Mr Osler said, the demand from the education community for the introduction of higher still, which was believed to be a better system, was a considerable factor in our way of handling the situation. Every year that we did not introduce higher still was a year when the candidates whom the new features were intended to benefit would lose something. As a result, there was a desire not to leave any young person in that position any longer than was necessary.

Nicola Sturgeon:

The demand for speedy introduction of higher still from the education community is something that has passed me by.

In the Scottish Executive's written evidence, there is a reference to the expression of concerns about higher still from members of the education community. Mr Osler, will you tell the committee how far back those concerns go, to whom they were communicated, what exactly the concerns were and how those concerns might have impacted on this year's problems? Furthermore, the submission says that the Executive considered the concerns expressed by stakeholders very seriously. What was done to respond to those concerns?

John Elvidge:

With respect, that is not a question purely for Mr Osler. The education community's opinions do not come in through any one channel. The department has a variety of ways of contacting the stakeholders. I will do my best to answer that question.

I am not really bothered about who answers the question—I just want some answers.

John Elvidge:

The dialogue with the stakeholder community goes back a long way; a series of groupings in which stakeholders could be consulted have been carefully maintained in the process of thinking about what needed to happen and when it should happen. Clearly, the dialogue about when and how to introduce higher still stretches back a long way.

From the evidence of the previous year in particular—we have supplied the committee with a lot of evidence of discussions in groups where stakeholders were present—there is no thread of feeling that the introduction of higher still this year was a mistake. Instead, there is a series of discussions about how to manage the process. Concerns that were expressed further back in time about whether it would be premature to introduce higher still—the concerns that led to the various postponements—had moved down a hierarchy of concern to narrower issues about how the implementation was made to work. This year, the department received remarkably few direct representations from any source about the process of implementation. Between October 1999 and June 2000, we received precisely six letters to ministers from any source about that set of issues.

Are those letters from individuals or representative organisations? I am sure that you appreciate the difference.

John Elvidge:

There is a difference. The letters are all from individuals. There is a natural explanation for that, because the representative organisations were all in a structured dialogue with the SQA and us and, by and large, did not need to write letters in order to communicate their views. We frequently sat around the table together and they expressed their views. In the context of those discussions, it is interesting to note that those education authorities that decided to submit detailed views on their experience of implementation were a small minority. The impression that I get on reviewing the record of the dialogue with stakeholders is that they had moved into a constructive partnership with us and the SQA about how to manage the detail of the implementation process.

Mr Brian Monteith (Mid Scotland and Fife) (Con):

I would like to pick up a similar point to the one that Nicola Sturgeon explored on the piloting of standard grade and the introduction of higher still. It would appear that teachers managed to cope with the introduction and that the two delays stemmed from concerns about the preparedness of teachers in schools and the SQA. However, it was then agreed that progress should be made and that higher still should be implemented. It appears, from media coverage and from communication that MSPs have received, that most of the concern that was expressed about higher still came from teaching staff about their preparedness.

Nevertheless, the political decision to go ahead with implementation was taken. Only after that decision was taken did reports of concerns relating to information technology, exam markers and so on begin to emerge. However, by that time people were sitting exams and others were beginning to mark the papers. Some reports go back as far as October 1999, but the main reports came through in March and April 2000, by which time it was too late.

There was great focus on whether schools and teachers were prepared. With hindsight, do you think that the department paid enough attention to the preparedness of the SQA, rather than that of teachers?

John Elvidge:

That is an extremely good question. Higher still appears to have worked in schools. It has not been perfect or absolutely smooth, but by and large, the schools have delivered. It seems fairly clear that the point when things went wrong was when the SQA received the outputs from the schools.

I find it helpful to work through the succession of issues that arose during the year—which one can trace from the papers—and to ask whether they suggest that lack of preparation on the part of the SQA was a problem.

There are four sets of issues that can be traced through the documents. The first set of issues surrounded the registration of candidates and entry for examinations. That was the dominant consideration in people's minds from around October last year to about March this year. It is clear that the process did not go as smoothly as it should have. It is equally clear that, by working together, the SQA and schools managed to resolve the practical implications of that difficulty.

The next issue that comes to the forefront of everybody's minds is the exchange of assessment data between the centres and the SQA. The system appeared to work perfectly well for some centres, but not for others. However, as far as one could tell, the difficulties were not the centres' responsibility. By that I mean that it was not, by and large, something that the centres did wrong that accounted for the difficulties. Quite a lot of energy was expended on that issue, as it was thought that it might be indicative of a general problem with the IT systems, which might impact on the issue of results. The SQA spent a lot of time investigating case by case what was going wrong but found no consistent pattern, which is always frustrating. Life is a good deal easier if a problem has one head that can be cut off, rather than the problem appearing to have multiple causes.

Broadly speaking, both we and the SQA satisfied ourselves that the IT systems were, essentially, working as intended. I would not argue that the IT systems and the relationships between centres and the SQA work as perfectly as one would wish. However, it was established that nothing had gone so wrong with the planning of the SQA's IT systems that the exam results could not be expected to run properly. From the evidence that the Enterprise and Lifelong Learning Committee received yesterday, that conclusion appears to be correct, bearing in mind my caveats about whether we are sure we know precisely what went wrong.

There was a period when people were concerned about markers and it is clear that there was a difficulty in recruiting the last small cohort of markers. That problem was also resolved adequately, but later than one wished. I understand that the root of that difficulty was that the SQA was not as well placed to predict the number of markers that were required for each subject as it would have been in a more conventional year. Therefore, the estimate of what the SQA needed came later than was desirable. However, I am not sure that that is a planning fault—it might be one of the inevitable consequences of having one's first experience of a new practice.

During the last phase of the problems, which began at the end of June in our view, attention moved to what proved to be the crucial issue at the end of the day—the SQA lost track of some information. For a long time during that period, we thought that that was somehow a manifestation of IT problems. People talked about the problem as if it were a data transmission or IT problem. If the evidence that was given to the Enterprise and Lifelong Learning Committee yesterday is correct—I have no reason to suppose that it is not—that problem was nothing to do with IT systems. It appears to have been because of the SQA's procedures for handling physical pieces of information—pieces of paper. During yesterday's meeting, people said that that was incredible. I am not sure that anyone who visited the big shed in Dalkeith, which is full of pieces of paper on racks, would find it quite so surprising that pieces of paper could be mislaid. However, it is surprising because handling pieces of paper is precisely what the Scottish Examination Board has always known how to do well. That does not lead one to the conclusion that a lack of pre-planning was eventually the crucial factor.

We will return to questions on IT data management and marking arrangements. I am sorry to have interrupted you.

John Elvidge:

I am not saying that planning could not have been better. Planning is one of those things that could always be better. I am not saying that clearer project planning by the SQA would not have helped to deal with some of the difficulties that arose along the way; I am questioning whether better advance planning would have helped prevent what went wrong.

Mr Monteith:

Thank you for that clear and full answer to my question. Your answer suggests what I have suspected for some time—there was not one problem, but several, which added strains to existing strains, and that compounded the difficulty.

Given that one can predict that there will be an effect further down the line if something goes wrong, would not it have been better to run several pilots? Almost every other educational policy that has been introduced by the Scottish Executive or the Scottish Office in recent times has been piloted.

John Elvidge:

That is an imponderable question. Who knows what would have happened if we had piloted higher still? The strains of a situation such as we have experienced arise only when one runs the system at full capacity. One can run the system at a fraction of its capacity and everything can seem fine. It is only when one tests the system in real life and at full capacity that one discovers whether it will work.

We offered the SQA advice on IT issues in April and told it that we knew that it had carried out certain kinds of testing on its IT system. We also told the SQA that it had to test the system at volumes at which it would have to operate and that until it had done that, it would not know whether the system would stand up to the strain that it would face. The argument against piloting is similar.

Piloting worked for standard grade. Before one tests a jet engine at full throttle, one tests it at half throttle. Do not you agree that, in retrospect and with all the facts that we now have, it was a mistake not to pilot the scheme?

John Elvidge:

I could not conclude that from the information that we received. The fact that we use piloting in many situations demonstrates that we believe that it can have advantages. However, from the information that is available to us, I do not conclude that the decision not to pilot made any difference in this case. I sense that Douglas Osler is itching to make a contribution.

Douglas Osler:

We are in danger of holding standard grade up as a model for introduction of a new examination system, but history would not bear that out. Higher still has received far more support from all the main stakeholders than the introduction of standard grade received, and higher still has been consulted on far more extensively than any previous development programme that I know of.

I do not want to quibble over the wording, but technically, standard grade was not piloted. Part of the reason why standard grade took so long to introduce was that it was held up by a period of industrial action, which delayed it for about two years. Standard grade was phased in: groups of subjects were introduced in four separate phases.

I understand that the exam boards that were responsible for that found it difficult, because several subjects ran in parallel. Rather than one subject having one course, subjects such as English were being run in two different forms. That was demanding of markers, IT systems and so on. It was also confusing for young people, parents and employers. Over a long period, people left school with standard grades and O-grades—the relationship between the two caused much confusion.

When higher still was being introduced, all those issues were discussed. It was decided that the standard grade phasing had not been a good way to introduce a new scheme and that a big bang approach would be better for the system. However, that did not happen with the introduction of higher still—some higher still subjects ended up being partly phased in. That meant that there was a similar format of introduction as that which was used for standard grade, although it was not exactly the same.

Nevertheless, there was no degradation of O-grades or standard grades—at least, not that I am aware of. Were ministers asked to make a decision on whether to pilot?

John Elvidge:

I am not sure whether we are allowed to tell you that. We are not talking about our present group of ministers; we are talking about the business of previous Governments.

Mr Stone:

You might want to reflect on that answer.

There is some circumstantial evidence that suggests that although the teachers at the chalk face used every endeavour to implement higher still—such is their professional ethos—there was some consternation. It has been suggested to me that HMI was aware of the situation and that representations were made to HMI by high schools and secondary schools throughout Scotland. In some way, that information was not relayed back to you, the civil servants in the Scottish Executive education department.

John Elvidge:

Regarding Douglas Osler and the other members of the inspectorate as if they were a different species is a misconception of the way in which the Scottish Executive education department operates—they are, for most purposes, officials of the department. We should be regarded as knowing what they know. We are not free from imperfections in internal communication—no organisation is.

Throughout that crucial period, we had many opportunities to sit down with representatives of the teaching unions and the education authorities, as well as having continuing contact with individual schools. From the papers that the committee has received, it will be clear that the process was not without bumps, but in the majority of cases, the message that we received was that implementation was going ahead as planned. The excellent job that schools have done bears that out—they taught the courses and prepared their candidates for the exams. The system was not breaking down.

There were irritations over whether materials were being made available as quickly as classroom teachers would have liked, and there were sometimes misunderstandings over the freedom for manoeuvre that classroom teachers had, which might have removed some of their frustrations. However, the message from our sources of contact about what was happening in the schools is not that the process was proving unworkable—or anything that closely approximated to unworkable—but that it was generating the kind of teething troubles that one would expect.

Given what you know now, do you accept that you were not aware of certain signals and storm warnings that were coming from the chalk face?

John Elvidge:

We were aware of the warnings—that is what I have been trying to say. The evidence that has come to light includes communications from the SQA to schools. That communication plainly acknowledges the existence of anxieties in the schools and seeks to work with the schools to deal with their anxieties.

I am constantly trying to refrain from leaping to conclusions about what happened, because the committee and others are in the middle of a process of trying to pin that down. However, although it is possible to improve the processes that govern how higher still functions in schools, the schools did an excellent job. What happened in the schools is not a contributory factor to the problems with the summer diet of exams.

I am anxious to move on, but a couple of members still have supplementary questions. I shall invite them to speak, and we will then move to the next section.

Fiona McLeod (West of Scotland) (SNP):

Let us return to the evidence that was presented to you and the way in which you used it in the implementation of higher still. You talked about six letters that were sent to the Scottish Executive. I would like you to think about the consultation and representation that was made to you in terms of its content and source, rather its quantity. It would be interesting to know exactly what was said to you in those six letters.

There seems to be a contradiction in what you have said. When you answered Nicola Sturgeon's question, you said that most of the representation that you received concerned the impact on the production of the exam results. However, in paragraph 4.4, on page 23 of your submission, you say:

"Schools and colleges did raise concerns . . . but far more feedback was received on implementation issues relating to learning and teaching".

I return to a question—that has been asked more than once—about the phasing and piloting of higher still. If the representation that you received was about learning and teaching, why was the decision made to go ahead with higher still this year?

John Elvidge:

I do not recollect saying to Nicola Sturgeon that the representations that were made to us were primarily about handling of the exam results. If I gave that impression, it was a false one. The letters that we received were about a combination of what was happening in the schools and people's anxieties about bits of the process.

Typically, the letters focused on individuals' experiences, which led to worry that such experiences were widespread. The majority of the contacts that we had with people in schools naturally concerned learning and teaching aspects. It is difficult for the average teacher to see into the exchanges between their school and the SQA, and it is certainly near impossible for them to see into what is happening inside the SQA. One would expect teachers to talk to us about their first-hand experience—the process of teaching the courses.

As I tried to say, nothing in those representations suggested that the delivery of higher still in the schools was in any way near to failure. There were irritations for individual teachers and, as we know, individual teachers hold differing views on higher still. Some teachers believe strongly that the process of internal assessment is simply wrong in principle, but that is not a majority view. Some teachers felt that materials that were produced centrally for their subject could have been made available sooner—and they were right. The materials for most subjects were produced on time, but some lagged behind a little.

What one heard from the classroom seemed perfectly natural and was entirely consistent with the facts as we knew them. It did not lead one to the conclusion that something was going fundamentally wrong with the delivery of higher still in schools. The evidence suggests that nothing went fundamentally wrong with the delivery of higher still in schools. The problem is of the SQA's making, not the schools'.

Nicola, has that answered your question as well?

Nicola Sturgeon:

I have two further questions—we still have not received answers to some points that have been raised.

Are you confident that the concerns that may or may not have been raised by teachers—for example, through the higher still development unit—were getting back to you and, by extension, to ministers? The picture that you paint does not accord in all respects with the reality of the past few months, during which it seems that there were real concerns in schools that things were not as they should be. An example of that—this is a delivery problem—is the difficulty that schools had in communicating information to the SQA and that the SQA had in processing that information. It amazes me that you have received only six letters this year about the problems that were associated with higher still. Perhaps that is because those concerns were being directed through the higher still development unit. If that is the case, can you guarantee that you were being made aware of those concerns and that those concerns were being acted on?

My second question is one to which we have still not received an answer. Can you tell the committee—to the best of your knowledge—whether piloting of higher still was ever discussed in your department?

I am not sure that we have not received an answer to that question. Please make your answer concise, as I am aware of the time.

John Elvidge:

I shall be as concise as possible, although these are complicated matters.

It would be foolish of me to give an absolute guarantee that every representation that anybody made found its way through the system. A significant volume of comment came through the groups of stakeholders who we were consulting on the problems of transmitting information. From the evidence that the committee has, it should be clear that no one who was dealing with the issues was unaware that there was a problem on a significant scale. Regardless of whether every warning got through, there was certainly a substantial body of warnings and, as I said to Mr Monteith, those were dealt with systematically.

In answer to Nicola Stugeon's second question, I have said that I am debarred from saying what discussions took place with the ministers of a previous Government.

What about the current Government?

John Elvidge:

I am also debarred from saying what advice we gave to the current Government. The committee needs ministers' agreement to release information about our advice to them, not my agreement to tell the committee what we might have said to ministers.

The Convener:

I am sure that we will return to that at some stage, but I am anxious to move on. We will address a specific issue that has been raised and will then progress to the chronological order of the issues that have been discussed. Let us begin with discussion of the introduction of the IT system, data management and whatever followed that.

The new IT system was obviously a major project. Can you explain the difference between the data communication problem and the IT system problem? Those seem to be two distinct difficulties.

John Elvidge:

Indeed. When I talk about the IT system, I am talking about the system that the SQA operates internally, which it uses to process the information that is available to the SQA. When I talk about the communication of data, I mean the electronic passing of data from computer systems in the centres to the SQA's computer system. For some centres, that link did not work as well as it was designed to, which led in some cases to paper being substituted for the electronic transmission of data.

I would like clarification of the situation. Was the SQA computer system—which was different from the computer systems that are used in different local authorities—unable to read the data from the local authorities' systems as intended?

John Elvidge:

It is not the case that the SQA computer was incompatible in principle with any of the systems that are used in schools, because the communication worked fine for many centres. In many ways, it would be a relief if one could say that we know what the problem was—that one of those systems that are used in schools was not working, for example—but it is not as simple as that. By and large, the systems in schools seem to have done their job and the SQA's system seems to have done its job. However, for reasons that are not simple, that linkage did not work in a proportion of cases.

Ron Tuck talked about that to the Enterprise and Lifelong Learning Committee yesterday, but I do not profess to have his depth of knowledge about what the SQA found when it checked the individual instances of what had gone wrong.

Earlier, you said that the problem is continuing. Is the system still not working properly?

John Elvidge:

We are still not in a position to say that electronic communications between every centre and the SQA are guaranteed to work smoothly. One of the pieces of work that the SQA is carrying out at the moment, as part of its exercise to ensure that there is no repetition of this year's problems next year, involves simulated testing in an artificial environment of all the systems that are used in centres as well as the system that is used in the SQA centre. That is being done to check whether any of those systems misbehaves in a way that—although apparently random—can be tracked down to a feature of the system.

According to reports, although the computers were supposed to talk to each other, all the data were eventually input manually. You returned, in effect, to a paper system and input the information into a computer. Is that true?

John Elvidge:

I cannot claim to know the breakdown of data that were entered electronically and manually.

We can ask the SQA about that.

I am sure that we can ask about that elsewhere.

Were you satisfied that the SQA had the IT knowledge to be able to introduce a brand new software system, make it work properly, pilot it and produce the results?

John Elvidge:

We were satisfied that the SQA had used reputable and established firms of advisers to design its software and that it was building the system on a widely used database system, so that there was nothing novel or risky in the foundations of the system. Although I cannot pretend to be able to judge the quality of another organisation's IT staff, our contacts with the SQA have not suggested that there is any reason to doubt the professional knowledge of its internal staff.

Mr Tuck's submission says that you sent in an IT specialist at some point in March. I could find no reference to that in your submission. Is it referred to there?

Yes, it is.

It is there, but I missed it. That is my fault. Sorry.

Cathy Jamieson (Carrick, Cumnock and Doon Valley) (Lab):

I want to examine the relationship between the education department and the SQA, to find out how some of the problems were brought to light.

Reading the minutes of the liaison meetings between the Scottish Executive and the SQA, I was struck by the fact that an issue was raised in November 1999 in relation to previous standard grade exam appeals. A computer problem had been flagged up and reassurances were given that that could not happen again because a new awards processing system was being introduced. The error with the standard grade exams appeared to have occurred because the wrong program had been used. Did not that give some warning that all had not been well in the past and that, even with a new computer system, all might not be well still?

John Elvidge:

The only inference that I would draw from that is that mistakes happen, even when organisations have run the same processes many times. I am not led to conclude that there was any inherent reason to doubt the SQA's ability to handle its IT systems, although that possibility is the reason why we asked the questions.

What prompted the Scottish Executive to send an IT expert in and when was that decision made?

John Elvidge:

I shall try to be brief and sketch what I regard as three phases of our relationship with the SQA. The relationship changes demonstrably at points in this story.

Until the beginning of March, our relationship was entirely normal. A series of pre-planned, structured meetings took place and whatever questions we had to ask were asked in the context of those meetings. We were going through what we regard as the normal process of ensuring that we were informed and that we were raising matters that worried us.

From the beginning of March, we began to worry—largely because of questions that were being raised about data exchange—that reliance on that normal relationship was perhaps insufficient. We started to deal with the SQA outside the usual pattern of meetings. We did things that could by no stretch of the imagination be regarded as a part of our normal relationship with a non-departmental public body, such as suggesting that it take advice from our deputy director of information technology. We took that decision between early March—when discussion of data communication problems began to be widespread—and April, when the deputy director of IT visited the SQA. After we had spent some time trying to understand what was going wrong, we concluded that it would be helpful for the most experienced member of our IT staff to visit the SQA and offer it a peer review of its processes.

Who did that person report to? Will a copy of that report or its key findings be made available to us?

John Elvidge:

The key findings are summarised in one of the documents that you have been given.

Is that document the letter to David Elliot?

John Elvidge:

From memory, I think that the key findings of that visit are summarised in a letter from Alastair Wallace to David Elliot.

The report by our deputy director of IT was widely shared among people dealing with the SQA.

I will press you on that, as I do not think that the letter gives much information. To whom was the report passed at that stage?

John Elvidge:

I cannot from memory give you a full list of the people who saw it. It was not a report as such—it did not have a front cover and it did not set out the deputy director's investigations and findings. It was a statement of a number of suggestions of actions that the SQA should ensure that it took, largely as a matter of prudent contingency planning. The report did not consist of much more than a list of what those actions were, and that is largely what you will find in the letter from Alastair Wallace.

Are you content that the SQA took on board the report's recommendations?

John Elvidge:

In discussions over subsequent weeks, we asked the SQA what it was doing. We clearly identified—or the SQA clearly told us, and I have no reason to doubt it—that it had followed up many of the recommendations. If you are asking me whether I can say, hand on heart, that the SQA carried out all the actions in precisely the way in which we suggested that it should, I cannot say that I know that.

There are some other points to which I will return, but I am aware that other members wish to ask questions.

I recall reading somewhere in this tome of evidence that the SQA said that the IT information and advice that it had been given was not relevant and that it did not use it.

John Elvidge:

I think that you are referring to Ron Tuck's evidence to the Enterprise and Lifelong Learning Committee—I have not seen his evidence to this committee—which I would paraphrase as saying that the advice did not tell the SQA anything that it did not already know that it should be doing.

It is maybe unfair to ask you to paraphrase what someone else said. I am sure that we can raise that point elsewhere.

I would paraphrase Ron Tuck's statement as saying thanks, but no thanks. What was your response to that, given that you had sent someone in to respond to concerns coming from many sources about IT and data management?

John Elvidge:

It is not my impression that that is how we perceived what was happening. Our perception was that, whether as a result of our suggestions or of its own thinking, the SQA was doing most of the things that our deputy director of IT had suggested would be sensible for it to do.

Johann Lamont (Glasgow Pollok) (Lab):

The evidence that we have heard was that the IT expert had no particular advice to offer. If there is a report that identifies things that ought to be done, logic dictates that your department would have a series of tick boxes in which it would subsequently be confirmed that those things had been done. The SQA was saying that there was no particular advice, but you then pressed the SQA, so I presume that it then realised that there had been advice. Did you know whether it was doing something? Was someone specifically responsible for checking that? Is there a report that identifies what ought to have been done?

John Elvidge:

There is not a report as such. Our deputy director of IT had a long discussion with David Elliot at the SQA. He did not come back and write a report, because the purpose of his visit was to offer helpful advice, not to write a report on the state of the IT systems. However, there was an identifiable list of suggestions that he made, so I do not agree that we had no particular advice to offer.

Did your department have a means of checking whether that advice was taken?

John Elvidge:

Yes. Well—

If you were concerned about the way in which the system was working, you would give specific advice, and then either ensure that that advice was taken or find out why it was not taken. You would not just hope that it was taken.

John Elvidge:

We raised all those issues in the course of subsequent discussions.

What would the consequences have been if advice was not taken? What was the next step?

John Elvidge:

I am aware of no instance where we felt that the SQA had not taken our advice, in some way or other.

Yet the SQA has said that no particular advice was offered. You think that it was taking advice; it does not think that there was any advice.

John Elvidge:

Members have in front of them a letter from us to the SQA, which identifies a number of pieces of advice that we offered. Members are as well placed as I am to decide whether that constitutes no advice.

But the issue is whether that was pursued, to see whether the advice was taken.

John Elvidge:

We did pursue it.

We have information in front of us, and there seems to be an inconsistency. That is why people are pressing you on this, to get your view.

John Elvidge:

Of course.

Mr Monteith:

We hear about IT and we hear about data processing. I would ask that questions and answers be specific as to what those terms mean. Yesterday, in the Enterprise and Lifelong Learning Committee, we had the same problem. IT can cover a wide variety of things. We need to talk about both hardware and software. Some advice was specifically about hardware, and some was specifically about software. It is important that we differentiate between them.

Members and witnesses will take that on board. We should be clear about what we are referring to.

Michael Russell (South of Scotland) (SNP):

I would like to ask about the letter from Alastair Campbell that has been referred to us—sorry, from Alastair Wallace, although it is spun almost as well as Alastair Campbell would have spun it. Looking through the letter, I can see no specific advice whatsoever. Indeed, the letter says:

"we had no reason to doubt the thoroughness of SQA's analysis or the robustness of its planning."

It goes on to say:

"we recognise that SQA's basic analysis and project planning is sound."

That does not sound to me like specific advice on things to do; it sounds, frankly, like a commendation of what was taking place.

John Elvidge:

It may help me if I look at the letter. My recollection is that that letter also refers to a number of things that we suggested would be helpful for the SQA to do. It is true to say that the impression was formed that there was no fundamental problem with the IT systems. Our suggestions were about prudent checking of various things and about contingency planning.

Michael Russell:

The letter comments on various things that are happening, but that is pretty far from being an assessment of the SQA's systems and where the problems lay. I know that you do not want to mislead the committee, but many of us thought that you were talking about an assessment of the situation, of where the computer system was going and of any problems. We did not think that you were expressing some satisfaction with the way that things were going and were simply making some suggestions.

John Elvidge:

I apologise if I have given the impression that what we did was other than it was. There was an extended discussion, the broad conclusion of which was that there was no identifiable major flaw in the SQA's systems. However, we thought that there were some checks that it would be sensible for the SQA to run and listed those. In following up, we asked the SQA whether it had carried out the checks.

The broad conclusion of the letter is:

"we recognise that SQA's basic analysis and project planning is sound".

John Elvidge:

Yes.

That was your position then.

John Elvidge:

Yes, and it is my position now.

The evidence may suggest otherwise.

John Elvidge:

I would be interested to hear what evidence you think runs contrary to that.

The distress caused to pupils and others is the evidence, Mr Elvidge.

I would like to move on.

Mr Macintosh:

As far as I can work out, the letter makes three specific suggestions. I am interested in finding out about your relationship with the SQA. You say:

"we would be grateful for information on your plans for volume testing at, say, 25% above the highest expected volume of output."

Did that testing happen? You go on:

"It would be helpful to have a note of what priorities SQA have in mind if problems arise, (for example, I understand Music takes a particularly large amount of processing time)."

You then offer the use of PCs and of your integration centre

"to simulate data interchange between centres and SQA, if this would be helpful."

The key thing that we need to know is whether those suggestions were followed up. What was your relationship with the SQA at this point? Did it take advantage of your offer? We will speak to Mr Tuck later, but his response appears slightly dismissive.

John Elvidge:

I will need to deal with the suggestions individually. The basic answer is that we followed up the letter. As I have already said, the SQA told us that it had undertaken volume testing of its system. As far as I know, that is an accurate statement. We had subsequent discussions with the SQA that demonstrated that it was thinking through the prioritisation issues and that it was going through the process of analysis that one would expect.

On the offer of our integration centre, my recollection is that the SQA said that it would prefer to do the testing itself in another way. The SQA was worried about having first to resolve some issues of commercial confidentiality attaching to each of the proprietary systems that it would be testing. As I understand it, the SQA was anxious that a number of software providers had designed different systems that were intended to incorporate the best features that they could incorporate. The SQA did not want to leave itself open to accusations of having contributed to some form of industrial piracy by allowing the intellectual property of one company to be transferred to the system of another. I understand that it was working on finding a way round that anxiety, so that it could undertake the kind of testing that we suggested. The SQA did not disagree that that kind of testing has value.

Mr Macintosh:

This may be internal documentation, but I could not find the report or memo of the IT expert who visited the SQA. It would be helpful to have that. It would also be helpful to have some reference to how these issues were followed up, not necessarily in written form.

John Elvidge:

You should be able to find the evidence of the follow-up in the documents. By and large, it took place in meetings that we had with the SQA, especially in the meeting on 28 April. We will try to make that more accessible.

That would help.

Johann Lamont:

You said that you asked the SQA to say what its priorities were when there were difficulties, and that you were confident that there was evidence of the SQA thinking about those. In hindsight, was it sufficient that the education department had evidence that the SQA was thinking about priorities, or would it have been reasonable to expect it to say what its priorities were, given that that was identified as a problem?

John Elvidge:

The evidence to show that the SQA was thinking the matter through consisted of it telling us its priorities.

So the SQA was saying to you, "We have thought this through, and these are our conclusions"?

John Elvidge:

Yes.

The education department's role was then to monitor the situation, by asking for specific responses and checking that those responses were received.

John Elvidge:

Yes. It is a constant of our relationship with the SQA that we cannot pretend that we are the people who know how to run the processing system of examinations. The SQA has the people with the professional expertise. Our role is to say, "Show us that you have thought about this and explain to us the conclusions that you have reached." Substituting our judgment for theirs would be extremely difficult.

Johann Lamont:

What would your role have been if you had been unhappy with the SQA's response? Given that you cannot substitute your expertise for the SQA's, at what stage would you say, "Wait a minute. We do not think that your thinking through is rigorous enough", or "This is creating anxieties for us"? How would you intervene, given the distinction that you have drawn?

John Elvidge:

In the framework of regular meetings, we are not shy creatures. We have no difficulty in saying when we are not given an adequate explanation and asking for further reassurance. If we had thought that the SQA's conclusions were manifestly illogical or wrong, we would have continued to press it towards a conclusion that seemed more satisfactory to us.

So throughout the whole process, you were reassured by what the SQA was doing?

John Elvidge:

Yes, that is true. I spent quite a lot of time thinking about the issue of reassurance, as I am conscious that part of the currency of this debate is the question of how we could have accepted reassurances when things went wrong. I have also been thinking about the general question of how one knows when someone else is not giving one a satisfactory explanation of something on which they are an expert.

There are two basic things that one can do: when that person purports to provide factual information, one can try to verify that against some other source, and one can look for the internal consistency and logic of what they are saying. If the reassurances pass those two tests, it is difficult to get beyond them, and the reassurances that we were given passed those tests. Despite the fact that Michael Russell thinks that that is inconsistent with what happened, it is consistent with the view that it was not the functioning of the IT system that led to what happened. We still have no reason to believe that the kinds of issues that we were raising with the SQA were the root of the problem.

Johann Lamont:

It may have been the general process of monitoring that created the difficulty. You received reassurances but, because of the barriers that you have identified, you did not have the capacity to establish that that was really what was happening.

John Elvidge:

Short of having a member of staff standing behind every member of staff of the SQA, it is not possible to obtain absolute certainty that everything that one is told about what that organisation is doing is 100 per cent true. We subjected the SQA's reassurances to the tests to which it was possible to subject them.

Marilyn Livingstone (Kirkcaldy) (Lab):

We discussed this issue yesterday in the Enterprise and Lifelong Learning Committee. The evidence that we received from Ron Tuck about the APS development is crucial. He said:

"At all our regular liaison meetings with Scottish Executive officials, progress with Higher Still implementation was discussed in great detail."

The APS development was discussed and

"was running somewhat behind schedule, but not critically".

Ron Tuck assured us that the issue was not critical.

I asked him about the implementation of the software. He said that, at that stage, the problem was data management and that the

"crucial flaw which led to the problems with August certification lay in the management of data. Other problems such as software"

and other issues

"did not. . . . impact directly on the accuracy of certification, although these factors did exacerbate the data management problems."

It became clear through all our discussions that the problem was data management. That was my impression. Cathy Peattie might want to comment on that too.

I am anxious not to repeat what happened yesterday. I am sure that that is useful information that we can pick up on.

I am just saying that we were told clearly yesterday that the APS was not the issue, so maybe we should not go back to exploring that.

Fiona, do you still have a question?

Fiona McLeod:

It is really a statement. I wonder what it takes to set alarm bells ringing in the Scottish Executive. Throughout the questioning, we have heard that you never thoroughly checked out the reassurances that you were given. What does it take to make the Scottish Executive say, "Let's investigate a bit further and ensure that we are getting the right answers"?

John Elvidge:

That is a bit of a cheap shot. I am not sure what your definition of thoroughly checking something out is.

We want to hear your definition.

John Elvidge:

I have given you at some length my definition of the processes that it is possible to follow to try to check that something that one has been told is true. A judgment must be made about whether individual questions are central to the issues. Throughout the process, we discussed with the SQA a wide range of emerging issues that might have been a source of anxiety. We pursued each issue in a way that was proportionate to the risk that it appeared to pose to the outcome. We followed a proportionate process of checking with the SQA.

Can we move on, if there are no specific questions?

Nicola Sturgeon:

On the same general theme, I am interested to know what further information you can furnish us with to help us resolve some of the questions and form a judgment about the rigour of your relationship with the SQA. I read the memorandum that you sent, which notes concerns about data management and data processing and about delivery of the examination results. Sometimes, vague assurances from the SQA that things were all right were accepted at face value. The SQA once reportedly told you that

"we remain optimistic that the diet will be delivered successfully."

That was hardly a hard and fast assurance that things were okay. What information can you give us to allow us to form a picture of the questions that you asked, the steps that you took to investigate the vague assurances and the efforts that you made to find out what was really going on in the SQA?

John Elvidge:

We are happy to give you a fuller account. Notes of meetings necessarily constitute a summary of what may have been long discussions. We are happy to try to give the committee a better flavour of the issues than it can get from the documents available.

Nicola Sturgeon:

That would be useful, because those issues go to the heart of our inquiry.

The memorandum refers to other forms of assistance that you offered the SQA. Can you go into some detail on what those forms of assistance were? How did you respond when, as this memorandum suggests, the assistance that you offered was rejected out of hand?

John Elvidge:

We were not providing itemised lists of things that we thought might help the SQA. Help comes in two forms: people and money. We were saying that if the SQA could find any way of using either of those things, we would try to help it. The SQA's response, generally, was that it thought it had the resources that it needed. It had employed significant numbers of extra staff to help deal with the problems and it did not need our money to do that.

Another constant theme of the SQA's response to us was that, when one is in the critical phase of a major processing operation of this kind, introducing untrained staff who have to be trained by the staff who are getting on with the job would create more disruption than it could add value. That is a managerial judgment. It is not an implausible argument, so we listened to it and accepted the SQA's judgment. In my experience, when people say that they do not want help when one offers it, their decision is usually founded on a sound judgment. We normally face the opposite situation, with people asking for more resources.

Nicola Sturgeon:

That in itself might have set alarm bells ringing.

This is something of a theoretical question, but I want to ask you about the legal relationship between the Scottish Executive and the SQA. To what extent do you think it would have been open to you, through ministers or HMI, to intervene if you thought that help that you were offering and that was being turned down was required and that advice that you were giving to the SQA was not being followed? Could you have intervened to ensure that certain things happened?

John Elvidge:

There are two parts to your question. I would describe what we did, particularly between the end of June and the critical date, as intervention of a sort, in that it bore no relation to the normal relationship between a non-departmental public body and the department. We were offering advice and putting our minds to the problems faced by the SQA on a completely abnormal scale. To go beyond that, we would have had to find a set of circumstances in which one could identify a particular thing that we knew would solve the problems and that for some reason the SQA was refusing to do. In this set of circumstances, I cannot identify such a thing.

Are you saying that you could have intervened? If that is what you are saying, was Sam Galbraith wrong to say in Parliament that he had no power over the SQA?

John Elvidge:

I do not think that he said that in Parliament. He gave a rather lengthy explanation of the nature of his powers and of the way in which they are hedged around. I am saying that I cannot see a set of circumstances in which those powers, whatever they may be, could have been brought to bear in this case.

But the powers existed.

John Elvidge:

It is a matter of record in the legislation that the powers exist. Matching them up to this set of circumstances is an entirely different thing.

Unless anybody is desperate to ask a question, I would like to move on.

Johann Lamont:

The fact that you cannot think of circumstances in which you would intervene may explain why there was not any intervention, because at each stage the reassurances given were sufficient. I would like you to reflect on how you manage the rest of your department and the way in which tasks are allocated to staff. What are the obvious things that you as a manager were unable to do that you would have done if it had been in-house?

John Elvidge:

That is a big question. If this had been an entirely in-house operation, all the relationships would have been different. I am hesitant about saying that if it had been in-house, the problems could have been solved. I go back to Ron Tuck's evidence to the Enterprise and Lifelong Learning Committee yesterday. He was unaware, as chief executive, of the things that eventually went wrong. I cannot put my hand on my heart and say that if the operation had been in-house we would necessarily—simply by virtue of that—have been able to spot something going wrong that the management of the SQA was not able to spot. I am not saying that nothing would have gone wrong. I cannot say with certainty that because we were running an operation of this kind, we would have been infallible and would have spotted a problem that seems to have occurred a long way down the organisation.

But did you have a different process for ensuring that your advice was pursued?

John Elvidge:

Indeed. We do not have to give anybody any advice if we are running something—we just do what we think is right.

That is clear. I am conscious that others want to ask questions. I will bring them in at the end if we have time. We move to Mike Russell's questions.

Michael Russell:

You talk about your reluctance to intervene. Eleanor Emberson might want to pass you a letter from her dated 17 July 2000. Reluctance to intervene does not appear to be part of it. Committee members should comment on it, but the letter appears to be a series of instructions from you to the SQA. The letter says:

"By 21 July, you will have to consider the plan for data processing".

The following paragraph says:

"When the main data file is being closed, you will identify".

The paragraph after that says:

"The possibility of moving the 10 August publication date will have to be considered".

The next paragraph says:

"It might . . . be sensible to go through some of the arrangements".

I welcome your comments or Eleanor's on whether the letter was helpful or whether it was a set of instructions, which would have meant that you were by that stage running the SQA in all but name.

John Elvidge:

I do not dispute that we were intervening or that writing a letter like that is a long way outside the normal conduct of the sponsorship relationship, but that does not constitute running the SQA. We were leaning heavily and producing something that looks much like a checklist of actions. You are right to say that that letter is evidence of a relationship that has travelled a long way towards intervention.

Michael Russell:

We have already discussed the letter of 20 April from Alastair Wallace, which is not just about IT but about all the arrangements within the SQA. It says:

"This . . . enabled us to give reassurance to ministers about the true position"—

—and you call it the true position—

"including that we had no reason to doubt the thoroughness of SQA's analysis or the robustness of its planning."

You also say:

"we recognise that SQA's basic analysis and project planning is sound."

You have said that you will provide us with notes of meetings that the minister has indicated took place on 28 April; 27 June; 7, 14, 21 and 28 July; and 2, 4 and 9 August, but in the papers that you gave us there is nothing from your side to indicate that there is any great problem until this extraordinary letter dated 17 July. What happened in those two and a half months to mean that you changed from saying that everything was fine to your giving instructions—as you admit—to the SQA about how to do its business?

John Elvidge:

That is a good question. A discernible change in the nature of the relationship can be located quite precisely at the end of June.

In answer to an earlier question, I sketched out what I saw as the first change in our relationship with the SQA, which took place in early March. I would characterise the relationship between early March and the end of June as one of trying to ensure that the SQA had in its sights what seemed to us to be the appropriate issues. However, we were taking it as given that, once issues were identified, the SQA would be capable of dealing with them.

As you will see from the evidence, written assurances were given internally to the SQA board and others about the position in the week ending 23 June. At the beginning of the following week, on 26 June, Eleanor Emberson made a telephone call to the SQA in which information emerged that seemed to us to contradict the impressions that had been given as recently as the previous week.

With whom did that call take place?

John Elvidge:

With Ron Tuck. Through that telephone call, we came into possession of information either about the number of scripts that were still unmarked or about the number of markers still being sought. For the first time, reference was made to the issue of the missing data. Knowledge of those things seemed to us to place us in a different position and made us feel less secure in the belief that we could operate in the same mode as we had been operating in.

From that point on, we moved into a relationship of greater intensity of meetings and greater intrusion on our part into things that we would normally regard as the management business of an NDPB. The letter that you quoted is symptomatic of something that was happening through a period that stretches from that week to the end of the process.

Michael Russell:

Your submission to the committee says that you had

"been aware for some time that recruitment of sufficient markers might be an issue . . . and there had been some discussion of possible problems at the meeting on 28 April on the Awards Processing System".

Two months before that momentous phone call, you had an indication that things might be wrong. What did you do in those two months?

John Elvidge:

One can know that something might be a problem, but that is not the same as concluding that it has become a serious problem. We were in discussion with the SQA about how those things were going, and we eventually received entirely adequate reassurances that it had the number of markers necessary to do the job.

On the APS, we come back to the key distinction that members of the committee have been anxious to make. The issue of missing data was not about the functioning of the APS. It was not a matter of what was happening inside a computer system, but something that was happening in the assembly of the information that needed to be available to allow that computer system to operate—a quite distinct issue.

Michael Russell:

Let me take you from the end of June through July to the beginning of August. The letter from Eleanor Emberson indicates that in all but name—we may disagree about this—you were issuing instructions to the SQA on what it should do. One of the key issues became whether it should put the exam results out. There is debate about that, and I think that we need a lot more correspondence. There are six letters and five e-mails here, but there must have been a great deal more correspondence and I ask you to consider supplying that.

In the midst of all that discussion, there was an e-mail on 1 August—eight days before the results were due—from Alastair Wallace, which was copied to a number of people, including Eleanor Emberson. It said:

"There is also the more general problem of how to manage public reactions . . . We would expect Ministers to issue a supportive statement making these points".

We would expect ministers to issue a supportive statement on those points, but there was no decision on whether to postpone. The minister has said that that was an SQA decision. Given that you were telling the SQA all sorts of things that it had to do, why did you not tell it not to issue the results on that day if, as you feared, many were going wrong?

John Elvidge:

It was felt that that was a judgment that the SQA was best placed to make. You will see from the papers that its estimate of how many candidates might be affected by incomplete data was moving around—if not on a daily basis, over a wider margin than one would have expected over a short period of time.

The final judgment had to rest on the balance between candidates who would be disadvantaged by the late issue of results and candidates who would be in a better position as a result of delaying the issue of certificates. At the end of the day, that came down to a practical management question: if there were a certain number of extra days, how many candidates who would otherwise receive incomplete information would receive complete information? We could not possibly make that judgment—only the people at the coal face of the system could. Having agreed with the SQA what the criteria were, Mr Galbraith decided that it was right to leave the decision to it in the light of that operational knowledge.

Michael Russell:

I will mention Mr Galbraith in a minute.

On the evidence of 17 July, it seems that Eleanor Emberson was involved in all the detail of all the decisions. There is an item about the postal service and another about whether there were enough telephone lines—she was involved at an extraordinarily detailed level—yet from that day on your department does not recommend that there be a postponement. Would it have been possible for you or the minister to remove the chairman and the board?

John Elvidge:

If there were reasonable grounds, yes.

Michael Russell:

Given what you have told us about the phone call of 26 June, there might have been reasonable grounds, yet that was not done. Despite the fact that you were running the organisation and had those options before you, you did nothing to stop the disaster taking place on 9 August. Do you regret that?

John Elvidge:

I do not think that any of those actions would have affected the outcome at that stage.

Before I forget, let me go back and correct myself on one point. I do not think that it is within our power to remove the chief executive.

I did not ask you that; I asked whether it was in your power to remove the board.

John Elvidge:

I thought that you did—in that case I have not misled you. It is within the minister's power to remove the chairman or members of the board if there are grounds to do so and if the unfitness of those individuals to hold their roles can be demonstrated.

Michael Russell:

It is clear from the letter of 17 July that you had realised that the organisation was in chaos. The detail of this letter is extraordinary. You knew that the issuing of the results was of vital importance to hundreds of thousands of young people. Do you regret not having done something more in the period leading up to 9 August? Do you regret not having advised the minister to do something more? Did you so advise him?

John Elvidge:

If I were able to identify an action that I believed would have changed the outcome, I would certainly regret that we had not taken it. However, we must recognise that, at that time, we were at one minute to midnight. I cannot think of any intervention that we could have made at that stage that would have changed the outcome.

Did you advise the minister to take any action at that stage?

John Elvidge:

You are tempting me, but you know that I have already said that I cannot reveal what advice, if any, I might have given the minister. You can infer from the fact that I cannot think of an action that could have been taken that I would have had difficulty communicating such an action to anybody.

Mr Stone:

For much of the time, you were communicating only with the SQA, and I regret that. There were signals for other organisations. As Mike Russell has pointed out, your approach was then hands-on, and you weighed right in and started to try to sort things. Do you now regret that you did not go in further and grab the organisation, as it were, to ensure that it ran properly?

John Elvidge:

I do not believe that that is within our power. The clear intention of the legislation is for the operation of the examination system to be separated from political control. It is not my understanding that the powers could have been used effectively to take control of the organisation. The SQA remained—and remains—free to thank us for our advice, but to choose to do things differently.

In fact, we were working closely together. Eleanor Emberson's letter may look like a list of instructions, but it may more accurately be regarded as a record of the steps that had already been agreed with the SQA as sensible.

Yet you had the ability to offer IT support, which was not taken up. You could have gone further than you did.

John Elvidge:

We could have offered the SQA people or money: we could have offered to second people to it; we could have offered to supply it access to contractors; we could have offered it the money to employ other people. In effect, we did offer it those things.

I do not think that it is within our power—I am conscious that we are in the sort of territory about which a legal textbook could be written—to say to people at the SQA, "You as an employer, irrespective of your own judgment, will employ X or Y."

Mr Stone:

But the storm signals were there. You could see the shambles that was developing. You could see the conflicting figures that were appearing day by day, yet you are telling us that you felt it best for the SQA to run with it and with the date for the exams.

John Elvidge:

I am saying that—

Do you admit that that was a mistake?

John Elvidge:

I am saying that it is a huge decision to take the running of a very complex system out of the hands of the people who know how to run it into the hands of people who do not have experience of it.

But there was an extreme situation in which thousands of pupils stood to have their qualifications compromised, possibly blighting them for life. Was such action not demanded by that situation?

John Elvidge:

There is a danger of my sounding complacent or excusatory in saying this, but we have to consider the numbers.

Thousands were affected.

John Elvidge:

They were thousands out of more than 100,000. As far as we could tell, the system would work correctly for 95 per cent or more of cases and might not work as well as it should in a proportion of cases. At times, including towards the end of the process, it was suggested that that proportion would be less than 1 per cent of cases.

Removing the running of a complex system from the hands of the people responsible because one believed that one knew some other group of people who could achieve a 100 per cent result over the few weeks at their disposal would have been a huge decision. I would certainly not suggest to members that if such a decision had been taken the result would have been better. The most probable result of such a course of action would have been that many more thousands of young people would have experienced problems.

So you are saying that you acted with the best intention at the time, but you now agree, with hindsight, that you made a mistake.

John Elvidge:

No, I would not agree with that. It is not like substituting a football player: there is no second SQA waiting to be brought on to the pitch. There is no organisation that could obviously have been substituted for the SQA.

Mr Monteith:

We keep hearing about IT. I want to be quite specific about this. According to the evidence before us, there was a meeting with your head professional adviser in the IT sphere to consider what stage things had reached with regard to software—I am referring to the APS. It seems that, following that meeting, you were mainly offering help with hardware. Is that the case? Were you in a position to help the SQA with software?

John Elvidge:

I do not think that we could have helped the SQA with either hardware or software. There was no indication that either the hardware or the software was the problem. Even if it had been, the SQA was dealing with perfectly reputable companies that were responsible for supplying them both hardware and software.

We offered, in one form or another, people—foot-soldiers if you like. If the number of people trying to do the work was the problem, we were offering to try to help with that very simple difficulty.

I wish to clarify that. Our evidence was that the Executive's integration suite was offered. That is hardware—or does that include people working there?

John Elvidge:

The integration suite is a physical facility in which one can simulate the interaction of two computer systems. We did offer its use. It is not like some sophisticated science lab of which there are only two in the UK; it offers a relatively well-understood testing approach. We simply happen to have a dedicated place where it is easy to do that, and we were offering the use of that physical facility.

What would the people who were offered have done?

John Elvidge:

We were not specifying that; we were saying to the SQA, "If your judgment, as managers, is that more bodies will help you, tell us, and we will try to get you those bodies."

Mr Monteith:

Given the evidence that we have heard, particularly in the past 15 minutes or so, about the number of meetings, the escalation of your instructions, the change in the nature of the relationship between you and the SQA and the decision to let the SQA decide that it should proceed with the issuing of certificates on 9 August, for delivery on 10 August, was any advice given to the minister by you or by your department that a public relations disaster was about to, or could, happen? Was it suggested to him that he might be required to be available? Irrespective of what advice you gave the minister, did you not think it surprising that the minister was not making himself available, given all the preceding, escalating discussions?

John Elvidge:

I will try to stand on firm ground in approaching those questions. Once we had entered the phase in which it seemed certain that some candidates, even if a small number, were going to experience a problem, there was a lot of discussion about the public impact of that. Our primary concern was how to manage information flows in such a way as to minimise the distress caused to candidates. We believed, in the run-up to 9 August, that a set of arrangements to do that was in place.

What happened on 9 and 10 August did not bear any relationship to the set of arrangements that we believed would be in place to avoid that distress to candidates. Had we known what was going to go wrong, we might well have thought that there would be a serious public relations issue that would have a political dimension. As it was, we believed that a sensible set of plans for differentiating the substantial majority of candidates who would not have a problem from the small minority who would have one, and for offering targeted help to the small minority, was in place. We were wrong.

Mr Monteith:

I follow your train of thought and your rationale, but given that higher still has been seen as a flagship policy of this Executive, if not previous Administrations, and given that the date on which exam papers are to be sent out and delivered is known well in advance, would you not have advised the minister that this would be an appropriate time for him to be available to champion the delivery of the policy?

John Elvidge:

Generally, I do not take the view that ministers need advice on political matters from me. We were not thinking of this situation as a set of issues that linked to wider debates that had a political profile; we were thinking of it as a concerning practical problem that we wanted to ensure was managed as effectively as possible and whose impact on candidates we wanted to minimise. The time to worry about the splash is when one has run out of time to control the size of the thing that is going to make it.

Mike Russell has a quick question about that.

Michael Russell:

I would like Mr Elvidge to reflect on this. Eleanor Emberson might have expected—and may have told you to expect—what was going to happen in PR terms. There is an e-mail, dated 1 August, from Alastair Wallace, that discusses what would happen if there were a delay in issuing results, and specifically

"the panic and thousands of calls from anxious students and parents"

because of

"the reason for the delay—missing data affecting candidates' results."

In the same paragraph he makes the point:

"We would expect Ministers to issue a supportive statement making these points".

When you read in the Official Report your answer to the previous question, you may want to reflect on the fact that at least one of your officials had projected what was about to happen and given a view on how ministers might be involved and that your senior staff knew that.

John Elvidge:

We need to separate two things. Postponement of the issue of results would have affected every candidate in Scotland and would clearly have required a different kind of approach. Because one would have no communication with candidates individually, one would have to communicate in a mass sense—what might be described conventionally as public relations.

Michael Russell:

I am sorry to interrupt you, but the paragraph that I quoted starts with the words:

"I know that you have been looking at scripts/ads/letters on this; and for the more likely scenario that the results are issued on time but still with substantial numbers of incomplete certificates".

The quotation that I gave you refers specifically to what took place. I am asking you to reflect on that when you reconsider your answer to Mr Monteith.

John Elvidge:

General statements to try to put the problem in context were part of the communications strategy to back up the individual communications that candidates would receive.

So your staff thought that supportive statements from ministers would be enough.

John Elvidge:

I did not say that. The strategy for communication with candidates was a good deal more complex than that. It was focused primarily on what they would receive individually and what the centres to which they would naturally turn for advice were supposed to receive. We did not think for a moment that some generalised public statement would be an adequate way of dealing with the situation. However, we did think that a centralised public statement was a sensible component of any attempt to communicate with people and to distinguish the situation of the majority of candidates, who we expected to be wholly unaffected by the difficulties, from the position of the minority who were likely to be affected, as far as we could tell at that stage.

I will take three quick questions before trying to wrap this up.

Ian Jenkins:

I want to take a couple of steps back and to reconsider the decision to issue results. Was the department aware of the fact that concordance checks had not been carried out and that the first issue of results would consist of crude results to which amendments were likely? That decision affected the quality of the first issue of results. What about the missing data in the results that were issued?

John Elvidge:

That is a shortish question, but because it takes me into completely new territory it is not easy for me to give a short answer. Concordancy was applied to some categories of exam and not to others. It was applied to standard grade and old higher, but not to certificate of sixth-year studies and new higher. The effect of concordancy is to substitute for part of the appeals process. The likely effect of applying concordancy, had it been statistically possible, would have been to change the results of perhaps 4,000 candidates. I do not think that that was material to the decision that was made, because it would still have left the proportion of candidates who could expect to get correct results at well over 90 per cent. The crucial judgment in everybody's mind was the balance of interests between the 90-odd per cent for whom things were okay and who would be distressed by a delay, and the remainder for whom the process was not going to be okay. We had to weigh up those two numbers in our minds.

Who decided not to carry out the concordance checks?

John Elvidge:

It was entirely a matter for the SQA. I understand that it decided on technical statistical grounds that it did not have the evidence necessary to enable concordancy checks to be run for new higher. Because concordancy is a judgment on individual schools' records of forecasting results in individual subjects, one needs a certain sample size before a prediction can be statistically valid. If that condition is not satisfied, one cannot make concordancy work.

Johann Lamont:

You said—this may be at the heart of the problem—that the problem in dealing with the difficulties that were emerging was that there was no substitute SQA. Is it your department's position that once a body such as the SQA has been set up to take responsibility, unless a crisis comes very early it is impossible to do anything about that, except to cajole, encourage and advise? Are you saying that, once you have passed responsibility on to another body and the relevant expertise has gone to that body, time is against your being able to do anything to put problems right?

John Elvidge:

It depends on the type of public body one is dealing with. In the case of a public body such as the SQA, running a big operational system, it is true that unless you know at quite an early stage that you want somebody else to do the job, saying "We have changed our minds and would like someone else to do this" is not a practical option. Major changes of supplier are usually planned over several years.

When the SQA was being set up, would the possibility of difficulties emerging have been flagged up to those who were establishing this system for delivering educational qualifications?

John Elvidge:

That is a bit hypothetical. By common consent, we were incorporating into this structure an organisation, in the Scottish Examination Board, that appeared to have an exemplary track record of running this very specialised business. I do not think that doubt that a body incorporating that expertise could discharge this kind of function was likely to be near the forefront of anybody's mind.

Johann Lamont:

So even though the SQA was taking responsibility for a flagship policy, the possibility of its not being able to deliver it would not been discussed. Are you saying that if it did not have the capacity to deliver, there is nothing you could do unless it became apparent very early?

John Elvidge:

If we had had reason to have doubts about its capacity to delivery, I am sure that it would have been reasonable to consider that. I am driven back to the answer that, of all the bodies about which one was not likely to have doubts about its ability to do this particular kind of task—

Once you went down that road though, there was no safety net.

John Elvidge:

No. Constructing a safety net to run systems of this scale would be an enormously expensive business.

Cathy Jamieson:

I have a brief question on a practical point. You have mentioned a number of times the majority of candidates who were expected to have accurate results and the minority. For the record, will you tell us what your understanding of the projected balance was? On the basis of the information that you had been given, what percentage of people did you expect to get and not to get accurate results? How does that compare with the usual margin of error? Who took the final decision to go ahead and issue the results? When was that decision taken?

John Elvidge:

I will work through that series of questions. There was never a point in time when the information suggested that the size of the majority would be smaller than 95 per cent. At times, the central estimate was that it would be closer to 99 per cent. As we now understand it, the correct proportion was as near to 97 per cent as for it to make no difference. Therefore, the eventual outcome seems to have been pretty much in the centre of the range of estimates.

The SQA took the decision on whether to stick to the planned date for issue on the basis of its knowledge of what was happening and its understanding of the balance. My memory fails me as to the precise day on which it took that decision, but it was later than we had expected it to be taken.

Nicola Sturgeon:

The memorandum states that the Scottish Executive made contact with the Association of Scottish Colleges, the Committee of Scottish Higher Education Principals and the Student Awards Agency for Scotland to discuss the effects of a delay in issuing the results. If the decision was entirely down to the SQA, based on its judgment, why was the Scottish Executive making that kind of inquiry at the same time as the SQA was taking its decision?

John Elvidge:

We were offering the SQA every co-operation to assist in the management of the situation. The views of the various bodies seemed a relevant factor in the decision. If any of them could point to a compelling consideration, which was not part of our consideration, one would want to know about it.

Part of your consideration? I thought it was for the SQA.

John Elvidge:

I used "our" in a collective sense. We were discussing the issues with the SQA daily. We had our sleeves rolled up and were doing what we could to achieve the right outcomes. I do not think that we thought that who contacted a particular set of organisations was a significant decision. It is a question of where the free pairs of hands are. The SQA was doing the job that only it could do. It made sense for us to do some of the things that anybody could do.

The Convener:

I am afraid that I am going to have to wrap this up. I am conscious that a number of questions have either not been asked or have arisen during the discussion, which has been thorough. The committee will consider whether it wants to invite the witnesses back to the committee to take those issues further. For now, I thank them for their evidence.

We will take two minutes while we change witnesses. Our time for this room is running out, so members will need to arrange their own comfort breaks.

Meeting adjourned.

On resuming—

The Convener:

I offer my apologies to the witnesses for the delay; I know that they were told that they would be taken a lot sooner. However, I am sure that, having been in the audience, they will appreciate that the questioning was thorough and that we wanted to continue with it.

I am grateful to Ron Tuck and David Elliot for agreeing to appear on the panel together, which will facilitate the questioning. We have outlined a number of questions and, as before, I will bring in other members for supplementaries as we go through. I aim to be finished shortly after 1 pm. As I said to the officials from the Scottish Executive, if we do not get through the questions, we may need to invite you back to future meetings. However, we will try our best to get through them this morning.

What strategic planning took place within the SQA for what was going to be a difficult year?

Ron Tuck (Former Chief Executive, Scottish Qualifications Authority):

Our strategic planning was done via a unit in the corporate planning process, led by me and senior managers. The corporate plan is discussed and approved by the board then finally approved by ministers. Underneath that are all sorts of operational planning. Led by an individual head of unit, we planned for all our higher still-related tasks across the organisation. In addition, there was a body known as the APS project board, which from 1997 oversaw the development of all the software.

Cathy Peattie:

In your submission, you seemed quite surprised that, despite that strategic planning, there were not enough markers, information technology was inadequate and several other issues were coming up. In the spring or summer, did you consider trying to get an overview of the organisation to examine where the problems were?

Ron Tuck:

It would not be correct to say that we had entirely failed to anticipate problems in relation to markers. As I said in my submission, the extent of the problem took us by surprise. I admit that. However, I also explained that we had become aware of a growing issue about markers, which arose partly because of remuneration levels and partly because marking was increasingly eating into summer holidays. Therefore, the marginal benefit to staff of undertaking marking was reduced. We did not expect the marker problem to be on the scale that it was. I admit that there was perhaps a failure to anticipate the full consequences. Moreover, because of the change in the timing of the examination period, we had reduced the period of marking from three weeks to two weeks and we gave markers the option to take a full or reduced allocation. With hindsight, that was probably a bad decision.

Cathy Peattie:

So better strategic planning might have helped.

I am a reporter to the Enterprise and Lifelong Learning Committee and I was interested in some of the evidence that you gave yesterday. I am not sure whether someone asked you about this directly, but you give the impression in your submission and in other comments that I have read that you think that you were misinformed about the capacity of the SQA to deliver. Will you expand on that? Were you told lies? Did members of staff not give a full and accurate picture of what was happening?

Ron Tuck:

It is important in examining all these issues not to treat this as a global problem. In some ways the issues have connections with each other but, as I tried to explain in my submission, different issues arose at different times. In the end, the physical management of data was the crucial issue. Our failure to spot the problems with that proved to be critical. In relation to all other matters, the information that I had—and therefore passed on to the board and the Scottish Executive—and the assessment that we gave of the situation were broadly accurate.

We knew from the beginning that the awards processing system was difficult but doable. That is how APS turned out to be. As far as I know, the software performed its core functions. As with any major IT project, all sorts of annoyances and frustrations arose from using it, both for our staff and for staff in centres, but it did what it set out to do. Our general assessment of APS was not far short of the mark. We had anticipated a bit of difficulty, but the problem turned out to be worse than we expected.

Remember that for large parts of the year the issue that preoccupied us was the implementation of higher still. We were developing the vast amount of national assessment bank materials and we were listening to teachers and their responses on difficulties in specific subjects and trying to do something about that. As a result, I made assumptions that one part of the organisation, which had always been the fabled well-oiled machine, would continue to operate as it had done. With hindsight, one can say that that was a mistake in judgment and that we should have been more comprehensively examining all risks.

Cathy Peattie:

I read in The Scotsman this morning about a blame culture within the SQA. It reminded me of the part of your submission that talked about the poor management of the operations unit. I also read the paper that the head of the unit gave to us. He mentioned bereavement in his family. Good management is about supporting people who have been through difficulties. That does not seem to have happened, but you seem keen to condemn someone because of their poor management. Were you aware of the poor management that you now talk about? If so, why did you not do something about it at the time?

Ron Tuck:

What happened is almost the reverse of the inference that you are making.

It is what I have got in the submissions in front of me.

Ron Tuck:

The fact that the head of operations had suffered a bereavement led us to be gentler in our handling of him than would now seem advisable. We were probably more tolerant in our judgments. We did not probe his performance at an early stage. I do not think that we were guilty of insensitivity; we were perhaps too sensitive in our handling of him.

There is not a blame culture within the SQA. In any organisation, you will find individual managers whose styles are different. I would not claim that every manager in the SQA is a paragon of virtue in the management of staff. I do not think that the suggestion that there was a blame culture across the organisation would be substantiated by the SQA staff.

Do you agree that good management would be to anticipate a problem with a member of staff and offer support rather than wait until something goes wrong?

Ron Tuck:

We did not wait. One of David Elliot's performance objectives for 1999-2000 was to examine the management style of the head of operations and to support him in improving his performance. That is what our performance management policy says. One does not move to disciplinary action; one first tries to help the manager to improve his or her performance. That is what we tried to do.

Cathy Peattie:

Yesterday, it seemed clear that you thought that you could have achieved your aims but for one or two problems. Does that not highlight the poor communication within your organisation? It was never going to happen and it did not happen yet you still think that it could have happened.

Ron Tuck:

In my submission, I itemise the challenges that we faced. The overall conclusion that I draw is that we managed to get there in relation to most of those challenges: preparing the materials for higher still; preparing more than twice as many question papers; developing a large software system; and dealing with the unexpected crisis with markers. I would not suggest that the process was comfortable or smooth, but we got there. If we had not had any other problems, this committee would not be meeting today to discuss the matter. The fatal flaw was the management of data. That came to us left field late in June and it caught us cold.

I accept justifiable criticism that I should have been able to spot those problems earlier. The fact is that I did not. I do not think that you can draw a wider inference from that about management and communication within the SQA.

If communication had been better within your organisation, perhaps you would have been able to highlight issues earlier.

Ron Tuck:

It may be that there are communication problems in the operations unit. As I said to the Enterprise and Lifelong Learning Committee yesterday, one of the difficulties that I have in drawing conclusions to present to this committee is that the problems turned out to be far worse than I had understood them to be. Since 11 August, I have not been back in the SQA so I am not privy to the further investigations that have taken place, which will have highlighted why we had such a significant data management problem.

As I explained in my submission, from the end of June we attempted to identify why the perplexing and unprecedented situation of data going missing was happening. I itemised five audit trails that we followed, all of which yielded only partial explanations. The clock was ticking at that stage; 9 August was approaching and time that was invested with staff to find out what had gone wrong would be time taken away from getting in and sorting data. It was a difficult judgment call but, because none of our investigations was yielding the golden bullet, we focused on getting the data from centres with a view to issuing certificates on 9 August.

Your most important benchmark seemed to be 9 August, even though you must have known that things would not work properly.

Ron Tuck:

I did not know that. Until the point at which it would have been impossible to reverse the decision, I would have changed the date of certification had I known the true scale of the problems. In fact, the 9 August issue date was decided on and made public months ago, but we never regarded it as some kind of sacred icon. It would always have been open to us—although it would obviously have been embarrassing at the last moment—to change the date. I would much rather have delayed certification by two weeks had I known the scale of the problems and had I thought that those two weeks would have made a crucial difference.

The basis on which I made the decision was that we honestly believed that we had a missing data problem affecting 1 per cent of candidates, that we knew exactly what those missing data were and that, once schools returned and we were able to talk to principal teachers, we would be able to get those data in within one or two weeks and matters would be resolved. That would not have been a good outcome but, given the scale of the problem that we thought we had, we did not think it justifiable to delay the certification of, as we thought, 99 per cent of candidates, giving them two weeks' additional worry. We were wrong in our understanding of the scale of the problem, but that was the basis on which the decision was made.

Nicola Sturgeon:

My first question is about staff management. In April 1999, one of your directors resigned or retired, and the information technology and operations units were added to the responsibilities of David Elliot's division. Why did you not pursue the alternative of employing somebody who had an IT background, rather than simply lumping those responsibilities into somebody else's work load?

Ron Tuck:

We considered a straight replacement to create a fourth director. We went through interviews but the view of the interviewing panel, chaired by the chairman, was that none of the candidates would have added significant value and that it was therefore preferable to move to a three-director structure. That might have been part of our general move towards de-layering anyway, but it is important to remember that, at director level, one is not expecting technical operational expertise. The head of the IT unit is an IT expert. Below him are people with even greater expertise in IT. The job of the director is strategic management—to understand the needs of the business and to attempt to ensure that IT developments meet the needs of the business. A director in charge of that unit has to be IT literate, but he or she does not have to be a hands-on IT expert. Indeed, it can sometimes be a problem if someone at senior management level is too hands-on, as there is then a tendency to interfere in the work of subordinate colleagues.

Are you saying that the implication in David Elliot's submission, that you simply chose the cheaper option, is not true?

Ron Tuck:

In running a public sector organisation, cost-effectiveness is always an issue. We went through the step of seeing whether we could, by external or internal recruitment, find somebody who we thought would add value to the senior management team of the SQA. However, it was the judgment of the interviewing panel that we were unable to do that.

Nicola Sturgeon:

Do you accept that, at a time when you were implementing a new IT system, which is described in David Elliot's paper as

"the largest of its type current in the UK",

a layperson might find it bizarre that you decided to merge IT with another division, rather than employing a director who would continue to have separate responsibility for it? There is a clear implication in David Elliot's paper that it was simply a matter of cost. Is that correct or not?

I shall give you a chance to contribute, Mr Elliot, as soon as Mr Tuck has answered that question.

Ron Tuck:

I do not think that it was a bizarre decision or that the scope of responsibilities of directors was excessive. What David Elliot could bring to that post, which none of the other external or internal candidates could bring, was an understanding of the examination system based on a couple of decades—I cannot recall exactly how many years—of service. In our view, that was the sort of expertise that was necessary.

David Elliot's submission also says:

"Of the two units which were new to me, IT gave greatest concern."

Were those concerns expressed at the time and were they factors in your decision?

Ron Tuck:

David Elliot can speak for himself, but I do not think that we had concerns about the IT unit. It was well led and had a very capable manager and good staff. What David may have meant was that the APS project was regarded as a No 1 strategic priority, as indeed it was.

We shall give Mr Elliot a chance to explain what he meant by that before we move on.

David Elliot (Former Director of Awards, Scottish Qualifications Authority):

I would not say that the decision to go down from four directors to three was bizarre. It was a judgment call. I was simply making the point that someone with more experience of data processing might have spotted the problems sooner. I was clearly on a learning curve.

I can confirm that the IT unit was well managed. As I point out in my paper, the organisation did not exist until April 1997, and the courses started just over two years from then, so there was an enormous amount to achieve in a short period. Internal reorganisation of the body had to be done and the business processes had to be planned before we could start writing IT. I should make it clear that the operations and IT units were responsible not only for the examination system, but for processing all vocational qualifications. That was going on in tandem with the preparations for the examinations.

There was a great deal to be achieved in a short time. That is why I was concerned about whether the software would be ready in time. Without the software, nobody would have got their exam results this year. The legacy computer systems were not an option, because they could not support higher still. I did not have the same initial concerns about the operations unit, because it had a good track record. It gradually dawned on me over time that there were problems in operations. Without the learning curve that I had, I might have spotted those problems sooner. That is the point that I was making in my paper.

Nicola Sturgeon:

The suggestion that the data processing problems might have been identified earlier if somebody with an IT background had been employed is critical; we may want to return to that with other witnesses.

My next point is for Ron Tuck. You have spoken about the burdens of implementing higher still. Correct me if I am wrong, but I think that you implied that your preoccupation with other aspects of preparation, such as the national assessment bank, may have led you to take your eye off the ball in other parts of the organisation. Your general comments seem to suggest that the overall burden of higher still came too soon and that the SQA was simply not ready for it at the time.

That brings to mind a comment that you made the day after your resignation, when you said that, with hindsight, a piloting of higher still might have been desirable. You have heard the non-answers that we got from officials on that point. Is it something that you ever raised with the Scottish Executive? To the best of your knowledge, was the possibility of not implementing higher still in one go and in one year ever discussed? Would piloting or phasing have helped?

Ron Tuck:

Before I answer that second question, let me offer an adjustment to the summary that you made of what you thought I had said. I was not focusing solely on higher still; I was focusing on all the things that were new, including the APS. As a manager, if one is looking at where the risks are, one naturally assumes that the risky areas are the big new things that one has to do. One does not necessarily go round the organisation checking whether people are continuing to do the jobs that they have done for the past 20 years. That is the point that I was making.

I am not convinced that piloting would have helped us. I draw a distinction between piloting and phasing. Under piloting, one would have new-style qualifications operating alongside old-style qualifications. The problem is that such a system adds to the burden, because there would be even more examinations to run. Although that might be beneficial in terms of the lessons that we might learn, it is an additional burden.

Nicola Sturgeon:

I do not want to discuss the merits or demerits of the system. Unless you were misquoted, the press reported you as saying that, with hindsight, that option might have been desirable. My question is whether you actively discussed it with the Executive as an option.

Ron Tuck:

No, but that is why I have drawn the distinction between piloting and phasing. Phasing was discussed, but what eventually happened was that higher still was phased.

Are you finished, Nicola?

For the moment.

Ian Jenkins:

The first two pages of your submission detail all the activities that you had to undertake at once. I was slightly involved with the process and acknowledge where you are coming from. I see the SQA as an overloaded plane that managed to take off but is having difficulties with the landing—I do not think that we can change the pilot at the last minute. John Elvidge has said that there was nothing much wrong with the computers and that the teachers did all right, and you have pointed out that the SEB had a very good track record—indeed, that was the last place where anyone expected anything to go wrong. However, the problem lies in the quantity, quality and complexity of the data and in marrying the internal assessment with the exam results. Do you accept that that is where you were caught cold?

Ron Tuck:

I do not think so. However, I accept your general analysis that we were overloaded. In light of what has happened, I think that, although the whole venture might have been doable, it was risky; it was my job to advise the Scottish Executive of that fact. We did not do so, which is why we have to take responsibility for what happened.

Complexity of data is not a fundamental issue. We, the Parliament or whoever need to decide whether unit assessment is a good idea for pupils. If it is, admin systems are able to cope, because large-scale IT does that job quite well. Unit assessments are actually a less complicated part of the process than other parts. Nicola Sturgeon said that the highers system was simpler because there is only one exam paper. However, that is not the case; for example, one higher has nine components of course assessment, which is quite tricky. With unit assessments, it is simply a matter of deciding yes or no using the software. The software for pulling together various aspects of course assessment is far more complex, because it has to collate all the information into a final result. We have had software systems for such complicated results processes for some time. Although unit assessment is an additional burden, it is not a fatal flaw. IT and admin systems could and should be able to deal with it.

Ian Jenkins:

The whole system seems genuinely complex, even if you know what you are looking for. For example, pupils can have the option of doing two or three units and being awarded others later. I seriously believe that, through no fault of their own, some of your staff sometimes did not know what data they were looking at. People in schools or at the SQA probably did not know, for example, how an internal grade for spoken English related to other unit assessments or how qualitative assessments or gradings were translated into results. I suspect that that was the point at which some data went missing, were misinterpreted or were put in the wrong place.

Ron Tuck:

That is possible. However, if it were the case, it would be a matter of briefing and training staff. Unit assessments are not complex in admin terms—you either get them or you do not. For example, a pupil without three unit assessments will not get a course award. As far as the software is concerned, the process is straightforward. It is far more complex to combine four or five bits of course assessment with different marks into an overall grade. However, it is possible that some of the staff in the operations unit were not adequately trained for the task. That is our responsibility, but it does not mean that unit assessment is a fundamentally flawed idea.

Michael Russell:

In previous file systems or databases, a number of boxes had to be electronically ticked. When the system is set up for a particular task, it will present a box with four or five things that have to be ticked and the user will run a check at certain times of the year—particularly during the examinations—to ensure that all the information has been received. That system will automatically present any information that has not been received, which allows the system user to discover that they do not have a piece of information on a certain candidate. As long as it is properly set up, such a system can cope with one piece of information or 20 pieces. Were such checks carried out at the appropriate time and did they throw up the missing data or was the filing system not set up properly?

David Elliot:

We must consider that question in the context of the implementation of new software, which was challenging for us and the centres. We accepted that the information would come in much more slowly than in previous years, because we wanted to be flexible. The centres were having difficulties, which meant that the data were coming into the SQA somewhat later than normal. However, we took a cut about April and sent what we were holding to centres that had submitted electronic data to us so that they could check them.

One of the stark differences is that, in the traditional SEB system, internal assessed grades and estimates were generally expected to come into centres by 31 March or in April. For very good curricular reasons, higher still was different and unit results could come in right up to the end of June. That was not a flaw in the system but another matter that made the management of the examination in its first year that bit more difficult.

Michael Russell:

With respect, you have not answered my question. Let me repeat it. A certain box requires a number of things to be ticked to say that data have or have not been received. From the mountain of correspondence that I have received from schools and others that have submitted the information time after time—some schools have done so six times—it seems that, in previous years and exams, the system said whether specific information had or had not been received. Did you have and run such a system? If so, when was it run and what did it tell you?

David Elliot:

That question requires a complex answer, depending on which data we are talking about. Standard grade internally assessed grades—which have existed for a long time—were to be sent in by 31 March. It came to my knowledge that the normal practice was to pursue any centre that had not submitted data. However, as staff were already working an enormous number of hours by April, we did not carry out such a pursuit.

As for the new qualifications, it is fair to say that any reports that the system generated were not terribly user friendly and could have been improved. It was not always easy to get a grip on the amount of data that we held, as in many cases we had to make a judgment call about either gathering management information or getting on with processing data. That decision was a constant feature until August. However, by June, we were receiving regular reports from our IT and operations staff on the extent of missing data. On the basis of those reports, we pursued centres to try to get the missing information to the SQA.

I want to be specific about this, Mr Elliot. Did you have such a checking mechanism? You seemed to indicate that you had. Yes or no?

David Elliot:

Yes, we did.

When did that checking mechanism pick up that there were substantial amounts of missing data?

David Elliot:

As I said in my earlier reply, when do you construe data to be missing—

When they are not there. When did you understand that there were substantial amounts of missing data?

David Elliot:

They are not missing until after the date at which you expect them.

In that case, was there a date that you can think of when you had expected to have data but you did not?

David Elliot:

As I said, for the standard grade data it was clear on 31 March. We identified, later than we should have because of the pressure on staff, that a substantial amount of data were missing and we went about trying to make that good. With the higher still qualifications, the higher still programme had decided with the SQA that one unit result should be submitted by the end of March, but the other two unit results were not required until the end of June, so technically they were not missing until—

After you knew that the data were missing on 31 March, did you do something about it on 1 April, or 2 April or 3 April? When did you know there were problems with the data that were due in March?

David Elliot:

It was at that time that we issued to centres what we were holding on our system for them to check, to establish whether it was complete.

Did the centres get back to you immediately, or was there a time delay before they replied?

David Elliot:

As I recall, the centres reacted differently. Some were grateful for the information, while others, because they were working extremely hard as well, were not all that pleased to be given a request to check all the data again to see whether we were holding—

Michael Russell:

But a lot of centres have said that they submitted the information to you several times and that what they got back from you was inaccurate. Quite a number of centres said that they went on submitting data to you up to and beyond August without your system registering it. At what stage did the system pick up the problems? At what stage did you pick up the problems? At what stage did you think that the problems were solvable? What was happening?

David Elliot:

I think that we were aware of the problems from April onwards and became increasingly aware of how many data were missing. The difficulty was identifying the reason. The centres began to tell us that they had already submitted the data. My staff were suggesting that it was a complex situation. We had changed forms that had been unchanged for many years. We were asking for all sorts of different information, for example for unit passes and component scores, so some of the advice that I was getting was that the centres might have thought that they had sent the information but they had not. It took a long time for us to accept that the centres genuinely were sending data into our building but it was somehow not ending up in the IT system.

May I make an observation? I am sure that this is not the case, but from what you are saying it would appear that nobody had the faintest idea what data were there and what were not.

David Elliot:

We knew which data were on our IT system.

But you did not know whether they were the right data, or whether they were complete, or whether they were data that you should have, and you have said that you did not know whether there were missing data.

Mike, can we give Mr Tuck an opportunity to come in on this point?

Ron Tuck:

We presumed that the data were outstanding. They were not on our system, so the natural assumption was that they were outstanding. By the time we talked to our board on 22 June, I was not aware of a significant problem of missing data. I think that my Scottish Executive colleagues said that I made a telephone call to them on 25 June. I am sure that it was around then, but I cannot verify it. It was around that time that it became clear to us that in fact the problem was not one of outstanding data, but of missing data. In other words, the centres had been sending in the data and for some reason they were going missing.

Michael Russell:

I want to ask you that specific question. We have heard about the phone call of 26 June. What prompted it? Why did you make it? What did you say? Who did you speak to? Tell us about the circumstances. Ian Jenkins talked about the aeroplane. This is the moment at which the Titanic hits the iceberg. Tell us what happened.

Ron Tuck:

We had set up a schools desk team to make phone calls to centres to pursue the data question. It was at that point that it became clear that what had previously been anecdotal evidence of some centres having sent in data more than twice was actually widespread. At that point it became clear to us that it was not an outstanding data problem, but a missing data problem. While I disagree with your statement that at that point the Titanic hit the iceberg, it certainly was the case that at that point we became seriously concerned.

When was the desk set up? How long did it take? Who came to you and said, "I think we have a problem"? What happened? What decisions did you make that led to that call, and who did you call?

Ron Tuck:

I cannot recall precise dates, but it was around the middle of June when we established the schools desk. We gradually built up the staffing. All schools were pursued. We pulled together management reports and we started to find the consistent theme that data had been sent in already.

Who did you call?

Ron Tuck:

The call would have been made to the school to speak—

No, who did you call in the Scottish Executive education department?

Ron Tuck:

It was Eleanor Emberson.

And what was her reaction?

Ron Tuck:

Clearly, she was concerned.

Was that unexpected?

Ron Tuck:

Scottish Executive officials can understand what is going on inside the SQA only on the basis of the information that we provide. Throughout, we provided information in good faith, which is why, as soon as we became aware of a problem that was new to us, we alerted our Scottish Executive colleagues. At that time it was a matter of concern. It certainly was not, at that point, inevitable that there would be a problem on 9 August. What it gave us was a significant challenge in retrieving missing data.

If it was not inevitable at that moment that there would be a problem, yet it still happened, what did you not do during that period that could have averted it?

Ron Tuck:

We are talking about knowledge of a situation. In my submission I itemised five main reasons—there were probably more—that we identified that could explain why the problem occurred. We pursued them. Any one of them, according to our knowledge on 26 June, might have been the golden bullet. At one point we were optimistic that the results were not missing at all, and that what we had was a massive problem of duplicate entries. At that point we even said, "This is a very promising lead. Let's hold on for two days before chasing up schools for data because this seems quite a convincing explanation. The system may be showing up missing results that are not missing at all: if Mike Russell is entered for maths 1 twice and there is one result, that shows up as a missing result." That proved to be the source of a tiny proportion of the missing results. We followed five or six audit trails of that kind.

I assure you that this was the most inexplicable and frustrating experience of my professional life: discovering that we had a problem of vast volumes of missing data for which we could not find a reason. As David Elliot said a moment ago, a judgment call had to be made. The clock was ticking towards August. Whether it was 9 August or 16 August, it was still ticking. Eventually, we had to concentrate staff resources on getting data in again and entered on the system, rather than on completing the investigation into why this strange situation had occurred in the first place.

Did you consider abandoning the entire system and starting again?

Ron Tuck:

That is not possible.

So you were locked in to a system that did not work. You did not know how it worked.

Ron Tuck:

By this time candidates had sat their examinations.

We are talking about data. You were locked in to a data handling system that was flawed and you did not know why. That is what you have just said.

Ron Tuck:

No, I am not saying that. I am saying that we had a problem of missing data and we could not understand why data had gone missing. It was possible at least in theory to pull in those missing data, and to a large extent we succeeded.

When 9 August arrived, I honestly believed that data were missing for 1,500 candidates. The number turned out to be more than that. The strategy that we followed—of trying to retrieve the missing data—was essentially feasible, it just did not succeed.

David Elliott:

That is right. We had no alternative. One of the most demanding requirements of a public examinations system results from the constraints of that system. The start of examinations cannot be delayed. Although the issue of certificates can be delayed by a week or so, the system allows no freedom. We had to work towards the issue date. We thought that we had cracked the great majority of the data. The balance of advantage was to let the vast majority of candidates for whom there were accurate and complete certificates receive them on the due date.

I am slightly confused about why you did not know how many results were affected and why the number that you thought were affected kept chopping and changing.

Ron Tuck:

They did not; they fell.

Nicola Sturgeon:

Nevertheless, you did not seem to know with any consistency what the figures were.

Mike Russell talked about the check that you run to throw up which pieces of data are missing. I understand that the practice is to run that check on the final file, which is sent on to produce the certificates.

Ron Tuck:

Correct.

Was that check run on the final file? If not, why not? If it was, surely you must have known with some certainty how many results were affected.

Ron Tuck:

That check takes a statistically large sample. It involves verifying the results of all the candidates at one school. Staff produce the results for all candidates manually, which are checked against what the computer throws out. If that works for all subjects and all candidates in one school, it is a statistical certainty—near enough—that the whole system will work. That is what the SQA, and the SEB before it, have done for years. That is a test of what the software does with the data that are entered; it is not a check on missing data.

I would like to clarify some questions about the missing data. Are we talking primarily about paper or electronic data?

Ron Tuck:

I think primarily paper, but you must bear it in mind that I have been out of the organisation since 11 August, so it is difficult for me to give an accurate answer.

When you were in charge and you became aware that data were missing, were you under the impression that they were paper or electronic data?

Ron Tuck:

I would say predominantly paper, but David Elliot might have a view.

David Elliot:

We tried to keep an open mind. We ran checks on the software systems. It was up to education authorities and colleges to choose their own software suppliers. Phoenix and Strathclyde educational establishment management information system—SEEMIS—were dominant in the school sector. We studied Phoenix and found one problem, but that related to one diskette. We checked SEEMIS. It seemed to have a problem on 3 May, but apart from that all data were being transferred accurately. We kept an open mind and considered electronic and paper causes. However, I found it hard to accept that paper forms were sitting unprocessed in the office. I instigated several checks to ensure that no unaccounted for forms were languishing in the organisation. We cleared new accommodation in which to store the forms, because that reason seemed so implausible. I am in the same situation as Ron Tuck. I do not know what the SQA has discovered since 11 August about the predominant explanation.

Mr Monteith:

The clear implication of both papers in the evidence that you submitted is that although the software was delayed, which caused knock-on effects, it mainly worked. The problem was not the software, so you are pinpointing that it could have been with paper that had somehow been mislaid or with data entry into the system—or do you use tapes?

David Elliot:

Paper forms are sent to data punch bureaux to be punched and then turned into an electronic file that can be input. We had difficulty locating sufficient data punch bureaux because they are beginning to disappear from the land. I am not sure whether all the checks were carried out on the data when they came back from the bureaux. Staff were being overwhelmed.

Peripheral aspects of the IT, such as the screens that one uses to amend data, did not work quite as smoothly as we would have liked. There were delays in inputting amended data, and that contributed to the general difficulty that operations staff experienced in doing all the checks that they would normally like to do. Late marks data were being fed in long after they would normally have been put to bed, so staff were not free to do query checks. The normal query resolution process—which was needed so much more this year—was affected by the input and management of the core data. We were so constrained for time that, clearly, we did not resolve as many of the queries as we should have.

In effect, the quality checks that would normally have taken place were overtaken by events.

David Elliot:

In terms of resolving outstanding queries, they were less thorough than normal. The sort of things that have emerged since 11 August would certainly have been caught in a previous year if the staff had not been so hard pressed.

The Convener:

We have to be out of this room by a quarter past one at the latest. I know that several members have a lot of questions that they still want to get through, but I am reluctant to do that just now, so I suggest—and I hope that the witnesses agree with this—that we close the meeting and arrange a suitable date for both witnesses to come back. I do not know how convenient that will be, but I think that you will appreciate that a number of questions remain and that we are determined to get the answers.

David Elliot:

I would be pleased to come back.

The Convener:

I apologise for this, and I apologise to members who are waiting to ask questions, but we have to finish. I thank the witnesses for their attendance and patience this morning, and I look forward to seeing you again fairly soon.

Before anybody else leaves, the committee has to decide which items of written evidence will be made available to the public. I suggest that we make all the papers that we have seen this morning available.

Members indicated agreement.

That would be the second issues paper that we received from the two individual members.

And the material from the SEED?

Yes.

Given what has been said on the record and in public about the circumstances of Jack Greig, it would be appropriate that that paper be made available as well.

The reissued paper is on the table. You should have received it.

I mean available to the public.

Yes. I remind members that we are meeting again on Monday in Glasgow. We will meet at 1.30 pm to discuss our questions, and we will start taking evidence at 2 pm.

From whom are we taking evidence?

Evidence will be taken from SQA chairmen and board members, from the Scottish Parent Teacher Council and from the Scottish School Boards Association.

At that stage, will we discuss arrangements for Monday 9 October?

We will need to discuss arrangements for Wednesday 4 October and Monday 9 October.

I thought that the meeting on 9 October was going to be a different type of meeting.

The Convener:

It is. We will e-mail members this week with information about that meeting. We will discuss it next week as well. Obviously, we will also have to continue with this meeting, so that further questions can be put. Again, I will e-mail members, and we will schedule that as soon as possible.

I will contact members about the fact that information was made available late on, and I will suggest ways of ensuring that that does not happen again. I will listen to any members' comments on that.

Mr Monteith:

Notwithstanding some discussions that took place earlier, I would like to make my position clear. As spokesman on education for the Conservative group, I will feel entirely free to comment on any matters that are in the public domain and that were discussed here. I will seek to couch my language carefully, but if I am asked for comment I do not think that I can ignore any of the information that is in the public domain.

The Convener:

No one could say that, throughout the introduction to this inquiry, I have not sought to ensure that everything is as public as possible. However, I would say to you, Brian—and to all members of the committee—that if this committee is to act effectively, we need to act as a united committee. Members should ensure that any comments that they make reflect the fact that we are acting as a committee of this Parliament.

Mr Monteith:

Indeed, convener—but you will find that nothing that I have said so far has been an attempt to score points or to gain any party political advantage. I will take your points on board. However, if matters arise in the public domain that require comment, there are a number of different hats that members have to wear. Some parties have only one representative on this committee, who, by definition, is education spokesman.

I have never thought that you were in any way disadvantaged by being the only member of your party on the committee, Brian.

Nor have we.

However, it is important that members bear in mind their responsibilities as members of this committee of the Parliament. That is all that I will say on the matter.

Meeting closed at 13:11.