Official Report 275KB pdf
Good morning to all our witnesses; we are pleased to have you here. We have now had an opportunity to examine a paper from ADES and we will follow the usual process. I will give the witnesses a few minutes to introduce themselves and to add to their written statement. We are especially pleased to see Victoria MacDuff, and we will be interested to hear her comments. Then I will open up questioning to committee members. Will Mr Bloomer introduce the rest of his team?
ADES is delighted to have the opportunity to present oral evidence. Our team consists of Michael O'Neill, the previous president of ADES and the director of education in North Lanarkshire, and me, the current president of ADES and the director of education in Clackmannanshire. We have been fortunate to be able to bring with us Victoria MacDuff, a sixth-year student at St Modan's High School in Stirling. Victoria will amplify two points that relate to our evidence, if you will allow her a few minutes to do so at the end of my opening statement.
Absolutely.
Thank you very much.
Thank you. We noted the section in your submission headed "The Way Forward" and welcome your recommendations. We will come back to those in questions. Victoria MacDuff would like to add something to what has been said.
I am head girl of St Modan's High School, Stirling, and I chair Stirling Council's students forum. As a fifth-year pupil in the academic year 1999-2000, I experienced several problems, such as the late arrival of materials and constant changing of courses. This year I am doing higher modern studies and things have not improved. Some of the internal assessments were removed on Friday last week, and the materials for advanced higher Italian and French arrived last week. The course started in June, so, as members can imagine, that has caused problems. I feel that I am suffering a substantial increase in work load, because I am having to catch up. On top of that, there are internal assessments to be completed.
Thank you; that was clear. Pupils and teachers are not the only ones who have been left with misunderstandings; we have found it difficult to come to grips with the issue as well, and your own experience has added to the realisation of how difficult the process has been.
Before we start our line of questioning, I would like to return to something that Keir Bloomer said about who was responsible for the flaws in the design. Could you elaborate on that, Keir, because it would be helpful for us. Who do you think is responsible?
I would be happy to do that. Michael O'Neill may wish to contribute as well. The design arose, as you know, out of the political discussions following the Howie report, and the programme was initiated during the period in which Michael Forsyth was the Secretary of State for Scotland. The detailed design passed to the higher still development unit, and throughout the period in which the development was going on, a lot of the stakeholders within the system were extremely concerned about aspects of the development and delays in the implementation; much the same things that Victoria MacDuff spoke about in relation to this year.
The committee may wish to reflect on the way in which national curriculum changes take place in Scotland. I was part of a team that helped to introduce standard grade. The model for standard grade was the then nine regional councils' education authorities working closely with the Scottish Consultative Council on the Curriculum to produce the programme. In the early days of higher still the model was different in that, for reasons unknown to me, there was no role for the SCCC, despite it being the main advisory body to the Government on the curriculum.
You mentioned the paper that you submitted to the minister in 1997. Your submission also refers to that paper and goes on to say that the problems that you raised at that time were not addressed. Could you expand on that point? Which problems were not addressed?
The paper that you refer to, convener, was sent to the minister at the very beginning of 1997. In other words, it is almost four years since we first raised those concerns.
Could you clarify which minister you are referring to? There have been several.
It would have been Raymond Robertson at the beginning of 1997.
Are you saying that you did not get a constructive response from the minister and that he did not want to take on board the concerns that you had identified?
I cannot say that I recall what the written response of the minister was, since I was not president at the time. More important than that is that we did not get a proper response with regard to what the higher still development unit did next. The consequences of that lack of response are evident to us now. Michael O'Neill could probably answer your question better.
We had meetings with about five ministers, who were concerned and quite rightly allowed for delays in the programme. Ministers listened to us and wanted to delay the process. However, the point is not about delay, although that helped with the delivery of materials that Victoria MacDuff mentioned. Issues such as the design of the programme, which were not to do with the minister, were not tackled, as opposed to the professional and technical issues. In particular, the clash of cultures between SCOTVEC and the Scottish Education Board in terms of how assessment would be carried out lies at the root of many of the problems that schools experienced.
You make a clear statement that you are in favour of the general principles that underlie higher still, but your submission goes on to identify such a number of problems with it that we can hardly take your endorsement as ringing. You state that there was no philosophical debate and identify the difficulty arising from trying to reconcile the systems. Do you think that the systems can be reconciled? Is it possible to reconcile the differing demands placed on an examination system designed to meet the needs of youngsters in academic courses in schools and also the needs of youngsters in other courses and of people in the further education sector? Is it unreasonable to expect those to be reconciled?
It is because of the fact that our paper goes on to point to many difficulties that I thought that it would be worth while to indicate in my introduction what I mean by support for the principles that underlie higher still. I mentioned two points, if you recall.
Do you take the view that an increased emphasis on internal assessment can lead to a dilution of standards?
It might be helpful if Michael O'Neill commented as well. It is necessary to be clear about what the objectives of the separate elements of the assessment programme are. The aim of internal assessments tends to be to demonstrate simple competence in what has been covered by a particular unit, whereas the external examination is designed to serve the traditional need for examination at a certain stage of the school career, which has to do with entry into higher or further education or into other opportunities. Those are different objectives, which need to be reconciled.
In a previous life, I spent 15 years teaching the programme in a secondary school. The difficulty now relates not to the level of assessment, but to the need to record assessments formally, to transfer the assessment data and to deal with assessments in a different way. Schools, and I am sure FE colleges, have always used—recorded and kept—internal assessment for diagnostic, formative and summative purposes. Schools have always used prelims. There have always been unit assessments. In the previous system, an appeal was based on the school's evidence and the system worked well. The difference now, and the issue for us, is the rigidity of the system—the need to approach assessment in a certain way and within certain time scales—and the significant volume of paperwork involved in recording and transmitting the data to the SQA. The question is whether it is necessary.
You have said that the over-complex certificate should be abandoned. What should the certificate reflect?
When I opened my certificate on results day, I thought that it was very complicated. It was difficult to distinguish between the grades from this year and those from previous years. The writing was small type, although bold type was used. I have brought a copy of my certificate. It is not at all clear, as it has many pages and units.
But what should the certificate include to reflect the broad needs of youngsters doing courses, as opposed to just those of youngsters for whom highers are a stepping-stone into higher education? What is the minimum amount of information that certificates should include?
The unit information that is contained is unnecessary where there is course information. The core skills information is not yet in a form that adds anything useful to the information about the pupil's performance. Those are the areas in which there could be simplification.
I will return to a point that was raised earlier. You have referred several times to the importance of the development of higher still in what went wrong. The implication of your submission is that you spotted the difficulties and alerted the appropriate authorities to them beforehand. In section 3, you say that
When the letter to which you refer was written, I am not sure whether the SQA even existed. The letter was directed to the minister. I do not think that our expression of concern was unique. Many other stakeholders in the system at various stages expressed concern about how the programme was going, the lack of readiness of aspects of it and its over-complexity. Our concerns were merely part of a fairly steady stream of similar kinds of material that were directed to the minister and then to the SQA.
I am interested because when we have examined many of the concerns that were raised, it has turned out that they were about assessment in the classroom and the difficulties that teachers have faced. What has emerged so far is that very little attention was paid to the SQA's inability to handle higher still until problems occurred in June, July and August. We are considering the SQA's handling of this matter.
I will answer, as someone who was in the higher still development unit. As Keir Bloomer pointed out, this debate has gone on for several years and the SQA did not exist in its early stages. We have to remember that the SQA's job is to deliver rather than to devise the examination system. If our debate is about the nature of the exam system and about how the assessment was devised, it is true to say that at various points in the process the association made clear that the assessment model that was being developed was potentially too complex, weighty and hard to deliver, whoever was asked to deliver it.
Were you not referring to teachers? That is the implication of the evidence that we have seen so far. If you were not referring to teachers but to the assessment body, I would be interested to see the letters in which you made that point.
There are two separate aspects to that. One aspect, to which you referred, is well documented: the concern that was expressed by many people—not just ADES—was that the formalised model of internal assessment, and all that went with it, put additional burdens on teachers and took them away from teaching. The model that involved a sequential period of assessment—a linear assessment or progress through modules, which was the SCOTVEC approach and which did not necessarily fit the system comfortably—was also a change for classroom teachers. There are plenty of examples of comments that were made about the assessment that was devised and how it was going to put a burden on the learning and teaching process that was not helpful and that did not add anything to that process.
We raised concerns about the management information system and about information technology at the time, but those concerns focused on the difficulties experienced by schools more than on our anticipation that the SQA would experience difficulties in those areas. We also drew attention to our belief that insufficient attention had been paid to the practical implications of recording, reporting and conveying information. Although we did not elaborate on that issue, it emerged subsequently as a significant concern.
We will move on to the situation that arose during the summer months.
That situation also ties in with the topic that Kenneth Macintosh touched on.
The consequences to which we refer at that point in our submission are those for data management, which emerged out of the internal assessment arrangements. As I just said to your colleague Mr Macintosh, in early 1997 we did not anticipate the extent to which information overload would bring about the breakdown of the system.
Mr O'Neill, you said that it seemed to be the worst of both worlds, because putting the SCOTVEC and higher systems together led to an overload, as Mr Bloomer put it. Have you heard anything from the SQA in the months since the shambles that suggests that the burden might be lightened? What you have said this morning suggests that things are no better. There have been announcements from the SQA, but are you reassured in any way?
I am not sure whether I feel reassured at this time. However, I reiterate that the SQA's role is to deliver the system; it is not the SQA's role to make changes, unless it is instructed to do so. We have taken part in negotiations and discussions in a variety of working groups trying to simplify the system for the current year, so that the problems that Victoria MacDuff is currently experiencing do not continue into next year.
I would like to pick up on a point from your earlier oral evidence that was not mentioned in your written evidence. You observed that the Scottish Consultative Council on the Curriculum was not involved in higher still in the same way as it was with the introduction of standard grade, and that the higher still development unit took on that role instead. From your experience of the introduction of standard grade, why do you think that that happened? Why was the SCCC not involved in the same way?
I am not particularly clear about that. I posed the question on the basis that Learning and Teaching Scotland, as it is now called, remains the main advisory body on the curriculum. That is the body that delivered standard grade and has been heavily involved in delivering the five to 14 programme. It is not clear to me why the SCCC, as it was then, was not charged with delivering higher still, but I am not in a position to comment on why that decision was taken.
On the issue—or the non-issue—of results and printouts, paragraph 4.14 of your submission mentions the
I am not aware of our having been given any prior notice about that. As you know, we were advised comparatively late in the day that a small number of certificates might be slightly delayed. That is as much warning as we got about the difficulties that arose. We have not subsequently had specific reassurances in relation to the matter of information printouts going to schools. That is something that we hope will be handled properly in the next diet of examinations, because, as our submission states, it is extremely difficult for schools to assist pupils in pursuing inquiries at the point of issue of the certificate unless a printout is available.
ADES's submission mentions the difficulties with marking and moderation. In today's oral evidence, you said that you were aware of problems with the quality management of the marking, but were not aware of deficiencies in the marking. Have head teachers or teachers notified directors of education of any dissatisfaction with the marking? From the evidence that the committee has received, it appears that head teachers in particular have concerns about the quality of the marking, based on their expectations about not only the performance of some pupils in exams but the outcomes of appeals. Have representations not been made to you, or do you not accept such evidence if it is presented to you?
Mr Monteith raises a couple of relevant points. The appeals process exists to deal with such inquiries, of which there are a fair number every year. One of the problems—I hope that it will not be long-lasting—resulting from the summer's disastrous events has been a querying of the marking, which has never happened before. Examination and assessment are not exact sciences. Without question, there have been some difficulties involving marking and wrong assessment in every year that the examinations have been conducted. We do not maintain that every paper by every candidate has been marked in a manner that is beyond reproach. Whenever human judgment is central to a task, human fallibility is a factor. However, we have no reason to think that the quality of assessment is lower than the high standard that has traditionally pertained.
Is it therefore fair to say that if those philosophical aspects are not reconciled, the level of appeals will be higher than it was before?
That is difficult to say. In principle—for the reasons that Michael O'Neill gave about the availability of different levels of course—the number of appeals should be lower. However, other factors are operating in the opposite direction. One of them is obvious: the events of the summer have created a climate in which it is clearly expedient to appeal. It will take us some time to claw back from that situation and ensure that appeals have a genuine basis in the future. If the underlying problems to which I just referred are not addressed, they will be another source of future appeals. Mr Monteith is right to point up that issue. That problem will increase the number of appeals until it is addressed.
Your paper mentions the difficulties with moderation procedures. Were there any early warning signs to directors of education about problems with those procedures? In this committee, we have heard of examples of papers being returned unused. What was the reaction of directors when that was discovered?
I am not sure that I can comment on that exact point, but I can respond to the general point about the awareness of directors of the growing problem in April, May, June and July. In April, I wrote to the SQA on several occasions. At that point, there was the beginning of an awareness—both at school level and local authority level—that the scale of the problem might be greater than anyone had expected. As Victoria MacDuff has pointed out, problems of materials arriving late, which are being experienced now, are problems that arose last year, when teachers made do and used other materials.
I would like to pick up on Brian Monteith's point about moderation, and to talk about the past and the future. With the new system of internal assessment, how do the demands of moderation and validation of the work in schools impact on you? I am thinking about the demands that are made on schools to release people to do moderation. In evidence that we took from the higher still development unit, we learned that it had originally asked for 15 days of a moderator's time, but then said that it could make do with three or four days. How that could happen I do not quite know.
That is okay. Ian tells us what he thinks before he lets people answer his questions.
Mr Jenkins has identified a significant problem, which relates back to earlier comments on the nature of the assessment system. Two assessment systems are being merged; I commented about the worst of both worlds. The operation of the examination system has, for years, required little moderation or validation, because the external examination provided that. The exam board operated a system of concordancy.
To pick up on what you said about the second diet, in your evidence you mention the possibility of putting things such as the core skills on hold—sorry, I am telling people what to say again. There seems to be a logic about suggesting that we hold on and keep it simple for the moment. Would you comment on that?
You can say yes or no, if you wish.
Yes. There is compelling evidence to suggest that the group awards and related core skills in the second diet should be put on the back burner while we sort out the current system, so that pupils in the system do not suffer.
I want to return to validation and reliability. Problems arise if the validation system is not operated properly. I must be careful what I say, but that is why SCOTVEC was sometimes not given as much respect as it might have been. Sometimes a course was validated one year, then the following year someone came along and questioned and changed it. Is there a danger of that problem continuing?
I am not sure that there is a problem there. I suspect that that problem did not arise in the school sector, where schools were validated to provide courses under a local authority umbrella and the local authority had responsibility for quality assurance and for ensuring that staff were qualified to deliver the course. It may have been a problem outwith the school and formal further education sectors, with private providers and trainers, where SCOTVEC wanted to ensure they were able to provide the course. Perhaps, in different years, a provider was not able to be validated for that reason.
Your paper talks about "lack of customer focus." What needs to be done to make the SQA management more accountable to stakeholders?
Our perception is that the main form of accountability of the SQA over the past few years has been purely political and that it has been placed under pressure to deliver programmes in a particular time scale. The questions that have been asked of it have related to that demand more than anything else.
You also talk about strengthening the SQA board. Would that facilitate the relationship between the stakeholders and the SQA?
Clearly the board is a mechanism through which stakeholder interests are represented. We think that it is worth considering seriously how the role of the board could be strengthened. The alternatives to doing that, to which our paper also refers, are much more drastic. The board is likely to be keenly apprised of the need to take an active role in monitoring the work of the organisation. It could be argued that it is more keenly aware of that need than it was six months ago. That may help the board to be more responsive than it has been to its constituents.
Does the board need to be restructured? Should there be a change in the way appointments are made so that there is a model that is more representative of stakeholders?
Yes. We have not considered in detail whom we would like to be involved, but we think that it is important that the board should be more representative of the range of stakeholders. We are only one kind of stakeholder. We are partners in the management of the system, but there are others who are more clearly customers than are the local authorities. The full range of stakeholders needs to be represented effectively on the board and to be confident that the board is representing its interests in the management and operation of the SQA.
I want to explore your opinion on the role of an intermediary body or commissioner, who would work between the Executive and the SQA. Would such an arrangement work and be helpful?
We have stated in our submission—this is a firm view of ADES—that we do not think that the reaction of some people, at least early on in the crisis, that the SQA should be more closely integrated with the Executive, is a wise idea. So many of the problems in the management of change in Scottish education generally, including the management of higher still, have their origins in over-centralisation that further centralisation of control would be a retrograde step. To that extent, an intermediary would be a much more satisfactory outcome from ADES's point of view.
I want to return to moderation, which Ian Jenkins asked about. In your written evidence, you say:
Reassessment is not a problem as far as moderation is concerned; the problem is one of wasting time. Reassessment concerns teachers greatly because of the requirement under the current system to ensure that a young person who is completing a higher course has passed its three internal units. Teachers are naturally extremely reluctant for young people to sit the assessment unless they are sure that they will pass. A lot of time is therefore wasted on teaching the bit of assessment concerned, giving it as a piece of homework, doing a practice in class and then doing the assessment. That was the old SCOTVEC style for units.
The point about the lack of consistency relates to the use of teaching time, not to standards—is that correct?
Yes.
You made a number of points about standards, validation and moderation. The SQA was not able to answer our questions on this fully; are you able to take a position on whether the exams were moderated and validated this year to the usual standard? Do you have any evidence that they were not? Are there serious questions about the quality controls that were put in place over all the exams?
We know that certain quality control mechanisms were not in place. The standardisation procedures were not carried out. We have concerns about that.
Yes, you have made that point.
I know, but it is relevant to what you are asking now. We feel that that concern should be appropriately addressed by giving greater emphasis to the importance of the external examination component and refocusing—not entirely, but partly—the internal assessment on its diagnostic value. That would resolve some of the issues that members have been talking about, especially in combination with the point that Michael O'Neill emphasised a couple of times: there is no point in giving huge emphasis to the transmission of data on individual units when the course itself is likely to be completed successfully.
You have made that point a number of times. I am trying to find out whether you think that this year's exams were not up to the standard that we would expect.
We regret the fact that the standardisation and concordance procedures did not take place.
You mention unit registration in your written submission. Paragraph 5.2 says that
Yes. The comment in the written evidence, which we have recently amplified, is that the vast majority of people—the school customers, so to speak—who take part in courses such as highers, according to the SQA's advice and evidence to us, are school pupils and are doing complete courses. The exception to that is a small number of new higher subjects such as travel and tourism and hospitality. We have argued that, for people doing complete courses, the matter of which units make up those courses is not of particular relevance or interest to future employers or to the university system. That information should therefore not be registered.
I ask members to try to wind up this section of questions. Jamie Stone has a brief question.
My question concerns housekeeping; I always ask about Her Majesty's inspectorate of schools. You make no reference to HMI in your written submission. Do you have anything to say to us about its past role, possible future role or any changes to its role?
The view of ADES on the role of HMI is very clear—the function of HMI is, at least at national level, to be the main quality assurance mechanism in Scottish education. That function is compromised seriously by involvement in policy formulation. That has been a feature of various developments, particularly over the past decade, and we have seen the consequences of that in relation to a wide range of things.
Irene McGugan will ask absolutely the very last question.
In setting out your recommendations, you rightly draw attention to your concern for young people. I am sure that everyone on the committee shares that concern. At what stage, though, would those recommendations need to be implemented to effect any improvement for students next year? We heard of Victoria MacDuff's experience of the late delivery of course materials. Some of the recommendations are fairly significant—they would not be easy to bring about or implement—and I wonder what time scale we are looking at, if the next cohort of students is not to face the difficulties that were faced by Victoria and her colleagues last year.
In our paper and our initial presentation, we separated into two groups the actions that we think should be taken. Some actions require to be taken over a period of time to address what we regard as inherent difficulties in the programme. Those actions will not be taken in time for next year's diet of examinations, or in time for the subsequent diet. They are necessary, but are not part of the programme that we suggest to tackle the immediate problem of simply ensuring that the examinations operate properly next year.
Thank you for your contribution this morning. The committee is grateful for the points that Victoria MacDuff raised about this year's situation. Next week, we will have the opportunity to raise the issues with the minister and, we hope, with Bill Morton, the chief executive of the SQA. We hear what you are saying about the fact that the situation should have been resolved by now, and that, if it is not, we have a very short time scale in which to ensure that it is sorted out. The committee takes that very seriously. Thank you for answering our questions this morning—particularly Victoria MacDuff.
COSLA's submission covers the political, professional and policy spectrum. I am Danny McCafferty, the education spokesperson for COSLA. Gordon Jeyes is the director of children's services in Stirling Council and David Henderson is the head of policy development. With your indulgence, I would like to make a few opening comments, as I suspect that most of the questions will focus on professional policy.
Thank you. You have raised several points that members will want to return to in their questioning. Have you made any representations on finance to the Minister for Children and Education?
Individual local authorities that have made costings have submitted their own bills.
You mentioned HMI. I believe that Jamie Stone has some questions about that.
In your written submission, which I have read and taken on board, you make considerable mention of HMI. Do you wish to add anything to what you say there? Your points are clear and well made.
There is a difficulty in the discussion of policy because of the different meanings that can be attached to that word. In the education service, a school will have a policy for each aspect of how learning and teaching is taken forward. When schools talk about policy, they are thinking about something as detailed and fundamental as guidance to staff on the way in which they work. Those policies are heavily influenced by HMI reports and advice. For that reason, schools associate them with HMI policy. When Douglas Osler says that ministers make policy, he is absolutely correct. However, he is using the word "policy" in a different sense. That causes the debate to become confused.
Thank you for that. I would like to narrow the focus to HMI. In the previous evidence, the word "compromised" was used. I want you to focus directly on HMI—where it is and where it may go in the future. At the chalkface it is alleged—you will have heard this from the teaching profession, just as I have—that there was a breakdown in communication with HMI about the problems that were becoming apparent. That is just an allegation; I am not saying that it is right or wrong. Would COSLA want to associate itself with that view?
I have received anecdotal evidence from teachers in classrooms that I have visited—and I have no reason to disbelieve their claims—that when inspectors visited schools teachers flagged up on-going concerns to them. Teachers feel that they were ignored.
We have no reason to doubt that inspectors were passing the message on. "Compromised" is probably too strong a word. I am sure that as the year went on the message came through from the part of HMI responsible to Douglas Osler through Graham Donaldson. I am sure that it was put across and that reassurance was sought.
You seem to be saying that HMI is a wagon that has only three wheels and will not go very far. You have outlined succinctly what is wrong: the fact that HMI is acting as both judge and jury. I would like to get down to the nitty-gritty. What changes would COSLA advocate in the role of HMI? Is legislation needed? How important is it that there should be changes?
We think that it is very important. HMI has developed quality assurance through performance indicators extremely well, but it has been compromised—that word again—by being too involved in development. COSLA is on record as saying to this committee and others that it would like the functions of quality assurance and policy development to be separate. We heard from ADES about the SCCC being to some extent sidelined under the previous Government, when it was being developed as a non-departmental public body rather than as a public committee. When it moved to Dundee, it was intended to act as a profit centre. That may have something to do with the fact that development work was entrusted not to the SCCC but to the HSDU, which was run by a chief inspector of schools.
The setting up of the Parliament gives us an opportunity, which we should acknowledge, to think about whether what was suitable in the past will be suitable in the future. New partnerships between local authorities, ADES and the Parliament may be a more appropriate way of developing policy. Perhaps HMI's role should be that outlined in the Standards in Scotland's Schools etc Act 2000—to carry out inspection of authorities.
I would like there to be more accountability. Perhaps we should focus on the SQA board and how it is managed. Do you have a view on how the SQA board could be restructured or strengthened?
We have asked for a number of things. COSLA has made proposals on best value in response to a request from the Government. The Government has given a commitment in principle to extending best value across the whole public sector. The SQA, like all non-departmental public bodies, will be affected by that. We have asked for intervention powers and set out how they might work. A range of bodies would be involved: COSLA, the Society of Local Authority Chief Executives and Senior Managers, Audit Scotland and inspectorates.
Focusing and strengthening the board is necessary, but it is insufficient because the board, by definition, is part of the SQA. The most straightforward way of improving the credibility of the SQA in the eyes of the public is, as with other monopolies, to appoint an independent regulator. Any other solution would be too radical to get the change that is needed straight away.
The most important way of strengthening the board would be to define its function, because its strength would develop out of its function.
Gordon Jeyes has answered my next question before I have asked it.
My apologies.
That is okay. I want to ask about powers of intervention. How would you view the introduction of an intermediary body or a commissioner? Do you think that that would be helpful?
Given where we are now, it is the most logical option. As Keir Bloomer said, the difficulties are the result of the creation of a monopoly. We must ask whether we knew what we were doing when we chose that course. If this is supposed in part to be a response to the marketplace, where is the market testing? SCOTVEC could have gone on to develop national certificate highers, which could have operated in competition with Scottish certificate of education highers. Universities and other recognising bodies could then have made judgments accordingly. If having a profile of core skills was regarded as a significant advantage, schools would have ensured that their candidates acquired that advantage.
Who should be represented on such a regulating body?
Since the difficulties in the summer, there has been a series of meetings with the stakeholders, which have been chaired by the Executive and attended by the SQA. Those have been effective meetings of a consultative group that clearly stands on the outside. The directors of education are monitoring appeals and will report to that group and then to the ministers. Such an arrangement could be formalised. However, the SQA's reaction is that it does not want the creation of another group to which it will have to account. We are not fussed if the idea is not pursued, as long as there is somebody to whom the SQA gives effective account. The initial meetings of that group have been very effective.
I take it that the intervention model for local government that is described in the submission will apply here. Is that model intended to clarify the relationship between the SQA and the Scottish Executive or regulator?
Yes. We are developing policy on this front and have made proposals. The best value advisory group, which we expect the Government to set up shortly, will make proposals on intervention. The proposals will deal with intervention across local government, but we would like them to be widened. This relates to the discussion about the nature of the intervention role, which has still to be decided. We suggested that intervention powers should be as they are at present for the minister but that they should be subject to parliamentary scrutiny through affirmative resolution.
This matter has been raised in Parliament, because the power of ministers to intervene seemed to be unclear. Is the proposal for a protocol rather than for legislation to regulate the minister's relationship with the SQA?
In effect, yes.
Where does the regulatory body that you recommend fit into the model?
It would require legislation. We have suggested that there needs to be independent regulation of the SQA. What I was describing separately were proposals that are being developed for local government, which I think read across to this matter. We have not reached the stage of marrying the two ideas, although I think that it would be sensible to do so.
The proposed model is for the relationship between the minister and the SQA, but the regulator would be a separate entity.
Yes.
You have talked about regulation and previously you talked about HMI having a split role of inspection and policy making. If the inspectors are not to generate policy, should there be a forum where policy is debated? From where do you think policy should be driven?
In fairness to HMI, I should say that it took the lead from standard grade through five to 14 because there was a policy vacuum. I suspect that the balance has shifted over the past few years, as we moved towards the new constitutional settlement and had a more assertive Executive. In addition, ladies and gentlemen, the new element is you. It would have been interesting to have had a debate and the introduction of some common sense, rather than fashion dressed up as policy, during the development of higher still, had it been subject to parliamentary scrutiny. Such scrutiny is a crucial test of future policy development.
Sometimes it takes a tragedy to bring people together in unity. Many organisations with a single vested interest have appeared before you in this investigation. There has been an extraordinary consensus on finding solutions rather than problems. We should learn from this disaster, take on board the wealth of knowledge that exists in Scotland and consider new mechanisms and a new type of forum in which people can have ownership of how policy is developed, rather than having it handed down from on high. There is potential, but we have to address our minds to the matter.
At the beginning, you said that the unit structure of higher still sprang fully formed into the public arena and was never really debated. Where did that structure originate?
As has been said, Howie came up with an excellent analysis of the problems of the fifth-year rush, but the Scottish education community, for a variety of reasons, had difficulties with some of the solutions, which were characterised as twin track. There was detailed consultation and a document, "Higher Still: Opportunity for All", was produced by HMI for the Scottish Office as a solution. We went pretty quickly from that to detailed discussions about the technicalities. Douglas Osler is right to say that there has been more consultation on higher still than there has been on any other development, but it has been at the level of technical details rather than of building proper support.
Instead of a two-term dash, we had a two-term obstacle race, with the obstacles being shifted.
Yes, but the jury is still out on higher still. It has many merits and candidates have gained from it. I speak also as a parent. In keeping the pressure on young people throughout the year, the system can be seen to have raised achievement. There is no reason to doubt Douglas Osler's evidence to this committee of enhanced learning and teaching, although I find it interesting that he makes that observation before he knows the pass rate. We could be raising achievement in learning and pupils could be doing everything right except the value-added feature of passing exams. That brings us again to the culture clash.
I find your evidence refreshingly blunt. You gave a picture of a vehicle that was misdesigned, although the intention was laudable. Assuming that your account is right, and given the present mechanisms in the Scottish Executive and agencies, how confident are you that the problems will be sorted? Do you think that if we are not careful we will see more of the same next year?
There is potential for more of the same next year. Ministers should be spoken to as soon as possible so that a proactive stance can be taken and people on the ground are listened to. The SQA co-ordinators are already beginning to flag up problems that will arise with getting the show on the road for next year unless we start things moving within weeks, not months. If we start to listen to practitioners, who know what they are doing, and if we trust them, we can start to move forward. If we do not listen or if we procrastinate, what happens will be a self-fulfilling prophecy.
In a perfect world, is there a case for freezing higher still for a year, going back to what was happening before and rethinking?
That is not necessary. When the difficulties first emerged there was optimism that this would be the high-water mark of compliance and that all of us in central posts would learn a wee bit more humility and how to listen. As the weeks have passed, I am less convinced about that; I see a regrouping around the view that what happened was just a problem with management and information handling and that there is nothing wrong with the design. I have strong doubts about that.
From what I have heard, seen and read, there appears to be nothing to indicate that higher still should be held back. However, we should have the courage to hold it back if, on more detailed examination, something emerges to suggest that we should. There is nothing wrong with asking people to slow down. We have for too long been in a culture that says that if we slow down we will miss the train. We should look at the long-term future of education in Scotland and if it is necessary to slow down we should. However, the evidence is not there for that yet.
It sometimes helps to ground an argument by illustrating it. When I raised the example of the premature reporting of core skills in the case of Victoria MacDuff, I was told by HMI that it was an anomaly. When Victoria sends her certificate to the University of Oxford to support her application, the almost wilful misrepresentation of her capacities in the core skills will not be "an anomaly". That illustrates Councillor McCafferty's point that there is something wrong. I think Willis Pickard summed it up in an article that he wrote quite early on—many of us have felt as though we were saying that the emperor had no clothes, but the reply has been, "Not at all, lifelong learning is a beautiful creature."
I do not think that I need to put my questions—Gordon Jeyes and Councillor McCafferty have already answered them more than adequately.
I should explain that, even though the questions may seem to indicate that members are beginning to make up their minds, we are still taking evidence and we will not be coming to conclusions until we have heard all the evidence.
Thank you for giving me this opportunity to give evidence. I am anxious to put on record some of the problems that I perceived during my tenure as head of operations at the SQA.
I want to ask about the awards processing system—the APS. We have not had much opportunity to discuss that. Are you satisfied that there was a need for a brand new system or do you think that the old SEB system was adequate for handling the data?
The SEB could have implemented higher still. Over the years, the SEB had grown used to implementing changes, such as those to standard grades and highers. The administration of higher still was not dramatically new—it contained internal and external assessment, just as standard grade does. The SEB developed the examination processing system—EPS—in 1995. That was a model that could have been used to process higher still results. However, higher still was not handed to the SEB, but was given to a merged organisation which did not have a computer system that could cope.
Was the operations unit practically involved in setting up the APS, to allow you to use the knowledge that you had gained from setting up the EPS?
Yes. The APS was set up on the basis of senior users. I cannot remember how many senior users there were, but the operations unit was the main user of the computer system. There were eight modules in the APS and I was the senior user for five of them. I had been the senior user for the implementation of the EPS, so I had some experience of computer systems specification. I am not a computer expert—I do not write computer programs or analyse such systems. However, I have vast experience in designing systems.
Some of the evidence that we have received suggests that although you were trying to set up a new computer system to handle the data, the data processing and validation were not imposed in the new system. Is that the case?
I am sorry, could you clarify the question?
There should have been data validation rules for when the centres sent exam scripts to the SEB. Did you help to draw up those validation rules?
We had a major role to play in implementing the validation rules, which were determined by senior management. For example, the validation rules were set up at the front end to ensure that candidates could not do a particular combination of subjects if that was not allowed. Validation rules were put in place to ensure that people could not do the same unit twice. However, decisions were made to allow units to be taken more than once, which meant that we had to relax many validation rules. That resulted in a large duplication of data. If centres were unsure whether they had sent unit information and sent it again, the system would end up with twice the amount of information. Often, the centres changed the completion date on the second set of data and the system was left holding both.
That has emerged as a key problem. The transfer of data—a process that the SEB had handled well in the past—was not handled well by the SQA. There were several examples of data being transferred from centres to the SQA and being entered twice. However, one would have thought that the SQA had simple validation rules to prevent data from being entered twice and to prevent the replication of data.
It was the case that identical data should not have been entered twice. However, the APS software was delivered in stages—it was not a completely integrated system. It was being delivered as we were processing data and was not properly tested because there was no time to do so. In an ideal world, one would build in a period of six to eight weeks from the delivery of the software to implementation in order to test the software, but we were getting software on the day on which it was necessary to use it.
Obviously, you had to make a decision to relax the validation rules because if the software was not working, you would not get back the right information. Who would take such a decision—someone more junior in the department, someone more senior or you?
Those decisions would have been made either by a senior member of staff or by me. In most cases, they were made by a senior member of staff.
Another specific problem that has been raised is that, when data containing errors and problems came in, the operations unit tried to correct those problems by trying to re-enter the data. However, the unit did not report back to the centres.
What the unit reported was that some of the errors that were being produced were system errors. Staff did not want to repeat the punting back of pages and pages of error prints to centres, which had been done under previous regimes, and tried to massage the error prints in order to cut down on the work for centres. Unfortunately, staff underestimated the volume of the problems.
Either another member or I will come back to that issue in a minute.
Yes. The staffing for operations was handed to me by the previous director, Thomas Salvona. Then I was given an opportunity, which I took, to make my own suggestions, which were put to the senior management team for consideration. At that point, Ron Tuck took a unilateral decision to cut almost one third of the staff in operations.
When did Ron Tuck make the decision to move the SCOTVEC certification services staff?
Very early on. The first thing that happened when the SQA was formed was the appointment of senior officials, then the unit heads were put in place. At that point—about April or May 1997—the unit heads put forward their blueprint for the unit. That worked okay for the first two years, during which we operated as two organisations. We were virtually the SEB and SCOTVEC, so we still issued Scottish certificates of education and SCOTVEC certificates. Where it really bit was when we made the transition to a single, integrated system.
But at the time the decision was taken, you and David Elliot, among others, were concerned and made your concerns known to Ron Tuck?
Yes.
There were continuous concerns about staffing after that, were there not?
Obviously, when we got into the planning stages for higher still, there was a lack of input from Glasgow in the operations unit. At that point, I asked whether I could get some input from Glasgow-based staff at a reasonably senior level. It took them about six months to approve the appointment of one officer. She was in post for only about a month before she came back to me and said, "This job is massive. I need another four or five staff."
How did your relationship with David Elliot and Ron Tuck work? How did you communicate your feelings about, for example, the need for more staff?
My dealings were directly with David Elliot. I rarely dealt with Ron Tuck other than when he was at the same meeting, such as a project board. My feelings were communicated, either in writing or verbally, to David Elliot. In fact, that goes back to Thomas Salvona's time. As you probably know, David Elliot took over operations midway through the process.
I will stay on the management of the organisation. It must have been difficult for you to see some of the answers that we have had from people such as Ron Tuck.
I do not think that there was a master plan for the whole organisation for the implementation of higher still. It came in as a change and it was left to individual units to develop systems to support their units. The figure has been quoted that, unfortunately, 60 per cent of staff were not in the same posts as they had been prior to the big merger. That presented us with difficulties.
Would you agree that there was a communication problem within the organisation?
Yes.
We have also heard that there was a void in staff development. You are saying that people took on jobs in areas in which they had no experience. Was there opportunity for people to gain experience and training in aspects of work that they were asked to take on?
I can comment only on operations. We were a working unit used to working flat out in a 12-month cycle to deliver each examination. There was not a lot of slack to provide opportunities for development. We built in some slack to allow those opportunities, but the staff were unwilling to devote the time that was necessary to develop themselves in other areas. They were under so much stress and so overworked that that opportunity did not afford itself.
Do you agree that that is what happens when people are firefighting?
Yes.
Do you also agree that management's role was to consider what was ahead and do the strategic planning?
Senior management lost sight of the core of the business. The operations unit, which is key to the organisation and the delivery of correct results, was not given the place within the organisation that it deserved. The operations unit did not have the number of senior managers or the quality of staff of other units. We were undervalued. The staff did a tremendous job under the circumstances, but they were overworked and undervalued.
The other point coming to us through our inquiry is the reluctance of senior managers to hear of any problems or to react to them. You raised some problems on 17 May and subsequently. What was the response from senior managers when people said, "This is going to be a problem," or, "We are not going to manage to deal with this"?
Senior management did not have a full grasp of the situation. Ron Tuck felt that the emphasis was on getting the examination under way and that everything else would take care of itself. That was not the case.
Is the suggestion that there was a blame culture in the organisation accurate?
I do not understand the accusation that there was a blame culture.
In as much as when people were unable to manage a particular piece of work it was considered to be their fault rather than because of the lack of strategic planning or the lack of staff or the department's preparation in taking on new work.
Yes, that is a fair point.
I noted what you said about staff development and training. Could you expand on the staff's credentials? You said that you are not an expert in programming, but given that your unit was in charge of all data processing and information technology—
Not IT.
Okay. What professional qualifications do you have in this area? How many of your staff had appropriate professional qualifications in data processing, and which posts were they in?
I have no professional qualifications in data processing. I joined the organisation almost directly from school, so any expertise that I have in exams processing has been gained through experience. Almost 100 per cent of the operations staff had been employed by the SEB and their development had been on the job. I cannot think of anyone who had a qualification in data management or IT.
Would you say that that should not have been the case and that people with qualifications should have been recruited?
When the operations unit was set up within the SQA I sought to import staff with sufficient qualifications, but the human resources process that was in place did not allow that. Staff were matched into posts; there was no opportunity for me to say, "Look, we have staff in these posts who are well-meaning individuals, but they do not have the qualifications to carry these jobs through." That was not part of the HR process. There was no way in which we could import staff with appropriate qualifications into operations, either from within the organisation or without.
Was that factor acknowledged in what happened later?
No. Due weight was not given to that.
Given the significant problems that existed at the time, what action did you take to deal with them, and how did you make sure that what you were doing and saying was being transferred up and down the chain of command?
As you will know, I can comment only on the problems that occurred up to the end of June, because I was no longer in post within operations after then. It was apparent very early in the 1999-2000 exam cycle that there were problems. We were late in getting information out to centres, and centres were having difficulty getting their software, which was provided by commercial firms, in place. As a result, the flow of data, which normally would start in September and October and flow through until January and February, did not start until about January. That gave us a difficulty, in that information was being stockpiled.
Are you satisfied that you and your department made those concerns and problems known?
Yes.
Before we go on, I remind everyone to switch off mobile phones and pagers, as it can be off-putting if they ring.
This is a question that you do not need to answer, but you have been on the edge of disciplinary proceedings in the past—let us put it that way. If you feel inclined to tell us anything about that, we will listen with interest. If you do not want to say anything, we will have no problem with that either.
I do not understand what you mean by "on the edge of disciplinary proceedings in the past".
I mean recently, but I do not want to put you on the spot.
I am quite happy to answer your question. That is what I am here for. When I came back from leave in August, I read in The Scotsman—I was not informed in person—that I had been suspended from duty. I could not understand that, because I was not in charge of operations when, as I understand it, all the problems occurred.
Did you receive any written communication about your status, other than the fact that you were due to retire in September?
No.
You had no written communication?
When I phoned Ann Campbell, I was told that a letter had been sent to me to inform me that I was being suspended. I received that letter the following day, postmarked the day of my conversation with Ann Campbell. She told me that it had been sent the previous Friday, but it was actually postmarked after Ann Campbell had spoken to me. I think that my phone call prompted them to issue a letter to me.
Was your phone call provoked by reading the article in The Scotsman?
I did not read the article in The Scotsman myself. I was in Spain and my brother phoned me to tell me about it.
Your written evidence states that you were on sick leave in June. How long were you on sick leave?
For two weeks.
I do not mind if you cannot remember the precise dates, but it might help to give us a picture of the situation. As you have already said, you were no longer on duty in July.
I do not have details of those dates, but my sick leave was certainly prior to the fixing of the first pass marks, which generally takes place in the middle of June. Before I went off sick, we were still speccing the software that would enable us to fix the pass marks, and it had not been delivered by the time I went off sick. I was off sick with back problems, not stress, surprisingly enough. Although I was signed off for two weeks, I do not know whether the senior management was sure that I was coming back. However, I certainly had a closed medical certificate.
You say that after your return to work in July you had no powers to influence operational matters because Bill Arundel had been brought in to replace you. Given your 32 years' experience, were you surprised that you were not asked to work in tandem with Bill, to shadow him or at least to sit in on committees with him?
I was astonished. I could not understand why, given my experience of dealing with and solving problems, I was not given that role. In defence, other things needed to be done. The appeals system and the system for reporting results to schools had not been written. It needed someone of my experience to ensure that things were pushed along. What I cannot understand is that I was not involved in any committees or decisions or consulted on any matters.
In previous hearings, it has been suggested that your personal circumstances meant that managers above you were perhaps not firm enough or did not press you hard enough. Do you concur with that observation?
No, not at all. I was as answerable to senior management as anyone else was.
How was your relationship with David Elliot in relation to your ability to handle the work, the level of training that was made available to you and the assessment of your handling of the job during higher still?
David Elliot took over as head of operations, IT and other divisions while I was off sick. My first dealings with him were when I returned. It was obvious that he did not have a good understanding of the operational side of things, but he certainly had enough experience of how the SEB dealt with matters to allow him to cope. When I returned from leave after the summer of 1999, David Elliot had a discussion with me during which he pointed out that there was a view within the organisation that I was perhaps not as corporate as I might be and that I was perhaps not showing the level of commitment that I should have been showing. I found that strange, as I had been off for four months on sick leave. That was the level of discussion and training and development that I had with David Elliot. At no time after I returned from sick leave was I offered any advice, training or support, such as having someone to shadow me.
It was not put to you that there was concern about your performance and that you might benefit from additional help or training?
No.
I now want to go back to the meeting on 7 May of the examination diet 2000 group. I understand that at that meeting, you estimated that more than 1 million estimates and marks had to be processed and that approximately 10,000 forms needed to be processed each day to keep on target for certification. You also say that it was decided at that meeting that it was preferable to issue accurate certificates late than to issue suspicious certificates on time. What was the follow-up to those issues? How realistic was it at that stage to believe that accurate certificates could be issued?
It was difficult at that early stage to predict how things were going to pan out. The processing is done by subject and course; it is done separately. It is only in July that all the candidates' profiles come together. At that stage, although there was a feeling that we might not be able to cope with all the data, we could not be sure until we got to the later stages of the process in July. The indications were that there was likely to be a problem; otherwise, the issue of delaying results would not have been raised.
You have mentioned that you were absent on sick leave for two weeks in June. During the period between your relinquishing your responsibilities in that area and the meeting of 17 May, did you have any further concerns about the build-up in the volume of data that were still to be received for unit assessments?
As I explained earlier, there were concerns about evidence that some staff were stockpiling queries and that we were not getting through the work at the rate that we should have been. At that point, we recruited additional staff. We had difficulties, as we did not have easy access to additional computers, but we did all that we could to load in as many staff as possible. Staff worked on a two-shift basis, making maximum use of the space and the personal computers that were available. Efforts were made to recover the position.
The final meeting would have been around 27 June. Your final duty as head of operations was to attend that meeting at Victoria Quay. Were those present at the meeting made aware of the outstanding problems with the volume of data entry?
My memory was not accurate concerning that meeting. I suspect that the meeting to which I am referring took place earlier, as the agenda included the matter of the examination diet. I was involved in only one meeting at Victoria Quay, and discussions centred on whether pupils would be able to sit the examinations and whether the Scottish Executive could offer any assistance. We were told that the Executive supported the view that, if there was any likelihood of error in the results, their issue should be delayed. That was the substance of the meeting.
That offer of assistance would have included the IT suite.
It would have included the IT suite. Paul—I do not remember his surname—visited the Dalkeith office and met me, David Elliot, Bill Arundel and David Falconer. He offered us every assistance that the Executive could provide if we were in difficulties. Our IT people's answer at that point was that they would be able to deliver the software in time and that I would have sufficient time to test that software. That proved not to be the case.
Can you give us any indication of when that meeting took place?
It must have been round about the time of the first examination. I have no access to any information from the office.
Would that have been the end of May or the beginning of June?
The date will be in my diary in the office, but I have no access to that information.
I think that we have heard reference to that meeting before, and we wanted to confirm that.
I attended no subsequent meetings. Meetings took place fairly regularly between the SQA senior management team and the Scottish Executive, but I was not party to them.
As members have no further questions, I thank you, Mr Greig, for your attendance this morning and for answering our questions. I am sure that it was not easy for you, and we are very grateful to you.
Thank you.
That concludes our deliberations this morning. We will meet on Monday in private to begin with, as we have agreed.
Meeting closed at 12:43.