Official Report 237KB pdf
I apologise for the delay in starting today's meeting, but I am sure that the witnesses appreciate that the committee needed to deal with some business. I welcome members of UCAS to the meeting—they have had a long journey and I thank them for their efforts. The committee has received copies of UCAS's written submission. Mr McClaran will introduce the other two witnesses and we will then move to questions.
Thank you. On my right is Paul McClure, who is the head of application services—the department that is responsible for operations at UCAS. On my left is Ross Hayman, who is director of corporate communications at UCAS. We are grateful for the opportunity to attend this afternoon's meeting.
Thank you very much.
On students who are Scotland domiciled, the number who have found places in higher education in Scotland has increased by 7.8 per cent from the 1999 figure. The number who have found places in higher education in the UK has increased by 7.2 per cent, which compares with an overall 2 per cent rise in the number of students finding places in higher education. This year, we have had the highest number of admissions into full-time HE since the previous high point in 1997.
We will move on to some of the difficulties that UCAS experienced and how the organisation dealt with them.
I thank the witnesses for their helpful submission, in which it was observed that there was no independent quality control system. What system should be put in place to ensure independent quality control in Scotland?
A number of options could be considered, but I do not think that UCAS has the monopoly of wisdom on or insight into that matter. Structurally, it would be good practice to have an independent quality control mechanism. The English and Welsh systems use intermediate bodies that are responsible for quality control, but which are separate from the examining bodies' functions. It might be worth considering such a model.
I accept that you might not be able to go into the details of such a mechanism. However, who should be involved in such a body? Who would the stakeholders be and how would they represent education as a whole?
Any quality control body might include professionals who were involved in the day-to-day work and a board that consisted of stakeholders in the system. The examinations system has many stakeholders. We are conscious that higher education represents only a part of the total number of users and stakeholders in the exam system. Clearly, a quality control body would need to be broader than that.
I was interested in the comment in your submission that you were not necessarily criticising the higher still programme, but that you felt that
Ian Jenkins raises several points that need to be disentangled. UCAS has had an entirely constructive experience of working with the Scottish Qualifications Authority in the development of higher still and the new qualifications. The UCAS Scottish office convened the subject panels that considered many aspects of the higher still curriculum. We have been involved in the process and we are delighted that, for the first time, a point-score system has been achieved, which embraces the new Scottish qualifications. Under the new UCAS point-score system we have, for the first time, a UK-wide system. That has been extremely positive.
Thank you.
It is difficult to work out how many students were disadvantaged this year. Page 6 of your submission states that
I have some overall figures, but the situation is very complex. I can provide the committee with figures relating to applicants who we coded as taking either highers or certificates of sixth year studies. The total that was coded for that cohort was 18,816. Of those applicants, 13,351 were placed at their first choice institution, 1,084 were placed at their second choice institution, 1,015 were placed in the UCAS clearing system and 3,366 were not placed. We cannot draw any conclusions about the applicants who were not placed—they might have chosen not to enter the clearing system.
How do those figures compare with those of previous years?
We have not done such a comparison, but we could provide the committee with that information at a later date.
That would be good. At the end of your submission, you mention a total of 10,300 amendments. Am I right in saying that that refers not to 10,300 students, but to 10,300 amendments, some of which would apply to the same student?
That was the figure at the time that the report was written. The current figure is 18,400 individual result amendments.
A particular problem that was mentioned was the fact that students who wanted to get into medical and dental courses had to apply by 15 October 2000. That problem was not stressed to the SQA. Can you update the committee on that? Do you know whether any students were adversely affected?
We gave advice to schools and colleges in Scotland that any student who was completing their UCAS form for entry in 2001—particularly those who were aiming for a medical or dental course—should make clear in their application form any results that were subject to appeal.
How many students have been affected by that?
We do not have those figures. The applications for 2001 entry are paper based, so we do not yet have those data.
Normally, if a pupil had missed the application deadline, would they be able to appeal to UCAS?
They might come to us for advice, but normally they would get advice from their schools or from colleges. I hope that any student who is applying for medicine or dentistry courses for entry in 2001 will indicate on their UCAS application form whether any of their higher results are subject to appeal and I hope that they will not be disadvantaged.
I understand that it is difficult for you to provide hard statistical information, but would you say that you are not being inundated with inquiries?
Since we issued that advice, we have not been inundated with inquiries, or other indications of concern, from Scottish applicants.
We should also add that we raised with the SQA our concern about the earlier deadline for students who were applying for medical and dental courses. We understand that priority was given to appeals from students who were in that position. I hope that that helped to resolve that situation.
I want to return to some of the points that Ian Jenkins raised. You say in your paper that you think that higher still was
It is difficult to give a definitive answer to that until we know what the cause of the problems is. All that we know is what the problems that we experience are. Perhaps a year's further testing of the systems that were being used to implement the new arrangements might have been useful. Some of the bugs that led to the large-scale loss of data might have been resolved in that period.
I have never been able to get my head around the idea of tsars. From newspaper reports, I understand that there might be a qualifications tsar. Could witnesses help me by explaining what such a person would be able to do that an eight-cylinder minister for education could not?
I hold no particular brief for the appointment of a tsar, but people who argue for a tsar, or for an independent quality control body, talk about the advantages of having an intermediate body between the Government and the body that is responsible for exams. They argue that, if there is no intermediate quality control mechanism, quality problems might not be discovered in time and that any problems that occurred would have an immediate political impact, although they would not necessarily be resolved at that level. I imagine that a tsar would be reasonably expert in relation to the issues that surround the administration of large-scale public exams and would be able to intervene early.
This might be commenting on the matter in a sideways fashion, but does what you have said to the committee suggest anything about the way in which the board of the SQA was chaired?
We cannot comment on that—we are not privy to information about the SQA board or how it was chaired.
I want to ask about communications with the SQA prior to the issuing of exam results. I note from UCAS's paper that you first raised with the SQA the issue of the timeous issuing of results at the end of June. What prompted that?
That was prompted by rumours—through reports in the press and anecdotes from higher education institutions—that there might be problems and delays in the issuing of results. The concerns were vague at that point, but they led us on 29 June to seek reassurance from the SQA that there was unlikely to be any delay in the release of results, which were planned for the weekend of 5 and 6 August—a date we had previously agreed with the SQA. We were assured on 3 July that there would be no change to that timetable. We had to accept those assurances.
You were due to receive a copy of the results on 5 and 6 August, but by 8 August it had not turned up, which is when you contacted the SQA. At that stage—before the whole situation blew up—what reasons or excuses did the SQA give for non-delivery of the results?
The SQA said simply that it was having problems with its systems. We were not given any great detail.
When were you told the results would appear?
We contacted the SQA on 4 August. At that stage, we still expected delivery of the results over the weekend of 5 and 6 August.
Your paper says that you contacted the SQA on 8 August, which was the day before the results were due to arrive at pupils' addresses. At that stage, did the SQA say when UCAS would be likely to receive them?
Following that weekend, we were twice promised that we would receive the results before they were delivered. The SQA was obviously having problems—it had difficulty with its systems. It worked through those problems and it got the results to us as quickly as possible.
When it became obvious that there was a problem—although you could not have known the extent of it—did you begin to take any steps—
We were not aware that there would be any problems with the data. Obviously, we were going to receive the results later than planned, but at that point we were still confident that the data would be 100 per cent correct when they were supplied to us.
Is there anything that UCAS could have done to help if—when you sought reassurance at the end of June—the SQA had said, "Yes, we have a problem here," or was the problem compounded by the fact that UCAS might not have been getting the full picture even at that stage?
If it had been clear to UCAS at that stage that there was a problem, it might have been possible to renegotiate the date of release of the exam results. It is not an oddity, but one of the quirks of the system is that Scottish results are released before A-level results. In the past, Scottish students have had an advantage in that they have known their exam results before students south of the border have. There was some time to play with, in which one could have negotiated a planned release of the exam results—say, a week later. It might have been possible to resolve the difficulties in that additional time.
When you contacted the SQA on 4 August, was there any suggestion that you would not receive the results on 5 August? I am trying to find out the extent to which anybody who came into contact with the SQA—even at that late stage—heard an admission that there was a problem. Am I right in thinking that UCAS had to contact the SQA on 8 August to ask for the information that it was expecting, and that the SQA did not contact UCAS to explain—
I think that that is correct.
Yes it is.
That is the position—we contacted the SQA. As we entered that difficult period around the time of the issuing of exam results, there was a problem about obtaining information. That made it difficult to know exactly what we ought to say publicly. We issued a series of press releases—no fewer than six—throughout the crisis, which expressed our best understanding of the position. However, we were not always able to be clear about what the next few stages might be.
This question is about the future—witnesses may wish not to give an opinion. You said that, in terms of university admissions, there is no real reason why Scottish results should be issued earlier than A-level results. Given that the higher still exams come later in the academic year, is there an argument for delaying the issuing of the results and bringing them into line with the issuing of A-level results?
I can give only a partial answer to that question. From the perspective of one who is involved in running the national higher education admission system, one can see a number of options for dates. Other factors will also be important, so I would not presume to comment on what dates should be chosen. The Scottish examination system must serve the needs of a number of stakeholders—we would not want to argue that the system ought to be run purely for the convenience of admissions into higher education.
Yes, but would adding a couple of weeks to the timetable cause UCAS any difficulties?
I do not think that it would.
Adding a week to the timetable certainly would not cause us any difficulties, but adding two weeks might.
How might that cause difficulties?
If the results were issued two weeks later than they are currently, that would be more or less a week later than the current A-level results issue. The timetable is built on the release of A-level results.
I am sure that witnesses would agree that it would have been better for the results to be two or even three weeks late this year, in order to have avoided the fiasco that occurred.
Data that were 100 per cent correct—even if they were received five or six weeks later than they were expected—would have been better than only partially correct data that were delivered on time.
It could be dangerous to set future timetables purely because of what happened this year. We received amended data right up until 13 September, so the problems went on for a long time. The situation this year has been highly abnormal and, as it turned out, having even an extra week would not, in retrospect, have been tremendously helpful.
What kind of problems were caused by the fact that you were receiving amended data until 13 September?
I can give one very positive response to that: it did not cause any problems with the transmission of results to higher education institutions. We received 13 separate releases of amended data from the SQA and we were able to turn those corrected data round to all the higher education institutions within—in most cases—24 hours.
Were students disadvantaged?
That is difficult to answer. As the process went on into clearing, there was the potential that students might be disadvantaged through not being on a level playing field with students who already had their full results. However, if one considers the final position, one sees that record numbers of Scottish students have been admitted to higher education. The probable reason for that is the attitude of the Scottish higher education institutions, which decided, by and large, to give the benefit of the doubt—where any existed—when admitting students. The fact that the process is funded is also important.
Some young people in schools in my constituency have asked whether what you have just said is true across the board. Some of the more competitive courses would have been more difficult to get into, and some students have said that they have been disadvantaged. Is it possible that—although there might not have been problems with some courses—some students who wanted to get into courses where there was more pressure for places have not had the opportunities that they might have had?
That might be the experience of some of your constituents, but we have not seen direct evidence of that.
Is there any way to assess that objectively?
The figures that Paul McClure read out indicate that, of the 18,000 students we are talking about, 13,000 were placed at first-choice institutions. That is a high percentage to be placed at first-choice institutions and that is reassuring. However, more than 3,000 students were not placed, for one reason or another. We do not know why that is, so there might be scope for more research into why those students were not placed and whether they were disadvantaged.
Young people have expressed concerns that there will always be an element of doubt about this year's exams and that colleges and universities might still put a question mark over them. How can young people be reassured that they have been admitted to courses because they merited their places? How can young people who have not got into courses be reassured that they have not been unduly disadvantaged?
The first rather narrow point that I must make is that UCAS cannot give that assurance. We are not the examining body and we do not know whether the marks are finally correct. Ours is a system that transmits marks—as they are given to us—to higher education institutions to enable them to make their decisions.
UCAS has gone back to higher education institutions several times with new information from the SQA. Given UCAS's unique role in being able to consider what has happened, and taking on board what happened regarding A-levels, are you in a position to speculate on how higher education institutions might perhaps reconsider evaluation procedures for the future? Might there be ramifications for the way in which higher education institutions consider qualifications in years to come? Are you prepared to do some crystal-ball gazing into a matter that will be pretty crucial for future generations of pupils?
When it comes to crystal-ball gazing, it is rather hard to look beyond the results of this and the other inquiries that are being conducted. It seems that everything hinges on whether the inquiries can determine accurately which systems failures, or other failures, caused the problem. If the inquiry reports show how those problems can be addressed, there is every prospect—all other things being equal—that the system will enjoy full confidence. Academic quality is not in doubt; everything hinges on our being able to say definitively what went wrong and how that can be addressed, so that we can rebuild credibility into the system.
Do you think that if, God forbid, we do not restore credibility to the system, the HE institutions might get shirty and proceed rather differently in future?
I cannot speculate on that. Are you asking what might happen if there were another year of uncertainty on the scale that we have seen this year?
I wonder whether, if there was continuing uncertainty, HE institutions would do things differently, in terms of evaluating would-be entrants and administrative processes.
Those institutions would, I suppose, adopt ad hoc measures if they did not have sufficient information, as they did last summer in trying to seek reassurances directly from the schools that the young people had attended. However, I am sure that one would not want that to become a permanent situation.
Paul McClure read out figures that illustrate how many students have been placed in first-choice institutions. I assume that that does not necessarily mean that they were admitted to their first-choice courses.
The first-choice institution is defined as the conditional offer that the applicant has firmly accepted—the applicant's first choice from the range of offers that they originally received.
Does that also mean their first-choice course in an institution?
Yes. The definition is based on the specified courses that were applied for. We can assume that, in the majority of those cases, the candidate will be allocated their first-choice course at their first-choice institution.
We spoke about quality control mechanisms. Does UCAS have a quality control mechanism that will be able to pick up what might go wrong in future? Obviously, it was difficult to pick up what went wrong this year—UCAS picked it up anecdotally. I assume that several mechanisms will be put in place to improve quality control, but will UCAS have such a mechanism?
We can make a contribution. UCAS and its predecessor organisations have been running a national HE admissions system since the early 1960s. For the larger part of our history, that has been a computer-based system. We have necessarily gained—sometimes painfully—a huge amount of experience in administering large-scale systems such as this and in examining the interface between school and college qualifications and entry to higher education. If we could contribute some of that expertise as part of quality control, we would be delighted to do so and would have a strong contribution to make.
That is an interesting suggestion.
I would like to return to the point about moderation and quality control. Until this year, a methodology of concordancy and marker checks—which seemed to work—existed in the Scottish exam system. It appears not to have worked so well this year. Much of the moderation of higher still that was meant to have gone on has been moderation of the internal assessments, which ultimately do not affect the end grades hugely.
The moderation system for qualifications is beyond our competence. Our discussions with the SQA focused on what is within our competence—the method for ensuring the effective transfer of reliable grades to the higher education institutions, which must make admissions decisions. Matters that concern moderation and internal and external examination of the qualification are beyond UCAS's remit.
We talked about the complexity of the system. Do you value the establishment of Scottish group awards?
As they are new, the value of Scottish group awards will be determined by the value that higher education institutions place on them. Our experience this year has been that HEIs have not, by and large, expressed offers in terms of group awards, but in terms of grades for higher subjects. We will facilitate the accurate transfer of whatever information HEIs decide that they need to make their admissions decisions, and we will work with HEIs and the SQA to do that.
The group awards system is not implemented fully, but I wonder whether it provides another unnecessary layer of elaboration.
That was the final question.
Meeting adjourned.
On resuming—
As everybody is back in position, we will make a start. I welcome the representatives of the Committee of Scottish Higher Education Principals—we are grateful for their time. We have copies of their written submission, but Mr Caldwell or Professor Stringer may wish to add to that now.
I have nothing to add to the submission at this point, other than to say that we welcome the opportunity to assist the committee's investigation.
Absolutely. In your written submission, you say that you might want to add to what you said about the appeals process, as the situation is on-going. Is there anything that you would like to add?
Could you point me to the relevant page of the submission?
Unfortunately, it does not have page numbers, but the relevant part is at the top of the fourth page.
Our concern was the large number of appeals, which was much higher than ever before, and the consequences that that would have. At that stage, we did not know the number of people who would have to be admitted late to institutions. In fact, as the committee has just heard from the representatives of UCAS, the number of Scotland-domiciled pupils going into higher education has reached record levels this year. That has happened for a variety of reasons. A contributory factor is that a lot of people were admitted late because their appeals were successful. However, the sector has managed to accommodate them.
Thank you. We will move on to other questions.
I note that you have no desire for the SQA to be incorporated into the civil service. Why do you think that it should be an independent organisation?
That view emerged from consultation with COSHEP members, but was not backed up by much detail. However, I believe that it was felt that the SQA's professional accreditation function was not simply administrative and should be undertaken by members of the wider academic community, which could best be done at arm's length.
I am sure that the other underlying consideration that our members had in mind was that it is a good principle for judgments on whether academic standards have been satisfied to be made by a body that enjoys a degree of independence.
There has been some discussion about the establishment of an intermediary body and an independent quality control system. Do you have a view on that? If there were such a body, who would be involved?
We are fairly agnostic about that. Speaking personally, I tend to be against arrangements that are excessively complex. None the less, quality assurance is an important matter and we need to have systems in place that ensure that whatever the SQA—or a successor body—does, it is robust, stands up to scrutiny and can be subject to independent verification if necessary. However, I am not sure that COSHEP would want to commit itself to supporting any particular mechanism.
Clearly, the public have concerns about quality assurance within our exam structure. What can be done to reassure the public—young people in particular—that they should not give up?
The best thing would be to ensure that things are right next time and that there is no repetition of what happened this summer. It is worth emphasising that a great deal was got right this time and that what was being attempted was ambitious—perhaps, with the benefit of hindsight, too ambitious in some respects. You are right that it is important that we should give the maximum reassurance to the young people and their families who suffered uncertainty this year. That is what matters; it was what motivated us in the higher education sector to reassure those young people who were contemplating entry into higher education.
I am interested in the points that you make towards the end of your submission about how the situation should move on. You highlight the late processing of higher national diplomas and higher national certificates, which struck a chord, as the issue was raised with me locally—people felt that we had not heard what was happening to HNC and HND students. Could you expand on what you think has happened, or what you understand the current situation to be and how it has impacted on those students?
As far as I am aware—David Caldwell may have better statistics—only a small number of higher education institutions, perhaps four or five, offer HNCs and HNDs. I do not think that there has been a particular problem in higher education institutions in respect of those awards.
I think that that is correct. However, an increasing proportion of entrants into higher education are coming with HNC and HND qualifications, so we were concerned that, if the problems with the highers and standard grades had a knock-on effect on HNCs and HNDs, that could impact on entry into higher education. Although we do not have hard data, there is little evidence that the effect on HNCs and HNDs was significant this year. The problems were primarily to do with the highers results.
Nevertheless, in your written evidence you say that there is evidence that there were delays in processing. Those delays would have had a significant impact on students, particularly as the situation was not highlighted.
The delay was unhelpful. The point that must be made strongly is that higher education admissions offices are always working under tremendous pressure during August and September, when they have to handle a large number of complex cases in a short period of time. Any delay is unwelcome and the delay in the highers results was unwelcome this year. Similarly, the delay in awarding HNCs and HNDs was a problem, but it was on the whole dealt with successfully. However, we hope that the situation does not recur. It is much better when results are published timeously and on the dates on which they were expected.
The committee will, at a later stage, want to look at the impact of the problem on people who were using HNCs and HNDs for purposes other than accessing higher education. Do you feel that the delay has been overcome?
I cannot comment on cases other than those for entry into higher education, but we did everything that we could to ensure that no intending entrant into higher education was disadvantaged because of the lateness of their results. We very substantially accomplished that.
I want to pick up on something that you say in your written submission. Like everybody else, COSHEP was misled by the SQA right up to the point at which the results were due to be issued. By that time, confidence in the SQA was diminishing rapidly. However, on 13 August you felt able to issue a statement jointly with the SQA, in which you said that you were satisfied that the problem was one of missing data, not inaccurate data. How did you go about satisfying yourself of that, given that by that stage nobody had much confidence in anything that the SQA was saying?
As I was the person who put COSHEP's name to the joint statement, I will deal with that question.
Given that account of the situation on Sunday 13 August, do you accept that in putting your name to the statement that was issued you were risking making the same mistakes that the SQA had made? By putting your name to something that you were not sure was an accurate reflection of the situation, were you not running the risk of increasing confusion by giving out information that quickly turned out not to be accurate?
I would not and did not put my name to something of which I was not sure. I took a cautious line and advised our institutions that they should seriously consider suspending making offers of admission to candidates who were applying on the basis of SQA results.
It was reported widely that you were satisfied that the problem was missing data, not inaccurate data.
If the statement was reported in that way, it was not wholly accurately reported. I would not have given the advice that I gave to our members the following morning if I had taken that view.
You have answered my next question, which was going to be whether COSHEP is satisfied that the problem was with missing, as opposed to inaccurate, data.
Yes. All the evidence points to that. There were quite a lot of missing data and, although we should not understate the seriousness of the problem, we should not make out that the problem was something that it was not.
In your view, do the institutions have confidence in the quality of this year's results? Aside from the problem of missing data, do the institutions have the confidence in the accuracy and quality of this year's marks that you would normally expect?
I have every reason to believe that to be the case, but Professor Stringer, as principal of one of those institutions, is in a better position to add to my reply. All the feedback that I received is that the institutions have every confidence in the quality of the students that they admitted this year. They treat the admissions process extremely seriously. You ought not to underestimate the additional effort that was made to consider cases in the light of the problems that occurred this year.
My institution was not alone in its approach to the situation. We were concerned to take into account a number of factors. Because we wanted to ensure that students were not disadvantaged as a consequence of the situation, we extended the admissions period. In fact, we stopped taking applications from prospective students only in the past few weeks. We devoted a lot more resources to the admissions process.
I want to check out a statement in the last paragraph of your submission. You say:
You ask a very good question, and I wish that I had a definitive answer to it. The truthful answer is that we do not yet know the reasons for that decline, which is sharper than one would expect in a single year. There are obviously year-to-year fluctuations and, as ever, we ought not rely too much on one set of figures. None the less, the drop from last year of 14.9 per cent in the number of students from England being admitted to Scottish institutions causes us concern.
I wish to repeat a question that I asked earlier in the meeting. You said that you did not like overelaborate systems. Do you believe that the higher still system of assessment is overelaborate? Does the reporting system—the stuff on the certificate at the end of the process—include material that is not necessary for your purposes? I realise that you are not the only stakeholder in or receiver of the system, but could the results set-up have been simplified or streamlined with a different form of assessment?
You are right to say that we are not the only user of the system, so we would not like to determine what should be on the certificates. I am no expert on the detail of what is certificated and why. In general, however, I am a great supporter of the new type of certificate. It provides much more information than ever before about a candidate's qualifications. It also means that individuals will increasingly be able to carry with them a single certificate that attests to all the lifelong learning that they have accumulated.
That is a powerful defence of the system, but I think that you would agree that it ought to be comprehensible and easily understood.
Absolutely.
Do you know how many institutions have taken advantage of the relaxation in the numbers and the financial limits covering the intake this year?
I am not certain that what is happening is about taking advantage of the relaxation. The data will soon be available, probably later this week or early next week. My institution has recruited 5 per cent above the funded numbers that we have been given. The relaxation was 4 per cent; unintentionally, we have gone a little over as a consequence of not wanting to disadvantage students. I do not think that COSHEP has collected or collated the statistics.
The statistics have not been collected, but the UCAS figures indicate that the number of Scotland-domiciled students who have been accepted to Scottish institutions this year is 7.5 per cent higher than it was last year. That implies strongly that several member institutions will have to take advantage of the greater flexibility that has been provided.
The UCAS figures show that admissions were up 7.2 per cent and 7.8 per cent in England and Scotland respectively. What was the planned increase in student numbers?
We expected a much more modest increase, in the order of 1 or 2 per cent.
A long-term funding issue is built into that, which I hope the Executive and the funding council will address at the appropriate time. Those students will have to be carried through the system for three or four years, potentially without the additional funding to support them.
We ought not to see this as a problem. One could argue that it is a great benefit that this year we have been able to admit a higher percentage of young people in Scotland into higher education. Although we have a remarkably high participation rate in higher education in Scotland, there is capacity to extend it still further, especially in social groups that have been under-represented. The increase may partly have come about by accident this year, but we should rejoice in the fact that an historically large number of young Scots has been admitted into higher education.
I do not mean to overelaborate the point. I agree that we should welcome the expansion of higher education and that there might be potential problems with funding; I am more concerned about who is being admitted into university and under what criteria. Is this situation a one-off, or will it be repeated year on year? Perhaps Professor Stringer might know from her institution whether the extra number of students being admitted this year would have been rejected in a normal year.
The problems at the SQA have not been as direct a cause of the extent of the overshoot as we might at first assume. However, the situation does have something to do with those problems. Institutions have fairly narrow targets and are usually extremely good at hitting them. This year, we overshot the targets to some extent because of the uncertainties and the institutions' desire not to disadvantage students. We are still collecting information on this issue, but I would think that very few—if any—of the students who were admitted will have done so with lower than appropriate grades.
When they take decisions on admissions, institutions are heavily influenced by the test of whether a candidate has the capability to complete the course successfully. Our success rates matter a great deal to us. We should be relatively happy that the success rate of students in Scotland is as high as it is in the rest of the UK, even though we have a much higher participation rate. Our institutions will certainly want to maintain that very high rate of success.
I should stress that I was not so much concerned about the academic qualifications dropping off as about being fair to students across the board. It is not necessarily the case that, under the current system, all students from all backgrounds will have fair access to higher education.
I would like some clarification on funding the relaxation of the overshoot—and apologise for asking this question in quite simplistic terms. Will the relaxation of the overshoot have financial implications for institutions that might not be fully funded? Will institutions be financially worse off because they are taking in more students this year?
Unfortunately, the situation is more complicated than that, as it almost always is. We should recognise that there is a difference between average and marginal costs. It is not necessarily the case that, in taking a small number of additional students, institutions will incur costs at the same rate as the average cost of educating a student.
In real terms, institutions are financially worse off because of the additional students.
That is difficult to measure. The institutions are receiving some additional income. In that sense, they are not worse off. However, there are also additional students. The actual cost of looking after those additional students is quite difficult to compute.
The effect will be to depress slightly the average cost of taking each student through their programme.
Does that have any real- terms implications for students? If the notional average cost is depressed, does that have implications for the service the students are offered by the university?
It should not. My institution has considered the matter very carefully. As Mr Caldwell has pointed out, those additional students bring some funding, although it is marginal. All institutions should be able to cope with the situation but, as ever, we would like to receive more funding for each student.
Is that okay?
It is probably just my innumeracy, but I am not clear on that point. Perhaps we could get more information on that. Could you give us a written briefing on the surrounding issues?
We could provide an additional written statement, although it would not take us much further forward. The basic difficulty lies in calculating the exact additional cost of taking five more students on one course, eight more students on another and so on, spread over many different institutions. The case that I am arguing is that where there is a relatively modest increase in the number of students compared with the figure that was planned, the students can probably be accommodated with relatively modest additions and with no detriment to the quality of education that the students receive. If we were talking about doubling the number of students attending particular programmes, the situation would be entirely different—the whole cost base would need to be re-examined. I am happy to provide additional notes, but I doubt that we will be able to provide a more precise answer.
We are trying to ascertain whether there are financial consequences for the institutions which will have a detrimental effect on the students. You seem to be saying that that is not the case, given the size of the increase.
We will have to examine the situation very carefully. If, when we have the final numbers, we think that the impact on the costs for institutions is greater than was expected, we will draw that to the attention of the funding council and we would expect the funding council to raise the matter with the Executive. However, we are not inclined to argue such a case unless and until we have the evidence on which to base it.
If you have an increase of approximately 7 per cent this year, instead of the 1 or 2 per cent increase that you might have expected, will we see some strange figures in the few years to come as a result of some kind of compensatory mechanism to make up for the fact that the funding for the huge increase this year will have to be carried into the students' second, third and fourth years?
That is partly what I was alluding to earlier, but a definitive answer cannot be given until we have the information about where the students are and exactly how many of them there are. We do not know how much of the increase is attributable to the problems with the SQA and how much is to do with other factors. We want, as Mr Caldwell said, to collect the information. I have no doubt that, in the light of that information, each institution will want to determine its own strategy. I do not expect that that 7 per cent increase will be spread evenly across all institutions. Each institution will have to gauge whether there has been a disproportionate intake in certain areas and whether there has been any distorting impact. Until that has been done, it is difficult to know what individual institutions will do.
It is important to stress one aspect of that. I would be concerned if people were worried that, because we have taken extra students this year, the number of places available next year and the year after will be lower.
I thank our witnesses for their attendance. We will take a short break while we change witnesses.
Meeting adjourned.
On resuming—
Good afternoon. I welcome you to the Education, Culture and Sport Committee. You will know that our procedure is to give you a couple of minutes to introduce yourselves. If you wish, you can say a few words, especially given that we have not received a written submission from you. I will then open the discussion up to questions from members.
I am the chief development officer of the higher still development unit. With me is my depute, Tony Keeley. Ours is a service unit—our job is to support the programme and the direction of various groups in the programme. We have a broad remit, fairly wide contact with the profession and six years' experience of producing the sort of support that teachers, college lecturers and others have asked for. We were unsure what specific questions you would ask, but we came prepared to do our best to assist.
Is it your view that higher still was implemented too early, before you were ready?
We heard earlier that there may have been unresolved issues with regard to the SQA's information technology systems, which became obvious only quite late on, but the programme has been running for a fairly long time, having started about four years after the Howie committee proposals, when there was extensive negotiation about the way forward.
What would you say to the fact that various people, including many local authority directors of education, have reported that no attention was paid to their concerns about their lack of readiness? There is a feeling that, after you started, you were always running behind time. On the in-service days, for example, people were asking for materials, which were promised but did not arrive on time. It has been suggested that there was a lot of consultation, but in practice people felt that their professionalism was overlooked.
I think that the consultation on this programme has been very thorough. It was thorough at the beginning, when we got consensus from the profession on the nature of assessment and the way to go ahead; and it was thorough later on when we consulted about the development and support materials that staff felt they needed. We have exceeded what those people asked for by a considerable amount; we have not just met the minimum requirements.
In the consultation meetings, teachers of English raised great concerns about the integrity of internal assessment, the amount of assessment and the validity of some of the requirements. That was three years ago, yet we still have shifting sands. Is that not correct?
I would not use the term "shifting sands". You could say either that we did not pay attention and did not do anything about English or that we did pay attention and have moved forward. Views on where English should be going were split: there was never 100 per cent agreement that we should go in one particular direction, although that happened in other subjects where we have had far fewer problems.
But it is now three years since the implementation was announced. I do not want to go on too much about English.
None of the content of the November letter was new information. That request for information asked us to pull together key messages and publish them so that centres could be confident about going forward based on an established set of rules. That is what we did.
North Lanarkshire Council has given us a document that mentions the
That is your final question, Ian.
I knew that you were going to say that.
On the point about giving advice to people during the first year of implementation, a number of issues could not have arisen until the programme was run. For example, we piloted some of the assessment on a small scale, in as wide a variety of schools as we could, but, until the programme was run nationally, it could not be known how it would work. When feedback to the seminars said that the cut-off score in physics, for example, was a bit too challenging, we had a choice: either we could leave that course to run for a year, and cause difficulties for students in that year, or we could respond quickly, which is what we did on several occasions.
Unless there had been some kind of piloting.
Let Mr Keeley answer, Ian.
We piloted some of the assessment with small groups. However, it is not until a programme is implemented on a large scale that a national feedback of people's views is received. At that point there is a choice: put off changing the programme until next August—and let the problems run—or respond, which is what we did. Because the issue concerned assessment, we took it from the seminars back to the SQA and, in consultation with the SQA, came up with what seemed to be a solution in response to what people had asked for. We then passed that information out as quickly as possible, which seemed a fairly reasonable way in which to respond.
Was any consideration given to whether the SQA could deliver the results that you expected?
Throughout the development programme, subject groups dealt with the details of where we were going; the SQA was represented on those groups, along with ourselves. Over the four years that Tony Keeley and I have been in post, as steps have been taken along the way, SQA officers have given us feedback on whether those steps were manageable. That feedback always suggested that they seemed reasonable and manageable—internal assessment seemed eminently so, as it would provide either a yes or no answer.
Did you have no indication over the spring or summer that things were about to go wrong? I know that you, like others, had been given assurances, but teachers and others have told us that they had concerns.
Twice a year we meet school managers from every school in Scotland, and we have groups that advise us on issues that arise. From as early as autumn 1999, school managers told us that they were experiencing problems in registering candidates. Registration was part of the administrative procedures of the SQA, over which we had no remit, but we were concerned—it was not in our interests, or those of anyone else, for students to be placed at a disadvantage.
We arranged a meeting with local education authority representatives in March, because from working with local authorities we had become aware that they had concerns. That meeting, which was attended by both the SQA and Her Majesty's inspectors, gave local authorities the opportunity to discuss directly with the SQA the data-handling issues that were emerging.
There seems to be a recurring pattern of the SQA being informed of problems. Did you have a role in checking whether concerns had been taken on board? I accept that you told the SQA promptly about concerns that needed to be addressed. Was there any mechanism by which you could check what had been done to address those concerns?
The higher still development unit is a temporary unit. Our job ends in June 2001. We are part of Learning and Teaching Scotland, which is a different organisation. It would not have been proper for us to hold anybody to account—it is beyond our competence to do that. That does not mean that we did not have any idea of how things were going. Colleagues in schools and colleges had sought information about, for example, prelims. As the SQA issued amendments and improvements to, for example, cut-off scores, we were aware that it was causing insecurity. Mr Jenkins' point was right. It was reassuring that people were listening and doing something about it. The SQA tried hard to be responsive at that time and we got copies of the letters and question-and-answer briefings that it issued, but we were never in a position to hold the SQA to account. We could only inform the SQA and be reassured when we saw evidence that it was picking up on a number of the problems.
So your organisation was reassured that the SQA was responding to concerns.
We were reassured on the areas within our remit—the management of internal assessments—in which we were involved with the SQA. In general, when we flagged up an issue, although it sometimes took a little time for it to be dealt with, there was no lack of intention to deal with it. That was our experience.
That is your view. It is just your impression that there was no lack of intention.
Yes, it was our impression.
Some people say that the past few months have damaged people's perception of higher still. What is your view of that? How can that be overcome?
We are about to find out, as we are about to do another round of seminars. We will have face-to-face contact with colleagues in schools and colleges. In June 2000, we had a round of seminars at which practitioners who had taken part in our implementation studies presented how they had found the first year to colleagues from other schools. They were warts-and-all presentations in which people set out the problems that they had encountered, what they had done and what they intended to do next year. The feedback from that round of seminars was the most positive that we had ever had. The feeling was that we were meeting challenges, but were succeeding in getting somewhere.
Teachers and schools delivered on higher still—there is no question about that—but do you agree that the perception among students and young people is one of mistrust? Do you agree that those who have another year of exams to sit, or who have just done standard grade, are not confident about going on to do higher? They think that the system will make it difficult to get through and to do assessments. How do we deal with those young people and with the fears of parents about their children's future?
The situation can best be dealt with through the schools and colleges, which is where parents and students put their confidence. Working, as we do, on an on-going basis, is the only way to restore that confidence. There must be reassuring messages about administration—absolutely. We need to get the system bedded down and people need to be clear about what needs to be done so that the system is managed in the coming year. However, there are also a lot of students out there who did intermediate and access courses last year, who will not feel that it was a bad year. There will be people who got the units, but not the highers, who will not have thought that it was a bad year. They are in the system too and tell their friends about that.
I have two questions. First, did the introduction last year of a new exam system contribute to the problems over the summer?
The introduction of a new information technology system certainly contributed—
I am asking about the exam system.
The external exam system used shorter exams than had been the case previously. There were no more exams in the system than there were previously—if there have been problems, I do not understand what they might be.
Is your position that there is nothing that is peculiar to higher still that contributed to the problems that were encountered this year?
The fact that problems were also encountered at higher and standard grades suggests that my view has reasonable back-up.
Does that accord with the views of teachers, pupils and other people whom we have heard evidence from: that what we encountered this year is not independent of the introduction of the new exam system?
One cannot say that there will be no problems when a new system is introduced, a new body is created and a new IT programme is initiated. I would never be naive enough to say that. However, the new exam system was not an outstandingly important contributory factor.
It was a factor, however. My question is not designed to have a go at you; I am just trying to tease out whether there were aspects of the implementation of the new exam system that you think—with the benefit of hindsight—might have contributed to the problems.
I did not think that you were having a go; I thought that you were seeking clarity. Your question is more for the SQA than for me—in a sense, the administration of the exam system is within the SQA's expertise, not that of the higher still development unit.
With respect, it is not a question for the SQA. The SQA can answer questions about the administration of the exam results. I am asking a question that should properly be directed at you. Does anything connect the introduction of a new exam system—for example, the change in examination timetables and the assessments, and the fact that that led to a higher volume of material for the SQA to deal with—and the problems that were encountered? Are there aspects that might have had a bearing on those problems?
They would have had little impact. For example, senior managers were consulted on the timetable and they said that they could live with it. When recording accomplishment of course units, the input is a yes or a no; it is not a large input into the electronic system. External examinations were no different from past examinations. In terms of the references that Nicola Sturgeon gave me, the impact of the new examination system was small.
Given the problems that arose with higher still in its first year—I am not talking about the SQA problems—that Ian Jenkins talked about, and given that, as has been argued before the committee, some of those problems were flagged up well in advance of higher still going on stream, what changes is your temporary organisation making to the way in which you will communicate with and liaise with the teaching profession over the next year of higher still, to make sure that its concerns are fed directly into your processes?
The development of the national liaison group has been helpful. It brings a wealth of information to the programme. The higher still development unit works directly with all stakeholder institutions—such as local authorities—and will continue to do so. We will continue to run staff development, to meet practitioners and to run programme implementation studies. We will also continue to use the professionals who advise us on what is needed in different subjects. There are ways in which we can improve matters and there is no question but that we will do so.
What are those ways?
We can examine the feedback that we have received. Feedback has come into the system since June and we are working on that with our partner organisations. That feedback includes feedback on changes to assessment and additional support materials.
The implementation studies for each subject have been and will be helpful. In each subject, a number of schools agreed to work with us and to tell us how the first year went—what went wrong and what went well. We have produced reports that are based on that information, indicating where people have found solutions to the problems that have been identified. Those reports have been distributed to departments for the coming year.
One of the up sides of what has happened is that the problem has highlighted certain issues. People are saying that in future, the exams should be held earlier in the year rather than later, as they were this year. There is also a heated debate about the future of internal assessment. Does the higher still development unit have views on any of those issues?
It is outwith our competence to take a view on when exam diets should take place. Rightly, that is a matter to be decided by the Scottish Qualifications Authority, in negotiation with the stakeholders.
I refer—not entirely at random—to submissions to the committee from two local authorities in which I have an interest: East Ayrshire Council and South Ayrshire Council. Both authorities pick up the fact that, at various stages, problems were flagged up. That point has been made repeatedly during the inquiry. The problems related both to the materials that were—or were not—being provided and to the assessment programme.
Your first point related to materials. We regret that any materials arrived late. Undeniably, some materials arrived close to, or after, the date on which they were needed. However, most of the materials that were needed to implement the first year of higher still were available at least six months—and sometimes a year—ahead of when they were needed. All the support materials that are now in the system from our unit—which does not include the national assessment bank—are listed in our catalogue. There are more than 1,000 items—there is a great deal of support in the system.
I am sorry to interrupt you, but I was trying to get at more practical strategies rather than the philosophical background to assessment. Some teachers in schools told me that it is difficult to fit assessment into the timetable if one has to reassess students as well as work towards the next assessment. I represent a rural area, where the schools are not huge and do not contain a large numbers of pupils taking each subject. The practicalities of timetabling assessments became a problem. Can you offer those schools any advice or guidance?
A good example of a way in which having too many assessments in one week can be avoided is that of a school that divided the year into fortnights and gave each subject only one day within that fortnight on which exams could be held. That meant that there was no way in which a student would come in and have two exams on one day. Other people have found different ways of producing the same result—students not having more than one internal assessment on the same day.
Might students who did not have the advantage of that strategy be entitled to claim that they had been disadvantaged, if, for example, they found themselves having to undertake several internal assessments in a short space of time?
I am sure that you will find differences among schools in the operation of the timetable this year. In most cases, schools saw the problems coming; the problems did not come as a surprise to anyone, and we would expect schools to plan to avoid clashes. However, schools took different approaches in avoiding those clashes.
I want to test some of the feedback that I received from students. Some said that, in some subjects, the course work, the expectations and the internal assessments did not relate to the final exam and that they were not as well prepared as might have been expected. Will you address that issue for the coming year?
That is already being examined. We know which subjects have internal assessment set only at unit level. It is the nature of the assessment that, in some subjects, the students can perform beyond the level of the unit requirements for a particular task. In other subjects, they cannot do so because levels of knowledge and understanding are set at the level of the unit. In those cases, however, we had already produced additional material that people could incorporate into a prelim or a unit test, showing the demands placed by the course over and above the units. I made a list of that additional material nine months ago, as somebody asked about where it could be found.
Sorry to interrupt, but could I ask what the subjects were?
Mainly maths and sciences.
Could you comment on the lines of communication? Many bodies seem to be involved, but there does not appear to be a straight line of communication. You mention dealing with schools, and you presumably deal with the inspectors and with the Executive. Do you accept that there is a proliferation of bodies and that it is not clear who is in charge at what time?
I will start with the links with the SQA. If we received a letter from a local authority that raised an issue that belonged to the SQA, we would forward that letter to the SQA—
Why would the local authority write to you about something that was the responsibility of the SQA?
Authorities might sometimes know us as their link office. They might not be so clear about who to write to in the SQA. We would send the letter to the SQA and send back a letter to the local authority, explaining what we had done and specifying to whom we had sent our letter in the SQA. The authority would then have a direct line of communication.
In that case—I am honestly not trying to be difficult—what are you doing?
What is our work now?
Yes. Are you dealing with the content of the courses, for example?
We develop learning and teaching materials to help to deliver the courses.
As distinct from the national assessment banks?
As against national assessment material. We run the staff development programmes. We do the field testing through the implementation studies, to check how things are going and to try to resolve any problems. Recently, we held information sessions with the business community about the qualifications. Our work as a development unit is established through a group called the development unit advisory group.
That is where we start getting into difficulties.
That group has representatives in our parent organisation, Learning and Teaching Scotland, in the funding division of the Scottish Executive and in HMI. That is where the work plan for the year is set up.
So the people in the parent organisation are your bosses.
The organisation sets our tasks. The general work targets for the whole programme—of which we form only a part—are agreed by the implementation group.
So there is you, there is the other group and there is the implementation group.
The implementation group is the umbrella group, which makes all the policy decisions for the programme and allows for the representation of directors of education, parents, college principals and others—the appropriate stakeholders in the programme.
There is also the higher still liaison group.
The liaison group is a working group, which was set up alongside the implementation group by the Scottish Executive.
Does the inspectorate drive this?
The inspectorate works with the programme in several ways. The senior chief inspector of schools chairs the implementation group and the liaison group. That is a key role.
Thank you.
Ian Jenkins is happy.
I would not say happy.
We all understand the issue a lot more clearly than we did. I thank Mary Pirie and Tony Keeley for attending this afternoon's meeting and for answering our questions.
Meeting closed at 16:36.