Biochemical Journal - Electronic submission and Peer Review Biochemical Society
HomeSearchFeedbackLinksJoin the Society
MeetingsMembershipPolicy ActivitiesProfessional ActivitiesEducationThe SocietyContacts

































January 1998

Report of a survey of current external examiner practice in biochemistry

Dr C J Skidmore and Dr M T Withnall

Summary

The operation of the external examiner system in biochemistry was investigated by questionnaire survey of the views of current external examiners and of biochemistry departments. Examiners and departments generally shared similar views regarding the role of external examiners, but examiners were less sure than departments that they are consulted widely on matters other than the determination of final degree outcome. The recruitment of external examiners from similar types of institutions appeared more prevalent in old than in new universities. Few departments or current examiners saw a need for training of external examiners; even fewer departments provided formal training for internal examiners to prepare them to act as future externals. Few examiners or departments would welcome the Dearing concept that departments should select external examiners from a limited, formally trained, pool.

The majority of examiners considered that their role at examiners' meetings was to assure both academic standards and the fairness of the process of assessment. There was little indication that their main role had become the assurance of process. This conclusion was not affected by whether the course examined was modular or linear. On the other hand, about half the departments and examiners surveyed considered that degree classification for modular courses was effectively decided well before the examiners meeting, and a sizeable minority of examiners considered that they had little prior input into the decisions made. About half the departments and examiners recognised components of modular courses that could not be assessed readily by the external examiner at the time of marking.

Most examiners were satisfied that departments heeded the advice in their reports, but about half of them commented that they did not receive formal feedback from departments on action taken.

Background

1. The Higher Education Quality Council (HEQC) set out the purposes of external examining to be:

  • assisting institutions in the comparison of academic standards across HE awards and award elements;
  • verifying that standards are appropriate for the award or award elements for which the external examiner takes responsibility;
  • assisting institutions in ensuring that the assessment process is fair and is fairly operated in the marking, grading, and classification of student performance.

2. Whilst the external examiner system remains a cornerstone of UK higher education (HE) quality assurance procedures, a number of reports over the last decade have questioned whether the purpose of ensuring comparable academic standards across HE, in particular, is being achieved. The most recent (Silver et al., 1995) noted that changing curricular structures, diversity of programmes, increasing student numbers, modularisation, and new forms of assessment were making it increasingly difficult for external examiners to play the roles of moderator, or guarantor of comparability of standards. Institutions were increasingly appointing external examiners from pools of institutions seen to share common philosophies or modes of provision, which could potentially compromise notions of 'national' standards.

3. Following consultation exercises, HEQC published a framework for external examining and guidelines to support effective external examining (HEQC, 1996a). The Graduate Standards Programme Report (HEQC, 1997a) recommended that internal and external examiners should have the opportunity to meet to review assessment practices and share perspectives.

4. The Biochemical Society intends to facilitate this process of strengthening peer judgement by organising discussion fora in 1998 for examiners and heads of teaching in biochemistry. The present study was intended to provide a picture of the current state of the external examiner system in biochemistry, as seen by departments, and by external examiners themselves. The information obtained will provide the basis for discussion in the planned fora.

Methodology

5. Questionnaires were compiled based largely on the recommendations for good practice in external examining given in Guidelines on Quality Assurance (HEQC, 1996b). The questionnaires for departments and examiners largely overlapped, but some questions were unique for each group (see Appendices 1 and 2). Questionnaires were sent to the Heads of 98 academic departments in the UK and the Republic of Ireland that teach biochemistry, as listed in the Biochemical Society Directory, and to 93 external examiners. The list of examiners was compiled by requesting departments to identify examiners that they had employed during the last 5 years, and by requesting members of various Biochemical Society committees to identify colleagues known to be external examiners. The population of departments characterised by the external examiners was thus not necessarily the same as the population that responded to the survey, although there would have been extensive commonality.

Outcomes

6. The questionnaires for departments and external examiners, and a summary of responses to each question, are given in Appendices 1 and 2, respectively. Since the numbering of the questions on the two questionnaires differed, in what follows specific reference to questions on the departmental questionnaire are prefixed DQ, and those on the examiners questionnaire EQ.

7. Completed questionnaires were returned by 32 pre-1992 (old), and 6 post-1992 (new) universities. Old university returns comprised 26 describing experience of modular courses, and 6 of linear; new university returns comprised 6 describing experience of modular courses only. Of 45 external examiners completing the questionnaire 30 were examining modular courses, and 15 linear.

Policy considerations

8. The great majority of departments had policies on the aims of external examining, and on the roles and duties of external examiners, and made these available to their external examiners. External examiners were clear about the existence of these policies (Q1a,b).

9. Examiners were less sure than departments of the particular outcomes expected as a result of the examination process; they were even less sure of the parameters of comparability that departments sought to be judged. Only 33% of examiners considered that they were supplied with clear departmental policies on the latter, compared to 74% of departments stating that they had such policies. However, it was apparent from the replies that Q1c and Q1d were less well understood by respondents than Q1a and b.

Role of external examiners

10. Departmental responses to what are the defined roles of external examiners (DQ2) were assigned to one or more of the 4 categories listed by HEQC in its consultative document on the external examiner system (HEQC, 1995). Responses were fairly evenly divided between the additional examiner role, the moderator role, and the calibrator role, but there was only a single reference to the departmental consultant role, although elements of this role could be considered to be encompassed in other categories. The most frequently mentioned activities were assuring comparable academic standards (21 respondents); commenting on appropriateness/consistency of assessment (21 respondents); commenting on or setting papers and assignments (18 respondents); and conducting vivas (14 respondents). In most cases examiners were informed of these roles in writing, or by receiving a copy of examination regulations (DQ3). The great majority of external examiners stated that they had a clear understanding of their expected role (EQ2).

11. A large majority of both examiners and departments considered that examiners were supplied with information on student performance in a useful numerical format, although not necessarily in the form of a spread sheet. Six examiners and 3 departments commented on a lack of time for analysis (DQ4, EQ3). Three examiners had received timely information only after previous complaints.

12. Examiners were less sure than departments that they were consulted on course design, only 51% agreeing this notion (DQ5, EQ4). Whilst a clear majority of examiners (69%) considered that they were invited to comment on assessment modes, they were again less positive than departments (95%) (DQ6, EQ5). There was almost unanimous agreement that examiners were shown examination papers in advance, and examiners were mostly happy that any comments made were acted upon satisfactorily (DQ7, EQ6).

13. There was some disagreement as to whether examiners performed assessments at points in the course other than in the determination of final degree outcome. Sixty three percent of departments considered that examiners of modular courses did, compared to 37% of examiners. A similar proportion of examiners responded positively for linear degrees, but departmental responses contained too few linear courses to make meaningful comment. Where it occurred, both examiners and departments agreed that the role was generally to comment on earlier years' examination results, including deciding on progression of problem cases or ruling on extenuating circumstances, or to verify course work marks (DQ8, EQ7)

14. A slightly higher proportion of examiners (67%) than of departments (55%) identified course components that fell outside the remit of the external examiner. These were predominantly modules or subsidiaries outside biochemistry, and were usually dealt with by the external examiner for the particular subject (DQ9, EQ8). In the case of modular courses, 53% of examiners and 59% of departments acknowledged components of the mark for each module that fell outside the remit of the external examiner, principally course work or continuously assessed elements. Course work was to some extent a grey area since several respondents said that course work material was available for inspection, but that this was rarely carried out (DQ10, EQ9). There was unanimous agreement that the external examiner was free to check mark all examination scripts and projects, but in practice it is likely that in all but the very smallest departments the examiner was sent a representative sample (DQ11, EQ10).

Selection and appointment of external examiners

15. Not surprisingly, departments gave as the main criteria for selecting potential external examiners their known repute as an examiner (24) or seniority in the field (19). Smaller numbers mentioned broad subject knowledge (9), familiarity with the type of course (7), ability to command respect (5), or perhaps crucially, availability (5). Twenty three out of 38 departments needed approval for the appointment of an external examiner at a level higher than faculty (DQ12 and 13). It is interesting to note that 63% of departments in old universities claimed to recruit examiners from similar institutions rather than from across the whole sector, whereas only one in six of those in new universities did. This may reflect a desire of the new universities to demonstrate that standards are comparable with those in longer established institutions (DQ14). Sixty four percent of examiners and 66% of departments stated that reciprocal appointment of external examiners between departments was expressly forbidden for their courses; only 20% of examiners and 16% of departments said that it was allowed (DQ15, EQ11). Most departments stipulated that a minimum length of time would need to elapse before a former staff member was appointed as an external examiner, most commonly 3 years, when the existing cohort of students would have passed through the system (DQ16). The most common appointment period stated by both examiners and departments was 3 years, with a smaller number of 4 year appointments. Three year appointments were often extended by a year (DQ17, EQ12).

16. The Dearing report recommended that institutions should have to draw external examiners from a limited pool of specially trained individuals. Overall, only 33% of examiners and 24% of departments would welcome this concept. Departments in new universities seemed more prepared to accept it than those in old institutions, although this was based on a sample size of only 6 new universities (DQ18, EQ13). If a pool was introduced, the predominant view was that the department should be able to select an examiner from the pool, rather than have one appointed. This question elicited a very broad range of views, from the very enthusiastic to the very antagonistic. A number of respondents made the point that external examiners of the appropriate seniority and eminence would not have time to be part of a pool called upon to serve multiple departments, and that the pool concept would lead to mediocrity.

Development and training in examinership

17. Examiners and departments concurred that few departments provided specific training for external examiners (18% stated by examiners, 16% by departments), although larger numbers indicated that examiners are given a thorough briefing (24% stated by examiners, 37% by departments). A number of respondents commented strongly that training was inappropriate as they were experienced examiners, others that the department only selected experienced examiners. Very few departments (8%) provided training to develop internal examiners in order to prepare them to act as external examiners. A larger proportion (34%) commented that internal examiners develop by informal exposure to, and observation of, the examination process (DQ19, EQ14).

Board of examiners/ assessment panels

18. Responses concerning the roles of the external examiner at examiners' meetings placed emphasis on the additional examiner function (final decision on degree class, particularly as regards special factors or difficult cases; and commenting on vivas of borderlines). The moderator role of commenting on appropriateness/consistency of assessment was also frequently mentioned, as was the calibrator role of commenting on comparability of academic standards (DQ25, EQ15).

19. It was apparent that there is a wide variety of formulae and procedures for converting course marks to final degree grade, with varying degrees of rigidity (DQ20). Most examiners felt that they were able to influence decisions on borderline students, although 16% commented that they had little or no input to deciding degree classification (EQ16). The boundary of discretion available to external examiners was most frequently stated to be two percentage points (DQ20).

20. An identical proportion of examiners and departments (47%) stated that it was normal practice for the external examiner to conduct vivas alone, and very similar proportions of respondents (88% of examiners, 85% of departments) agreed that the external examiner had opportunity to viva benchmark students within grades as well as borderline cases. A few respondents commented that they disagreed with the principle of benchmark vivas because of the unnecessary stress imposed on the students (DQ23, EQ18). Sixty seven percent of examiners and 76% of departments stated that the department examined provided clear guidelines to internal markers on the criteria for marking within specific degree classes (DQ21, EQ17)

21. Sixty percent of examiners of modular courses, and 50% of departments supplying such courses, agreed that most of the important decisions regarding final degree class (apart from borderlines) were made, effectively, well before the examiners' meeting. Of the total number of examiners who agreed this notion, 56% considered that they had little or no prior input into degree classification. However, in other cases the external examiner had some prior input by moderating module or unit marks, or approving examination papers. It was apparent from the replies that there was some confusion, particularly among departments, as to the interpretation of 'well before the examiners meeting'. The question had been intended to elicit information on courses where the degree class is determined in a formulaic manner from module grades, into which the examiner may have had little input (DQ22, EQ19)

22. Examiners and departments used similar methods to compare standards with those at other institutions (DQ24, EQ20). Examiners referred to experience of examining elsewhere, or comparing expectations of students with those applying in the examiner's own department. Departments relied on the comments of the external examiner, or used the experience of their own staff acting as external examiners elsewhere. Equal numbers of examiners said that they made their main comments on standards at the examiners' meeting or in the formal written report. Half the examiners who answered the question (but 38% of the total number of examiners) had expressed some criticism of standards or procedures, but this was balanced by many in the other half who were very positive about standards. The response of the department concerned to criticism had generally been positive, but not universally so (see EQ20). Three departments had expressed concern, but stated that they were constrained by university rules or policies to take poor quality students.

23. There was little difference of view between examiners of modular courses, or of linear ones, on what was the expected role of the external examiner at the examiners' meeting (EQ21). Forty seven percent of examiners of modular courses considered that they assured standards as well as assuring process of assessment, compared to 53% of examiners of linear courses; 30% of examiners of modular courses considered that the main role was to assure process, compared to 33% for linear courses. These figures are consistent with responses to question EQ15.

External examiners' reports

24. Similar proportions of departments (76%) and examiners (67%) considered that departments had clear policies specifying the nature and type of information sought from examiners (DQ26, EQ22). Only one department admitted that it had no formal policy to ensure that recommendations of the external examiner were acted upon; at most of the rest the recommendations were reviewed, and actions monitored, by committees at various levels (DQ27). Seventy six percent of examiners expressed satisfaction that their comments were acted upon, and a further 11% were generally or partially satisfied (EQ23). The proportion of departments stating that examiners received formal written feedback on their report, or minutes of appropriate meetings (66%) was higher than that of examiners acknowledging this form of feedback (44%). Examiners reported a higher incidence than departments of informal verbal feedback (20%), or no feedback other than observing any changes at the time of the next visit to the department (33%) (DQ28, EQ24).

Conclusions

25. The fact that a substantial proportion of external examiners did not consider that they were provided with departmental policies on expected outcomes of the examining, or on parameters of comparability sought, suggests that departments need to make these policies more explicit. This is particularly necessary since it is clear from the survey that departments rely heavily on the external examiner's comments on standards.

26. There were no major differences in view between departments and examiners regarding the role of the external examiner; but there was some indication that examiners were less confident than departments that they were able to influence course design or modes of assessment, or that they were involved in the examining process at stages other than the determination of final degree outcome. The recognition by about half the departments and examiners surveyed that some components of modular courses were not assessed by external examiners chimes with the findings of other recent surveys, and may be a cause for concern. Whilst course work was available in principle for inspection, it was rarely examined in practice because of time pressures. The survey did not reveal to what extent this type of problem was dealt with by module examination boards.

27. Whilst more than 60% of departments in old universities recruited examiners from similar institutions it was not clear whether this might lead to pools of universities, each with common standards, but not necessarily overlapping fully. Many of the respondents in this category stated that their externals were from departments 'with similar high academic standards to our own'.

28. The dislike of many respondents for the concept of a pool of external examiners was on the basis that it would lead to a lowering of standards, and reduce institutional autonomy. It is thus not surprising that QAAHE has commented recently that the vision in the Dearing report of a pool of trained examiners being available for 60 days a year is not tenable.

29. Recent HEQC publications have stressed the need for new external examiners to have the opportunity to participate in institutional induction proceedings and to be fully briefed on their role by the department, and for internal examiners to undergo formal training to equip them for future roles as external examiners. Few biochemistry departments appear to see the need for this training.

30. The finding that more than half the departments and examiners surveyed considered that degree classification for modular courses was effectively decided well before the examiners' meeting has also been noted by others, and led to questions being raised about the ability of external examiners to influence events (HEQC,1997b). The fact that more than half of the examiners in the present survey who supported the above statement considered that they had little or no prior input into degree classification thus warrants further attention.

31. There was no indication in the present study that respondents considered that the principle role of the examiner of modular degrees (as compared to linear ones) was to assure the process of assessment rather than to assure standards, as has been reported by other recent broader studies across disciplines (HEQC,1997b). This finding appears inconsistent to some extent with the previous observation that a number of examiners of modular degrees considered that they had little influence on degree classification.

32. Whilst a large majority of examiners was satisfied that departments took note of their comments, and acted upon them, the lack of formal feedback on actions reported by more than half the examiners gives scope for improvement in departmental practices.

References

  1. Higher Education Quality Council, The future development of the external examiner system. HEQC, London: 1995.

  2. Higher Education Quality Council, Strengthening external examining. HEQC, London: 1996a.

  3. Higher Education Quality Council, Guidelines on Quality Assurance. HEQC, London: 1996b.

  4. Higher Education Quality Council, Graduate Standards Programme Final Report. HEQC, London: 1997a.

  5. Higher Education Quality Council, Assessment in higher education and the role of 'graduateness'. HEQC, London: 1997b.

  6. Silver H, Stennett A and Williams R, The external examiner system: possible futures. Report of a project commissioned by the Higher Education Quality Council. Open University Quality Support Centre, London: 1995.

Appendix 1

DEPARTMENTAL RESPONSES

Note that for some questions the figures indicate the number of times that a particular response was made. Departments may have listed more than response category, so the figures do not necessarily sum to the total number of departments. In other cases not all departments replied to particular questions.

Policy considerations

1. Does the University or Department have explicit policies that set out

  1. the aims of the system of external examining? [Yes (34); no (4)]
  2. the roles and duties of external examiners? [Yes (37); no (1)]
  3. the particular outcomes expected as a result of the external examining process? [Yes (32); no (5); unclear (1)]
  4. the parameters of comparability (appropriateness of standards for the award and comparison across institutions) that it seeks to be judged by its external examiners? [Yes (28); no (10)]

Role of external examiners

2. What are the defined roles of the external examiner within your Department?

  1. Additional examiner
    • additional marker: modular (4); linear (2)
    • advocate in aggregating marks: modular (7); linear (4)
    • commenting on or setting papers and assignments: modular (15); linear (3)
    • conducting vivas ; modular (13); linear (1)
  2. Moderator
    • auditing exam procedures and conduct against regulations: modular (3); linear (1)
    • commenting on appropriateness/ consistency of assessment: modular (19); linear (2)
    • course monitoring; content and development: modular (7); linear (1)
  3. Calibrator
    • assuring comparable academic standards: modular (18); linear (3)
  4. Acting as consultant to department
    • on course content and development: modular (1); linear (0)

3. How is the external examiner informed of these roles? [Briefing before examiners meeting (2); sent copy of regulations (10); in writing (18); by department (2); by examinations officer (3); by report form (1); unspecified (2)]

4. Does the external examiner receive information in a useful spreadsheet format and with sufficient time to identify problem courses, questions, markers? [Yes (35); no (3); 3 commented on lack of time for analysis]

5. Is the external examiner consulted about curriculum design, course content, or any intended changes in courses? [Yes (25); no (9); informed rather than consulted (4)]

6. Is the external examiner invited to comment on assessment modes. If so, when? [Yes (36); no (2); when?: at examination board (23); at teaching board (1); in report (15); to department (1); during visit (2) when modules validated (1); at all times (3). At examination board and in report were listed together by several respondents]

7. Is the external examiner shown drafts of all examination papers for comment. What is the process for dealing with any comments under 6) and 7)? [Yes (38); action on comments: discussed, and agreement reached with; exam board/ staff meeting/ teaching or course committee (17); question setters (6); module leaders (3); head of school (1); exam officer (4); unspecified how agreement reached (14)]

8. Do you use an external examiner at points in the degree course other than in the determination of final degree outcome? What is the role at these points? [Modular: yes (20); no (12): role: commenting on earlier years examination papers, verifying results (including problem cases), verifying course work mark (15); consultation on modules/ sitting on module assessment boards (3); other (1). Linear: yes (3); no (3): role: approving earlier years grades (2); commenting on curriculum (1)]

9. Are there any course components or modules that contribute to the final degree class that fall outside the remit of the external examiner, or that the examiner is not able to assess at the time of allocation of marks for that component (eg minor courses, options, earlier years' components)? How does the Department deal with the case where one or more course component may be supplied by another Department and fall outside the competence of the external examiner? [Modular: yes (19); no (13); what?: modules outside biochemistry (15)-dealt with by external examiners for other subjects (11), or accepting marks moderated by other subject degree boards (not specified if external involved) (4)-3rd year course work (1)-not stated how dealt with-earlier year marks or continuous assessment (2)-not stated how dealt with-or oral presentation on project (1)-external saw dissertation . Linear: yes (2); no (4): what?: earlier year subsidiary courses (2)- dealt with by the appropriate subject external examiner].

10. If the course is modular, are there components of the mark for each module that fall outside the remit of the external examiner (eg course work)? [Yes (19); no (13): what? course work (7); continuous assessment (5); seminars, problem solving, practicals (2) A number said course work material could be inspected if desired, but this was rarely carried out].

11. Is the external examiner free to check mark all examination scripts and projects? [Yes (38)-some commented that in practice an examiner was usually sent a selection]

Selection and appointment

12. What criteria form the basis for the selection of external examiners? [Known repute, track record (experience) as an examiner (24); a senior person in the field (19)-of whom 3 specified professorial status; wide knowledge of subject (9); familiarity with type of course at department inspected (7); availability (5); ability to command authority/ respect (5); having a suitable personality (2); impartiality (2); not too far away (2); from a department of similar academic standing (1)]

13. What internal processes ensure that nomination and appointment criteria are being met? [Approval needed from university or college academic board/ senate/ council (23); faculty (10); department (teaching) committee (3); no formal procedure (2)

14. Do you recruit external examiners from a subset of similar institutional departments to your own or from across the whole biological sciences sector? [Pre-1992 institutions: similar (20); whole sector (12); post-1992: similar (1); whole sector (5)]

15. Does your process allow reciprocal appointment of external examiners across institutional departments or sub-units? [Yes (6); no (25); not disallowed but has not occurred (5); unsure (2)]

16. What are the regulations for appointing a former member of staff as an external examiner? [Not allowed within 3 years (10); not within 5 years (6); not within 7 years (1);unspecified minimum time (5); not done/ has not arisen (8); no formal regulations (5); not known (3)]

17. What is the maximum appointment period for an external examiner? [3 years (23)-can often be extended to 4; 4 years (14); not known (1)]

18. Would you welcome the concept that external examiners should only be drawn from a limited pool of specially trained individuals? Should the Department appoint an examiner from the pool or should this be the role of an independent external body? [Pre-1992; yes (6); no (24); unsure (2); post-1992; yes (3); no (1); unsure (2): overall; yes (9, 24%); no (25, 66%); unsure (4, 10%): of those saying yes, 5 said the department should appoint the external examiner, 2 said the external body; of those saying no, if it had to be introduced, 8 said the department should appoint, 2 said the external body]

Development and training in examinership

19. Does your department have policies that

  1. provide training for new and continuing external examiners, particularly as regards briefing on all aspects of the expected role, and offering adequate and timely opportunities to consider and clarify their role within the institutional context? [Yes (6); no (18); briefing only (14); 4 commented that externals are experienced and would not welcome training]
  2. develop and foster the work of internal examiners in order to prepare them to act as external examiners? [Yes (3); no (22); informal learning through experience and observation of procedures (13)]

Board of examiners/ assessment panels

20. What formula or process is used to determine the degree class for students on this course? How rigid are the rules for converting marks to classes, and how wide are the discretionary boundaries? [A large number of different procedures, of varying rigidity, was used to aggregate marks to degree class. Discretionary boundaries: 1% (3); 2% (13); 2-3% (3); 3% (4); relatively narrow/ rigid (4); several % points (3); not indicated (7); all students vivaed, and marks could be adjusted down as well as up as a result (1). Degree class of borderline students was usually resolved by vivas, but 2 departments said that the degree class could be adjusted upwards depending on marks in particular key modules]

21. Is there written guidance for internal markers on the criteria for marking within specific degree classes? [Yes (29); no (7); unsure (2)]

22. If the course is modular, are most of the important decisions regarding final degree class made well before the examiners meeting. What has been the prior input of the external examiner? [Yes (16); no (16); most responding positively stated that the prior input was moderating unit/ module marks (7) or approving examination papers (4). Ten stated that the external examiner could influence degree class at the examiners meeting chiefly by moderating borderline students]

23. Are vivas conducted by the external examiner alone, or in the presence of an internal observer? Is there the opportunity to viva alone selected students within degree classes to act as benchmarks? [Alone (18); with observer (16); decision left to external examiner (3); vivas not performed (1); opportunity to viva benchmarks? Yes (28); no, only borderlines (5)]

24. What process is used by the Department to compare standards with other institutions? [Comments of external examiner (38), and some included experience of own staff acting as externals elsewhere (22).One included discussions with employers]

25. What is the role of the external examiner at examiners meetings?

  1. Additional examiner
    • contributing to final decision on degree class (particularly as regards special factors or difficult cases): modular (22); linear (3)
    • commenting on vivas of borderline students: modular (17); linear (5)
    • leading the discussion with authority : modular (5);linear (1)
  2. Moderator
    • auditing exam procedures against regulations: modular (2); linear (0)
    • commenting on appropriateness/ consistency of assessment : modular (19); linear (3)
    • commenting on course content and development: modular (6); linear (1)
  3. Calibrator
    • commenting on comparability of academic standards: modular (10); linear (1)

External examiners' reports

26. Does the Department have a policy specifying the nature and type of information to be sought from external examiners? [Yes (29); no (6); not formal, but understood (2); no reply to question (1)]

27. What procedures are in place to ensure that issues raised by external examiners are reviewed , and acted upon, as appropriate? [Report reviewed at level of senate/ academic advisory committee/ (deputy) principal (5); school/faculty (teaching committee)/by dean (16);at department level (7); by academic board/exam board/ board of studies (6); no official policy (1); unspecified/ no response to question (3)]

28. What form of feedback is used to inform external examiners of any actions taken as a result of their reporting? [Formal written response (16); sent minutes of appropriate meetings/ subject review (9); the examiner can see changes, or is informed, at time of next visit (5); unspecified informal feedback (6); no response to question (2)]

Appendix 2

EXTERNAL EXAMINER RESPONSES

Note that for some questions the figures indicate the number of times that a particular response was made. Examiners may have listed more than response category, so the figures do not necessarily sum to the total number of external examiners. In other cases not all examiners replied to particular questions.

Policy considerations

1. Were you provided with Departmental policies on

  1. the aims of the system of external examining? [Yes (36); no (9)]
  2. the roles and duties of external examiners? [Yes (42); in part (1); no (2)]
  3. the particular outcomes expected as a result of the external examining process? [Yes (26); no (14); not directly/ explicitly (3); not understood (2)]
  4. the parameters of comparability (appropriateness of standards for the award and comparison across institutions) that it seeks to be judged by its external examiners? [Yes (15); no (24); not directly/ explicitly (5); not understood (1)]

Role of external examiners

2. Did you have a clear understanding of the roles that the Department expected you to perform? [Modular: yes (26); no (0); not initially (2); partial (2); linear; yes (13); no (1); partial (1)]

3. Did you receive information in a useful spreadsheet format and with sufficient time to identify problem courses, questions, markers? [Yes (40)-but 3 commented only after a previous complaint; no (2); partial information (3); 6 complained of lack of time for analysis]

4. Were you consulted about curriculum design, course content, or any intended changes in courses? [Yes (23); no (19); informed rather than consulted (3)]

5. Were you invited to comment on assessment modes? If so, when? [Yes (31); no/ not explicitly (14); when?: at examiners meeting (31); in report (12)-often both were stated; during the year (before examinations) (4)]

6. Were you shown drafts of all examination papers for comment? How did the Department deal with any comments that you made under 5) and 6)? [Yes (44); no (1); action on comments: department usually used the advice/ department accepted and implemented comments (31); we came to an agreement by negotiation (5); comments were partially implemented (2)]

7. Did you perform assessments at points in the degree course other than in the determination of final degree outcome? What was your role at these points? [Modular: yes (11); no (19); role-commenting on earlier years examination papers/ results/ projects (6); determining progression (problem cases, extenuating circumstances) (2); moderating modules/ semester examinations (2); judging oral presentations (1). Linear: yes (6); no (9): role; commenting on earlier years examination papers/ results/ projects (6); determining progression (1)]

8. Were there any course components or modules that contribute to the final degree class that fell outside your remit, or that you were not able to assess at the time of allocation of marks for that component (eg minor courses, options, earlier years' components)? How did the Department deal with the case where one or more course component may be supplied by another Department and fall outside your particular competence? [Modular: yes (21); no (9); what?--modules outside biochemistry (14)-dealt with by external examiners for other subjects (12), or accepting marks of other boards (2); earlier year marks or continuous assessment (2)-relied on internal markers and/ or examining selected scripts (2); sandwich year mark (1)-relied on departmental assessment. Linear: yes (9); no (6): what?--subsidiary subjects outside biochemistry (3)-relied on other examination boards (3); 3rd year course work/ practicals (2)-relied on departmental assessment (2); earlier year marks or continuous assessment (3)-relied on internal markers and/ or examining selected scripts (3)]

9. If the course was modular, were there components of the mark for each module that fell outside your remit (eg course work)? [Yes (16); no (14). Few said what-course work (2); continuous assessment (2)]

10. Were you free to check mark all examination scripts and projects? [Yes (45)-although a number commented that there was not time to examine more than selected ones]

Selection and appointment

11. Does your own Department allow reciprocal appointment of external examiners with the Department that you examined? [yes (9); no (29);not known to be specifically disallowed but unlikely to happen (7)]

12. For what time period were you appointed as external examiner? [1year (1); 2 years (4); 3 years (24); 4 years (14); 5 years (1); 6 years (1). Those appointed for 3 years frequently said that the appointment could be extended by 1 year]

13. Would you welcome the concept that external examiners should only be drawn from a limited pool of specially trained individuals? Should the Department appoint an examiner from the pool, or should this be the role of an independent external body? [Yes (15, 33%), no (27, 60%), unsure (3, 7%); of those saying yes, 6 said the department should choose the examiner from the pool, 5 said the external body, and 2 said either; of those saying no, if it had to be introduced, 7 said the department should choose the examiner, and no-one said the external body]

Development and training in examinership

14. Did the Department that you examined provide training, particularly as regards briefing on all aspects of the expected role, and offering adequate and timely opportunities to consider and clarify your role within the institutional context? [Yes (8); no (20); briefing rather than training (11); no, but it was not needed as I am an experienced examiner (6)]

Board of examiners/ assessment panels

15. What did you consider to be your role at examiners meetings?

  1. Additional examiner
    • contributing to final decision on degree class (particularly as regards special factors or difficult cases): modular (19); linear (7)
    • commenting on vivas of borderline students: modular (12); linear (6)
    • leading the discussion with authority: modular (1); linear (3)
    • acting as advocate for the students; modular (4);linear (1)
    • a cosmetic role, unable to affect decisions appreciably: linear (1)
  2. Moderator
    • commenting on appropriateness/ consistency of assessment: modular (10); linear (6)
    • commenting on course content and coverage: modular (6); linear (4)
  3. Calibrator
  4. commenting on comparability of academic standards: modular (11); linear (7)

16. What was your input into the process of converting marks into degree class? [Advice on borderlines (19); could modulate up to 2% (3); could modulate up to 3% (1); an advisory role (8); adjudicating on extenuating circumstances (2); ensuring that the standard of marking allowed the required formulaic fixing of degree class (1); little or none (7); no response to the question (4)]

17. Did the Department have written guidance for internal markers on the criteria for marking within specific degree classes? [Yes (30); no (11); unsure (2); no response to question (2)]

18. Were vivas performed by yourself alone, or in the presence on an internal observer? Was there opportunity to viva alone selected students within degree classes to act as benchmarks? [Alone (21); with observer or another examiner (16); part of formal committee (1); vivas not performed (5); no response to question (2): opportunity to viva benchmarks (not necessarily alone)?: yes (28); no, only borderlines (4). Two examiners that they requested an observer to be present; 2 said that they did not support conducting benchmark vivas because of the unnecessary stress imposed on the students concerned]

19. If the course was modular, were most of the important decisions regarding final degree class made well before the examiners meeting? What had been your prior input to these decisions? [Yes (18); no (8); no reply to question (4): what had been examiner's prior input?: none (7); limited/ very little (3); some involvement (1); advice on problem cases (1);moderating semester examination results (1); moderating marks (1); involvement at preliminary examination board meeting (1)]

20. What process did you use to compare degree standards with those at other institutions? [Experience of examining elsewhere/ discussions with other external examiners (35); comparison of expectations from students with those in examiner's own department (14); no response to question (4) How this was commented on: verbally/ directly to examination board or examiners (16); in a formal written report (17) Did you ever criticise standards?: yes (17-procedures stated by 3, presumably the rest were on overall standards); no (17) Response of the department to criticism: positive (6); responsive (4); took serious note (2); considered formally (2); produced and implemented action plan (1); concerned, but constrained by university rules (to take poor students) (3); there is continuing dialogue on my criticism (1); the department attempted to defend its position (1); caused a stir, but not acted on (1); unsatisfactory (1)]

21. Did you consider that your expected role was one of assuring due process of assessment rather than as one of creating and disseminating standards? [Modular: assuring process (9-30%); assuring standards (7); both (14): Linear: assuring process (5-33%); assuring standards (2); both (8)

External examiners' reports

22. Did the Department have a policy specifying the nature and type of information to be sought from external examiners? [Yes (30); no (11); not really (2); not known (1); no response to question (1)

23. Were you satisfied that issues raised in your report were reviewed by the Department, and acted upon, as appropriate? [Yes (34); generally or partially (4); in some cases (1); lack of feedback (2); reviewed, but query acted on (1); no (2)]

24. What form of feedback was used to inform you of any actions taken as a result of the report? [Formal written response (16); informal verbal feedback (9); sent annual programme report or minutes of appropriate examination board meeting (4); none, other than observing any changes the following year (15); no reply to question (1)]




Biochemical Society/Portland Press
59 Portland Place
London
W1B 1QW
United Kingdom

Biochemical Society Membership Office
Portland Customer Services
Commerce Way
Whitehall Industrial Estate
Colchester
Essex
CO2 8HP
United Kingdom
Administration:
Tel: 020 7580 5530 - Fax: 020 7637 3626
E-mail: [email protected]

Editorial:
Tel: 020 7637 5873 - Fax: 020 7323 1136
E-mail: [email protected]

Meetings:
Tel: 020 7580 3481 - Fax: 020 7637 7626
E-mail: [email protected]

Membership:
Tel: 01206 796 351 - Fax: 01206 799 331
E-mail: [email protected]