Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-11-24T20:45:54.307Z Has data issue: false hasContentIssue false

Preparing trainees for the MRCPsych examinations

Published online by Cambridge University Press:  02 January 2018

Rights & Permissions [Opens in a new window]

Abstract

This article is aimed at organisers of courses for the Royal College of Psychiatrists' membership examinations (MRCPsych) and College tutors preparing trainees for the MRCPsych. Running revision courses requires planning and a good deal of work but should be possible for most MRCPsych preparation courses. The theoretical background of assessments is explained. An overview of the type of examination used in the MRCPsych is provided and advice as to how trainees can best prepare for them given. Advice is given on the recruitment and retention of examiners for mock clinical exams, how to deal with simulated patients and what equipment is useful to buy for the use of trainees. We also explain how trainees can practice for the written papers and how feedback is best given to them. The new MRCPsych formal examination and workplace-based assessment programme are also discussed.

Type
Research Article
Copyright
Copyright © The Royal College of Psychiatrists 2006 

Before taking the Membership examinations of the Royal College of Psychiatrists (currently the MRCPsych Part I and Part II examinations, or MRCPsych for short) all trainees are required to attend a local ‘MRCPsych preparation course’. Feedback from trainees on the Birmingham MRCPsych preparation course has consistently suggested that it should be more exam-focused. In response we have set up a series of revision courses for both parts of the MRCPsych and this article lays out the principles for running such courses, based on our experience. We also describe the format of the new MRCPsych examination and workplace-based assessments, discussing the implications these will have for organisers of MRCPsych preparation and revision courses.

Assessment in medicine

In medicine the assessment of trainees is of particular importance. Allowing a trainee to pass who should be failed may pose a threat to society. Conversely, failing a trainee who should be passed wastes the trainee's time and money by requiring them to repeat some of their training and may deprive society of a competent practitioner (Reference SchuwirthSchuwirth, 2004). Assessment should be seen as integral to any course or training programme and not merely an add-on (Reference HardenHarden, 1986).

The current MRCPsych is an example of summative assessment: trainees sit the two parts at specific stages during their training and the grades they attain form the basis on which decisions are made about their future. Since the MRCPsych leads to a professional qualification it is also a ‘high-stakes’ exam. By contrast, formative assessment is typically undertaken to provide feedback to the trainee and their educational supervisor about progress and potential difficulties but without contributing to pass/fail decisions. A recent trend in medical education is to soften the distinction between formative and summative assessment (Postgraduate Medical Education and Training Board, 2005) as the emphasis moves away from performance in high-stakes exams to gathering evidence of clinical competence and appropriate professional behaviour and attitudes. The Postgraduate Medical Education and Training Board (PMETB) argues that this evidence is best gathered in the workplace through workplace-based assessments.

Educational assessments should fulfil two particular requirements. First, the assessment should be reliable: it must produce consistent results. Second, it should be valid: it should measure what it is supposed to measure (Reference Newble and CannonNewble & Cannon, 2001). Any one method of assessment will have good and bad points. Therefore the best way to build up a complete picture of a learner is by combining a number of different assessments, each of which tests different areas of learning (Reference SchuwirthSchuwirth, 2004). Thus, the current MRCPsych includes multiple choice questions (MCQs), objective structured clinical examinations (OSCEs), patient management problems (PMPs), essays and individual patient assessments (IPAs).

In its new format, which will be introduced in Spring 2008, the MRCPsych written papers will retain the MCQ and OSCE, but drop the essay, PMP and IPA in favour of the EMQ (extended matching question: see below). To these formal exams, and in keeping with the requirements of PMETB, will be added several methods of workplace-based assessment during the course of training. As these assessments involve personal witnessing of trainees’ interactions with patients over a period of time and in a variety of settings, they are thought to have high face validity (Reference Brown and DoshiBrown & Doshi, 2006). Nevertheless, questions have arisen about the appropriateness of using workplace-based assessments, as they do not have published reliability data for psychiatry in the UK (Reference RoseRose, 2006). To obtain such data the Royal College of Psychiatrists has carried out a pilot project across 15 sites and involving over 600 trainees. Preliminary results are being analysed at the time of writing.

Thus, the formal exams should preserve confidence in psychiatric training, while the MRCPsych as a whole conforms to PMETB's ‘overarching assessment strategy consisting of workplace based assessment, and examinations of knowledge and clinical skills’ (Postgraduate Medical Education and Training Board, 2005). The MRCPsych therefore is part of the whole assessment package for development of trainees.

Why run revision courses?

Trainees’ desire for revision courses for the MRCPsych can be seen from the advertisements that appear in the British Journal of Psychiatry and from the numerous courses advertised on the internet. Furthermore, the large number of psychiatric trainees who graduate from medical schools outside the UK may not have had the same exposure as UK graduates to the examination methods that the College uses. (This may explain Reference Oyebode and FurlongOyebode & Furlong's (2007) finding that foreign graduates perform worse than UK graduates on the MRCPsych.) Revision sessions should not be used merely to give trainees answers. They can be used as interactive sessions that encourage trainees to think for themselves about questions. Nevertheless, some educators feel uncomfortable with the notion of using local MRCPsych preparation courses for the purpose of exam revision, and this is worth brief discussion.

Learning styles

A key concern about revision courses is the type of learning that they can encourage. Reference Newble and EntwistleNewble & Entwistle (1986) divide learning styles into strategic, surface and deep learning. Strategic learning is motivated by a desire to be successful, and leads to patchy and variable understanding. Surface learning is motivated by fear of failure and a desire to complete a course: students tend to rely on learning by rote and focusing on particular tasks. Many feel that it is precisely these types of learning that revision courses use. In deep learning, however, individuals are motivated by their interest in the subject matter. They learn because they believe what they learn to be relevant and they are rewarded by acquiring knowledge that helps them to carry out tasks that matter to them. Deep learning is active rather than passive and it involves interaction with others (Reference GibbsGibbs, 1992). Deep learning is more likely to be associated with a better quality of learning (Oxford Centre for Staff Development, 1992).

Assessment-driven learning

We know that assessments drive the way in which learners study (Reference FrederiksenFrederiksen, 1984; Reference MischMisch, 2002). Ideally, assessments should match the aims and objectives of the curriculum, so that studying for exams becomes the same as studying to become a better doctor (Reference SchuwirthSchuwirth, 2004). This view is implicit in recommendations that psychiatric trainees in Canada should sit regular mock exams throughout their training (Reference Crockford, Holt-Seitz and AdamsCrockford et al, 2004). These incorporate training in ‘case vignettes’, oral examination skills and anxiety management skills. It is also implicit in the College's new competency-based curriculum and its emphasis on workplace-based learning and workplace-based assessments (Reference BhugraBhugra, 2006).

Infrastructure

Running mock exams for a large number of trainees requires a great deal of support and help. Box 1 outlines some of the preparation needed. The Birmingham MRCPsych course is currently organised by three consultants, supported by three honorary lecturers (one for the Part I course and two for the Part II course). This team also runs the revision courses that form part of the MRCPsych course. All are members of the Birmingham MRCPsych course board. With the new MRCPsych format these roles will be revised, with one honorary lecturer supporting each of the specialist training (ST) grades from 1 to 3. The honorary lecturers are specialist registrars on the West Midlands rotation and are appointed through competitive interview by the University of Birmingham. The revision courses are administered by the local psychiatric Postgraduate Medical Education Centre, and the role of the administrator is vital to the smooth running of the courses.

Box 1 Things to do when setting up MRCPsych revision courses

  1. Encourage as many consultants as possible to become College examiners

  2. Have a collection of mannequins (e.g. ophthalmology head, resuscitation dummy) and make these available to trainees. These can be bought from medical equipment catalogues

  3. Have a library of revision books for the examinations: these are useful not just for the trainees but for anyone trying to devise examination papers

  4. Check the College website regularly (in particular: http://www.rcpsych.ac.uk/exams.aspx and http://www.rcpsych.ac.uk/training.aspx)

  5. Obtain the College training videos of OSCEs and make these available to trainees (go to http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat/partiosce/trainingvideos.aspx)

  6. Make sure there is easy access to appropriate simulated patients (an internet search for ‘simulated patients’ brings up the addresses of suitable agencies)

  7. Ensure that clinicians helping out are aware of local ‘teaching the teachers’ courses

Although the revision courses are part of the Birmingham MRCPsych preparation course, trainees are charged a nominal fee to cover the cost of lunch and also to reduce the non-attendance rate. Any spare places on the course are offered at cost price to trainees from outside the region.

Recruiting and retaining examiners

It is important to have a bank of well-trained examiners for the mock exams. It is also essential to have a couple of examiners in reserve on the day of the exams. If there are no cancellations they could act as floating external examiners. Where possible we try to recruit consultants who are or have been examiners for the MRCPsych exams. Specialist registrars are often keen to examine and offer feedback and they are therefore a vital part of the pool of mock examiners. College tutors and MRCPsych preparation course organisers should encourage consultants who are educational supervisors to join the MRCPsych board (regular notices asking for applications appear in the Psychiatric Bulletin and an application form can be downloaded from http://www.rcpsych.ac.uk/docs/Examiner%20Application%20Form%20181202%20Protected.doc).

A word of caution is necessary, in that examiners may be party to information that should not be divulged to trainees, for instance examiners must not remove any examination material from the OSCE stations during the College exams. Consultants who are involved in setting or organising exams for the College need to be particularly careful.

At Birmingham, revision courses take place on a Saturday and can involve a considerable amount of extra work. We have found that paying the examiners helps minimise non-attendance.

Organising revision courses for written examinations

Multiple choice questions

Multiple choice questions (MCQs) are ubiquitous in both undergraduate and postgraduate medicine. They are popular because of their reliability and, perhaps most important, the fact that they can be ‘marked’ by computers, thus making them ideal for testing large numbers of candidates. The MCQ format tends to be used for assessing factual knowledge (Reference SchuwirthSchuwirth, 2004).

All three papers of the new MRCPsych will contain MCQs with a ‘1 from 5 single best answer’ format, that is the candidate will have to select the most likely answer out of five options, some of which may be correct but one of which is more likely to be correct than the others. This is different from the current MRCPsych, which has individual statement questions (ISQs) – individual statements are presented, each of which has to be marked as true or false. It is assumed that the new format will reduce the chances of guessing the correct answer. These changes are on-going and readers are strongly recommended to keep up to date with information on the College website (http://www.rcpsych.ac.uk).

Extended matching questions

The three written papers of the new MRCPsych will each also include extended matching questions (EMQs). The EMQ (also know as the extended matching item (EMI) question) is a variant of the traditional MCQ, and it is increasingly becoming the preferred format for written undergraduate and postgraduate medical examinations. In an EMQ the stem is typically a clinical scenario about which related questions are asked. Candidates must choose their responses from a list of at least 5 options (usually 10–20 are offered). If, for example, 10 options are given, the candidate is likely to guess correctly only 10% of the time (Reference GeorgeGeorge, 2003), and this minimises the recognition effect that occurs in standard MCQs. By using clinical vignettes instead of facts, the items can be used to test the application of knowledge or clinical reasoning (Reference Case and SwansonCase & Swanson, 1993; Reference Schuwirth and van der VleutenSchuwirth & van der Vleuten, 2004). Another advantage of EMQs is that they are perceived as being the ‘fairest’ examinations (Reference McCoubrieMcCoubrie, 2004).

The EMQs are often written in sets, to be used together in the same exam, where the same theme and option list is used for two or more different vignettes. Reference Case and SwansonCase & Swanson (2002) have described how EMQs can be constructed, and the steps involved are outlined in steps 1–5 below, using as an example a pilot EMQ produced by the Royal College of Psychiatrists (Box 2).

Box 2 A sample EMQ produced by the Royal College of Psychiatrists

Theme: Disorders of perception

Options:

  1. A Completion illusion

  2. B Delusional perception

  3. C Dysmegalopsia

  4. D Extracampine hallucination

  5. E Functional hallucination

  6. F Haptic hallucination

  7. G Hygric hallucination

  8. H Pareidolic illusion

  9. I Reflex hallucination

  10. J Synaesthesia

Lead in: Which of the above descriptive psycho‐pathological terms refers to the following symptoms?

37) ‘I hear the voice of my long dead father, as if he were talking to me now, when I hear water running from the bath tap.’

38) ‘I hear the voice of my father speaking to me from the other side of the city.’

39) A young woman describes looking up into the clouds and seeing an image of her fiancé.

40) A 22-year-old woman with schizophrenia describes the sensation that somebody is touching her body in intimate places.

Answers:

37 – E

38 – D

39 – H

40 – F

(http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat/partiexamplequestions.aspx)

1Decide on the theme for the set

The theme could be a presenting problem (e.g. low mood), a situation (e.g. admission/discharge from the emergency department), or a drug class (e.g. atypical antipsychotics). In the example in Box 2 it is disorders of perception.

2Write the lead-in for the set

The lead-in indicates the relationship between the stems and options, clarifying the question posed. In the example it is ‘Which of the above descriptive psychopathological terms refers to the following symptoms?’

3Prepare the list of options

The options should be single words or very short phrases; they should be listed in alphabetical order unless there is an logical alternative order.

4Write the items or stems

The items within a set should be similar in structure: in the example they are the description of symptoms. Often they are patient vignettes.

5Review the items

A check should be made to ensure that there is only one ‘best’ answer for each question. There should be at least four reasonable distractors (incorrect options) for each item.

MCQ and EMQ revision

At Birmingham, revision for this part of the exam takes place throughout the MRCPsych course, as course speakers are encouraged to produce questions as part of their lectures. In addition, during the Part I course 4 hours are devoted to MCQs and EMQs. During these sessions MCQs and EMQs linked to the teaching of that semester are introduced and answers are discussed.

Practice MCQs are available on various websites and also from numerous books (e.g. Reference McNamaraMcNamara, 2003; Reference MichaelMichael, 2004a ). Shortly before trainees sit the Part I written paper they sit a half-day mock exam under examination conditions. The candidates mark each other's papers and the course leader oversees a brief discussion of the answers. The subject matter of the mock exams should reflect the structure of the College examination (see the College website for details: currently http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat.aspx). An important part of the revision is not only to get trainees used to the format of MCQs and EMQs but also to encourage interactive discussion of possible answers.

Speakers on the Part II course are also encouraged to discuss MCQs with the trainees as part of their lecture. The Part II Written Revision Day includes a 1-hour session going through selected MCQs.

As the three knowledge-based papers of the new MRCPsych will contain only MCQs and EMQs one would assume that there will be greater demand from trainees for MCQ and EMQ practice in revision courses. Furthermore, training for assessors of workplace-based assessments will certainly emphasise preparing trainees for this assessment and there may also be a demand for this in revision courses.

Essay papers and critical appraisal questions

The current Part II written exam contains an essay paper, and essays are a good way of assessing a student's ability to summarise and integrate information, as well as to hypothesise, find relations and apply known procedures to new situations (Reference Schuwirth and van der VleutenSchuwirth & van der Vleuten, 2004; Reference Tyrer and OyebodeTyrer & Oyebode, 2004). They do suffer from limited reliability and are expensive to mark (Reference Schuwirth and van der VleutenSchuwirth & van der Vleuten, 2004). There is no direct replacement for the essay paper in the new MRCPsych.

The introduction of the critical appraisal paper in the written Part II examination some years ago seems to have had a positive effect on the way journal clubs are run in psychiatric hospitals (Reference Taylor and WarnerTaylor & Warner, 2000). Paper 3 of the new MRCPsych will test the critical appraisal of research relevant to clinical practice, but in EMQ and MCQ format. The advantage of this will be that the paper can be marked reliably and without bias. As part of workplace-based assessments, trainees will also have to do four journal club presentations during the course of their training. This is another example of the power of assessment to drive student learning. Reference Brown and WilkinsonBrown & Wilkinson (2005) and Reference Ogundipe, El-Nadeef and HodgsonOgundipe et al(2005) provide a guide to critical appraisal papers.

Organising revision for clinical exams

Simulated patients

The concept of using simulated patients in medical education was developed in the 1960s, initially to assess medical students’ performance (Reference BarrowsBarrows, 1993). Subsequently Harden and his colleagues developed their use in OSCEs (Reference Harden and GleesonHarden & Gleeson, 1979). They are also widely used in a variety of educational roles in medicine (Reference Fenwick, Vassilas and CarterFenwick et al, 2004; Reference Wallace, Rao and HaslamWallace et al, 2002).

Recruiting simulated patients

Many medical schools now have a pool of people that are used as simulated patients for exams and teaching. Simulated patients are often actors, drama students or even members of amateur dramatic societies. Developing a bank of simulated patients solely for use in revising for exams would be extremely time-consuming (Reference Ker, Dowie and DowellKer et al, 2005) and it is better to tap into an already existing resource. A professional agency is used for the Birmingham MRCPsych revision courses.

Preparing simulated patients

Simulated patients participating in a mock OSCE receive vignettes in advance. Its important to make explicit what information these individuals are expected to volunteer and how they are to divulge it. Before the OSCE begins a briefing should emphasise the need for consistency in the simulation for all candidates and should also allow them the opportunity to clarify any points in their instructions. Floating external examiners can be used to verify the accuracy and consistency of the simulations during the exam. A debriefing of simulated patients after the exam is also useful for the course organisers, in planning future courses and giving feedback to trainees.

Feedback to candidates from mock clinical exams

For trainees feedback provides invaluable information about how they might improve their performance. All examiners should be aware of how to give feedback in a manner that allows the trainee to feel safe. ‘Teaching the teachers’ courses are a good way to disseminate these techniques (Reference Vassilas, Brown and WallVassilas et al, 2003). Although they have been criticised (Reference WalshWalsh, 2005), Pendleton's rules (Box 3) are a useful starting point in this regard.

Box 3 Pendleton's rules

  1. 1 The learner first performs an activity

  2. 2 Subsequently the learner is asked what they thought was done well

  3. 3 The teacher then talks about what was done well

  4. 4 The learner then describes what might have been improved on

  5. 5 The teacher then comments on the aspects that might be improved and offers suggestions in a constructive manner

(Reference Pendleton, Schofield and TatePendleton et al, 1984)

Building on Pendleton's ideas teachers at the University of Calgary Medical School developed the framework of ‘descriptive feedback’ (Reference Silverman, Draper and KurtzSilverman et al, 1996; Reference Kurtz, Silverman and DraperKurtz et al, 1998). Devised to help with teaching communication skills, this offers more specific guidance and techniques for giving feedback in a non-threatening manner (Reference Vassilas and HoVassilas & Ho, 2000). The essential components of descriptive feedback are listed in Box 4. The learner is assisted in finding a solution to any difficulties that arise by using as a starting point the problems that they themselves experience. A non-judgemental approach is used and the feedback describes what actually happened. During this session the focus is on desired outcomes and trainees are asked what they want to achieve. Feedback is balanced, in that comments are made about things that worked and things that did not, but the strict order required by Pendleton's rules is not necessarily followed. Trainees are encouraged to say what steps might be helpful next time. This latter point is particularly important in a revision course setting: frequently trainees want to know ‘the correct answer’, but in an examination as in a real clinical situation there may be no single correct approach. For the OSCE feedback sessions at Bimingham we try to get the group of trainees to suggest how problems that have arisen might have been better managed.

Box 4 Descriptive feedback

Feedback should be:

  1. non-judgemental

  2. specific

  3. directed to particular behaviours

  4. checked with the recipient

  5. outcome-based

  6. focused on problem-solving, in the form of suggestions rather than prescriptive comments

(After Reference Kurtz, Silverman and DraperKurtz et al, 1998)

OSCEs

The OSCE was first introduced into undergraduate medical teaching at Dundee Medical School (Reference Harden, Stevenson and DownieHarden et al, 1975; Reference Harden and GleesonHarden & Gleeson, 1979). It has become ubiquitous in both undergraduate and postgraduate medical exams (Reference AdamoAdamo, 2003). The OSCE became part of the MRCPsych in 2003, following an evaluation of the MRCPsych examinations by a medical educationalist (Reference Tyrer and OyebodeTyrer & Oyebode, 2004).

The OSCE format allows a wide range of skills to be tested and reduces the influence of any one examiner on the overall outcome for the candidate (Reference Wallace, Rao and HaslamWallace et al, 2002). The OSCE is a test of clinical skills that is independent of factual knowledge, and foreknowledge of the stations does not appear to influence performance (Reference Wilkinson, Fontaine and EganWilkinson et al, 2003). In an OSCE examinees move through a series of ‘stations’ at which various clinical tasks are carried out. These tasks are observed and scored by an examiner. Reliability is increased with a larger number of stations (Reference Newble and SwansonNewble & Swanson, 1988; Reference Van der Vleuten and SwansonVan der Vleuten & Swanson, 1990).

Reference SmeeSmee (2003) outlines three areas that limit the extent to which OSCEs can be used to assess clinical practice. First, time-limited stations often require trainees to perform isolated aspects of the clinical encounter. This deconstructs the doctor–patient interaction and for formative assessments may be inappropriate. On the other hand, limiting the time means that there can be more stations, allowing for reliable, summative decision-making, which is how they are used by the Royal College of Psychiatrists. Second, OSCEs rely on checklists, and this assumes that clinical interactions can be described as a list of actions. Checklists tend to emphasise thoroughness, which may become less relevant as the clinical experience of candidates increases. Reference Hodges, Regehr and McNaughtonHodges et al(1999) confirmed that OSCEs were not so good at assessing advanced psychiatric skills. Third, there are limits as to the type of clinical scenario that can be portrayed in an OSCE. Again, this becomes more of an issue for advanced trainees.

An OSCE station can be broken down into the following four components (Reference Tyrer and RaoTyrer, 2005).

The construct or stem

The construct states the aims of the station.

The objectives or checklist

The objectives are the actions that should be taken in response to the information in the construct. The items should be (a) appropriate for the level of training being assessed, (b) task-based and (c) observable (so they be scored). A score must be assigned to every item, and in the MRCPsych scoring is on a scale from A to F (F is a severe fail). Individual items are assigned different relative weights, with items considered more important being worth more. Using such weighting may improve the validity of an individual's score in an OSCE, which may affect which candidates pass or fail while at the same time not changing the overall pass rate (Reference SmeeSmee, 2003).

Instructions to candidates

Any relevant background information is given and the candidate's task is clearly stated.

Instructions to simulated patients

The approach to this is described above.

Box 5 shows extracts from a sample OSCE. The current MRCPsych Part I OSCE, with its 12 stations, shows high levels of reliability (Reference OyebodeOyebode, 2002). The new format clinical examination will be an OSCE consisting of complex cases in two parts held on the same day. The first part will contain 10 stand-alone stations, each lasting 8 min (including reading time of 1 min). The second part will contain five pairs of ‘linked’ stations, which will allow the assessment of more complex competencies, each station lasting 12 min (including 2 min for reading and preparation).

Box 5 Extracts from an example OSCE

The construct

The candidate demonstrates the ability to establish rapport with a distressed relative and to explain the aetiology, nature, signs and symptoms of schizophrenia, its treatment using both pharmacological and psychosocial methods, in a way that the relative understands, and balances accurate and realistic information with instillation of hope.

Instructions to the candidate

This lady, Mrs Bennett, is the divorced mother of one of your patients, Stephen Bennett, who is a 21-year old university student recovering from a recurrence (second episode) of a schizophrenic illness. This first presented with an acute onset 3 years ago…

Explain the nature of schizophrenia and the long-term prospects for her son.

(Royal College of Psychiatrists, http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat/partiosce/sampleinstructions.aspx)

Part I clinical revision courses

Two different ways of organising revision for OSCEs have been described – the OSCE workshop (Reference Naeem, Rutherford and KennNaeem et al, 2004) and the mock OSCE (Reference Pryde, Sachar and YoungPryde et al, 2005). A comprehensive revision course will have elements of both types of revision. In the Birmingham MRCPsych Part I preparation course half a day per semester is dedicated to an OSCE workshop, and there is a separate mock OSCE revision course, held twice a year shortly before candidates sit the OSCE component of the MRCPsych.

OSCE workshops

In the OSCE workshop described by Reference Naeem, Rutherford and KennNaeem et al(2003) six candidates were divided into two groups of three, each group having a facilitator. After the the facilitators had described how OSCEs work they ‘role played’ a sample station. Each group then tried out a station that it had designed on members of the other group. Finally, the trainees were given a copy of their marks and feedback was given. In this type of OSCE workshop trainees get experience designing their own stations and take responsibility for their own learning.

Mock OSCEs

While the OSCE workshop provides practice for the contents of the various stations, a mock OSCE is a rehearsal of the format of the exam, allowing practice of essential skills such as time management, ‘thinking on one's feet’ and moving on from a difficult station.

Reference Pryde, Sachar and YoungPryde et al(2005) have described in detail the organising of a mock OSCE. On the Birmingham MRCPsych revision course the mock OSCEs are held after the results of the written paper have been published and before the actual OSCE. The venue should be large enough to allow separate rooms for each of the stations as well as rooms for waiting and the briefing of the simulated patients. The Birmingham MRCPsych course uses a hospital out-patient department.

Examiners must be warned not to interact with the candidates apart from asking them their name and candidate number, unless the station demands it, and to resist the temptation to teach.

Designing the circuit

The circuit should closely mirror the format of the actual OSCE, with the same number of stations and the same timing for each station. Providing rest stations, that is stations with no examiner where the candidates sit down for 7 min, will allow for extra candidates and for the use of paired stations, where the content of the second station is linked to that of the first. To avoid confusion, it is essential to have successive stations in adjacent rooms and to make it clear to candidates where they should go. A wide variety of stations should be used, testing the five skills listed in Box 6. There are a number of books giving sample stations (e.g. Reference MichaelMichael, 2004b ; Reference MurthtyMurthty, 2004; Reference RaoRao, 2005) as well as internet resources.

Box 6 The five skills tested in the OSCE

  1. History taking

  2. Physical examination skills

  3. Practical skills/use of equipment

  4. Emergency management

  5. Communication skills

(http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat/partiosce/about.aspx)

In Birmingham, simulated patients are used for at least half of the stations. For the rest we use senior house officers who have passed Part I, teaching dummies, investigation results, etc.

Mark sheets can be designed for each station using the one on the College website (http://www.rcpsych.ac.uk/pdf/osce.pdf) as a template. The construct of each OSCE should be clearly specified, and the testable task should then be broken down into several components, each of which is graded on a 5-point scale. For most stations, communication skills will be a component.

Timekeeping

At least one facilitator should be assigned the responsibility of timekeeping. Several software programs available on the internet allow computerised timekeeping, with recorded voice instructions that simulate exam conditions. Having a stop watch and bell in reserve is essential.

Feedback

Before the mock OSCEs at Birmingham there is a discussion with the examiners and simulated patients on how best to give feedback. The examiners are encouraged to discuss each candidate's performance with the simulated patient and to write down their comments using the principles of descriptive feedback outlined above. Feedback should say what was done well in addition to what could be improved on. After the mocks, trainees receive a copy of this written feedback. There is then a plenary session involving the simulated patients, examiners and trainees. Each examiner and simulated patient in turn gives feedback about how the trainees in general performed at their station, saying what was done well and what could have been improved on. The trainees then have an opportunity to question the examiners and simulated patients and to comment themselves on how best to approach the station. Offering feedback to learners immediately after they have completed the stations can improve competency in the performance of criterion-based tasks, at least over the short term (Reference Hodder, Rivington and CalcuttHodder et al, 1989).

The individual patient assessment or long case

While OSCEs have many advantages their nature is to break down complex clinical skills into small ‘testable’ tasks. This runs the risk of training doctors who are very good at these individual tasks but are unable to assimilate them into a coherent assessment (Reference Wallace, Rao and HaslamWallace et al, 2002). For this reason, current Part II examination includes a standard long case or individual patient assessment (IPA). The IPA is an assessment of trainees’ ability to take a comprehensive history and demonstrate good communication skills during an observed interview that lasts up to 10 min. Candidates must process, analyse and integrate information in order to reach a diagnosis, plan a management programme and give feedback to the patient and carers.

A major drawback of the IPA is its lack of reliability. Each candidate would have to see several long cases with several different examiners to achieve the level of reliability appropriate for a high-stakes examination and this is clearly not practical. Consequently in the new MRCPsych format the formal IPA examination will be replaced by an equivalent workplace-based assessment in the assessed clinical encounter (ACE). Each candidate will have to complete a minimum of eight ACEs during the course of their training. Three of these will be assessed by a validated College-approved assessor, and the marks will count towards the final clinical mark of the OSCE as part of a summative assessment.

Patient management problems

Patient management problems (PMPs) are another aspect of the current Part II clinical examination. The examiner reads aloud clinical vignettes and asks the candidate three specified probe questions for each vignette. Such PMPs provide examiners with an opportunity to explore the candidates’ skills in applying clinical knowledge in a wider and more practical setting (Reference McCreadieMcCreadie, 2002). Sample questions are provided by Reference McCreadieMcCreadie (2002) and are also available on the College website (http://www.rcpsych.ac.uk/exams/regulationsandcurricula/examregulations/examformat/structuredpatientmanagement.asp).

The PMP format is subject to the same reliability problems as the IPA, despite the College's attempts to standardise them across the various examination centres in the country. Consequently PMPs have also been dropped from the new MRCPsych. However, we believe that it is wrong to assume that the workplace-based assessment methods of the case-based discussion or mini-ACE are equivalent to the PMP.

Evaluating revision courses

As part of the process of trying to improve our teaching on the Birmingham revision course, trainees are asked to complete standard feedback forms at the end of each part of the course. Because of the limited number of places available on our clinical revision courses a certain number of trainees are also accepted as ‘observers’ in the mock OSCEs and the Part I and II clinical examinations. Initial feedback on this scheme (in the OSCEs) from examiners and examinees has been very positive.

The new MRCPsych

As mentioned above, the College's new examination and assessment programme is due to commence in Spring 2008. Changes to the MRCPsych format have been driven by the introduction of a national programme of competency-based training and the development by the College of a competency-based curriculum, due to go live throughout the UK in August 2007.

The new curriculum currently comprises a ‘core and general module’, together with modules for each of the six psychiatric specialties: adult; child and adolescent; forensic; learning disability; old age; and psychotherapy. PMETB has provisionally approved each of these modules. Additional modules on neuropsychiatry, liaison psychiatry, and social and rehabilitation psychiatry are under development and further modules are planned for the future.

Pilot studies of the curriculum are taking place in deaneries across the UK, supported by locally run workshops and by a package of supporting materials – the Pilot Pack. The latter can be downloaded from the College website (http://www.rcpsych.ac.uk/training/curriculumpilotpack.aspx) and it contains the new curriculum itself.

The formal examinations: Papers 1–3 and the OSCE

The new examination and assessment programme combines formal MRCPsych exams with workplace-based assessments. The distinction between Parts I and II of the current MRCPsych has been replaced with a modular approach. There are three knowledge-based papers (Papers 1, 2 and 3), containing MCQs (1 from 5 single best answer) and EMQs, and a clinical exam in the form of an OSCE. Each component requires candidates to have achieved certain competencies at specified levels in their workplace-based assessments. The eligibility structure can also be downloaded from the College website (http://www.rcpsych.ac.uk/pdf/MRCPsych%20Assessment%20Programme%20Final%2020of3.pdf).

Paper 1 will be a basic paper assessing areas such as history-taking, treatment explanation and record-keeping. Paper 2 will be a more specialist paper focusing on areas such as psychotropic drugs, advanced psychology and neuropsychiatry. Paper 3 will contain complex questions assessing clinical specialties (including general adult psychiatry as a specialty) and critical appraisal of research relevant to clinical practice. Table 1 shows the contents of the three written papers. The approximate percentage of a paper allocated to each area will be published in due course.

Table 1 Content of the three written papers in the new MRCPsych

Paper 1 Paper 2 Paper 3
History and mental state examination General principles of psychopharmacology (pharmacokinetics, pharmacodynamics) Research methods
Cognitive assessment Psychotropic drugs Evidence-based practice
Neurological examination Adverse reactions to treatment Statistics
Patient assessment Evaluation of treatments Critical appraisal
Aetiology Neuropsychiatry (physiology, endocrinology, chemistry, anatomy, pathology) Clinical topics
  1. Liaison psychiatry

  2. Forensic psychiatry

  3. Addiction psychiatry

  4. Child and adolescent psychiatry

  5. Psychotherapy

  6. Psychiatry of learning disability

  7. Social and rehabilitation psychiatry

  8. Old age psychiatry

Diagnosis Genetics
Classification Statistics and research (basic)
Basic psychopharmacology Epidemiology
Basic psychological processes Advanced psychological processes and treatments
Human psychological development
Social psychology
Description and measurement
Basic psychological treatments
Prevention of psychiatric disorder
Descriptive psychopathology
Dynamic psychopathology
History of psychiatry
Basic ethics and philosophy of psychiatry
Stigma and culture

To allow for more flexibility and greater trainee-led learning, the written papers can be taken in any order, provided that the eligibility criteria are met. A suggested time frame is given, and there is a minimum mandatory training time of 12 months before Paper 1 can be taken. Papers 1 and 2 may be taken together or individually. A pass in Paper 3 may be ‘banked’ for up to 18 months, allowing three attempts at the OSCE (described above). If the third attempt is unsuccessful, Paper 3 will have to be taken again.

Workplace-based assessments

The College has established four principles for workplace-based assessments:

  1. they should focus on performance in the workplace

  2. decisions on performance should be evidence-based

  3. evidence must be triangulated with assessments by different assessors, at different times and using different methods

  4. records of assessments must be permanent (for further details see Reference Bhugra, Malik and BrownBhugra et al, 2007). The College has developed nine workplace-based assessment methods, four of which are from the new 2-year foundation programme, which trainees must complete before embarking on specialist training. Each method uses a specific assessment form modified for use in psychiatric training. Most are scored on a 6-point Likert scale on which 4 indicates successful completion. Most are accompanied by a ‘guide and performance descriptor’ on the expected performance of candidates relating to each point of the scale for each of the criteria. Most methods are followed by immediate feedback to the candidate. The current rating forms available in the Pilot Pack are disappointing in that the domains to be rated are very broad, they lack operationalised criteria and there is no space for written text for the assessor to justify their scores.

Implications of the new programme

A significant change for trainees is the requirement to complete several time-consuming workplace-based assessments. With the added pressures of the European Working Time Directive there will be precious little time for actual training and service delivery. Trainees from other countries who have not been exposed to workplace-based assessments during undergraduate training or the foundation years are further disadvantaged.

The new examination and assessment programme presents a challenge to organisers of MRCPsych preparation and revision courses. The standard lecture format of most preparation courses looks increasingly obsolete. As already mentioned, there will be increased demand for MCQ, EMQ and OSCE revision. There will also, no doubt, be an increased demand for revision courses aimed at workplace-based assessments, leading to the paradox that the introduction of assessment methods to measure performance in the workplace will drive the need for training in these methods outside the workplace.

Course organisers need to be aware that in June 2005 it became mandatory for psychiatric trainees to receive training directly from service users and carers (Reference Fadden, Shooter and HolsgroveFadden et al, 2005). The vision of the College is that service users and carers should be involved in the planning, delivery and evaluation of psychiatric training as well as the assessment of psychiatric trainees.

Conclusions

Preparing the materials for the revision courses, setting up banks of questions, liaising with simulated patients and organising mock examinations is time-consuming and labour intensive. However, once the initial work has been done materials can be re-used, examiners are often happy to return to examine and the process becomes a lot easier to manage. Box 7 gives an overview of the planning of such courses.

Box 7 Summary of revision course timetable

Throughout the local MRCPsych course:

  1. encourage speakers on the course to produce MCQs and/or PMPs and to use these in their talks

  2. have practice sessions of EMQs and ISQs, to encourage familiarity with the format

  3. run OSCE workshops during the Part I course, to encourage familiarity with the format

Prior to the Part I examination:

  1. before the written paper hold a half-day MCQ session

  2. before the clinical examination hold a half-day mock OSCE examination

If revision is provided uncritically it may result in strategic learning. Thus, an important part of the process is to ensure that those running feedback sessions use the approaches outlined above. The traditional lecture, which is still used in many MRCPsych courses, can be improved by using interactive lecturing (Reference Steinhert and SnellSteinhert & Snell, 1999). Incorporating MCQs or even OSCE stations into the MRCPsych revision courses may, if they are used as the basis for discussion, improve the educational value of the courses.

Organisers of MRCPsych courses should help learners to pass the examinations but do so using the principles of adult learning. Giving the trainees the tools to learn themselves and encouraging a deep approach to learning should mean better training for trainees.

The new examination and assessment programme will alter the way preparation and revision courses are run, but experience gained with running the current MRCPsych courses will not be wasted as the same educational principles still apply.

Declaration of interest

None.

MCQs

  1. 1 The OSCE in the current MRCPsych examination:

    1. a was introduced in 1999

    2. b consists of 10 stations

    3. c involves the use of simulated patients

    4. d does not test physical examination skills

    5. e was developed by Barrows.

  2. 2 Simulated patients:

    1. a are impossible to use in revision courses

    2. b can easily be recruited and trained at short notice

    3. c may be drama students

    4. d should receive a vignette or script on the day of the exam

    5. e should not contribute feedback to trainees.

  3. 3 Feedback to trainees in mock exams:

    1. a is generally best given as a numerical score

    2. b is best given in a judgemental style

    3. c can usefully employ Pendleton's rules

    4. d should involve prescriptive comments

    5. e should always be given individually.

  4. 4 EMQs in the new MRCPsych:

    1. a are a form of MCQ

    2. b are thought to be poor at testing clinical scenarios

    3. c make it easy to guess the answers

    4. d will be the only form of objective written exam used

    5. e cannot usefully be incorporated into a local MRCPsych preparation course.

  5. 5 Long case examinations:

    1. a require the use of real patients

    2. b are also known as individual management problems

    3. c do not lend themselves to providing feedback to candidates in mock exams

    4. d test complex clinical skills

    5. e have been retained in the new MRCPsych.

MCQ answers

1 2 3 4 5
a F a F a F a T a F
b F b F b F b F b F
c T c T c T c F c F
d F d F d F d F d T
e F e F e F e F e F

Footnotes

Over the coming year APT will include a number of articles focusing on aspects of workplace-based assessments. Ed.

References

Adamo, G. (2003) Simulated and standardized patients in OSCEs: achievements and challenges 1992–2003. Medical Teacher, 25, 262270.Google Scholar
Barrows, H. S. (1993) An overview of the uses of standardized patients for teaching and evaluating clinical skills. AAMC. Academic Medicine, 68, 443451; discussion 451–443.CrossRefGoogle ScholarPubMed
Bhugra, D. (2006) The new curriculum for psychiatric training. Advances in Psychiatric Treatment, 12, 393396.CrossRefGoogle Scholar
Bhugra, D., Malik, A. & Brown, N. (2007) Workplace-Based Assessments in Psychiatry. RCPsych Publications.Google Scholar
Brown, N. & Doshi, M. (2006) Assessing professional and clinical competence: the way forward. Advances in Psychiatric Treatment, 12, 8189.CrossRefGoogle Scholar
Brown, T. & Wilkinson, G. (eds) (2005) Critical Reviews in Psychiatry (3rd edn). Gaskell.Google Scholar
Case, S. M. & Swanson, D. B. (1993) Extended-matching items: a practical alternative to free response questions. Teaching and Learning in Medicine, 5, 107115.CrossRefGoogle Scholar
Case, S. M. & Swanson, D. B. (2002) Item Writing Manual: Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners. http://www.nbme.org/publications/item-writing-manual.html Google Scholar
Crockford, D., Holt-Seitz, A. & Adams, B. (2004) Preparing psychiatry residents for the certification exam: a survey of residency and exam experiences. Canadian Journal of Psychiatry, 49, 690695.Google Scholar
Fadden, G., Shooter, M. & Holsgrove, G. (2005) Involving carers and service users in the training of psychiatrists. Psychiatric Bulletin, 29, 270274.Google Scholar
Fenwick, C. D., Vassilas, C. A., Carter, H. et al (2004) Training health professionals in the recognition, assessment and management of suicide risk. International Journal of Psychiatry in Clinical Practice, 8, 117121.Google Scholar
Frederiksen, N. (1984) The real test bias: influences of testing on teaching and learning. American Psychologist, 39, 193202.Google Scholar
George, S. (2003) Extended matching items (EMIs): solving the conundrum. Psychiatric Bulletin, 27, 230232.CrossRefGoogle Scholar
Gibbs, G. (1992) Improving the Quality of Student Learning. Technical and Educational Services.Google Scholar
Harden, R. M. (1986) Ten questions to ask when planning a course or curriculum. Medical Education, 20, 356365.Google Scholar
Harden, R. M. & Gleeson, F. A. (1979) Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 4154.Google Scholar
Harden, R. M., Stevenson, M., Downie, W. W. et al (1975) Assessment of clinical competence using objective structured examination. BMJ, 1, 447451.Google Scholar
Hodder, R. V., Rivington, R. N., Calcutt, L. E. et al (1989) The effectiveness of immediate feedback during the objective structured clinical examination. Medical Education, 23, 184188.Google Scholar
Hodges, B., Regehr, G., McNaughton, N. et al (1999) OSCE checklists do not capture increasing levels of expertise. Academic Medicine, 74, 11291134.CrossRefGoogle Scholar
Ker, J. S., Dowie, A., Dowell, J. et al (2005) Twelve tips to developing and maintaining a simulated patient bank. Medical Teacher, 27, 508521.Google Scholar
Kurtz, S., Silverman, J. & Draper, J. (1998) Teaching and Learning Communication Skills in Medicine. Radcliffe Medical Press.Google Scholar
McCoubrie, P. (2004) Improving the fairness of multiple-questions: a literature review. Medical Teacher, 26, 709712.Google Scholar
McCreadie, R. G. (2002) Patient management problems: ‘the vignettes’. Psychiatric Bulletin, 26, 463467.Google Scholar
McNamara, D. (2003) MCQs in Psychiatry. Gaskell.Google Scholar
Michael, A. (2004a) Get Through MRCPsych Parts 1 & 2: 1001 EMQs. Royal Society of Medicine Press.Google Scholar
Michael, A. (ed) (2004b) OSCEs in Psychiatry. Churchill Livingstone.Google Scholar
Misch, D. A. (2002) Andragogy and medical education: are medical students internally motivated to learn? Advances in Health Sciences Education, 7, 153160.Google Scholar
Murthty, S. P. M. (2004) Get Through the MRCPsych Part I: Preparation for the OSCEs. Royal Society of Medicine Press.Google Scholar
Naeem, A., Rutherford, J. & Kenn, C. (2003) The new MRCPsych Part II exam – golden tips on how to pass. Psychiatric Bulletin, 27, 390393.Google Scholar
Naeem, A., Rutherford, J. & Kenn, C. (2004) The MRCPsych OSCE workshop: a new game to play? Psychiatric Bulletin, 28, 6265.Google Scholar
Newble, D. I. & Cannon, R. (2001) A Handbook For Medical Teachers (4th edn). Kluwer Academic.Google Scholar
Newble, D. I. & Entwistle, N. J. (1986) Learning styles and approaches: implications for medical education. Medical Education, 20, 162175.Google Scholar
Newble, D. I. & Swanson, D. B. (1988) Psychometric characteristics of the OSCE. Medical Education, 22, 325334.CrossRefGoogle Scholar
Ogundipe, L., El-Nadeef, M. & Hodgson, R. E. (2005) Lecture Notes on Paper Critique: Research Methodology and Statistics for Critical Paper Reading in Psychiatry. Trafford Publishing.Google Scholar
Oxford Centre for Staff Development (1992) The Council for National Academic Awards Improving Student Learning Project. Oxford Centre for Staff Development.Google Scholar
Oyebode, F. (2002) Commentary. Advances in Psychiatric Treatment, 8, 348350.CrossRefGoogle Scholar
Oyebode, F. & Furlong, E. (2007) MRCPsych examinations: cumulative results 1997–2002. Psychiatric Bulletin, 31, 6164.Google Scholar
Pendleton, D., Schofield, T., Tate, P. et al (1984) The Consultation: An Approach to Teaching and Learning. Oxford Medical Publications.Google Scholar
Postgraduate Medical Education and Training Board (2005) Workplace Based Assessment: A paper from the PMETB Subcommittee on Workplace Based Assessment; January 2005. PMETB. http://www.pmetb.org.uk/media/pdf/3/b/PMETB_workplace_based_assemment_paper_(2005).pdf Google Scholar
Pryde, I., Sachar, A., Young, S. et al (2005) Organising a mock OSCE for the MRCPsych Part I examination. Psychiatric Bulletin, 29, 6770.Google Scholar
Rao, R. (ed.) (2005) OSCEs in Psychiatry. Gaskell.Google Scholar
Rose, N. (2006) Hazards ahead? Invited commentary on: assessing professional and clinical competence. Advances in Psychiatric Treatment, 12, 8991.Google Scholar
Schuwirth, L. W. T. (2004) Assessing medical competence: finding the right answers. Clinical Teacher, 1, 1418.CrossRefGoogle Scholar
Schuwirth, L. W. T. & van der Vleuten, C. P. M. (2004) Different written assessment methods: what can be said about their strengths and weaknesses? Medical Education, 38, 974979.Google Scholar
Silverman, J., Draper, J. & Kurtz, S. M. (1996) The Calgary–Cambridge approach to communication skills teaching. 1: Agenda-led outcome-based analysis of the consultation. Education for General Practice, 7, 288299.Google Scholar
Smee, S. (2003) Skill based assessment. BMJ, 326, 703706.Google Scholar
Steinhert, Y. & Snell, L. S. (1999) Interactive lecturing: strategies for increasing participation in large group presentations. Medical Teacher, 21, 3742.Google Scholar
Taylor, P. & Warner, J. (2000) National survey of training needs for evidence-based practices. Psychiatric Bulletin, 24, 272273.Google Scholar
Tyrer, S. (2005) Development of the OSCE: a College perspective. In OSCEs in psychiatry (ed. Rao, R.) pp. 1423. Gaskell.Google Scholar
Tyrer, S. & Oyebode, F. (2004) Why does the MRCPsych examination need to change? British Journal of Psychiatry, 184, 197199.Google Scholar
Van der Vleuten, C. P. M. & Swanson, D. B. (1990) Assessment of clinical skills with standardised patients: state of the art. Teaching and Learning in Medicine, 2, 219225.Google Scholar
Vassilas, C. & Ho, L. (2000) Video for teaching purposes. Advances in Psychiatric Treatment, 6, 304311.Google Scholar
Vassilas, C. A., Brown, N., Wall, D. et al (2003) ‘Teaching the teachers’ in psychiatry. Advances in Psychiatric Treatment, 9, 308315.CrossRefGoogle Scholar
Wallace, J., Rao, R. & Haslam, R. (2002) Simulated patients and objective structured clinical examinations: review of their use in medical education. Advances in Psychiatric Treatment, 8, 342348.Google Scholar
Walsh, K. (2005) The rules. BMJ, 331, 574.Google Scholar
Wilkinson, T., Fontaine, S. & Egan, T. (2003) Was a breach of examination security unfair in an objective structured clinical examination? A critical incident. Medical Teacher, 25, 4246.CrossRefGoogle Scholar
Figure 0

Table 1 Content of the three written papers in the new MRCPsych

Submit a response

eLetters

No eLetters have been published for this article.