This is a time of significant change in medical education. Tomorrow's Doctors (General Medical Council, 1993) had a major impact on undergraduate education that is still reverberating around medical schools. Changes in post-qualification training have seen the development of some psychiatry house officer posts, restructuring of senior house officer training, and the introduction of the specialist registrar grade and the Certificate of Completion of Specialist Training (CCST). Revalidation, consultant appraisal and the personal development plan are likely to presage significant changes in the way that consultant psychiatrists plan their continuing professional development (Royal College of Psychiatrists, 2001). The College is consulting on a document that will set out the core competencies for psychiatry and all its sub-specialities. Such a specification is likely to have far reaching implications for what psychiatrists need to learn and how they will be taught.
A parallel development has seen the introduction of new teaching methods that focus on encouraging learners to find things out for themselves. Self-directed learning, project work in groups, problem-based learning and the promotion of ‘finding-out skills’ are now prioritised. However, evaluation of these new teaching methods is often not rigorous — we may have evidence-based clinical practice but what about evidence-based education? (Reference HutchinsonHutchinson, 1999; Reference PetersenPetersen, 1999; Reference Wilkes and BlighWilkes & Bligh, 1999; Reference LilleyLilley, 2000; Reference PrideauxPrideaux, 2002).
Petersen (Reference Petersen1999) notes that survival of a surgical procedure is rarely seen as qualifying a person to perform that procedure. All doctors have been successful medical students but on the same basis, survival of medical education should not qualify doctors to teach. The task of teaching others requires training and expertise, an expertise that is recognised in the skills required for the CCST and in the new consultation paper on core psychiatric competencies. Equally, those of us entrusted with the task of teaching deserve access to high-quality evaluations of teaching methods in order to allow us to select the best methods to use when teaching others.
The ‘Education & Training’ section has become a regular feature of the Bulletin. In 2001, eight articles were published under this banner with a further nine articles explicitly about education or training being found under the headings ‘Special Articles’ or ‘Original Papers’. Other published papers were indirectly related to training matters. Unfortunately, many other papers on this subject were submitted and rejected because they contained no more than a brief description of an educational programme and a summary of learner feedback. The Royal College of Psychiatrists is committed to raising standards of education and training and, to this end, the Psychiatric Bulletin is seeking to raise the standards of published evaluation and research into teaching methods.
Good educational research can encompass naturalistic studies, including detailed observational descriptions of teaching, through controlled comparisons of educational experience and outcomes, to experimental studies such as randomised controlled trials of different educational interventions. Both qualitative and quantitative methods will have their place — evaluation of teaching process and learner experience is as important as evaluation of outcome. Audits of training experience, such as surveys of learners, have much to contribute but, as with articles about clinical audit, they really need to start ‘closing the loop’ and relate results to existing standards on education practice or generate new standards for others to audit.
What good-quality evaluations should have in common is an attention to the theory behind the educational process and a proactive evaluation strategy that considers, at the outset, how learning and teaching will be evaluated rather than tagging on an evaluation once teaching is complete. Most studies will not be randomised controlled trials; these are relatively infrequent in medical education (Reference PetersenPetersen, 1999). However, many of the criticisms of their use (problems with randomisation, difficulties with ‘blinding’, number and complexity of other variables, difficulties in specifying and measuring outcomes and problems with manualising interventions) will be familiar to psychiatrists who have been involved with evaluation of psychological treatments. Murray (Reference Murray2002) provides a good overview of some of these problems and suggests that educational research shares many similarities with health services research. Interventions are multi-factorial and take place in the real world where economic, political and social factors may change during the study period, making interpretation of the results complex. Psychiatric experience in running complex trials of psychotherapies may enhance our ability to conduct controlled trials in education — both seek to measure the effectiveness of interventions designed to bring about behavioural change.
The British Medical Journal has already published guidance for authors of papers seeking to describe the evaluation of educational interventions (Reference Abbasi and SmithAbbasi & Smith, 1999). Such papers should have clear aims, appropriate design, samples, measures and analysis, and a structured discussion. An adaptation of these guidelines is a useful guide for readers and contributors to the Bulletin (Box 1). It is to be hoped that the coming years will see an increase in both the quality and quantity of published research into psychiatric education.
Aims |
|
Design |
|
Discussion |
|
Declaration of interest
None.
eLetters
No eLetters have been published for this article.