What you should know about the changing role of the teacher in the health care professions

Prof. Ronald Harden – General Secretary, Association for Medical Education in Europe (AMEE), Professor of Endocrinology, Dundee University, UK

Making the subjective, objective: Assessing leadership and interprofessional teamworking

Prof. Judy McKimm – Emeritus Professor of Medical Education, Swansea University Medical School, Wales, United Kingdom

In this workshop, we will explore ideas and strategies for assessing specific aspects of professionalism, namely leadership (management and followership) and interprofessional teamworking and collaboration. These behaviours are often seen as hard to teach and assess, but are increasingly formalised in health professions’ curricula, stimulated by their inclusion in outcomes and standards defined by regulators and professional bodies. We will first discuss how these behaviours can be inculcated in our learners and then consider the various ways in which we can assess requisite knowledge, skills and behaviours and how these can be observed, recorded, and deemed acceptable or not. Finally, we will discuss how these aspects also form part of fitness to practice or suitability for progression, we will discuss the various challenges for educators and schools and consider some solutions from international examples.

Quality of Evidence in Measuring Outcomes in Health Professions Education

Prof. Kameshwar Prasad -Professor of Neurology, All India Institute of Medical Sciences New Delhi and Director, Rajendra Institute of Medical Sciences, Ranchi, India

Dr. Manya Prasad – Assistant Professor, Epidemiologist, North DMU Medical College and Hindu Rao Hospital, New Delhi, India

The workshop is focussed on outcomes of medical education, not just the outputs, recognising that the latter are immediate results linked to educational activities. Outcomes indicate achievement of an educational program linked to its objectives, and support program’s efficacy, effectiveness, and efficiency. The term meta-analysis was first used by Gene Glass in his presidential address at the annual conference of the American Educational Research Association in 1976, but the method has become more popular in health care field. In fact, evidence-based health care considers meta-analysis of well-conducted randomized trials as the highest level of evidence for effects of interventions. However, health care field has moved further with a framework for assessing quality of evidence with respect to outcomes of interventions using what is now widely known as ‘GRADE’ system ( This workshop will apply the principles of GRADE system to assess quality of evidence for outcomes in medical education. Specifically, it will consider issues like study design, study limitations, publication bias, inconsistency, generalizability/ applicability, precision, and magnitude of effect to define the quality of evidence to support educational recommendations with an aim to achieve a specific educational outcome.

Exam Bank Health Index: Know the physiology of items to recognize the pathology

Dr. Fadi Munshi – Director, Postgraduate Medical Education, Saudi Commission for Health Specialties (SCFHS), Kingdom of Saudi Arabia

An exam bank with hundreds of items (questions) needs continuous care and maintenance. Similar to a population, the health of each individual contributes to the health of the whole. The grasp of assessment principles required to maintain an exam bank requires innovative tools. Understanding the physiology will help to identify the pathology and recommend an intervention to cure the “sick and unwell”. Exam Bank Health Index (EBHI) is proposed to determine the overall quality of the items and is calculated from item parameters and measures used to detect poor-performing items or changes in item performance that affect the overall exam bank health.

How to design assessment for learning programmatically? ONLINE

Prof. Lambert Schuwirth – Professor, Prideaux Discipline of Clinical Education, College of Medicine and Public Health, Flinders University, Adelaide, Australia

Often, it is assumed that programmatic assessment and assessment for learning require a complete redevelopment of assessment systems and that existing systems need to be completely abandoned. But both programmatic assessment and assessment for learning are in fact a set of principles that can be applied to existing systems with a limited need for modifications of the instruments that are in place. What is more important, is to re-purpose existing instruments and reconceptualise the way information from the assessment is collected, triangulated and used. Having a richer and more meaningful information about students’ performance, achievement and improvement, automatically means that there is also more information available to evaluate the educational organisation’s performance achievement and improvements. In this workshop, we will design a simple assessment program whilst exploring and applying the principles of programmatic assessment and assessment for learning

Assessing Professionalism: a practical guide

Prof. Trudie E. Roberts – Leeds Institute of Medical Education, UK

Prof. Richard Fuller – Director, Christie Education, Christie NHS Foundation Trust/Manchester Cancer Research Centre/University of Manchester

The aim of this workshop is to facilitate the better understanding of the importance of assessing professionalism in medical education. As part of the workshop attendees will develop an understanding of what is meant by professionalism and how it might be assessed throughout the medical undergraduate programme to build a programme of assessment. We will also examine how assessing professionalism differs from other areas of assessment. Attendees will be helped to critically examine the current tools in use to assess professionalism and compare these to their own methods. Finally, we will briefly consider the difficult issue of remediation of poor professionalism.

At the end of this workshop participants will:

  • Understand the importance of assessing professionalism
  • Have developed a definition of professionalism for their own context
  • Have developed ideas for a programme of assessment for professionalism
  • Have critiqued a number of tools for assessing professionalism
  • Have developed ideas on remediation of poor professionalism

Tools and procedures to arrive at valid summative entrustment decisions

Prof. Olle (Th.J.) ten Cate, PhD – Professor of Medical Education, Scientific Director of Education, Senior Scientist at the Utrecht Center for Research and Development of Health Professions Education, University Medical Center Utrecht, Universiteitsweg, Netherlands

This workshop will highlight both general aspects of data collection and processing for summative entrustment decisions and specific tools to use in the clinical workplace, for brief observations, entrustment-based discussions and longitudinal observations, including associated technology. After an introduction, the workshop participants will be asked to practice techniques in small groups. The workshop will conclude with a plenary discussion.

Outside the Walls of the Ivory Tower: Using the Community for Real Life Assessment

Prof. Trevor Gibbs -President, Association for Medical Education in Europe (AMEE)

Prof. David Taylor – Professor of Medical Education, Gulf Medical University

Participants: Senior faculty and teachers involved in curriculum planning and assessment and community based education.

Learning Outcomes: At the end of the workshop, participants will be expected to:

  • Clearly define what is meant by a community, specific to the context and culture the workshop participants are based within.
  • Critically analyse the opportunities offered by their community towards teaching, learning and assessment.
  • Begin to create innovative opportunities for assessing their students within the real-life opportunities afforded by their community.

Background: Assessment of all levels of health professionals is based upon the expected outcomes of ensuring that the graduate is fit for purpose- “being the right person for the job” – and fit for practice- “having the necessary competencies required for the job”. The recent Covid-19 pandemic has made all faculty aware that in the rapidly changing world of healthcare, there is also a need to create graduates fit for the real world-“equipped to cope with new life-changing events and technological challenges”. The majority of assessment methods assume a by-proxy measure- an assumption that what is able to be measured and judged within the confines of the school is carried forward into real-life events; an assumption that is not always proved to be credible. To judge the ability of the student to perform effectively in the real world requires students to leave the confines of the medical school / institution – the Ivory Tower-and into the real world of the community that the school / institute serves.

“The knowledge of the world is only to be acquired in the world, and not in a closet.” Lord Chesterfield, 1694-1773, English statesman & writer.

Workshop Format: The learning outcomes will be developed through a series of didactic inputs and group discussions and presentations relating to the stated outcomes. The emphasis will be on identifying opportunities and planning for the future.

Reflecting on how we evaluate educational interventions and how we can do it better.

Dr. Louise Allen – Monash University, Monash Centre for Professional Development and Monash Online Education, Australia

This workshop offers an opportunity to reflect on our current evaluation practices, to explore together what we do well, what could be improved and how together we can improve our approaches. The culminating activity will involve a case study -where you will work in small groups to develop an evaluation strategy.