FFICM Examination Update and Response – 25 November 2021

Published 25/11/2021

The Faculty has been undertaking an analysis of the results of the October 2021 FFICM exam since these results were shared on 4 November 2021. We have also held online meetings with national and regional StR and other candidate representatives, regional trainers, and Examiners to collect information and listen to the range of questions and concerns raised. It was important for us to hear the experiences of people sitting the FFICM and to understand the impact on candidates.

The devastation, uncertainty and personal impact felt by so many was clearly articulated. We heard about the individual impact that the exams and exam outcomes have had on some of our doctors in training, valued colleagues alongside whom we work. We recognise the huge personal investment, not just financial but in terms of time, energy, and emotional wellbeing, that training in ICM requires.

We regret that we are not at present able to give a final conclusion to this issue and recognise this prolongs uncertainty. However, we wanted to honour our commitment to update you by 25 November and reassure you that we continue to work hard to resolve matters. At present, a definitive conclusion – the need for which was strongly articulated at the engagement events – is contingent on ongoing discussions with the GMC. We are not able to put forward a date of resolution on the GMC’s behalf but will write again on Thursday 9 December (two weeks from this update).

This update is also available to download as a PDF below.

 

Status update

  • There is no clear or single cause for the OSCE/SOE disparity in the exam data.
  • We are opening conversations with the GMC.
  • FICM’s position is that all passes already issued for this exam will stand.
  • This sitting will not count toward any candidate’s total number of examination attempts.
  • We are in discussion with the Lead Dean for ICM as to what steps can be taken to safeguard the training status and progression of affected candidates.
  • FICM commits to reviewing the OSCE question bank.

Further discussion

Disparity between OSCE and SOE results

We cannot be certain, from analysing the exam data itself, what precise factors have led to the disparity in OSCE/SOE result, or in what proportion. The cause is likely multi-factorial and a confluence of events, and indeed some factors may have impacted different candidates in different ways – we have been doing our best to review as much information, data and feedback on this examination sitting as possible to help contextualise the results. Various factors were suggested at the engagement events; some involved reviewing the question content and conduct of the examination, looking at the case-mix candidates had experienced, the impact of the pandemic on people and the exam experience itself. We hear the feedback that the Faculty and FICM Examiners need to understand and learn from this experience.

Whilst we do not draw any final conclusions from this examination, as local circumstances vary, we cannot discount the serious impact of the pandemic on our candidates, trainers, and Examiners. Case-mix and training opportunities have been affected and our workforce is very tired. The added burden on professional lives and exam preparation has been compounded by ongoing difficulties and concerns for personal health, the health of loved ones, childcare issues, and significant disruption to emotional wellbeing. This is a level of change and disruption not faced by a whole cohort in previous sittings of the examination and therefore it is difficult to draw comparisons to previous sittings. Furthermore, this may be an issue that is more acutely felt and experienced in ICM over any other medical or health profession in the UK, making it difficult to make quick comparisons to other similar groups with national examinations regulated by the GMC.

Involving the GMC

We heard a number of requests from the engagement events for FICM to seek guidance from the GMC. We have been in contact with the GMC to outline what happened and seek a way forward. This further engagement with the GMC is ongoing and will unfortunately require more time, and we are not able to put forward a date on the GMC’s behalf.

We entirely appreciate the desire for urgent resolution to this issue and are working to conclude this as quickly as we can. As stated, we will provide further update on 9 December.

Honouring existing exam passes

Some candidates who passed the exam have asked whether their pass mark might be rescinded following a review. The FICM position, which we have articulated to the GMC, is that previously issued exam passes must be unaffected; we believe that to change this would be unfair to those candidates.

Exam attempts

Regardless of the outcome of this situation, we can confirm that the October sitting of the FFICM will not count toward any candidate’s total number of examination attempts.

Workforce and training considerations

We are in discussion with the Lead Dean as to what steps can be taken to safeguard the training status and progression of affected candidates.

Further review

As previously discussed, FICM commits to reviewing the OSCE question bank. FFICM will also be participating in the broader independent external review of RCoA examinations, which is now out for tender.

Engagement event questions

We committed to respond within a week to the issues raised in the engagement events in as full a way as possible, with immediate answers to areas we can resolve straight away or updates on progress on those areas which may need additional consideration or input before resolution.

There were questions raised verbally on the call, in the meeting chat and via email. We have considered all the feedback and questions and summarised below with our thoughts – some questions are also reflected in the update section above.

We have clearly heard the feedback from candidates that the preparation material currently available for the examination is insufficient. The Faculty will prioritise this. We will set up a dedicated group to oversee production of greater exam resources and to review how our broader offering of courses and other educational content can be better geared toward exam preparation. Our hope is that this work can be directly informed by input and comment from the StR body to ensure we develop the materials and resources in the way that meets your needs – we are exploring the mechanisms for this involvement.

It is fair to say that moving examinations online quickly as a result of the pandemic was nobody’s preferred choice: not StRs, their trainers, nor Examiners. When the pandemic first struck in 2020, the spring OSCE/SOE and July MCQ were cancelled. This was for a number of reasons, not least concerns that the impact of working in the pandemic was significant and exam preparation would have been unfairly impacted. However, it was subsequently made clear to colleges and faculties by the DHSC/HEE that keeping exams on hold was not an option; the Exams team and senior FFICM Examiners needed to design and test a new format of delivery at pace to allow exams to go ahead. This had to be compliant with lockdown rules and test, as far as possible, the same competencies at the same standard as the face-to-face exams.

A huge amount of work was put in to modify the format of exam questions pulled from the question bank and work out the logistics of how to deliver this via the online platform. These changes were then presented to, and approved by, the GMC. The Faculty remains immensely grateful to all the FICM Examiners and the RCoA Exams team for the enormous volume of work put into adapting the examination for online delivery, under unprecedentedly trying circumstances.

The RCoA has committed to online exams until April 2022 but will decide before that date whether to return to face to face or remain virtual. We spoke at the engagement events whether the SOE and OSCE could return to face-to-face and the MCQs remain remote to reduce travel. We will make sure your views on this are communicated to the RCoA Exams Board and considered.

Both the FICM and RCoA Exams teams would like to reiterate our sincere apologies to all candidates regarding the delayed and confused release of the results, and the additional distress and confusion this engendered. We are extremely sorry that this occurred.

Some background context to this issue is provided by the recent problems surrounding the issuing of FRCA Final results. Following these issues, the RCoA did not feel secure in issuing exam results via email Mail Merge. The decision was therefore taken to issue all RCoA and Faculty exam results via postal letter whilst the FRCA Mail Merge issue was fixed. However, this decision was not communicated to FFICM candidates. We acknowledge that this lapse was unacceptable.

The issue on the day was caused by a mixture of factors, including a communications breakdown between the two teams. As set out in our letter to candidates on 4 November, the initial delay was caused by the process of checking and rechecking the initial OSCE results; this work involved multiple Examiners and consumed the capacity of the RCoA Exams team. The delay meant that the plan to issue postal letters was no longer workable inside the 4 November deadline; given the time constraints the results plan switched to bespoke individualised emails. This process was much more time- and labour-intensive than Mail Merge and added to the delay.

Due to the intensity and focus on the complexity of this work, the Exams team did not communicate the change of delivery method to the Faculty team. This meant that the Faculty team, when trying to advise candidates and update website information, used incorrect or out-of-date information. The final element of the delay was the simultaneous and unrelated failure of the doctors.net.uk email system, which meant that some candidates’ results were further delayed. The Faculty and College again apologise wholeheartedly for this and resolve to eliminate the factors leading to this error.

This incident has highlighted the need for new processes within the operational mechanics of exam results:

  • We will agree clear lines of communication for candidates when dealing with exam and results queries.
  • Both RCoA Exams and FICM team members are to be physically present together within Churchill House when processing and issuing exam results – it is felt that remote working contributed to the drop in communications.
  • We make a clear commitment to all candidates to share with them any communication changes – whether they be necessitated by external factors or otherwise – ahead of time.
  • Cross-team briefings will be held prior to each diet of the examination to ensure all team members are aware of any changes or updates.
  • The FICM website pages and materials will be reviewed and updated – it should be obvious to candidates where they should go for exam updates.

A sheet of notepaper was allowed in the previous two online sittings of the OSCE/SOE but removed for this OSCE sitting. This was in order to come more in line with the face-to-face examination experience, where notepaper is not allowed, and to prevent misconduct in the examination and therefore remain fair for everyone. All candidates were told they could ask the examiner to show the initial question information again at any point during the station, if required. We have clearly heard the feedback that the removal of notepaper felt unfair to the October cohort of candidates and that the experience for online exam candidates has not been consistent.

Whilst the outcome of the examination is yet uncertain, we do not think it correct to publicly release any breakdowns of the cohort. In the results as issued, the pass rate in the SOE was, whilst better than in the OSCE, still lower than any other sitting since 2015. The reasons for this cannot be definitively gleaned from the exam data.

It should also be noted that FFICM reporting is run on the same governance lines as other RCoA-delivered examinations, which do not historically report on demographic details. There may be issues where some demographics, in compliance with data protection regulations, cannot be reported on due to cohort size. We will explore demographic reporting with the RCoA, but any such work must be scoped out in terms of feasibility and resources.

The majority of the examiners have experience of examining FFICM over many years. All examiners conduct both OSCE and SOE exams and examine all types of OSCE stations except for the simulation station which is examined by a small group of examiners with expertise in simulation. All examiners have their performance in the exam audited regularly by other examiners, are observed by both lay and external medical observers, and receive feedback at their exam appraisal. They also undergo regular training in equality and diversity relevant to examinations. Examiners have considerable prior experience as trainers, examiners in other postgraduate or undergraduate exams and additional areas of subject matter expertise. Many have an additional qualification in medical education, a higher postgraduate degree in a discipline relevant to ICM and/or recent research and publications. All examiners are practicing doctors who all work at consultant level. A number of those more recently appointed have FFICM by examination but a number were appointed consultants before the first sitting of the exam and have postgraduate qualifications relevant to their training.

New examiners are appointed through a competitive process and undergo training prior to being involved in a live exam. A small number (10 maximum per year) are added to the pool.  No new examiners were appointed in 2020, and eight new examiners started examining in the October 2021 exam. New examiners were trained specifically in OSCE and SOE examining and marking prior to their first exam. In the October 2021 exam the new examiners were ‘supervised’ during the OSCE by a senior examiner for the first two full days of their examining, as well as being audited by another senior examiner.

We were very concerned to hear this. FFICM Examiners are audited in accordance with standard exam practice and were observed in the October 2021 exam; we received no comments at the time from auditors or observers on examiner behaviour which might adversely affect the candidates’ scores, but we will look into this further with the Court of Examiners and discuss any necessary process changes. If Examiners have concerns about either the process or the question content, they are encouraged to raise them with their peer Examiners so that we can address them. We will remind our Examiners of the appropriate process to follow if raising concerns about the examination. Furthermore, we will ensure that all Examiners are appraised of the methodology for the mapping of items to the ICM curriculum so we can demonstrate the appropriateness of question content and make amendments to the question if it does not comply.

An OSCE is an objective exam; the candidates’ experience should be independent of the Examiner as much as possible, be it in face-to-face or online examination. For these reasons we have strived to standardise the wording and the use of re-phrasing and prompting in the OSCE. We have heard and understand the feedback that this limit on subjective flexibility has been an issue for candidates if the phrasing of the original question is felt to be unclear or is misinterpreted by the candidate and the Examiner is unable to redirect them. The clarity of phrasing is considered when questions are written, reviewed and when the Angoff score is established, and will be considered again as part of the OSCE question bank and exam processes.

We are in discussion with the Lead Dean, HEE, and the devolved nations as to what steps can be taken to safeguard the training status and progression of affected candidates. We do not wish to see any candidate disadvantaged considering the impact that the pandemic has already had in their training and wellbeing.

We have clearly heard the feedback that greater clarity is needed around the purpose and standard of the exam. The role of the exam is to test knowledge and competence of specific skills across the curriculum.

Questions are developed to represent the full breadth of the curriculum and to appropriately sample content and aspects within it. Relevance to ICM is described by the curriculum, and this has recently been updated and aligned to the GMC Excellence by Design standards and the Generic Professional Capabilities for training (more on the curriculum below).  The development of the CCT in ICM has, since 2010, drawn extensively on the Competency Based Training in Intensive Care Medicine in Europe (CoBaTrICE) syllabus, an international partnership of professional organisations and critical care clinicians working together to harmonise training in Intensive Care Medicine worldwide. If topics are deemed important to ICM but absent from the curriculum or if the curriculum is thought to contain items not relevant to ICM, then this would be addressed by review of the curriculum, followed submission to the GMC for approval. Every major revision of the curriculum, including that for undertaken for 2021, is subject to a public consultation process where trainers and StRs have the opportunity to highlight any concerns of this nature alongside other content or structural concerns.

Educationalist review of specialty examinations is provided by the GMC, which reviews each specialty’s examination as part of the assessment system for its CCT, including the validity of the assessment methods and examination methodology used (MCQ, SOE, OSCE etc).

The FFICM Examiners have also routinely invited external or independent individuals to view and comment on the examination, its delivery and the processes used in the examination.  Feedback in this way has always been welcomed and seen as a positive and constructive way of benchmarking and reviewing what we do. During the recent exam we had a senior external observer from the College of Intensive Care Medicine, Australia and New Zealand.

The Royal College of Anaesthetists is commissioning an independent review of the entire RCoA examination structure. This review will cover the processes, infrastructure, delivery methods, resources, communications, roles and responsibilities and policies of the examinations. We intend to include the FFICM processes and examination within the scope of this review as the FFICM examination is supported by and delivered via the College Exams team.  The FFICM Examiners are confident the findings from any review of the examination process will contribute to the quality and ongoing development and evolution of the examination.  The Faculty will report the findings of the review and recommendations to the Board and publish these so members have oversight of the future direction of travel of the FFICM examination.

The contract for the review is now out to tender.

Examiners are recruited and trained to deliver the standards of the examination. The examiners do this on behalf of the FICM Board and are overseen by FICMTAQ (the FICM Training, Assessment & Quality committee). Therefore, it is entirely appropriate for the senior examiners, who know the examination and its processes, to undertake a review of the exam process and report to FICMTAQ and Board. This arrangement is common amongst medical Royal Colleges.

Exam frequency is dictated by the availability of examiners, who volunteer their time to invigilate, and the capacity of the RCoA Exams Team. We have found that examiners have increased pressure on their time, as they work alongside StRs in ICUs across the UK. The RCoA exams calendar incorporates FRCA as well as exams for FICM and the Faculty of Pain Medicine. These are limiting factors on the ability to put on additional exams.

We will do our utmost to ensure that, for candidates who must re-sit, there is space to do so. The RCoA has committed to online examinations until at least April 2022 and is yet to take a decision about whether to continue online or return to face-to-face after this time. However, whether virtual or in-person, we are constrained by the availability of volunteer examiners, the capacity of the RCoA Exams Team, and utilisation of Churchill House without any social distancing measures and capacity restrictions (though the building has now reopened, these are still in place).

We are not currently able to make any comment regarding finances until such time as we have a better understanding of the issues and an agreed forward plan.

All the questions are mapped to the Stage 1 and 2 training of the CCT in ICM curriculum, (excluding the Special Skills Year). The competencies which underpin the 2021 curriculum’s 14 High Level Learning Outcomes (HiLLOs) are the same as those within the previous curriculum, albeit reconfigured to adhere to Excellence by Design. The questions are all derived from this curriculum and mapped to it and written and reviewed by current practicing UK intensive care physicians; examiners are required to step down when they retire from their ICM clinical role.

The question selection for an OSCE exam is undertaken by the OSCE subgroup lead, following a predetermined structure, to provide one radiology, one ECG, one communication (professionalism), one simulation, and nine clinical/data questions. This format has not changed since the first OSCE, as approved by the GMC; this broad format has been replicated in the transfer to on-line OSCE made necessary by COVID restrictions.

OSCE questions are selected from the database from across a wide and representative range of topics from the curriculum to ensure content validity. The SOE and OSCE question selections are also arranged to avoid a clash or duplication in topics, so that candidates who sit both components in the same sitting are not examined on the same narrow topic area in both components. Question selection also considers the difficulty of questions (as reflected in the Angoff scores) so that exams have a similar overall level of difficulty across the days.  This question section methodology has not changed.

Each OSCE exam day in October 2021 had 13 questions, of which one or two were new (i.e. not used in any previous exam). Over time, the number of new questions per day has reduced gradually from all new questions in the first exam to one or two, plus the ‘test’ question, per day. Previously used questions for the recent OSCE exam in October 2021 had been used between one and eight times in previous exams (an average of 2.8 times).

Given that these questions or topics (including fire safety/infection control) have been used successfully before, the Faculty stands behind the decision to use them in the exam as there was thought to be no reason prior to this exam to question their use.

However, we have heard the feedback from engaging with you that there is a perception that the OSCE questions place undue emphasis on rote recollection of specific non-critical facts. We commit to reviewing the OSCE question bank to try to consider these issues.

The exam, both its content, delivery and standard setting remains under constant review, and these will continue to be reviewed both internally and with the help of external advisors.

As simulations cannot be provided identically by remote delivery, the last three OSCE examinations have instead included an e-simulation station, in which the candidate talks to the examiner about what they would do in a specific clinical situation rather than actually demonstrating it. In the October 2021 diet, by chance according to the criteria outlined below, most of the e-simulation stations were designated as the ‘test’ station in the OSCE, so did not contribute to the candidates’ overall mark.

At each exam, one of the 13 stations is designated as a ‘test’ station, based on pre-defined criteria. If there is an operational problem that resulted in the question not being able to run consistently for all candidates, this is designated the ‘test’ station. If there is no ‘unfit’ station, the newly Angoffed question in which the average mark has the maximum difference below the Angoff score for that question is designated the ‘test’ station. If no newly Angoffed questions score below, then the question with the minimum difference above the Angoff score is designated the ‘test’ station.

The majority of OSCE questions used in the October 2021 exam, as above, were questions that had been used in previous sittings unaltered. As such, the Angoff score for those questions was unchanged from previous examinations. There were no ‘unfit’ stations in the October 2021 OSCE, so the test station was removed according to the rules outlined above.

The online OSCE does indeed take longer than the face-to-face version. The examining time is unchanged. However, movement of candidates between stations online necessarily is more time consuming than in person. In addition, any internet connection problems for any individual introduces a delay for the whole rotation in order to allow the affected candidate best opportunity to complete their examination.

The face-to-face OSCE takes 104-120 minutes of exam time (depending on the number of rest stations) plus approximately 20-30 minutes of ‘call’ time before this and additional time for any delays between stations (rarely for than 5 minutes overall). A ‘standard’ exam time (without delays) is 135 minutes. If a candidate with extra reading time (reasonable adjustment) is present, there is additional time between stations for all candidates.

The online exam takes 143-165 minutes (depending on the number of rest stations) plus 30 minutes for ‘call time’ and a security check, and any additional time for any delays between stations. The additional examination time is caused by the time taken to move candidates into and out of stations individually, time to read the candidate instructions and an extra minute per station to compensate for any minor internet issues.  Delays between stations, mostly caused by any internet connection problems, can be longer in the on-line exam (rarely up to 10 minutes) and occur more frequently. Because of the sequential movement of candidates from station to station, if one candidate ‘drops out’, the whole circuit has to be paused while getting that candidate back into the exam. Some exams have no delays, some have several. 

Because of the increased time overall, the online OSCE has a 10-minute break where candidates and examiners can leave their screen for refreshments or a comfort break after seven stations (mid exam). Candidates are also able to leave their screens for a short break after the briefing but before the security check, which precedes the first station.  A standard exam time (without delays) is 30 minutes then a short break then 160 minutes with a break after approximately 85 minutes.

If a candidate who needs reasonable adjustments of additional reading time is present in the online exam, they are moved earlier into a station, so no additional delay occurs for other candidates. 

In two instances in the first online exam (October 2020) additional marks were awarded to a candidate who had insufficient time on a question due to technical issues. No occurrences of these problems occurred in the October 2021 exam.

The recent 2021 update to the ICM 2021 curriculum was mandated by the GMC, which initiated an update to all specialist medical curricula to ensure compliance with its Excellence by Design principles. The timing for this was not set by the Faculty, but we have endeavoured to keep the ICM community updated through the curriculum's development, consultation, and implementation.

The implementation date of the new curriculum was mandated by the GMC. StRs in Stage 3 of their training were permitted to stay on the old curriculum in order to close out their training. For StRs in Stage 1 and 2, it was necessary to move to the new curriculum due to the move to a new ePortfolio (more below), as for financial and practical reasons, it is also not possible for FICM to keep both NES and the new LLP running alongside each other for a prolonged period.

We note with interest the feedback that there is ‘not enough specificity’ in the curriculum and would like to understand more about this. The curriculum assessment is made up of 14 High Level Learning Outcomes (HiLLOs). In keeping with the GMC’s requirements in Excellence by Design – applicable to all specialties, not just ICM – the HiLLOs of the new curriculum move us away from the previous practice of detailed evidencing of individual competencies – often cited as a drawback to the 2010 curriculum – to an outcomes-based assessment approach. Educational Supervisors must make an overall judgement based on all the information available to them as to whether the HiLLOs are achieved at the expected level at each key progression point. The FICM Training, Assessment & Quality Committee appreciate that this is a shift in mindset; however the hope is that this is compensated for by the associated reduction in assessment burden, which was felt to be a drawback of the 2010 curriculum.

We hear your feedback that further guidance is required on this and will take that forward.

The move to a new curriculum correspondingly spurred the need for an updated ePortfolio. The decision was whether to stay with the previous provider NHS Education for Scotland (NES) or seek a new solution.  Several medical Royal Colleges had decided to leave the NES ePortfolio, putting the future viability of the platform in jeopardy. FICM began looking into alternative providers in 2018 and engaged with several major providers, including the company hosting the RCoA’s Lifelong Learning Platform (LLP). A thorough options appraisal was provided to the RCoA, to whom the Faculty of Intensive Care Medicine (FICM) is legally and financially responsible and following careful consideration, the LLP was selected as the most appropriate platform for both operational and ongoing financial concerns.

The largest cohort of ICM StRs are those on ICM and anaesthesia dual CCT programmes, who are already familiar with the LLP. In this respect, and for StRs on those programmes, cross-linking of capabilities going forward will be easier on the LLP than on NES (update: this guidance is now live on the FICM website). With regard to other platforms, there is no one IT portfolio platform which encompasses all specialties and allows cross-linking of competencies. The Faculty is not currently resourced to build data linkage between the LLP and external college portfolios – this was not even possible within different curricula which were on the NES portfolio (e.g. ICM and Acute Medicine), due to inbuilt technical restrictions. FICM agrees it would be better to have one portfolio platform for all doctors in training, but colleges have autonomy and differing requirements in this area.

As before, the issue is resource. We are a small Faculty and have not had the funding to build a bespoke ICM logbook – however, we do have Logbook Resources on our website including our Logbook Summary and links to external ICM logbooks. We can explore the possibility of this as part of future LLP development, but it will likely involve a significant development and financing to achieve.  During the development of the LLP functionality, it was felt that priority should be given to the functions that already existed in the old ePortfolio and then once implemented, we can concentrate on exploring the development of additional functions in the future. Previous feedback has suggested that the type of unit worked in and the case-mix encountered is a better reflection of experience. FICM will look to engage with StRs and trainers on this issue in the future.

FICM has guidance online regarding Less Than Full Time training and is looking to refresh this in the near future – the matter is already under discussion by FICMTAQ. FICM entirely supports the principle that LTFT should be available to StRs; TPDs should work together with their LTFT StR, educational supervisors, Tutors, and school LTFT leads to provide the same breadth and depth of training that the full-time StRs undertake. In terms of local availability of such training, FICM cannot mandate specific post numbers to deaneries/LETBs, only set the educational standards that must be reached. We would urge StRs who are having issues with this to contact their local Faculty Tutor and Regional Advisor.

Conclusion

On those areas we have responded, we hope we have addressed your points and provided further background and explanation, along with assurances of the steps we are taking to make necessary improvements. We know that there are outstanding issues, and we will write to you again on 9 December to give you a further update on progress.

If you raised questions which you do not feel are covered, please contact us.

It is important that we listen and use what you have told us as the basis for change which will benefit our members, our patients, and our specialty.

We know we are not there yet, but we are committed to continual improvement and to keeping our members at the forefront of what we do, and to the degree of integrity that you expect.