The Royal College of Physicians and Surgeons of Canada: Overseeing Postgraduate Medical Education and Specialist Standards

The Royal College of Physicians and Surgeons of Canada (RCPSC) plays a leading role in specialty and subspecialty postgraduate medical education (PGME) in Canada. Established by an Act of Canadian Parliament in June 1929, the RCPSC has been instrumental in shaping medical education and ensuring high standards of specialist training and care for over 95 years.

Mandate and Responsibilities

The RCPSC's prime objective is to ensure the highest possible standards of specialist training and specialist care. To achieve this, the RCPSC:

  • Accredits PGME programs across Canadian universities
  • Sets the standards for specialty and subspecialty training
  • Administers certification examinations
  • Develops and oversees the Maintenance of Certification (MOC) program

The RCPSC influences the delivery of postgraduate medical education for every specialty except family medicine, which is overseen by the College of Family Physicians of Canada. Through their Competence by Design and CanMEDS Framework, and through their accreditation of all postgraduate medical education programs in universities across Canada, the RCPSC directs postgraduate specialty training in Canada.

Certification and Examinations

RCPSC Certification is required by Medical Regulatory Agencies (MRAs) across Canada as a condition of entry to independent practice. To become certified, a physician must pass Royal College examinations. Access to these examinations is usually gained by completing a Royal College-accredited residency program at a Canadian university.

The RCPSC Certification process does not exist in a vacuum. While the RCPSC is an independent not-for-profit Canadian corporation with the mandate of overseeing post graduate medical education, it exists within a legislative and academic environment which demands certain standards of practice for its Certification processes.

Read also: Scholarships at RCA

Examination Structure

From the 1940s to the 1970s, the Royal College conducted examinations at two levels in most specialties: Fellowship, the higher qualification, or Certification, a lesser designation. Fellows of the Royal College use the designation FRCPC (Fellow of The Royal College of Physicians of Canada) or FRCSC (Fellow of The Royal College of Surgeons of Canada) depending on their qualifications.

Concerns Regarding Examinations

Recently, questions have been raised regarding the usefulness of such high-stakes examinations. If such examinations are to be fair and equitable, they must be designed and implemented in accordance with best practices for educational testing, and the processes for implementation and grading must be transparent and fair.

Some IMGs who have completed RCPSC examinations have reported difficulty passing these exams. These respondents attribute their difficulty passing these exams to a variety of factors such as the inclusion of Canadian specific minutiae in the exams that they deem irrelevant to assessment of entry to practice levels of competency, as well as lack of access to Canadian study resources that CMGs and IMGs who trained in Canada have ready access to from their programs.

Postgraduate Medical Education (PGME)

In June 1929, a special Act of Parliament established the Royal College of Physicians and Surgeons of Canada to oversee postgraduate medical education in Canada. Almost every provincial medical regulatory agency (MRA) in Canada requires RCPSC Certification as a condition of full Registration as a specialist. While the RCPSC is not legislatively empowered to make entry to practice decisions, most provincial MRAs have delegated their legislative authority to make decisions about competence for entry to practice in a specialty to the RCPSC as a third-party assessor. As a result of this delegation of authority, the RCPSC has assumed a leading role in determining entry to practice to medicine in Canada, with RCPSC examinations and Certification representing the final pre-requisite to application for full licensing by a provincial MRA.

Competence by Design (CBD) and CanMEDS

In 1996, the Royal College adopted CanMEDS, a medical education framework it developed that emphasizes the essential competencies of a physician. Revised in 2005, the CanMEDS competencies have now been integrated into the Royal College's accreditation standards, objectives of training, final in-training evaluations, exam blueprints, and the Maintenance of Certification program. Since its creation, CanMEDS has been adopted and adapted around the world.

Read also: World-Class Music & Drama College

Competency-based medical education, or CBME, is an outcomes-based approach to the design, implementation, assessment, and evaluation of a medical education program using an organizing framework of competencies.

In 2012, the Royal College began a multi-year plan to design, develop, implement, and sustain a program of CBME. In 2013, the Royal College announced Competence by Design, the name that the organization has given to its reorientation toward a CBME model of learning and assessment (The CanMEDS Framework, first introduced in 1997, sets out the competencies and principles considered essential for Canadian physicians). Under CBME, medical education (for residents in training and specialist physicians who pursue lifelong learning) progresses not according to how much time a resident or certified physician has practiced certain skills, as has been the case in the past in Canada.

Maintenance of Certification (MOC)

The Royal College develops and administers a continuing professional development program called Maintenance of Certification (MOC) that requires Fellows to engage in certain activities to maintain their competence throughout their careers. Introduced in 2000, MOC is a core service delivered by the Royal College and is also open to health care professionals who are not Fellows and not physicians.

MOC Program Sections

The Royal College will recognize these activities as meeting the requirements for MOC Program Section 3 credits. The Royal College will recognize the participation by Fellows in accredited group learning activities developed by providers accredited within the ACCME system, that are held outside of Canada, as participation hours in MOC Section 1.

The Royal College replaced MAINPORT with a new credit tracking platform, My MOC, in August 2024.

Read also: Adaptation in Art Education

ACCME Agreement

Effective immediately and through December 31, 2030 - The ACCME and the Royal College of Physicians and Surgeons of Canada have collaborated to renew our agreement that enables Royal College Fellows to earn Section 1 credits for accredited CME that fits the definition of Group Learning and/or Section 3 credits by participating in accredited CME that counts for Maintenance of Certification (MOC).

Through an agreement between the Accreditation Council for Continuing Medical Education and the Royal College of Physicians and Surgeons of Canada, medical practitioners participating in the Royal College MOC Program may record completion of accredited activities registered under the ACCME’s “CME in Support of MOC” program in Section 3 of the Royal College’s MOC Program.

Royal College Fellows can visit CME Passport and search for activities registered for MOC for one or more of the certifying boards. Any of these activities will count for Royal College MOC Sections 1 and 3.

Reporting Participation

You do not need to report Royal College Fellows’ participation in your activities.

International Medical Graduates (IMGs)

An international medical graduate is someone who has completed his or her medical training outside of Canada or the United States and whose medical training is recognized by the World Health Organization. There are 29 international jurisdictions that the Royal College has assessed and deemed as having met Royal College criteria. For the graduates of these particular jurisdictions, the College assesses their training to determine the extent to which they have successfully met and completed the Royal College training requirements. When the training is deemed comparable and acceptable, the IMGs are ruled eligible to take the Royal College certification examination. Success at the certification examination will lead to Royal College certification.

Standards for Educational and Psychological Testing (SEPT)

The RCPSC Certification process does not exist in a vacuum. While the RCPSC is an independent not for profit Canadian corporation with the mandate of overseeing post graduate medical education, it exists within a legislative and academic environment which demands certain standards of practice for its Certification processes. The SEPT are intended to provide guidelines for best practices in developing and evaluating educational tests and examinations and ensuring that educational tests and their interpretation of test scores is valid for the test’s intended uses. The SEPT provide guidelines for establishing validity and reliability of educational testing. Validity refers to the extent to which evidence and theory support that test scores are consistent with their intended uses. According to the SEPT, validity of a test may be evaluated in several ways. One is content validity, or “an analysis of the relationship between the content of a test and the construct it is intended to measure.” Another method is convergent and divergent validity which involves evidence of the extent to which test scores relate to, or differ from, other measures intended to address the same or different constructs. Reliability is typically assessed through replications of the testing procedure to determine whether test results are consistent over time. In addition to concerns about validity and reliability, the SEPT addresses the issue of fairness in testing. Tavakol and colleagues in their paper on ensuring fairness in assessment in health professions education emphasize the important role the SEPT plays in ensuring that potential biases in test development and administration are minimized, and that tests are fair for all intended groups regardless of examinee characteristics. According to the SEPT, fairness includes elements such as fairness in treatment during the testing process, fairness as a lack of measurement bias, fairness in access to the constructs as measured without the test taker being biased by personal characteristics such as age, disability, gender, race, ethnicity or language, which ensures that the test measures only what it is intended to measure, and not factors that are irrelevant. Fairness also includes ensuring the validity of individual test score interpretations for the intended uses. The SEPT note that when drawing inferences about an examinee’s performance, skills and abilities, it is important to consider how the examinee’s individual characteristic such as ethnicity may interact with the design and implementation of the testing situation. The Medical Council of Canada Qualifying Examination 1 (MCCQE1) provides an excellent example of the above SEPT standards in practice, and some might argue that it represents a “gold standard” in assessment of medical competency in Canada. The MCCQE Part 1 Annual Technical Report practices correspond closely with the guidelines provided by the SEPT. The MCCQE1 appears well researched with clear transparent reporting of validity, reliability and psychometric properties of the exam. It provides feedback from candidates through a post-examination survey and transparent reporting of results. This transparent reporting creates confidence in the fairness of the MCCQE1 examination process which is also a key requirement of the SEPT.

Concerns Regarding Examination Transparency and Fairness

A search of the academic literature in Google Scholar using the search terms such as “validity,” “reliability,” “RCPSC,” and “examinations” also returned no studies researching the validity or reliability of RCPSC exams. This dearth of research may be a result of RCPSC confidentiality strictures and reflects a troubling lack of transparency regarding RCPSC exam processes.

The SEPT document above describes critical elements necessary for fair testing including fairness as a lack of measurement bias and fairness in access to the constructs as measured. Without a transparent and detailed technical report such as MCC provides, and without transparent access to psychometric data and reliability and validity studies, it is difficult to assess how fair the RCPSC exams are to different cohorts. As discussed below, some IMGs who have completed RCPSC examinations have reported difficulty passing these exams. These respondents attribute their difficulty passing these exams to a variety of factors such as the inclusion of Canadian specific minutiae in the exams that they deem irrelevant to assessment of entry to practice levels of competency, as well as lack of access to Canadian study resources that CMGs and IMGs who trained in Canada have ready access to from their programs. Without a transparent report from the RCPSC regarding IMG performance, it remains a question whether IMGs do more poorly than CMGs on RCPSC exams, and we are forced to rely upon anecdotal reports.

Prat, in Healthy Debate, describes what he experienced as a lack of fairness and transparency from the RCPSC regarding his request for a formal review of his exam scores based on a belief that he failed the psychiatry specialty exam due to factually inaccurate questions being included in the exam. Prat notes that, “The examination process and appeal do not permit anyone to review the content of the examination as it is deemed ‘confidential to the Royal College and not shared with candidates.’ There is therefore no possibility for the candidates to check the accuracy of the expected answers, not even for the sake of understanding where they failed.” Prat goes on to state that, “When a request for appeal is made, there is limited chance of success since candidates do not have access to anything tangible. Formal reviews of examinations are conducted only based on alleged significant procedural irregularities in the assessment process, not because of alleged errors in content.

SOCASMA Survey on RCPSC Examination Experiences

SOCASMA surveyed fifty-three anonymous participants who completed an online Google Forms survey about their experiences with a variety of RCPSC specialty exams. Caution needs to be used in interpreting the results of the SOCASMA RCPSC Examination Experiences Survey. The number of respondents is low with only fifty-three respondents. The summary posted online contains only a brief description of the methodology of the survey. It states that “participants were recruited though invitations and notices on various medical forums and through interested stakeholder groups,” and goes on to caution that, “this survey may be subject to some degree of response bias.” Given the reported methodology of the survey, selection bias is likely. The combination of selection bias and low respondent numbers limit the generalizability of the survey. Also, based on the online summary, the authors conducted only a surface analysis of quantitative data. As an example, and as noted later in this paper, there is no reporting of how many CMGs vs IMGs had access to past MCQ questions, and no analysis of how such access may have impacted reported respondent pass rates on RCPCS exams. Further, the study relies entirely on self-reported data. The survey included considerable anecdotal qualitative data, which is subjective in nature and not subject to independent verification. Despite these limitations, in the absence of any transparent reporting by RCPSC of the results from their post examination survey reports, this survey appears to represent the only publicly available information regarding RCPSC examination participant’s experience of the RCPSC examination process.

Survey Demographics and Key Findings

In terms of Survey demographics, the RCPSC Examination Experiences Survey reported that “67.3% of respondents were International Medical Graduates (IMGs), 30.8% were Canadian Medical Graduates (CMGs) and 1.9% were US Medical Graduates (USMGs).” One of the questions the Survey asks respondents is, “In your opinion, do you believe the RCPSC exam you wrote was objective and fair?” Of 33 responses, 18 responded “Yes” and fifteen responded “No” resulting in 45.5% of respondents not believing that the RCPSC exam they wrote was objective and fair. This suggests a significant perception problem with the objectivity and fairness of RCPSC exams in the population sampled. This is concerning given that the Survey also reports that 34 of the 53 respondents or 64% of the sample reported passing their RCPSC exams, suggesting that respondents may not be simply complaining about exams they have failed. Another question asked in the Survey is, “When you think about the overall exam experience, do you believe it was fair and transparent?” Of 38 responses to this question, 18 responded “Yes” and 20 responded “No” resulting in 52.6% of respondents thinking the overall exam experience was not fair and transparent. As noted above, this data should be interpreted with caution given the methodological issues of the survey and the absence of reporting of any deeper data analysis.

For perspective on the above data, while the MCCQE1 Candidate Survey in their 2020 Technical Report does not ask specifically for candidate’s experiences of exam fairness, there are several questions that are analogous and may offer points of comparison. The MCCQE1 Candidate survey asks respondents, “How would you rate your overall exam experience?(p44). For the January 2021 examination cohort, 84% of candidates rated their exam experience as Good, Very Good, or Excellent.” The Candidate Survey also asks if, “The MCQ section provided an opportunity for me to demonstrate my level of medical knowledge.”(p42) For the January 2021 examination cohort, 78% of candidates agreed or strongly agreed with this statement, with a further 18% neither agreeing nor disagreeing. Another question that speaks to MCCQE1 perceptions of fairness is, “The questions were clearly written.”(p42) For the January 2021 examination cohort, 74% agreed or strongly agreed with this statement, with a further 17% neither agreeing nor disagreeing. Given the above MCCQE1 responses, reports that 52.6% of respondents do not think the overall RCPSC exam experience was fair and transparent appears to be a concerningly high percentage.

Qualitative Feedback from Survey Respondents

Respondents had an opportunity to provide comments regarding why they answered the above questions on the RCPSC Examination Experiences Survey the way they did. Some of the comments included:

  • “I had some old exams but not the most recent couple years."
  • "The exam contained a lot of new data that is not relevant to a newly graduated non sub specialized surgeon. Also they asked about Hazard ratio."
  • “Questions were very poorly written, in some cases clearly had been translated from French as the grammar was French (word order). Multiple answers were often correct and clinically inappropriate to choose one over another given that both interventions/treatments are critical e.g."
  • “Many of the questions are apparently repeats that certain individuals and programs have access to for practice and comprises a majority of the questions on the exam."
  • "Canadian students/residents had access to questions well in advance and the ability to ask their supervisors (the ones writing the questions) what the answers were. These answers were shared around to Canadians. Speaking to colleagues who were CMGs and passed the exams well after the fact, they pretty uniformly say there’s no way anyone can pass that exam without the previous questions and specific coaching."
  • “The RC do not provide the answers that are expected, so there is no way to know if they have made a mistake or not. We need to blindly rely on the knowledge content expert with no way to double checking."
  • “No transparency at all."
  • "I have been told by numerous people who have taken the exam (CMGs and IMGs) that there is no way to pass the exam without having access to illegally shared databanks of remembered past exam questions, and that the Royal College recycles up to 80% of past exam questions, that they are overly detailed and lacking in clinical relevance and that simply studying will not be enough to pass."
  • "Canadian trainees have the old exam questions and the exam is almost entirely old questions. If you don’t have some way to get them, it would be very difficult to pass the exam, almost impossible. Additionally, there are no standardized/accepted materials such as books or question banks to study from as are available in other countries. In the USA, there are books th…

Awards and Grants

The Royal College's awards and grants program distributes $1 million a year in awards, grants, fellowships, and visiting professorships. Awards recognize the importance and potential impact of specialist physicians' work, and categories include original research, personal achievements, and visiting professorships. Among the more notable Royal College awards is the Teasdale-Corti Humanitarian Award, which recognizes physicians and surgeons who, while providing health care or emergency medical services, go beyond the accepted norms of routine practice, which may include exposure to personal risk. The award is named in honor of Dr. Lucille Teasdale and Dr. The International Medical Educator of the Year Award is given to an international medical educator who has demonstrated lasting impact and a commitment to enhancing ethics and humanism in residency education.

Resident Membership

Since 1997, the Royal College has also offered a category of resident membership called "resident affiliate” in an attempt to engage residents at an early stage of their careers. Those who choose to join the Royal College receive complimentary membership during the time they are registered in a Royal College-accredited residency program.

tags: #royal #college #of #physicians #and #surgeons

Popular posts: